the best is the cables that not only have all that, are also lined internally with whatever that grease is that helps insulate. After terminating a few of those it felt like I had been playing in lard.
Instead of buying a 1080, I really looked at it and realized that there's really not anything out that I can't play at 60+ FPS with everything cranked to Ultra at 1440p. But I did pickup an MSI Ghost GS60 with that Prime Day Amazon Warehouse coupon for $915. Pretty awesome to have a portable machine that can maintain 60FPS on Medium at 1080p in GTA V.
Zxerolfor the smaller pieces, my shovel wouldn't doso i took off my boot and used my shoeRegistered Userregular
High refresh is pretty sweet even if you're not requiring the most extreme twitch latency. A recent, specific example is Doom, where the camera motion blur looks fantastic at higher framerates but is an overbearing smear at 60, IMO.
Actually camera rotations in general are the most noticeable improvements at high refresh. They look super chunky to me now at 60hz.
Even with my GTX 1080 I still have a few titles that fall well short of 60 FPS (especially CPU-dependent games) at 4K, but frankly it's appreciated to be at 40 FPS than 20 FPS or lower.
So, aside from the fact there's some Thunderbolt magic involved, would it be possible to build some sort of Mini-ITX solution that could interface with a laptop and do similar things? I looked up the prices for the Stealth + Core recently (like, a month ago maybe), and I want to say the Core itself was on the order of $600 *before adding a vidcard*.
So, aside from the fact there's some Thunderbolt magic involved, would it be possible to build some sort of Mini-ITX solution that could interface with a laptop and do similar things? I looked up the prices for the Stealth + Core recently (like, a month ago maybe), and I want to say the Core itself was on the order of $600 *before adding a vidcard*.
The "Thunderbolt magic" is what makes the whole thing work, period. It is literally some PCI-e lanes going over the cable, and the box on the other ends splits it up.
You would be talking about, like, a KVM switch? But that wouldn't be automatic.
it's probably USB-C which is basically the whole reason USB-C is being purported as next gen technology.
You can have mostly dumb terminals, then carry devices like that to increase graphics and give you storage and all that. Or a laptop that is a laptop and when you get home, you dock it and it becomes a gaming desktop with full GPU power.
I'm hoping USB-C takes off like mad.
not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
people have been making there own external gpus using pcie cards for years now. So you can totally do it yourself for less than $600, or there are other companies selling boxes that do it as well.
making patch cables with solid core outdoor cat6 is madness
always terminate to jacks on solid runs that aren't going to be moved
It was for the outdoor point to point wireless system. Had a dozen of them to link up all the buildings.
not using wireless bridges instead?
Were they daisy-chained or something?
Ok, bad choice of words. To be more spesific, we linked all the buildings at the campus with these nifty Ubiquity Nanobeam bridges. I started with a hub and spoke sort of layout to link the central building to several buildings within line of sight. The problem was, not all the buildings were line of sight. 2 were blocked by hilly terrain. We had to relay the Nanobeams through switches to get all the buildings linked up. The system is incredible, though. Those Nanobeams are rated for 10+km as long as there's line of sight. All the buildings were within 500m of each other so I didn't even need to bust out the alignment tool. In fact, I have them broadcasting at half power and I'm still seeing a solid ~-60dbm with all the summer foliage grown in. We ran outdoor cable to each of them. Getting through both the outer layers and not cutting into the twisted pairs in tough. On one I had literally one more try left before we'd have to rerun the cable.
The only justification I can see for more than the 1070 is to do 4k gaming at maxed 60Hz, and since not even the Titan can quite do that, why bother.
Also note I don't play games that see benefits from >60FPS like cs go.
All games see visual benefit from greater than 60 FPS if you have the refresh rate to actually display it.
Yeah okay but 1) I doubt my capacity to discern between 60 and 144 FPS and 2) I don't think Civ 5 will be notably improved by the difference even if I could see it.
+1
Dhalphirdon't you open that trapdooryou're a fool if you dareRegistered Userregular
Yeah okay but 1) I doubt my capacity to discern between 60 and 144 FPS and 2) I don't think Civ 5 will be notably improved by the difference even if I could see it.
1) So did I until I got my ROG Swift. Just trust me. I think the move from 60hz to 144hz is actually better for overall enjoyment of visuals than the move from 1080p to 1440p.
2) Agree on that much.
What I want to stress though is that the difference is very noticeable.
Even moving my mouse around on my two side 1080p (60hz) monitors feels janky and jerky on the Windows desktop compared to my primary 144hz monitor.
Dhalphir on
0
Casually HardcoreOnce an Asshole. Trying to be better.Registered Userregular
Yeah okay but 1) I doubt my capacity to discern between 60 and 144 FPS and 2) I don't think Civ 5 will be notably improved by the difference even if I could see it.
1) So did I until I got my ROG Swift. Just trust me. I think the move from 60hz to 144hz is actually better for overall enjoyment of visuals than the move from 1080p to 1440p.
2) Agree on that much.
What I want to stress though is that the difference is very noticeable.
Even moving my mouse around on my two side 1080p (60hz) monitors feels janky and jerky on the Windows desktop compared to my primary 144hz monitor.
I will have to get back to you after this weekend. I haven't had a chance to really test my new monitor yet, other than "yep this makes Fallout 4 look a lot better" which could also be the G Sync or the 1440p, too many changes over my old monitor to know which ones are making the difference. My money is still on
Yeah okay but 1) I doubt my capacity to discern between 60 and 144 FPS and 2) I don't think Civ 5 will be notably improved by the difference even if I could see it.
1) So did I until I got my ROG Swift. Just trust me. I think the move from 60hz to 144hz is actually better for overall enjoyment of visuals than the move from 1080p to 1440p.
2) Agree on that much.
What I want to stress though is that the difference is very noticeable.
Even moving my mouse around on my two side 1080p (60hz) monitors feels janky and jerky on the Windows desktop compared to my primary 144hz monitor.
I will have to get back to you after this weekend. I haven't had a chance to really test my new monitor yet, other than "yep this makes Fallout 4 look a lot better" which could also be the G Sync or the 1440p, too many changes over my old monitor to know which ones are making the difference. My money is still on
1440p>gsync>>higher FPS
Cost is also a factor there. G-sync is great, but it can also be the single most expensive element in the equation--G-sync can literally double the price of an above-average monitor.
Going 4K was, kind of ironically, a lot cheaper than G-Sync in my case.
my current plan is to try to ride out my 7870 and 1980x1200 monitor until the cost of a GPU and monitor that can do 4k acceptably is less than a rent/mortgage payment. That's the jump I want to do.
I've got a ROG Swift, a 2600k and a 980. No problem with 1440p @144hz here, though many modern games won't run at a high enough frame rate for 144hz to make a difference for me.
Yeah okay but 1) I doubt my capacity to discern between 60 and 144 FPS and 2) I don't think Civ 5 will be notably improved by the difference even if I could see it.
1) So did I until I got my ROG Swift. Just trust me. I think the move from 60hz to 144hz is actually better for overall enjoyment of visuals than the move from 1080p to 1440p.
2) Agree on that much.
What I want to stress though is that the difference is very noticeable.
Even moving my mouse around on my two side 1080p (60hz) monitors feels janky and jerky on the Windows desktop compared to my primary 144hz monitor.
I will have to get back to you after this weekend. I haven't had a chance to really test my new monitor yet, other than "yep this makes Fallout 4 look a lot better" which could also be the G Sync or the 1440p, too many changes over my old monitor to know which ones are making the difference. My money is still on
1440p>gsync>>higher FPS
Gsync without the higher FPS is not that useful. It just removes screen tearing, that's all. And I never really noticed much screen tearing.
+1
Dhalphirdon't you open that trapdooryou're a fool if you dareRegistered Userregular
edited September 2016
I also don't think 4k @ 60hz is worth it over 1440p @ 144hz. The latter makes games look much, much better, and doesn't introduce as many annoyances dealing with desktop applications as 4k does. Until we start seeing high refresh 4k monitors, I won't bother with it.
Speaking of monitors. I'm putting together an upgrade list hopefully to be ordered this weekend. Going with an i7-6700k and a 1080, will be a nice jump from my 670. I have no idea what monitor to look for though to take advantage of the new card. I'm assuming I want a 1440p 144hz with g-sync? Any suggestions?
I also don't think 4k @ 60hz is worth it over 1440p @ 144hz. The latter makes games look much, much better, and doesn't introduce as many annoyances dealing with desktop applications as 4k does. Until we start seeing high refresh 4k monitors, I won't bother with it.
I personally disagree, but that's also because I desperately want that added real estate, and because I actually had a video card that could consistently hit the 60 frames a second (and like most people, cannot drop an additional $300 for G-Sync--though I wish I could, and can understand why people are pissed off about G-Sync versus Freesync now), and 144hz without G-sync can be...problematic.
I was actually completely set on getting a 1440p monitor, until I actually used one regularly and realized it was, to me, a very small step from the 1080p I've had for a half-decade now. There's a reason why UI scaling is easier on it--it's not that different overall. I did have a lot more effort into UI scaling a few specific games (I'm looking at you, Creative Assembly).
More work for me? A little. The superior option for me? Without a doubt.
+1
Dhalphirdon't you open that trapdooryou're a fool if you dareRegistered Userregular
edited September 2016
You would want to go for 32" minimum for a 4k monitor. I doubt you can see the difference below that at typical desk distances.
You would want to go for 32" minimum for a 4k monitor. I doubt you can see the difference below that at typical desk distances.
Closer to 30" really, but it's not a bad guideline. That being said, because of my desk my monitor's a good bit closer than most peoples, I'm sure. Simply switching between 1080p and 4K makes it ridiculously apparent at my distance.
0
Dhalphirdon't you open that trapdooryou're a fool if you dareRegistered Userregular
You would want to go for 32" minimum for a 4k monitor. I doubt you can see the difference below that at typical desk distances.
Closer to 30" really, but it's not a bad guideline. That being said, because of my desk my monitor's a good bit closer than most peoples, I'm sure. Simply switching between 1080p and 4K makes it ridiculously apparent at my distance.
I meant between 1440p and 4k. I doubt you could see that on a 27" monitor at desk distance, for example.
Posts
always terminate to jacks on solid runs that aren't going to be moved
It was for the outdoor point to point wireless system. Had a dozen of them to link up all the buildings.
not using wireless bridges instead?
Were they daisy-chained or something?
I need help.
Don't buy the 1080, it's not that big of an upgrade.
Wait for the inevitable 1080ti and THEN upgrade.
Seriously though, are there any games you're playing that you aren't maxing out on graphics?
do it
:hydra:
This sounds just responsibly irresponsible enough to be the right course of action.
XBL : lJesse Custerl | MWO: Jesse Custer | Best vid ever. | 2nd best vid ever.
Also note I don't play games that see benefits from >60FPS like cs go.
Penny Arcade Rockstar Social Club / This is why I despise cyclists
Actually camera rotations in general are the most noticeable improvements at high refresh. They look super chunky to me now at 60hz.
buy the 1080. Then give me your 980Ti, I'll pay the shipping!, for giving you such good advice.
https://www.youtube.com/watch?v=fjTXEnRRMF0
So, aside from the fact there's some Thunderbolt magic involved, would it be possible to build some sort of Mini-ITX solution that could interface with a laptop and do similar things? I looked up the prices for the Stealth + Core recently (like, a month ago maybe), and I want to say the Core itself was on the order of $600 *before adding a vidcard*.
The "Thunderbolt magic" is what makes the whole thing work, period. It is literally some PCI-e lanes going over the cable, and the box on the other ends splits it up.
You would be talking about, like, a KVM switch? But that wouldn't be automatic.
You can have mostly dumb terminals, then carry devices like that to increase graphics and give you storage and all that. Or a laptop that is a laptop and when you get home, you dock it and it becomes a gaming desktop with full GPU power.
I'm hoping USB-C takes off like mad.
like for example here's one for $200: https://www.amazon.com/Akitio-AMZ-T2PC-TIA-AKTU-Thunder2-PCIe/dp/B00LTAUTHE and then add a pcie riser and psu.
or if you google around you can find a few different pc diy forums where they go over the options.
Ok, bad choice of words. To be more spesific, we linked all the buildings at the campus with these nifty Ubiquity Nanobeam bridges. I started with a hub and spoke sort of layout to link the central building to several buildings within line of sight. The problem was, not all the buildings were line of sight. 2 were blocked by hilly terrain. We had to relay the Nanobeams through switches to get all the buildings linked up. The system is incredible, though. Those Nanobeams are rated for 10+km as long as there's line of sight. All the buildings were within 500m of each other so I didn't even need to bust out the alignment tool. In fact, I have them broadcasting at half power and I'm still seeing a solid ~-60dbm with all the summer foliage grown in. We ran outdoor cable to each of them. Getting through both the outer layers and not cutting into the twisted pairs in tough. On one I had literally one more try left before we'd have to rerun the cable.
Excellent. Now I know when to replace my motherboard.
All games see visual benefit from greater than 60 FPS if you have the refresh rate to actually display it.
Yeah okay but 1) I doubt my capacity to discern between 60 and 144 FPS and 2) I don't think Civ 5 will be notably improved by the difference even if I could see it.
1) So did I until I got my ROG Swift. Just trust me. I think the move from 60hz to 144hz is actually better for overall enjoyment of visuals than the move from 1080p to 1440p.
2) Agree on that much.
What I want to stress though is that the difference is very noticeable.
Even moving my mouse around on my two side 1080p (60hz) monitors feels janky and jerky on the Windows desktop compared to my primary 144hz monitor.
I will have to get back to you after this weekend. I haven't had a chance to really test my new monitor yet, other than "yep this makes Fallout 4 look a lot better" which could also be the G Sync or the 1440p, too many changes over my old monitor to know which ones are making the difference. My money is still on
1440p>gsync>>higher FPS
Cost is also a factor there. G-sync is great, but it can also be the single most expensive element in the equation--G-sync can literally double the price of an above-average monitor.
Going 4K was, kind of ironically, a lot cheaper than G-Sync in my case.
I've got a ROG Swift, a 2600k and a 980. No problem with 1440p @144hz here, though many modern games won't run at a high enough frame rate for 144hz to make a difference for me.
Yes, on everything bleeding edge ultramax.
I am running 144hz at 1440p, and most games I can get up to around 100 FPS at nearly ultra max settings. With a little tweaking.
Gsync without the higher FPS is not that useful. It just removes screen tearing, that's all. And I never really noticed much screen tearing.
I personally disagree, but that's also because I desperately want that added real estate, and because I actually had a video card that could consistently hit the 60 frames a second (and like most people, cannot drop an additional $300 for G-Sync--though I wish I could, and can understand why people are pissed off about G-Sync versus Freesync now), and 144hz without G-sync can be...problematic.
I was actually completely set on getting a 1440p monitor, until I actually used one regularly and realized it was, to me, a very small step from the 1080p I've had for a half-decade now. There's a reason why UI scaling is easier on it--it's not that different overall. I did have a lot more effort into UI scaling a few specific games (I'm looking at you, Creative Assembly).
More work for me? A little. The superior option for me? Without a doubt.
Closer to 30" really, but it's not a bad guideline. That being said, because of my desk my monitor's a good bit closer than most peoples, I'm sure. Simply switching between 1080p and 4K makes it ridiculously apparent at my distance.
I meant between 1440p and 4k. I doubt you could see that on a 27" monitor at desk distance, for example.