Thanks that worked. I had to reinstall it in it's own partition, which made all the other files recoverable, but overall it's a little small to work with. I'm thinking of just getting a larger drive. Once I do, I can do a clean install on that new drive without any problems right? And then use the older one as a backup or other random storage? What's the correct order of operations?
Thanks that worked. I had to reinstall it in it's own partition, which made all the other files recoverable, but overall it's a little small to work with. I'm thinking of just getting a larger drive. Once I do, I can do a clean install on that new drive without any problems right? And then use the older one as a backup or other random storage? What's the correct order of operations?
When you get a new drive, the way I would handle it is:
1. Disconnect the old drive.
2. Install the new drive.
3. Install windows. You can use the same install media you just used. Clean install is going to be the only option. You might want to check the advanced settings to make sure the entire hard disk is being formatted into a new NTFS partition. If the disk is unformatted it will probably prompt you to make a new partition which you should set to be the size of the disk. If the disk is already formatted you should delete any partitions that are there and make a new one.
4. Reconnect the old drive when the install process is over.
5. Move any files you want to keep over to the new disk.
6. Use the disk management utility (which can be accessed by typing 'disk management' into start. It will probably list it as something silly like 'create and format hard disk partitions'). Delete all of the partitions on the old hard drive. Create a new partition that takes up the full capacity of the drive and format it as NTFS.
You can use the old drive for whatever you want at this point.
The only trouble you might run into is with step 4, when you reconnect your old drive. The old drive will still be bootable. Which hard drive the computer boots from will be decided by the bios, so if it ends up booting into the old install, restart the computer and change the boot device priority in the bios.
Fan noise is a factor. It actually woke me up the next morning.
I know I'm not satisfied with the high temp results, but I also know it isn't a normal result anyway.
But, I thank you for the advice. We'll see soon enough.
I don't see anything unordinary with what you're reporting. I think you're fine. I'm guessing some tweaks via Link will bring your temps down; but like you said, Prime95 is specifically meant to stress the system. If it stayed around 80F I think that's pretty solid.
---
In other news, I got an email from my wife today. She wants me to help her pick out monitors for work (she can tell them what to buy). In addition, I may have convinced her to take one of my Monoprice monitors for home use; which means I can start looking at the 1440p monitors in earnest. Is the one linked a few pages ago still on sale?
I want a second monitor on account of my current one is quite old and having a bunch of monitors at work has been awesome.
So.... what's looking good for the thrifty hobbyist at the 1680 x 1050 or above range? I don't need 4k or anything.
I guess one thing I'd like recommendations about too is a nice stand-thing or arm to attach them to. Currently when I type my monitor jiggles and dangit that's annoying. I want some method of reducing that. Would an arm even do that? I guess it's more the desk that's the issue.
when will the new cards show up on sites for ordering?
I want to say Friday for the 1080 and middle of next month for the 1070.
No matter where you go...there you are. ~ Buckaroo Banzai
0
Options
Dhalphirdon't you open that trapdooryou're a fool if you dareRegistered Userregular
Do any of you fine guys know where I could find any information on why GPUs currently go with 3x DisplayPort and 1x HDMI, instead of something more practical for the mass consumer base like 2x HDMI and 2x DisplayPort?
Seems like the list of people needing to connect three GSync monitors simultaneously is probably pretty small compared to the list of people who might like to run two HDMI displays like a TV and a monitor.
0
Options
Dhalphirdon't you open that trapdooryou're a fool if you dareRegistered Userregular
when will the new cards show up on sites for ordering?
I want to say Friday for the 1080 and middle of next month for the 1070.
27th May for Founder's Edition 1080 @ $699 (which nobody should buy)
Unspecified date for aftermarket 1080, rumours say June.
June 10th for Founder's Edition 1070 @ $449 (which nobody should buy)
June 10th for aftermarket/third party 1070 @ $379 (which everybody below a 980 should seriously consider)
Do any of you fine guys know where I could find any information on why GPUs currently go with 3x DisplayPort and 1x HDMI, instead of something more practical for the mass consumer base like 2x HDMI and 2x DisplayPort?
Seems like the list of people needing to connect three GSync monitors simultaneously is probably pretty small compared to the list of people who might like to run two HDMI displays like a TV and a monitor.
My guess is that it's because it's easier to convert from display port to hdmi than it is to do it the other way around.
I don't think there are universal 'everyone should consider' metrics.
To my mind, if you're gaming at a resolution where your current card can't keep a respectable minimum frame rate, then sure, consider going for the new hotness.
Most people on 1920x1080 are doing fine on GTX 970s, 780, 980. 1440P is still holding strong on a GTX 980 OCed or a 980Ti.
DX12 isn't really prevalent yet, and VR is still very much for early adopters, and while it is nice that NVidia is delivering technologies that improve those environments in this generation, they're not quite to the benefit of most gamers yet.
And well, for my case, it seems that none of the new Nvidia cards are going to deliver 50+ minimum fps at 2160P outside of SLI.
So I'm quite content to be doing 1080P on my big HDTV on a GTX 970, until I can upgrade to a 4K TV, and a single GPU that can drive that.
Most people on 1920x1080 are doing fine on GTX 970s, 780, 980. 1440P is still holding strong on a GTX 980 OCed or a 980Ti.
Doom just released and I'm playing it at 1920x1080 with everything set to high, no AA, and 16X AF and getting anywhere from 40-130fps. Most of the time it's at a solid 60.
I'm doing this on a 3GB R9 280x, which is a two year old rebadge of a three and a half year old overclocked version of a four year old GPU that is sitting in a system with a five year old i5 2500 (running at 3.3GHz stock). I've even been able to play some newer games supersampled at 2560x1440/2K with decent results. Also, this 280x is the ASUS version that has artifacting issues at higher clockspeeds, so I have it underclocked to reference speeds. These insanely long console generations are a godsend for people like me, you can get away with some pretty neat stuff with some pretty ancient hardware.
I could see the virtue in buying a 1070 for 1080p gaming if you had a 144hz monitor and wanted to hit a consistent 144fps or something though.
SmokeStacks on
0
Options
Casually HardcoreOnce an Asshole. Trying to be better.Registered Userregular
What is going on with DX12? I thought it was suppose to be the Messiah of PC gaming.
As an aside, I think the reason I may have thought my CPU was automatically OC'd was that during the Prime tests the turbo kicked in and then never turned off. I'm pretty sure I was confusing the 6700k turbo speed with the FX-8370 speed.
Zxerolfor the smaller pieces, my shovel wouldn't doso i took off my boot and used my shoeRegistered Userregular
There's a handful of games with DX12 that are out right now. I've played a couple -- Rise of the Tomb Raider, which gives somewhat of a boost over DX11 in certain areas, and Forza 6 Apex, which is fantastic. DX12 is different enough from previous APIs where just simply bolting it on an existing renderer isn't going to give you much. You have to plan well in advance and very carefully to take advantage of its low-level hardware access, and that's not going to happen overnight (or, you know, if at all).
Basically, "Hey, when is Epic going to get a good DX12 renderer in UE?"
Huh. Just realized that the new cards don't have 2 DVI ports. I've never actually had to think about the ports on my GPU.
Guess I can just track down an HDMI to DVI adapter and use that since I only have 1 HDMI port between both of my monitors and it's used for my consoles.
0
Options
That_GuyI don't wanna be that guyRegistered Userregular
Do any of you fine guys know where I could find any information on why GPUs currently go with 3x DisplayPort and 1x HDMI, instead of something more practical for the mass consumer base like 2x HDMI and 2x DisplayPort?
Seems like the list of people needing to connect three GSync monitors simultaneously is probably pretty small compared to the list of people who might like to run two HDMI displays like a TV and a monitor.
My guess is that it's because it's easier to convert from display port to hdmi than it is to do it the other way around.
Exactly. DisplayPort is a far more capable connection type but can be backwards compatible with HDMI. HDMI's bandwidth is much more limited than DP and can't push 4k@60hz.
Zxerolfor the smaller pieces, my shovel wouldn't doso i took off my boot and used my shoeRegistered Userregular
It's like, when the Geforce 6800 came out with dual DVI, and there were people asking why. Who's going to use two DVI monitors instead of VGA? Why does it have two of these things?
I have no problem shifting from one format to another with connectors. Though I'd probably try to find at least one DVI-out port simply because I have a DVI-to-HDMI cable.
Do any of you fine guys know where I could find any information on why GPUs currently go with 3x DisplayPort and 1x HDMI, instead of something more practical for the mass consumer base like 2x HDMI and 2x DisplayPort?
Seems like the list of people needing to connect three GSync monitors simultaneously is probably pretty small compared to the list of people who might like to run two HDMI displays like a TV and a monitor.
My guess is that it's because it's easier to convert from display port to hdmi than it is to do it the other way around.
Exactly. DisplayPort is a far more capable connection type but can be backwards compatible with HDMI. HDMI's bandwidth is much more limited than DP and can't push 4k@60hz.
Pretty sure HDMI 2.0 can do 4K@60Hz. Earlier versions can't.
Casually HardcoreOnce an Asshole. Trying to be better.Registered Userregular
I made the mistake of turning off my computer and when I turned it back on.......
Nothing....
No post. Just a boot cycle.
So wants to help me throw parts at this computer until it works?
PSU, MOBO, or CPU?
You decide!
(Though first I'm getting an internal speaker to see if this thing is beeping anything.)
Though I could get another cooler for my old 2500K and see if it boots with this current PSU. If it does, then I just say 'fuck it' and RMA both the MOBO and CPU.
BouwsTWanna come to a super soft birthday party?Registered Userregular
edited May 2016
Canadian Pricing for the GTX 1080 Founders edition is (unsurprisingly) brutal. Most cards are sitting at $909 through NCIX, with some almost touching $1000. For whatever reason, the more trusted brands are the cheaper ones, with PNY and Zotac being the most unreasonable. Time to sit on ass, and see what the aftermarket cooler versions are worth, but I have a sinking feeling they won't be any cheaper.
BouwsT on
Between you and me, Peggy, I smoked this Juul and it did UNTHINKABLE things to my mind and body...
Canadian Pricing for the GTX 1080 Founders edition is (unsurprisingly) brutal. Most cards are sitting at $909 through NCIX, with some almost touching $1000. For whatever reason, the more trusted brands are the cheaper ones, with PNY and Zotac being the most unreasonable. Time to sit on ass, and see what the aftermarket cooler versions are worth, but I have a sinking feeling they won't be any cheaper.
I thought the purpose of the Founder's Edition was that they were going to be made by NVIDIA while the other companies added their own coolers. So what's the difference if we have these Founder's Editions made by Asus and everyone else?
TP-Link has released the first router to support the 802.11AD standard, claiming to support up to ~4.6 gigabits over a 60Ghz frequency. Will be looking forward to seeing range and stability tests on this new standard.
TP-Link has released the first router to support the 802.11AD standard, claiming to support up to ~4.6 gigabits over a 60Ghz frequency. Will be looking forward to seeing range and stability tests on this new standard.
With how much 5ghz gets blocked by walls over just 2.4ghz, I would think that a 60ghz signal would only be good in like the room of the router.
TP-Link has released the first router to support the 802.11AD standard, claiming to support up to ~4.6 gigabits over a 60Ghz frequency. Will be looking forward to seeing range and stability tests on this new standard.
man, I originally thought you meant 6.0 GHz, not 60. I can't see 60GHz having any kind of wall penetration.
Canadian Pricing for the GTX 1080 Founders edition is (unsurprisingly) brutal. Most cards are sitting at $909 through NCIX, with some almost touching $1000. For whatever reason, the more trusted brands are the cheaper ones, with PNY and Zotac being the most unreasonable. Time to sit on ass, and see what the aftermarket cooler versions are worth, but I have a sinking feeling they won't be any cheaper.
I thought the purpose of the Founder's Edition was that they were going to be made by NVIDIA while the other companies added their own coolers. So what's the difference if we have these Founder's Editions made by Asus and everyone else?
Ya, I'm at a loss as well. I guess something was lost in translation, or the aftermarkets saw an opportunity to just get a slice of that founder edition pie.
Edit: They've already adjusted most pricing down to an even playing field, with only the PNY offering being out of line. Everything besides the box on these looks to be reference.
BouwsT on
Between you and me, Peggy, I smoked this Juul and it did UNTHINKABLE things to my mind and body...
0
Options
Casually HardcoreOnce an Asshole. Trying to be better.Registered Userregular
Did some quick reading on AD wifi. Looks like it's designed for very short range same room applications where you want fast speeds. So streaming from a tablet to your tv, wireless portable hdd, file transfers between laptops, etc.
Not really all that useful to have on a router itself, much better for peer to peer things.
Posts
When you get a new drive, the way I would handle it is:
1. Disconnect the old drive.
2. Install the new drive.
3. Install windows. You can use the same install media you just used. Clean install is going to be the only option. You might want to check the advanced settings to make sure the entire hard disk is being formatted into a new NTFS partition. If the disk is unformatted it will probably prompt you to make a new partition which you should set to be the size of the disk. If the disk is already formatted you should delete any partitions that are there and make a new one.
4. Reconnect the old drive when the install process is over.
5. Move any files you want to keep over to the new disk.
6. Use the disk management utility (which can be accessed by typing 'disk management' into start. It will probably list it as something silly like 'create and format hard disk partitions'). Delete all of the partitions on the old hard drive. Create a new partition that takes up the full capacity of the drive and format it as NTFS.
You can use the old drive for whatever you want at this point.
The only trouble you might run into is with step 4, when you reconnect your old drive. The old drive will still be bootable. Which hard drive the computer boots from will be decided by the bios, so if it ends up booting into the old install, restart the computer and change the boot device priority in the bios.
I don't see anything unordinary with what you're reporting. I think you're fine. I'm guessing some tweaks via Link will bring your temps down; but like you said, Prime95 is specifically meant to stress the system. If it stayed around 80F I think that's pretty solid.
---
In other news, I got an email from my wife today. She wants me to help her pick out monitors for work (she can tell them what to buy). In addition, I may have convinced her to take one of my Monoprice monitors for home use; which means I can start looking at the 1440p monitors in earnest. Is the one linked a few pages ago still on sale?
So.... what's looking good for the thrifty hobbyist at the 1680 x 1050 or above range? I don't need 4k or anything.
I guess one thing I'd like recommendations about too is a nice stand-thing or arm to attach them to. Currently when I type my monitor jiggles and dangit that's annoying. I want some method of reducing that. Would an arm even do that? I guess it's more the desk that's the issue.
I want to say Friday for the 1080 and middle of next month for the 1070.
~ Buckaroo Banzai
Seems like the list of people needing to connect three GSync monitors simultaneously is probably pretty small compared to the list of people who might like to run two HDMI displays like a TV and a monitor.
27th May for Founder's Edition 1080 @ $699 (which nobody should buy)
Unspecified date for aftermarket 1080, rumours say June.
June 10th for Founder's Edition 1070 @ $449 (which nobody should buy)
June 10th for aftermarket/third party 1070 @ $379 (which everybody below a 980 should seriously consider)
My guess is that it's because it's easier to convert from display port to hdmi than it is to do it the other way around.
To my mind, if you're gaming at a resolution where your current card can't keep a respectable minimum frame rate, then sure, consider going for the new hotness.
Most people on 1920x1080 are doing fine on GTX 970s, 780, 980. 1440P is still holding strong on a GTX 980 OCed or a 980Ti.
DX12 isn't really prevalent yet, and VR is still very much for early adopters, and while it is nice that NVidia is delivering technologies that improve those environments in this generation, they're not quite to the benefit of most gamers yet.
And well, for my case, it seems that none of the new Nvidia cards are going to deliver 50+ minimum fps at 2160P outside of SLI.
So I'm quite content to be doing 1080P on my big HDTV on a GTX 970, until I can upgrade to a 4K TV, and a single GPU that can drive that.
Doom just released and I'm playing it at 1920x1080 with everything set to high, no AA, and 16X AF and getting anywhere from 40-130fps. Most of the time it's at a solid 60.
I'm doing this on a 3GB R9 280x, which is a two year old rebadge of a three and a half year old overclocked version of a four year old GPU that is sitting in a system with a five year old i5 2500 (running at 3.3GHz stock). I've even been able to play some newer games supersampled at 2560x1440/2K with decent results. Also, this 280x is the ASUS version that has artifacting issues at higher clockspeeds, so I have it underclocked to reference speeds. These insanely long console generations are a godsend for people like me, you can get away with some pretty neat stuff with some pretty ancient hardware.
I could see the virtue in buying a 1070 for 1080p gaming if you had a 144hz monitor and wanted to hit a consistent 144fps or something though.
That's what is going on.
As an aside, I think the reason I may have thought my CPU was automatically OC'd was that during the Prime tests the turbo kicked in and then never turned off. I'm pretty sure I was confusing the 6700k turbo speed with the FX-8370 speed.
Basically, "Hey, when is Epic going to get a good DX12 renderer in UE?"
Guess I can just track down an HDMI to DVI adapter and use that since I only have 1 HDMI port between both of my monitors and it's used for my consoles.
Exactly. DisplayPort is a far more capable connection type but can be backwards compatible with HDMI. HDMI's bandwidth is much more limited than DP and can't push 4k@60hz.
Cuz it's mo better that's why.
It probably will happen.
I feel like it's going to be Thunderbolt eventually. USB-C can't do PCI-E passthrough.
well, thunderbolt 3.0 is actually a USB C connector. a Thunderbolt 3.0 port does Thunderbolt and/or USB 3.1 Gen 2.
Pretty sure HDMI 2.0 can do 4K@60Hz. Earlier versions can't.
Steam: pazython
Nothing....
No post. Just a boot cycle.
So wants to help me throw parts at this computer until it works?
PSU, MOBO, or CPU?
You decide!
(Though first I'm getting an internal speaker to see if this thing is beeping anything.)
Though I could get another cooler for my old 2500K and see if it boots with this current PSU. If it does, then I just say 'fuck it' and RMA both the MOBO and CPU.
Does this nonstop unless i switch off the PSU.
That sounds like a bad mainboard. What model is yours? Can you give us your specs?
I thought the purpose of the Founder's Edition was that they were going to be made by NVIDIA while the other companies added their own coolers. So what's the difference if we have these Founder's Editions made by Asus and everyone else?
SteamID: edgruberman GOG Galaxy: EdGruberman
5280k
Gigabyte x99p-sli
Evga 16 GB RAM
Corsair 750w
980gtx gigabyte
http://www.newegg.com/Product/Product.aspx?Item=N82E16833704301
TP-Link has released the first router to support the 802.11AD standard, claiming to support up to ~4.6 gigabits over a 60Ghz frequency. Will be looking forward to seeing range and stability tests on this new standard.
With how much 5ghz gets blocked by walls over just 2.4ghz, I would think that a 60ghz signal would only be good in like the room of the router.
man, I originally thought you meant 6.0 GHz, not 60. I can't see 60GHz having any kind of wall penetration.
Ya, I'm at a loss as well. I guess something was lost in translation, or the aftermarkets saw an opportunity to just get a slice of that founder edition pie.
Edit: They've already adjusted most pricing down to an even playing field, with only the PNY offering being out of line. Everything besides the box on these looks to be reference.
Not really all that useful to have on a router itself, much better for peer to peer things.