Just out of curiosity, do you use Webroot on your computer? Or a Microsoft keyboard?
Nope and nope. To be honest, I don't even know what Webroot is, so I expect I'm not using it.
Anyway, I'll try a full format and reinstall to see if it works. If not, it's not a biggie; I use the second PC fairly rarely. Still, I'd rather have a functioning second PC than a big, unwieldy paperweight.
"Nothing is gonna save us forever but a lot of things can save us today." - Night in the Woods
If you're using an Nvidia graphics card, and updated to 375.57, you may've noticed some things in Windows 10....are suddenly kind of broken. Specifically, the tile system, and the Mail and Calendar apps. If you don't use them, no problem, if you do, it's a known issue.
Given my history with Windows 10 glitches of seemingly every variety, this was a very worrying moment followed by relief when my guess to uninstall all my Nvidia software (and revert temporarily to Windows 10's basic drivers) immediately fixed the issue after Microsoft troubleshooting didn't.
To be fair, at least this is just the tiles crashing and not the tilelayerdata corruption that's been a problem in Windows 10.
The one time I have ever had issues with GPU drivers was back when I was running a GTX 560ti and had just updated to Windows 10.
Microsoft had decided that they were going to include drivers in Windows Update, which was an idea. Things worked out ok for a while until Nvidia released a GeForce driver that broke multi-monitor support. Microsoft I guess didn't get the memo that a new driver had been released, and Nvidia didn't get the memo that their shit was broke, so for like week there was a tug of war going on that basically looked something like this:
10 GeForce Experience sees that your GPU driver is out of date and there is a new driver available and downloads it
20 That driver breaks multi monitor support
30 Windows Update sees that the new driver is different from the one it has on file
40 Windows Update decides that since the driver you are running is different from the one it has on file than the driver you are running is outdated
50 Windows Update 'updates' you to the outdated driver that they have on file
60 GOTO 10
Software like GeForce Experience and Catalyst do an excellent job about keeping people's GPU drivers up to date, so I don't know why Microsoft thought they needed to get involved. If a person isn't using either of those programs and they aren't manually installing new GPU drivers than odds are they aren't much of a gamer, don't care about new features or performance increases for a specific new game, and will probably be fine using whatever driver they got from the DVD in the box.
The two GPUs I have had since then have both been AMD so I didn't know up until a while ago that Nvidia started requiring logins to use their software, that is dumb as shit.
Software like GeForce Experience and Catalyst do an excellent job about keeping people's GPU drivers up to date, so I don't know why Microsoft thought they needed to get involved. If a person isn't using either of those programs and they aren't manually installing new GPU drivers than odds are they aren't much of a gamer, don't care about new features or performance increases for a specific new game, and will probably be fine using whatever driver they got from the DVD in the box.
That's how I got the update--ironically, it happened to backfire this time. As someone said, GeForce Experience would probably benefit from a "rollback" option for one version because new drivers usually have some minor bug or another--this one just happened to effect the operating system.
Software like GeForce Experience and Catalyst do an excellent job about keeping people's GPU drivers up to date, so I don't know why Microsoft thought they needed to get involved. If a person isn't using either of those programs and they aren't manually installing new GPU drivers than odds are they aren't much of a gamer, don't care about new features or performance increases for a specific new game, and will probably be fine using whatever driver they got from the DVD in the box.
Drivers can have severe security vulnerabilities. Being able to roll out a patch for a vulnerability in, say, a network card driver is kind of a big deal.
Software like GeForce Experience and Catalyst do an excellent job about keeping people's GPU drivers up to date, so I don't know why Microsoft thought they needed to get involved. If a person isn't using either of those programs and they aren't manually installing new GPU drivers than odds are they aren't much of a gamer, don't care about new features or performance increases for a specific new game, and will probably be fine using whatever driver they got from the DVD in the box.
Drivers can have severe security vulnerabilities. Being able to roll out a patch for a vulnerability in, say, a network card driver is kind of a big deal.
I know Nvidia had an issue with their Tegra drivers a while back, but something like that should be opt-in if you are in a non corporate environment, especially for a GPU, and especially if there is already a piece of software that exists solely to make sure your GPU drivers are constantly up to date.
Having Windows and GeForce Experience in conflict over which drivers were current is absolutely amateur hour stuff, it should have never happened.
Looks like Windows has been reinstalled correctly now, though I haven't yet done all of the setup. What's the best way to find out if one of my drives is borked and, if so, which one?
"Nothing is gonna save us forever but a lot of things can save us today." - Night in the Woods
0
Options
ShadowfireVermont, in the middle of nowhereRegistered Userregular
If you open up Event Viewer and check your system logs, look for Errors from source Disk. If you see any that are DR0, that's a good indication that you've got a failing main drive. Otherwise you'd want to run diagnostic tools, and I'm not sure what good ones are available and also free.
Software like GeForce Experience and Catalyst do an excellent job about keeping people's GPU drivers up to date, so I don't know why Microsoft thought they needed to get involved. If a person isn't using either of those programs and they aren't manually installing new GPU drivers than odds are they aren't much of a gamer, don't care about new features or performance increases for a specific new game, and will probably be fine using whatever driver they got from the DVD in the box.
Drivers can have severe security vulnerabilities. Being able to roll out a patch for a vulnerability in, say, a network card driver is kind of a big deal.
I know Nvidia had an issue with their Tegra drivers a while back, but something like that should be opt-in if you are in a non corporate environment, especially for a GPU, and especially if there is already a piece of software that exists solely to make sure your GPU drivers are constantly up to date.
Having Windows and GeForce Experience in conflict over which drivers were current is absolutely amateur hour stuff, it should have never happened.
Corporate environments have IT departments; they're the environment that need automatic updates the least. The GeForce Experience thing at Windows 10 launch was bad, but security updates in mass market software should never be opt-in.
We just had a massive DDOS attack on Friday carried out primarily via compromised devices that used an opt-in security model. It's not good! Microsoft has had trouble with the execution, but they absolutely have the correct strategy here.
We just had a DDOS attack because the IoT is fundamentally broken and there are idiots out there who think that their fridge needs to be able to access Twitter, it has nothing to do with updating GPU drivers or Microsoft. You're confusing the two issues.
Having opt in or opt out driver updates in the context of this conversation doesn't matter when A.) Microsoft doesn't handle firmware updates for your network-connected baby monitor because it doesn't run Windows and B.) your internet connected thermostat probably doesn't have an nVidia GPU inside of it.
Deciding to update a user's drivers for a product you don't manufacture, don't support, and apparently don't bother to check on whether or not you're even updating users to the correct version (there was literally no version checking on MS's GPU drivers) when there is already a freely available piece of software made and supported by the actual manufacturer that does that exact same task better is not the correct strategy.
The IoT is fucking dumb anyway, we've been telling manufacturers for years that they are building vulnerable devices and they do not care, and it's the reason why you will run into things like Can't sign in to Google calendar on my Samsung refrigerator, which is a sentence that no human being should ever have to type or read.
+1
Options
MichaelLCIn what furnace was thy brain?ChicagoRegistered Userregular
Ha, $2,700 for a fridge that requires occasional rebooting.
We should be embarrassed of ourselves for letting it come to this.
The IoT is one attack vector. You still have plenty of others including, yes, John Doe's Windows computer. That may not be as large a source of devices to compromise, but they're there too.
Also FWIW, I love using my internet connected thermostat. It's definitely a first-world situation, though.
We just got a new furnace and the OG Nest we have doesn't work with the furnace (I'll omit details). I'm trying to give it to my brother but I'm not sure he'll even use it.
On second thought, I wonder if there's some sort of adapter kit that lets the Nest work with newer furnaces. I'm guessing No, because the incompatibility would force people to move to the second-gen, but I guess it's worth the cursory search.
Also FWIW, I love using my internet connected thermostat. It's definitely a first-world situation, though.
We just got a new furnace and the OG Nest we have doesn't work with the furnace (I'll omit details). I'm trying to give it to my brother but I'm not sure he'll even use it.
On second thought, I wonder if there's some sort of adapter kit that lets the Nest work with newer furnaces. I'm guessing No, because the incompatibility would force people to move to the second-gen, but I guess it's worth the cursory search.
yea specific to nest my parents live in a 60 year old house and someone wanted to sell my dad their old nest for $50, but when we looked at the wiring setup the nest isn't even compatable with how my parents heating system is wired. It definitely doesn't work with everything.
about IoT: the name is stupid, but the concept is the future. There are just several issues that very from minor pains to "bad guys can use IoT devices to DDoS dyn for lulz" In my opinion, IoT will change things a lot in the future. It isn't about being able to read twitter on your fridge, but there are practical uses. Think if your fridge had Alexa or something like it built in, instead of having to have a separate device in the kitchen. You take out the last bag of carrots, and then tell your fridge to add carrots to your shopping list right there. Then the next time you're at a grocery store and pull up your list on your phone, Carrots are on the list. There are a lot of useful things IoT can do.
the biggest issue in my opinion is right now most IoT things are done by startups with startup mentality. These companies fail/sell themselves at a huge rate, and often times these products that they sell for a couple years end up being orphaned and are never updated again. Even from bigger companies, do you really think that samsung/lg/ge/whoever is going to push software updates to IoT fridges for the 15 years you'll own it? Right now the answer is almost certainly no. What needs to happen is there be one standard. For lack of a better metaphor we need the "windows platform" of IoT. If there was one standard that everyone worked against, one that could continue to be updated over time regardless of external factors, it would go a long way. The issue is that Google, Apple, Microsoft, and Amazon are all trying to do this, in competition against eachother, so instead of one standard we're probably going to end up with 4 standards that don't talk to eachother, and not every 3rd party services talks to all 4 of them, and the situation doesn't really become better.
It is almost like it was back in the days of early computing, when you had IBM, Amiga, Commodore, Apple, etc all trying to make the dominant computing platform, and all the 3rd parties had to pick sides until someone did win.
There has been an attempt to use SmartThings or [something else I can't remember the name for whatever reason] as a standard, by making a large percentage of devices work with them. What's interesting is the Philips HUE bulbs seemed to have spurred the drive for standardization (though for the life of me I can't figure out why everyone wants them).
0
Options
ShadowfireVermont, in the middle of nowhereRegistered Userregular
There has been an attempt to use SmartThings or [something else I can't remember the name for whatever reason] as a standard, by making a large percentage of devices work with them. What's interesting is the Philips HUE bulbs seemed to have spurred the drive for standardization (though for the life of me I can't figure out why everyone wants them).
I don't care about the bulbs themselves, but it's nice that they create their own mesh network, kind of like Sonos stuff.
We just had a DDOS attack because the IoT is fundamentally broken and there are idiots out there who think that their fridge needs to be able to access Twitter, it has nothing to do with updating GPU drivers or Microsoft. You're confusing the two issues.
This isn't unrelated. You advocated an opt-in mechanism for a feature that delivers security updates. That DDOS attack is an example of why opt-in security is bad. Requiring users to take action to install security-related driver updates will result in significantly more vulnerable systems, just as requiring users to take action to change passwords or install security updates on their IoT devices did.
An Intel driver had a vulnerability in Windows 7 last year. My last Windows laptop didn't have an automatic Intel driver updater installed; the Intel control panel had a button that opened the Intel driver website. As a user, I would have had to have been proactive to hear about the vulnerability and patch it.
Last year there was a vulnerability that could be exploited via malicious printer drivers. In that case, the operating system overriding the user-supplied driver by default is actually exactly the proper behavior to keep the user safe. Sure, there can be a switch to allow overriding the Microsoft supplied driver, but it should default to the off position.
Deciding to update a user's drivers for a product you don't manufacture, don't support, and apparently don't bother to check on whether or not you're even updating users to the correct version (there was literally no version checking on MS's GPU drivers) when there is already a freely available piece of software made and supported by the actual manufacturer that does that exact same task better is not the correct strategy.
Is this an ongoing problem, or an issue they fixed on launch day? Since the NVidia disaster, I've only ever seen Windows 10 reject unsigned drivers, which is a different (also security-related) issue.
the biggest issue in my opinion is right now most IoT things are done by startups with startup mentality. These companies fail/sell themselves at a huge rate, and often times these products that they sell for a couple years end up being orphaned and are never updated again. Even from bigger companies, do you really think that samsung/lg/ge/whoever is going to push software updates to IoT fridges for the 15 years you'll own it?
Home routers were also a huge part of the attack. Routers made by huge networking companies, shipped with default passwords that users didn't change. You have to manually install updates on most routers, too.
An Intel driver had a vulnerability in Windows 7 last year. My last Windows laptop didn't have an automatic Intel driver updater installed; the Intel control panel had a button that opened the Intel driver website. As a user, I would have had to have been proactive to hear about the vulnerability and patch it.
If only there was a piece of software that automatically checked to see if your GPU driver was up to date oh darn wait there is it's called GeForce Experience.
You're talking about every driver for every device ever made while I am explicitly talking about a single piece of hardware (GPU) from a single vendor (Nvidia) that already has software available (GeForce Experience) that does a better job than Microsoft does. They have no business pushing GPU drivers on people, especially not outdated ones.
They have no business pushing anything on anyone. It's just too much to ask them to get things right the first time. So they should just spam "Hey idiot, install this!" and not abuse it. But that too is a pipe dream and they'll just end up doing what they think is best for them.
An Intel driver had a vulnerability in Windows 7 last year. My last Windows laptop didn't have an automatic Intel driver updater installed; the Intel control panel had a button that opened the Intel driver website. As a user, I would have had to have been proactive to hear about the vulnerability and patch it.
If only there was a piece of software that automatically checked to see if your GPU driver was up to date oh darn wait there is it's called GeForce Experience.
You're talking about every driver for every device ever made while I am explicitly talking about a single piece of hardware (GPU) from a single vendor (Nvidia) that already has software available (GeForce Experience) that does a better job than Microsoft does. They have no business pushing GPU drivers on people, especially not outdated ones.
Your original statement put emphasis on GeForce, but included the entirety of the automatic driver update feature, the way I read it. So we've probably been talking past each other.
Sure, GeForce Experience can be a thing. I use it myself. It was flat out broken on my system for about six months earlier this year; always said I had the most recent driver and wouldn't even download new updates. The GeForce experience. It was enough of a headache to make me wonder if the issue at Win10 release day was actually a Microsoft problem.
It doesn't seem to conflict with Microsoft's auto update feature anymore, though? Is automatic Windows driver updates and GeForce Experience conflicting actively a problem for anyone, or just a "what if it breaks again?" scenario?
They have no business pushing anything on anyone. It's just too much to ask them to get things right the first time. So they should just spam "Hey idiot, install this!" and not abuse it. But that too is a pipe dream and they'll just end up doing what they think is best for them.
The world of security simply doesn't work in a "get things right the first time" manner. Bad people are going to be out there figuring out new ingenious ways of taking control of various and all facets of all types of computers. Herd immunity is not just a thing for people.
They have no business pushing anything on anyone. It's just too much to ask them to get things right the first time. So they should just spam "Hey idiot, install this!" and not abuse it. But that too is a pipe dream and they'll just end up doing what they think is best for them.
The world of security simply doesn't work in a "get things right the first time" manner. Bad people are going to be out there figuring out new ingenious ways of taking control of various and all facets of all types of computers. Herd immunity is not just a thing for people.
Yeah, that wasn't sarcasm. It's not possible, just shouldn't be an excuse either.
They have no business pushing anything on anyone. It's just too much to ask them to get things right the first time. So they should just spam "Hey idiot, install this!" and not abuse it. But that too is a pipe dream and they'll just end up doing what they think is best for them.
The world of security simply doesn't work in a "get things right the first time" manner. Bad people are going to be out there figuring out new ingenious ways of taking control of various and all facets of all types of computers. Herd immunity is not just a thing for people.
Yeah, that wasn't sarcasm. It's not possible, just shouldn't be an excuse either.
Oh herply derp, I completely misread your post, sorry! I thought you'd written "is it too much to ask...".
It may have already become common knowledge, but the Nvidia hotfix (I think?) drivers version 375.63 do not seem to have the issue. I installed them yesterday and so far, everything seems fine.
So this was a fun one. After the anniversary update I had a memory leak somewhere. So while I was trying to pin it down to a program or a device I would just have to reboot more often than normal. Normal is usually damn near never anymore. If I left the machine running, eventually for no noticeable reason at all I'd be using 99% of my 16GBs of RAM. The machine would become unresponsive, but a reboot would get it back to working for a time. I wasn't too terribly concerned because it would usually would take a couple of days to get to the point where a reboot was required. A minor annoyance at best. Eventually though it gnawed at me enough to try to figure out what it was. No programs running, nothing really going on, but there's that insane memory usage. I'll spare you all everything I went through to figure it out. Turned out to be the damn network driver of all things.
After all that, I think I'd be down for more of a push to be forced to update things. The driver was released in September and if I had already updated it prior to being pushed the anniversary update, I wouldn't have encountered the problem. I honestly wish there was an easier, non malware ridden, way to update all my drivers easily. I honestly wouldn't mind if Microsoft did a bit more in the new driver pushing to home users. How many people are really keeping up on all their driver updates for every component in the PC? Probably very few of the total market. Many folks will only update if something stops working. But if there's a security flaw that a driver fixes and that's it; unless the user is informed by the manufacturer, or keeps up on driver updates on their own, that flaw will probably remain on that PC.
Also, twitter on a fridge I don't get. Being able to sync the family google calendar to the fridge though could totally be useful if your household contains more than 1 person and some of those people are too young to have their own cell phone yet.
Kind of feel like an old man yelling at the clouds, but the Internet of Things just seems dumb in general. IJS, wtf do I need widgets on my fridge when I can just look at my phone/tablet/pc to get the same information?
I'm kind of on the fence about forced upgrades. As while they are generally a force for good, their misuse (while free Win 10 was a net positive, imo... the use of the update system to do it wasn't entirely good) can be bad, and damaging to user trust, etc.
tastydonuts on
“I used to draw, hard to admit that I used to draw...”
IoT most definitely has a place in the home, especially since smartphones have become more ubiquitous. Devices like Ring and Nest allow you to have better control over your house with minimal effort after setup. I think what we're debating over, though, is that the convenience they offer does indeed have a price. And most users AND companies are quick to ignore that fact.
Right, I actually read a sort of fear-mongerish article about the security vulnerabilities of IoT devices (ovens and toasters in particular) that touched on how they are actually a danger vs a help, though a more obvious use of IoT devices and their weaknesses was evidenced in the DDoS attack that happened.
It's definitely a two edged sword.
“I used to draw, hard to admit that I used to draw...”
IoT devices aren't my worry, not when routers themselves are a much more available and easier target.
Think of how many people just plug them in and hit the easy setup button to get it going, maybe they change the ssd and wifi password, but a lot of people don't bother to change the default admin logins.
And then who ever bothers to update firmware? I'm guilty of this myself, it's just not something I ever really think of checking for unless something is not working right.
Sure, it's always about usefulness. But that comes with the fact that this is an emerging thing, and people/companies are willing to try anything and see what sticks. Do I think an internet connected Toaster is something I'd ever want/need? No. But again a fridge with some connectivity where if I take the last eggs out of the fridge I can right away that second add it to a shopping list that syncs somewhere that ends up on my phone so when I do get to the store the list is there already. That's useful to me, and I would imagine to a lot of people.
And I know that there are ways of doing this now, mostly around things like the Echo, Google Home, etc, but I'd almost rather have things for specific use cases built into the devices they make more sense for than have to dot amazon dots (see what I did there?) all over the house. I should just be able to tell my fridge, instead of random device near my fridge.
Yelling at the clouds pretty hard here, but do you really need your fridge or tablet grocery list to tell you that you need eggs? :P
lots of people only go shopping once a week/at payday, etc. So if it's something that I might need but I'm not going to the store for a couple weeks and don't use every day, sure. yelling at my fridge to add Ketchup to the grocery list is useful because there's a good chance I'll forget the ketchup bottle is almost empty when I go to the grocery store in a week.
Posts
Anyway, I'll try a full format and reinstall to see if it works. If not, it's not a biggie; I use the second PC fairly rarely. Still, I'd rather have a functioning second PC than a big, unwieldy paperweight.
"Nothing is gonna save us forever but a lot of things can save us today." - Night in the Woods
Nvidia in this case, but there is an argument to be made about that.
To be fair, at least this is just the tiles crashing and not the tilelayerdata corruption that's been a problem in Windows 10.
Microsoft had decided that they were going to include drivers in Windows Update, which was an idea. Things worked out ok for a while until Nvidia released a GeForce driver that broke multi-monitor support. Microsoft I guess didn't get the memo that a new driver had been released, and Nvidia didn't get the memo that their shit was broke, so for like week there was a tug of war going on that basically looked something like this:
10 GeForce Experience sees that your GPU driver is out of date and there is a new driver available and downloads it
20 That driver breaks multi monitor support
30 Windows Update sees that the new driver is different from the one it has on file
40 Windows Update decides that since the driver you are running is different from the one it has on file than the driver you are running is outdated
50 Windows Update 'updates' you to the outdated driver that they have on file
60 GOTO 10
Software like GeForce Experience and Catalyst do an excellent job about keeping people's GPU drivers up to date, so I don't know why Microsoft thought they needed to get involved. If a person isn't using either of those programs and they aren't manually installing new GPU drivers than odds are they aren't much of a gamer, don't care about new features or performance increases for a specific new game, and will probably be fine using whatever driver they got from the DVD in the box.
The two GPUs I have had since then have both been AMD so I didn't know up until a while ago that Nvidia started requiring logins to use their software, that is dumb as shit.
That's how I got the update--ironically, it happened to backfire this time. As someone said, GeForce Experience would probably benefit from a "rollback" option for one version because new drivers usually have some minor bug or another--this one just happened to effect the operating system.
Drivers can have severe security vulnerabilities. Being able to roll out a patch for a vulnerability in, say, a network card driver is kind of a big deal.
I know Nvidia had an issue with their Tegra drivers a while back, but something like that should be opt-in if you are in a non corporate environment, especially for a GPU, and especially if there is already a piece of software that exists solely to make sure your GPU drivers are constantly up to date.
Having Windows and GeForce Experience in conflict over which drivers were current is absolutely amateur hour stuff, it should have never happened.
"Nothing is gonna save us forever but a lot of things can save us today." - Night in the Woods
Corporate environments have IT departments; they're the environment that need automatic updates the least. The GeForce Experience thing at Windows 10 launch was bad, but security updates in mass market software should never be opt-in.
We just had a massive DDOS attack on Friday carried out primarily via compromised devices that used an opt-in security model. It's not good! Microsoft has had trouble with the execution, but they absolutely have the correct strategy here.
Having opt in or opt out driver updates in the context of this conversation doesn't matter when A.) Microsoft doesn't handle firmware updates for your network-connected baby monitor because it doesn't run Windows and B.) your internet connected thermostat probably doesn't have an nVidia GPU inside of it.
Deciding to update a user's drivers for a product you don't manufacture, don't support, and apparently don't bother to check on whether or not you're even updating users to the correct version (there was literally no version checking on MS's GPU drivers) when there is already a freely available piece of software made and supported by the actual manufacturer that does that exact same task better is not the correct strategy.
The IoT is fucking dumb anyway, we've been telling manufacturers for years that they are building vulnerable devices and they do not care, and it's the reason why you will run into things like Can't sign in to Google calendar on my Samsung refrigerator, which is a sentence that no human being should ever have to type or read.
We should be embarrassed of ourselves for letting it come to this.
Also I'm expecting someone to hack that fridge at some point and have it snap pics of people getting their midnight snacks.
We just got a new furnace and the OG Nest we have doesn't work with the furnace (I'll omit details). I'm trying to give it to my brother but I'm not sure he'll even use it.
On second thought, I wonder if there's some sort of adapter kit that lets the Nest work with newer furnaces. I'm guessing No, because the incompatibility would force people to move to the second-gen, but I guess it's worth the cursory search.
yea specific to nest my parents live in a 60 year old house and someone wanted to sell my dad their old nest for $50, but when we looked at the wiring setup the nest isn't even compatable with how my parents heating system is wired. It definitely doesn't work with everything.
about IoT: the name is stupid, but the concept is the future. There are just several issues that very from minor pains to "bad guys can use IoT devices to DDoS dyn for lulz" In my opinion, IoT will change things a lot in the future. It isn't about being able to read twitter on your fridge, but there are practical uses. Think if your fridge had Alexa or something like it built in, instead of having to have a separate device in the kitchen. You take out the last bag of carrots, and then tell your fridge to add carrots to your shopping list right there. Then the next time you're at a grocery store and pull up your list on your phone, Carrots are on the list. There are a lot of useful things IoT can do.
the biggest issue in my opinion is right now most IoT things are done by startups with startup mentality. These companies fail/sell themselves at a huge rate, and often times these products that they sell for a couple years end up being orphaned and are never updated again. Even from bigger companies, do you really think that samsung/lg/ge/whoever is going to push software updates to IoT fridges for the 15 years you'll own it? Right now the answer is almost certainly no. What needs to happen is there be one standard. For lack of a better metaphor we need the "windows platform" of IoT. If there was one standard that everyone worked against, one that could continue to be updated over time regardless of external factors, it would go a long way. The issue is that Google, Apple, Microsoft, and Amazon are all trying to do this, in competition against eachother, so instead of one standard we're probably going to end up with 4 standards that don't talk to eachother, and not every 3rd party services talks to all 4 of them, and the situation doesn't really become better.
It is almost like it was back in the days of early computing, when you had IBM, Amiga, Commodore, Apple, etc all trying to make the dominant computing platform, and all the 3rd parties had to pick sides until someone did win.
I don't care about the bulbs themselves, but it's nice that they create their own mesh network, kind of like Sonos stuff.
An Intel driver had a vulnerability in Windows 7 last year. My last Windows laptop didn't have an automatic Intel driver updater installed; the Intel control panel had a button that opened the Intel driver website. As a user, I would have had to have been proactive to hear about the vulnerability and patch it.
Last year there was a vulnerability that could be exploited via malicious printer drivers. In that case, the operating system overriding the user-supplied driver by default is actually exactly the proper behavior to keep the user safe. Sure, there can be a switch to allow overriding the Microsoft supplied driver, but it should default to the off position.
Is this an ongoing problem, or an issue they fixed on launch day? Since the NVidia disaster, I've only ever seen Windows 10 reject unsigned drivers, which is a different (also security-related) issue.
Home routers were also a huge part of the attack. Routers made by huge networking companies, shipped with default passwords that users didn't change. You have to manually install updates on most routers, too.
Opt-in security is bad.
If only there was a piece of software that automatically checked to see if your GPU driver was up to date oh darn wait there is it's called GeForce Experience.
You're talking about every driver for every device ever made while I am explicitly talking about a single piece of hardware (GPU) from a single vendor (Nvidia) that already has software available (GeForce Experience) that does a better job than Microsoft does. They have no business pushing GPU drivers on people, especially not outdated ones.
Your original statement put emphasis on GeForce, but included the entirety of the automatic driver update feature, the way I read it. So we've probably been talking past each other.
Sure, GeForce Experience can be a thing. I use it myself. It was flat out broken on my system for about six months earlier this year; always said I had the most recent driver and wouldn't even download new updates. The GeForce experience. It was enough of a headache to make me wonder if the issue at Win10 release day was actually a Microsoft problem.
It doesn't seem to conflict with Microsoft's auto update feature anymore, though? Is automatic Windows driver updates and GeForce Experience conflicting actively a problem for anyone, or just a "what if it breaks again?" scenario?
The world of security simply doesn't work in a "get things right the first time" manner. Bad people are going to be out there figuring out new ingenious ways of taking control of various and all facets of all types of computers. Herd immunity is not just a thing for people.
http://steamcommunity.com/id/pablocampy
Same. Mine still says 375.57 is the latest. Waiting for it update to 375.63 or something even further.
EDIT: Nevermind, it updated today to 375.63. Still not gonna update it juuuust yet though.
Yeah, that wasn't sarcasm. It's not possible, just shouldn't be an excuse either.
Oh herply derp, I completely misread your post, sorry! I thought you'd written "is it too much to ask...".
http://steamcommunity.com/id/pablocampy
After all that, I think I'd be down for more of a push to be forced to update things. The driver was released in September and if I had already updated it prior to being pushed the anniversary update, I wouldn't have encountered the problem. I honestly wish there was an easier, non malware ridden, way to update all my drivers easily. I honestly wouldn't mind if Microsoft did a bit more in the new driver pushing to home users. How many people are really keeping up on all their driver updates for every component in the PC? Probably very few of the total market. Many folks will only update if something stops working. But if there's a security flaw that a driver fixes and that's it; unless the user is informed by the manufacturer, or keeps up on driver updates on their own, that flaw will probably remain on that PC.
Also, twitter on a fridge I don't get. Being able to sync the family google calendar to the fridge though could totally be useful if your household contains more than 1 person and some of those people are too young to have their own cell phone yet.
PSN : Bolthorn
I'm kind of on the fence about forced upgrades. As while they are generally a force for good, their misuse (while free Win 10 was a net positive, imo... the use of the update system to do it wasn't entirely good) can be bad, and damaging to user trust, etc.
It's definitely a two edged sword.
Think of how many people just plug them in and hit the easy setup button to get it going, maybe they change the ssd and wifi password, but a lot of people don't bother to change the default admin logins.
And then who ever bothers to update firmware? I'm guilty of this myself, it's just not something I ever really think of checking for unless something is not working right.
And I know that there are ways of doing this now, mostly around things like the Echo, Google Home, etc, but I'd almost rather have things for specific use cases built into the devices they make more sense for than have to dot amazon dots (see what I did there?) all over the house. I should just be able to tell my fridge, instead of random device near my fridge.
lots of people only go shopping once a week/at payday, etc. So if it's something that I might need but I'm not going to the store for a couple weeks and don't use every day, sure. yelling at my fridge to add Ketchup to the grocery list is useful because there's a good chance I'll forget the ketchup bottle is almost empty when I go to the grocery store in a week.
Tinfoil hat/ think of the children! Time:
And knowing where your kids play at, where their soccer games are, their names.
And doctor names, and bank details, etc.