As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/

Cloud Computing Is Retarded: Or, The Pendulum Swings Back

2

Posts

  • chamberlainchamberlain Registered User regular
    edited March 2009
    You are my new IT hero. If I was still a net admin I would want to be just like you.

    <3

    chamberlain on
  • Lizz the BlizzLizz the Blizz Registered User regular
    edited March 2009
    kildy wrote: »
    *grumble*

    Again, this is really how everything works Anyways.

    Think about your office. Think really hard. How much of the stuff you use isn't actually doing more than temporary storage/rendering on your system? Of all the apps I have open right now, my notepad scratch file is the only entirely local application running. Email? Just a local cache and application, all the data is IMAP on the exchange server. Ton of putty windows, because it would be silly for me to run the data crunching applications off user desktops. Firefox windows, which are obviously just rendering the end result of code running somewhere else.

    My machine here IS a dumb terminal, it's just got an awful lot of hardware behind it trying to pretend it's not. That said, local machines aren't going away barring very specific circumstances. But even then, centrally managed machines are useful. Enterprise backups of your desktop VM farm. No data loss on hardware failure. Give new box, plug in VM path, run. Run from anywhere capability (no assigned desk, just come in, sit down, log in, and your customized desktop is there). Honest dumb terminals are useful for hospitals and educational systems, but centralized storage and management is useful anywhere (and not having to buy bleeding edge desktops for everyone is a huge perk)

    Cloud Computing isn't about replacing the world with dumb terminals. It's about reducing the massive waste inherent in the idea of overly fat machines on every desktop and in every server. It's an evolution of the idea of VMing things for resource reduction/cost reduction. Grid and Cloud computing aren't horribly different technologies, it's the application of such (and cloud being an overly broad marketing term). Grids tend to have a lot of machines, and you allocate the majority of them to working on one massive problem. Clouds are about allocating smaller blocks of processing to larger pools of varied requests. The EC2 cloud is just a fuckton of processing power, and you buy it by time chunk/number of processors you need for that time. It's pay by use burst processing that can integrate with your systems. And that should never go away, because the alternative is having to buy 30 new servers and rack space/power/cooling because you want to crunch your indexes for a client again this month and that goes beyond normal expected processing needs. I don't WANT a 300 server datacenter for a company that honestly only needs 150 on a regular basis. I'd rather replace it all with some seriously hardcore machines running any form of virtualization so I can get even more leverage out of the sliding windows of processing power required by certain systems. I loved when I worked at a company that went that route. We had specific processing times on a clock. Old method was to run dozens of machines that 95% of the day ran at 2% processor, and 5% ran at 80% doing all the math. New method was that their effective hardware could run the actual day to day operations until that 5% shows up, and they could steal some resources from trivial systems like the backup server for a few minutes that didn't need it at the time. Real time distribution of hardware resources to things that need it, instead of massively overspeccing a datacenter Just In Case. Efficiency!


    Exactly!

    I guess your wording is better than mine, but this is pretty much word for word what I meant. :)

    Lizz the Blizz on
  • TopweaselTopweasel Registered User regular
    edited March 2009
    kildy wrote: »
    Elitistb wrote: »
    Having centralized servers do all the work is great! For instance, now you only have to monitor that one computer for illegal and/or seditious activities!

    *grumble*

    Again, this is really how everything works Anyways.

    Think about your office. Think really hard. How much of the stuff you use isn't actually doing more than temporary storage/rendering on your system? Of all the apps I have open right now, my notepad scratch file is the only entirely local application running. Email? Just a local cache and application, all the data is IMAP on the exchange server. Ton of putty windows, because it would be silly for me to run the data crunching applications off user desktops. Firefox windows, which are obviously just rendering the end result of code running somewhere else.

    My machine here IS a dumb terminal, it's just got an awful lot of hardware behind it trying to pretend it's not. That said, local machines aren't going away barring very specific circumstances. But even then, centrally managed machines are useful. Enterprise backups of your desktop VM farm. No data loss on hardware failure. Give new box, plug in VM path, run. Run from anywhere capability (no assigned desk, just come in, sit down, log in, and your customized desktop is there). Honest dumb terminals are useful for hospitals and educational systems, but centralized storage and management is useful anywhere (and not having to buy bleeding edge desktops for everyone is a huge perk)

    Cloud Computing isn't about replacing the world with dumb terminals. It's about reducing the massive waste inherent in the idea of overly fat machines on every desktop and in every server. It's an evolution of the idea of VMing things for resource reduction/cost reduction. Grid and Cloud computing aren't horribly different technologies, it's the application of such (and cloud being an overly broad marketing term). Grids tend to have a lot of machines, and you allocate the majority of them to working on one massive problem. Clouds are about allocating smaller blocks of processing to larger pools of varied requests. The EC2 cloud is just a fuckton of processing power, and you buy it by time chunk/number of processors you need for that time. It's pay by use burst processing that can integrate with your systems. And that should never go away, because the alternative is having to buy 30 new servers and rack space/power/cooling because you want to crunch your indexes for a client again this month and that goes beyond normal expected processing needs. I don't WANT a 300 server datacenter for a company that honestly only needs 150 on a regular basis. I'd rather replace it all with some seriously hardcore machines running any form of virtualization so I can get even more leverage out of the sliding windows of processing power required by certain systems. I loved when I worked at a company that went that route. We had specific processing times on a clock. Old method was to run dozens of machines that 95% of the day ran at 2% processor, and 5% ran at 80% doing all the math. New method was that their effective hardware could run the actual day to day operations until that 5% shows up, and they could steal some resources from trivial systems like the backup server for a few minutes that didn't need it at the time. Real time distribution of hardware resources to things that need it, instead of massively overspeccing a datacenter Just In Case. Efficiency!
    Cloud Computing for a Datacenter at a business where the the Cloud is owned and run by said business, Smart. Cloud computer in terms of slim machine under the same guidelines, smart. Cloud applications, OS's, and data storage for the masses, retarded. Great way to make sure you don't own or control everything you do on your own equipment. We should be getting closer to personal ownership and management not away from it.

    Topweasel on
  • tsmvengytsmvengy Registered User regular
    edited March 2009
    One thing I've always wondered about distributed computing (SETI@home, folding, etc.) is what the energy use looks like vs. a data center or something like that. It's great to have all that processing power of the distributed network, but what are the energy costs vs. doing something more centralized?

    EDIT: It looks like Folding@home on the PS3 is pretty energy-efficient.

    tsmvengy on
    steam_sig.png
  • kildykildy Registered User regular
    edited March 2009
    For grandma, storing everything/running everything remotely is kind of a shitty idea.

    For my remote marketing people, being able to web vpn in and load up a demo is keen and doesn't involve having to load an entire fake environment on their laptop or rely on the client's available machines not rendering something oddly.

    It's a corporate thing, and won't pick up too much for home use beyond specialized systems (dumb terminals for things like pandora are actually kinda neat if overpriced, and your cable box pretty much IS a dumb terminal.) Cloud computing for home users with no advanced need of anything should start at google apps and end at amazon backups. Which is cloud applications (because office is goddamned expensive for features grandma doesn't need) or web based email (which is really just google apps: Outlook), and cloud based storage for backups (not primary copies, just data backup because home users never have a good backup system in place and it's cheap to do it this way). Like everything, it will be overly pushed on end users by the vendors to try and make more money. But it's only a tool that can be used for specific applications, just like VMs shouldn't replace everything on the planet, just large volumes of underused systems. Or their virtual desktop shit in extremely selective cases.

    kildy on
  • durandal4532durandal4532 Registered User regular
    edited March 2009
    I already get annoyed when I have to use my shitty connection to verify Steam applications. Extending that to my OS? No thank you sir.

    At work, of course, it would be lovely. I just don't see there being near enough benefit for me to go through the hassle at home.

    Edit: Gmail is useful, but I don't see there being a huge issue with basic utility programs. It's difficult for grandma to find without the help of an enterprising youngster, but Open Office works perfectly well in place of Word. You can pretty much supply a basic user with every program they need for free.

    durandal4532 on
    Take a moment to donate what you can to Critical Resistance and Black Lives Matter.
  • Darkchampion3dDarkchampion3d Registered User regular
    edited March 2009
    kildy wrote: »
    Elitistb wrote: »
    Having centralized servers do all the work is great! For instance, now you only have to monitor that one computer for illegal and/or seditious activities!

    *grumble*

    Again, this is really how everything works Anyways.

    Think about your office. Think really hard. How much of the stuff you use isn't actually doing more than temporary storage/rendering on your system? Of all the apps I have open right now, my notepad scratch file is the only entirely local application running. Email? Just a local cache and application, all the data is IMAP on the exchange server. Ton of putty windows, because it would be silly for me to run the data crunching applications off user desktops. Firefox windows, which are obviously just rendering the end result of code running somewhere else.

    My machine here IS a dumb terminal, it's just got an awful lot of hardware behind it trying to pretend it's not. That said, local machines aren't going away barring very specific circumstances. But even then, centrally managed machines are useful. Enterprise backups of your desktop VM farm. No data loss on hardware failure. Give new box, plug in VM path, run. Run from anywhere capability (no assigned desk, just come in, sit down, log in, and your customized desktop is there). Honest dumb terminals are useful for hospitals and educational systems, but centralized storage and management is useful anywhere (and not having to buy bleeding edge desktops for everyone is a huge perk)

    Cloud Computing isn't about replacing the world with dumb terminals. It's about reducing the massive waste inherent in the idea of overly fat machines on every desktop and in every server. It's an evolution of the idea of VMing things for resource reduction/cost reduction. Grid and Cloud computing aren't horribly different technologies, it's the application of such (and cloud being an overly broad marketing term). Grids tend to have a lot of machines, and you allocate the majority of them to working on one massive problem. Clouds are about allocating smaller blocks of processing to larger pools of varied requests. The EC2 cloud is just a fuckton of processing power, and you buy it by time chunk/number of processors you need for that time. It's pay by use burst processing that can integrate with your systems. And that should never go away, because the alternative is having to buy 30 new servers and rack space/power/cooling because you want to crunch your indexes for a client again this month and that goes beyond normal expected processing needs. I don't WANT a 300 server datacenter for a company that honestly only needs 150 on a regular basis. I'd rather replace it all with some seriously hardcore machines running any form of virtualization so I can get even more leverage out of the sliding windows of processing power required by certain systems. I loved when I worked at a company that went that route. We had specific processing times on a clock. Old method was to run dozens of machines that 95% of the day ran at 2% processor, and 5% ran at 80% doing all the math. New method was that their effective hardware could run the actual day to day operations until that 5% shows up, and they could steal some resources from trivial systems like the backup server for a few minutes that didn't need it at the time. Real time distribution of hardware resources to things that need it, instead of massively overspeccing a datacenter Just In Case. Efficiency!

    And greater efficiency = lower cost. In my current shop we run virtual server clusters simply because we can't afford to waste all of our money buying tons of boxes for retarded apps that demand to run on their own server but sit there 95% idle 95% of the time.

    And in my last job working as a contractor before I moved to this job, we were doing diskless computing deployments for school districts, mid sized businesses, etc. (using Neoware who are apparently part of hp now...) It vastly decreased the amount of problems you had to deal with on computers (reducing the number of required pc technicians needed) and the thinclients were cheaper than fat boxes too.

    Somebody fuck up their pc and call your helpdesk? Tell them to reboot. Rebooting basically wipes them out from zero, as it is reloading from your golden image (which has been pre-built for their needs and is easily updated) and then mapping mydocs and desktop

    Darkchampion3d on
    Our country is now taking so steady a course as to show by what road it will pass to destruction, to wit: by consolidation of power first, and then corruption, its necessary consequence --Thomas Jefferson
  • iTunesIsEviliTunesIsEvil Cornfield? Cornfield.Registered User regular
    edited March 2009
    Topweasel wrote: »
    Great way to make sure you don't own or control everything you do on your own equipment. We should be getting closer to personal ownership and management not away from it.
    In what way do you not own data that you have stored on a server?

    iTunesIsEvil on
  • kildykildy Registered User regular
    edited March 2009
    tsmvengy wrote: »
    One thing I've always wondered about distributed computing (SETI@home, folding, etc.) is what the energy use looks like vs. a data center or something like that. It's great to have all that processing power of the distributed network, but what are the energy costs vs. doing something more centralized?

    EDIT: It looks like Folding@home on the PS3 is pretty energy-efficient.

    It's actually more efficient to do it from home machines than massive datacenters energy wise for two reasons:

    A) Cooling. It costs an absolutely stupid amount to cool datacenters. It's god damned ridiculous, and leads to datacenter space actually being sold by space for cooling, and not how many racks you can cram into a room. One of my old companies owned an empty cage in a colo due to power per space per room cooling requirements meaning we needed to physically possess more blank space just to put more racks and power into our other cage <3

    B) Power stepping! Most electronics are able to step down their power usage based on how much you're actively doing right now. It's why your machine gets hellishly loud for a second while everything spins up when you open a game after browsing the web for a while. It's variable power draw to save you money. Servers have this ability, but it's mostly turned off by default to prevent horrible lag times on random processing if the machine hasn't been doing much. It also messes with some applications. I've had a few machines it works well with, but for the most part they run full blast all day long and never think they're idle enough for a long enough period to drop their power draw. Plus by default servers draw a dickload of extra power for all the redundant drive setups and number of internal fans.

    kildy on
  • nexuscrawlernexuscrawler Registered User regular
    edited March 2009
    I remember when I worked at a datacenter and our AC system went down

    It hit like 130 degrees in the server room D:

    ppls all calling ever 3 minutes saying their servers crashed

    nexuscrawler on
  • NailbunnyPDNailbunnyPD Registered User regular
    edited March 2009
    Cloud computing is just new terminology for old philosophies that are applying to current technological advances. Its not good nor bad, it just depends on the application.

    The idea that a small business with multiple locations can implement a VM infrastructure with multiple connections to the internet to effectively centralize computing and replicate data while providing effective disaster recover is very appealing. Its a throwback to the old mainframe design, but using current technology and proving to be very cost effective.

    NailbunnyPD on
    XBL: NailbunnyPD PSN: NailbunnyPD Origin: NailbunnyPD
    NintendoID: Nailbunny 3DS: 3909-8796-4685
    steam_sig-400.png
  • redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    edited March 2009
    I already get annoyed when I have to use my shitty connection to verify Steam applications. Extending that to my OS? No thank you sir.

    At work, of course, it would be lovely. I just don't see there being near enough benefit for me to go through the hassle at home.
    It's not so much that at home. It's having access to all that same stuff you have access to at home, also from your phone, laptop, netbook, game console and car, as well as any of those things if they belong to other people as well. Basically anything that can run a browser or thin client.

    It's really cool, but to do it yourself it takes some know how and a semi-dedicated computer running all the time. It's not necessarily that cool. Let the cloud take care of all that crap.


    Does anyone know what kind of expectation of privacy 'the cloud' has? Would the 4th amendment cover, say a VM running from an encrypred virtual hdd, that the provider has no access too? What about whatever the gdrive is? Customs and cops have something near carte blanche when it come to searching digital devices and media these days. If it's something like a thin client, there aren't any files on the devices to be found, and worries of it being physically stolen don't really exist.

    redx on
    They moistly come out at night, moistly.
  • DeciusDecius I'm old! I'm fat! I'M BLUE!Registered User regular
    edited March 2009
    Topweasel wrote: »
    Great way to make sure you don't own or control everything you do on your own equipment. We should be getting closer to personal ownership and management not away from it.
    In what way do you not own data that you have stored on a server?

    In the way that you may not own the server it's being stored on. Think of Google in this application. Sure you "own" the Gmail account, but they own the server it's stored on, and really could do anything they wanted with it. This is something that is prevalent across most consumer-model cloud computing applications, and the reason I don't like cloud computing in general. I want ownership of my data, hook line and sinker. The data, the medium used to store that data, and the equipment used to operate and read that medium. Admitedly this is based off my mistrust of others when it comes to something that can directly affect me, but I feel safer when I know exactly what is being done with my data. This is the reason I operate my own "cloud" in the form of 3TB of NAS on my home network, parked behind a decent firewall.

    Also if the equipment fails, I'm the one responsible for it. I'm not a company that is going to claim bankruptcy anytime soon, taking all my services with me. The only way I suddenly blink off the map is if I die, at which point I don't care. The only thing that prevents me from access to my data is catastrophic hardware failure, or no electricity.

    Decius on
    camo_sig2.png
    I never finish anyth
  • nexuscrawlernexuscrawler Registered User regular
    edited March 2009
    You might own a gmail account but google has every right to shut down their server tomorrow, add new restrictions to the service or deny you access anytime.

    nexuscrawler on
  • DeciusDecius I'm old! I'm fat! I'M BLUE!Registered User regular
    edited March 2009
    Bingo!

    Decius on
    camo_sig2.png
    I never finish anyth
  • iTunesIsEviliTunesIsEvil Cornfield? Cornfield.Registered User regular
    edited March 2009
    Does keeping backups not solve that problem though? Same thing's gonna happen if your HDD grinds to a halt tomorrow, or you get an exceptionally nasty powersurge.

    iTunesIsEvil on
  • ElJeffeElJeffe Moderator, ClubPA mod
    edited March 2009
    So one of the reasons we're talking about this is the OnLive thing and the desire to run games (and other hardware-intensive apps) on central machines rather than on local machines, yes? For consumer applications?

    Won't this kinda, you know, seriously stymie hardware advancements? I mean, I'm a standard gamer. I like my games to look pretty, and so I buy the latest hardware. This creates demand for more advanced hardware. Game (and other software) developers see this trend, and develop for more advanced hardware. This also creates more demand. So chipset developers, graphic card developers, all these guys they keep pushing the envelope with faster, more powerful machines. Hardware improvement++

    Now let's imagine the games are all run on a central server with a single hardware spec. Why upgrade? What do you have to gain? The gamers are all locked into your hardware if they want to play games at all, so it's not like you gain much from upgrading all your hardware every six months. All you do is blow money.

    I suppose you could say the PC gaming model would more closely resemble the console gaming market, where you have a number of companies competing with different hardware sets, and that competition would drive the companies to keep upgrading. But if we look at the console model, we see them changing their hardware every 5 years or so.

    Of course, console developers are constrained to 5 year cycles because console gamers don't want to abandon their system every 6 months. There's a lot of investment in those consoles, so they need to last awhile. Maybe you think this wouldn't apply to the service providers who could upgrade incrementally. But A) they'd have a lost of investments in their hardware, too; B) every upgrade would require extensive bug-testing, which would disincent them from upgrading frequently and C) seriously, they would likely be lazy about it because everyone else would also be lazy about it. It's cheaper to just sit there and hock the same stale hardware specs for a few years.

    But still, let's say the hardware cycle becomes 2 years instead of the 5 that's standard for consoles. We're still going from a 6 month cycle to a 2 year cycle, and the market will have shrunk dramatically. So: massive stagnation.

    But I wouldn't have to run the games on my own machine!

    Whee? I guess?

    ElJeffe on
    I submitted an entry to Lego Ideas, and if 10,000 people support me, it'll be turned into an actual Lego set!If you'd like to see and support my submission, follow this link.
  • redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    edited March 2009
    but most people don't know how to run their own e-mail server and it would be impractical for a lot of reasons anyway. In some cases, even with the threat of the carpet being yanked out from under you, it's worth the risk. You can mitigate that somewhat(back up everything to a local account via pop), but it's there to some extent.

    The same is true for cloud computing generally. There are cases where it is worth the risk, but it is a case by case basis.

    redx on
    They moistly come out at night, moistly.
  • monikermoniker Registered User regular
    edited March 2009
    Cloud computing is just new terminology for old philosophies that are applying to current technological advances. Its not good nor bad, it just depends on the application.

    The idea that a small business with multiple locations can implement a VM infrastructure with multiple connections to the internet to effectively centralize computing and replicate data while providing effective disaster recover is very appealing. Its a throwback to the old mainframe design, but using current technology and proving to be very cost effective.

    Not to mention the benefits of allowing a singular working file for anything that's collaborative. This will be great for the construction industry since you'll only have to make the contractors aware that you have updated the drawing file rather than having to save it as House_3-26-09 or whatever and ftp it to everyone again.

    moniker on
  • DemiurgeDemiurge Registered User regular
    edited March 2009
    I think this for public libraries and/or netcafe's in developing countries where people don't normally have access to the internet. Put a large server in a building in Rwanda, put up 20 terminals in various spots around the city and let people surf the internet for unbiased news and information. Leading to a more educated and knowledgable population.

    Demiurge on
    DQ0uv.png 5E984.png
  • DeciusDecius I'm old! I'm fat! I'M BLUE!Registered User regular
    edited March 2009
    moniker wrote: »
    Cloud computing is just new terminology for old philosophies that are applying to current technological advances. Its not good nor bad, it just depends on the application.

    The idea that a small business with multiple locations can implement a VM infrastructure with multiple connections to the internet to effectively centralize computing and replicate data while providing effective disaster recover is very appealing. Its a throwback to the old mainframe design, but using current technology and proving to be very cost effective.

    Not to mention the benefits of allowing a singular working file for anything that's collaborative. This will be great for the construction industry since you'll only have to make the contractors aware that you have updated the drawing file rather than having to save it as House_3-26-09 or whatever and ftp it to everyone again.

    Which is something that corporations have been doing for a long time, and is really nothing new. Files stored on file servers, with permissions allowing multiple users or groups of users access to the file. They save it and the changes are saved on the server.

    I think they need to make a separation between cloud computing and cloud storage (as if we need more buzz terms). Cloud computing is basically remote computing, running applications on a server but outputting them to a client. Nothing new really. Cloud storage is simply storing data remotely and accessing it locally. Also nothing new.

    I'm detecting an overall theme here, known colloquially as "same shit, different pile."

    Decius on
    camo_sig2.png
    I never finish anyth
  • DeciusDecius I'm old! I'm fat! I'M BLUE!Registered User regular
    edited March 2009
    I'm still wait for Vint Cerf to come screaming to the front of the line touting his NC concept (Network Computer, another fancy term coined in the 90s for a dumb terminal) as the wave of the future....again.
    It was Vint Cerf right?

    Decius on
    camo_sig2.png
    I never finish anyth
  • kildykildy Registered User regular
    edited March 2009
    Cloud allows more in the back end than a single system, but yeah. The basic distinction between shit you already do at work and cloud computing is, well, ownership. It's not even the remote/over the internet bit, chances are you're already doing that.

    It's just someone else is providing Software as a Service, Hardware as a Service, or Storage as a Service.

    kildy on
  • geckahngeckahn Registered User regular
    edited March 2009
    ElJeffe wrote: »
    snip

    You're assuming that there is no competition in the central-server gaming industry. If there is, which I have mo doubt there will be, you would see faster advancement in graphics technology.

    geckahn on
  • redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    edited March 2009
    Demiurge wrote: »
    I think this for public libraries and/or netcafe's in developing countries where people don't normally have access to the internet. Put a large server in a building in Rwanda, put up 20 terminals in various spots around the city and let people surf the internet for unbiased news and information. Leading to a more educated and knowledgable population.

    why does it need a large server located in Rwanda? Even if you are scrapping the bottom of the barrel for terminals, they ought to be able to render web pages. Run them from a static image, when they shut down there is no cache. It takes a bit of ram, but it's very cheap these days.

    redx on
    They moistly come out at night, moistly.
  • tbloxhamtbloxham Registered User regular
    edited March 2009
    You might own a gmail account but google has every right to shut down their server tomorrow, add new restrictions to the service or deny you access anytime.

    You might own a copy of outlook express, but microsoft has the right to stop patching it and updating it tomorrow.

    There are some problems with cloud computing, but most overlap exactly into personal computing. OnLive or whatnot is a silly use, since it requires enormous downstream bandwidth to deliver to the consumer, however there are huge numbers of excellent uses. Code that operates on gigs of data but delivers a single number output, data which needs to be available anywhere and so forth.

    tbloxham on
    "That is cool" - Abraham Lincoln
  • ElJeffeElJeffe Moderator, ClubPA mod
    edited March 2009
    geckahn wrote: »
    ElJeffe wrote: »
    snip

    You're assuming that there is no competition in the central-server gaming industry. If there is, which I have mo doubt there will be, you would see faster advancement in graphics technology.

    Like I said, there will be competition. I just think there would be less competition in a market with a small fraction of the demand.

    ElJeffe on
    I submitted an entry to Lego Ideas, and if 10,000 people support me, it'll be turned into an actual Lego set!If you'd like to see and support my submission, follow this link.
  • tsmvengytsmvengy Registered User regular
    edited March 2009
    tbloxham wrote: »
    You might own a gmail account but google has every right to shut down their server tomorrow, add new restrictions to the service or deny you access anytime.

    You might own a copy of outlook express, but microsoft has the right to stop patching it and updating it tomorrow.

    There are some problems with cloud computing, but most overlap exactly into personal computing. OnLive or whatnot is a silly use, since it requires enormous downstream bandwidth to deliver to the consumer, however there are huge numbers of excellent uses. Code that operates on gigs of data but delivers a single number output, data which needs to be available anywhere and so forth.

    Except with outlook express, your e-mails are stored on your computer. So if Microsoft stopped updating it you'd still be able to use the program to access your e-mail. Google shuts down and your e-mails and the program that you use to read them are gone.

    tsmvengy on
    steam_sig.png
  • iTunesIsEviliTunesIsEvil Cornfield? Cornfield.Registered User regular
    edited March 2009
    tsmvengy wrote: »
    Except with outlook express, your e-mails are stored on your computer. So if Microsoft stopped updating it you'd still be able to use the program to access your e-mail. Google shuts down and your e-mails and the program that you use to read them are gone.
    Google shuts down and my email client still has all of my email on my local drive.

    iTunesIsEvil on
  • geckahngeckahn Registered User regular
    edited March 2009
    yeah, you can use pop with gmail.

    geckahn on
  • redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    edited March 2009
    tbloxham wrote: »
    You might own a gmail account but google has every right to shut down their server tomorrow, add new restrictions to the service or deny you access anytime.

    You might own a copy of outlook express, but microsoft has the right to stop patching it and updating it tomorrow.

    There are some problems with cloud computing, but most overlap exactly into personal computing. OnLive or whatnot is a silly use, since it requires enormous downstream bandwidth to deliver to the consumer, however there are huge numbers of excellent uses. Code that operates on gigs of data but delivers a single number output, data which needs to be available anywhere and so forth.

    Excellent uses that aren't one of the most latency sensitive uses in computing. This is streamed gaming, right? I don't see what could possibly go wrong.

    redx on
    They moistly come out at night, moistly.
  • FeralFeral MEMETICHARIZARD interior crocodile alligator ⇔ ǝɹʇɐǝɥʇ ǝᴉʌoɯ ʇǝloɹʌǝɥɔ ɐ ǝʌᴉɹp ᴉRegistered User regular
    edited March 2009
    I find it very telling that people in this thread are referring to "cloud" computing as two or three different things and none of them are terribly new ideas.

    Feral on
    every person who doesn't like an acquired taste always seems to think everyone who likes it is faking it. it should be an official fallacy.

    the "no true scotch man" fallacy.
  • TopweaselTopweasel Registered User regular
    edited March 2009
    Feral wrote: »
    I find it very telling that people in this thread are referring to "cloud" computing as two or three different things and none of them are terribly new ideas.

    No but depending on the implementation your talking about the clients would be. Business, Schools, Libraries, and some others are better served with one or multiple implementation of cloud computing. As long as it is those business's or their contractors that own and or control the "cloud", and those ideas have been around forever.

    Home users are a new idea. Anything a person actually has to pay for should be done and handled completely within the parameters of their equipment. If a file is deleted, you should be able to know that it is actually deleted. If you purchase a program it should be installed on your computer. If you pull a trigger in a fricken game, it should pull the trigger then not another 40-80ms later. The worst though is the idea for home users, Cloud OS's, seriously the idea of having services loaded from the net on the most important part of your system, the one thing that makes all of the rest run, relies on an internet connection and a server you have no control over.

    Topweasel on
  • QinguQingu Registered User regular
    edited March 2009
    Comparing cloud computing to early mainframe/terminal computing misses the metaphorical bull/elephant in the allegorical china shop/room: the existence of the goddamn Internet.

    Here are some things that did not exist with mainframes/terminal computers:

    • Webmail and instant messaging
    • Social networking sites
    • Blogs and personal webpages
    • Online stores

    To say nothing of "cloud" applications like Google Docs. More and more of our culture and economy has moved into this newfangled space between computers. Moreover, the actual computers used to access and create this information are basically irrelevant and interchangeable. I can code my Wordpress story just as easily on my Macbook as I can on my parents' computer.

    The structure and philosophy of the Internet seems to lend itself to a cloud model.

    I also don't see the downside to "centralization" if that centralization entails ubiquitous access and input. Granted, I know nothing about computers, but I honestly think my documents and information are a lot safer in my Gmail and GDocs than they would be as Word files on my laptop.

    Qingu on
  • kildykildy Registered User regular
    edited March 2009
    They're safer in that google has a vested interest in keeping them safe, and has a proven track record of making bullet proof data retention models. That said, the number of home users with reasonable backup solutions is rather low. I know personally I keep most of my information in gmail and my applications in steam instead of dealing with a backup schedule and remembering to leave my laptop online for those set times.

    kildy on
  • Darkchampion3dDarkchampion3d Registered User regular
    edited March 2009
    kildy wrote: »
    They're safer in that google has a vested interest in keeping them safe, and has a proven track record of making bullet proof data retention models. That said, the number of home users with reasonable backup solutions is rather low. I know personally I keep most of my information in gmail and my applications in steam instead of dealing with a backup schedule and remembering to leave my laptop online for those set times.

    I just have a RAID 1 setup for my data drive. Simple and I dont have to remember to do anything for it to work. But yes, the odds of me losing my google docs is much smaller than something fucking up both of my drives and me losing everything on my pc. They have a redundant enterprise level backup system. I have a 200$ RAID setup. No comparison :)

    Darkchampion3d on
    Our country is now taking so steady a course as to show by what road it will pass to destruction, to wit: by consolidation of power first, and then corruption, its necessary consequence --Thomas Jefferson
  • TopweaselTopweasel Registered User regular
    edited March 2009
    Qingu wrote: »

    The structure and philosophy of the Internet seems to lend itself to a cloud model.

    I also don't see the downside to "centralization" if that centralization entails ubiquitous access and input. Granted, I know nothing about computers, but I honestly think my documents and information are a lot safer in my Gmail and GDocs than they would be as Word files on my laptop.

    Agreed on the first part. But that doesn't make it a good thing.


    On the Second part, is accountability. Its safer because the effort to make sure information isn't lost falls on someone else hands. The problem is one you leave a task with someone else, its hard to get that back. Google keeps a literal history of the inter-webs. Once they scan a web page its in their system for life. How do you know if Google or another company is doing the same with your documents, how do you know that every variation isn't saved for the heck of it, and how do you know when you have deleted something its gone forever? How do you know what someone isn't reading that copyright form you are finalizing and attempting to steal your idea? What happens when Google or Microsoft thinks you have broken their TOS? Do they give you access back to the documents they have saved? I don't think the inability to uncomfortableness of those answers is worth the peace of mind that I can replicate with a CD burner.

    Topweasel on
  • monikermoniker Registered User regular
    edited March 2009
    Feral wrote: »
    I find it very telling that people in this thread are referring to "cloud" computing as two or three different things and none of them are terribly new ideas.
    There is nothing new under the sun.

    The internet itself wasn't a terribly new idea when it finally came about, we had just finally gotten to a point where it was reasonably viable to be implemented and so got implemented. Getting the various products and services that 'cloud computing' entails to actually work well enough that it will be broadly accepted is rather new. Hopefully it leads to some beneficial and interesting places.

    moniker on
  • QinguQingu Registered User regular
    edited March 2009
    Topweasel wrote: »
    On the Second part, is accountability. Its safer because the effort to make sure information isn't lost falls on someone else hands. The problem is one you leave a task with someone else, its hard to get that back. Google keeps a literal history of the inter-webs. Once they scan a web page its in their system for life. How do you know if Google or another company is doing the same with your documents, how do you know that every variation isn't saved for the heck of it, and how do you know when you have deleted something its gone forever? How do you know what someone isn't reading that copyright form you are finalizing and attempting to steal your idea? What happens when Google or Microsoft thinks you have broken their TOS? Do they give you access back to the documents they have saved? I don't think the inability to uncomfortableness of those answers is worth the peace of mind that I can replicate with a CD burner.
    I really feel that the Internet has become a new kind of "space," a third category in addition to the entrenched concepts of "public space" and "private space."

    I think Google, in particular, has gone a long way towards defining the moral parameters of this space.

    With respect to the problems you brought up, some of them are incredibly troubling, but they reflect a mindset that regards activity on computers as taking place in "private space." I think this will shift with generations. 10 years ago, if you imagined someone at a computer, you probably conjured up an image of someone hunched over a keyboard in a dark room, blue light from the monitor glowing against his face—in other words, being on the computer implies you're alone. 10 years from now, I bet more people will conjure up images of online gatherings, of people communicating and sharing information with friends in this new "Internet space"—being on the computer implies you are connected.

    Just as everyone knows there are certain things you just don't do in public space, I think people will start to "know" that there are certain things you just don't do in "internet space." At the same time, people have struggled very hard to ensure that public space is used fairly, and I imagine that people will similarly struggle over their "rights" in Internet space to ensure that the controllers of this space are not abusing things like terms of service and unjustly denying access (at least I hope, but the trajectory of net neutrality is giving me good reason to hope.)

    Qingu on
  • FeralFeral MEMETICHARIZARD interior crocodile alligator ⇔ ǝɹʇɐǝɥʇ ǝᴉʌoɯ ʇǝloɹʌǝɥɔ ɐ ǝʌᴉɹp ᴉRegistered User regular
    edited March 2009
    moniker wrote: »
    Feral wrote: »
    I find it very telling that people in this thread are referring to "cloud" computing as two or three different things and none of them are terribly new ideas.
    There is nothing new under the sun.
    The internet itself wasn't a terribly new idea when it finally came about, we had just finally gotten to a point where it was reasonably viable to be implemented and so got implemented. Getting the various products and services that 'cloud computing' entails to actually work well enough that it will be broadly accepted is rather new. Hopefully it leads to some beneficial and interesting places.

    My primary point is that "cloud computing" doesn't really mean anything.

    Can anybody offer a concise definition of "cloud computing" that is not already covered by a better, more precise term like "distributed processing" or "software-as-a-service?"

    As far as I can tell, from this discussion and others, all "cloud computing" means is "anything that uses the Internet."

    Feral on
    every person who doesn't like an acquired taste always seems to think everyone who likes it is faking it. it should be an official fallacy.

    the "no true scotch man" fallacy.
Sign In or Register to comment.