As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/

The Petabyte Age: The End of Theory & the (exponential) advancement of Science

Zilla360Zilla360 21st Century. |She/Her|Trans* Woman In Aviators Firing A Bazooka. ⚛️Registered User regular
edited July 2008 in Debate and/or Discourse
Humans love to create, and all creation is essentially new or remixed data entering a closed thermodynamic system...(true?)
pbfeedingfbw2.jpg
This is a kind of follow-on thread from the god + science thread, since I thought it might be too big of a tangent to post in there. :)

So:
The Petabyte Age:
Sensors everywhere. Infinite storage. Clouds of processors. Our ability to capture, warehouse, and understand massive amounts of data is changing science, medicine, business, and technology. As our collection of facts and figures grows, so will the opportunity to find answers to fundamental questions. Because in the era of big data, more isn't just more. More is different.
http://www.wired.com/science/discoveries/magazine/16-07/pb_theory

I guess the real question that arises is, just how far away is humanity from a computer like Multivac? We can already have the capability to simulate atomic bombs and protein folding.
When we eventually hypothesize, theorize, model, and test far faster than any human, on top of this new model of statistical elimination, what role will religion have to play? Will it be that little whisper in our collective ears that tells us to stop, slow down, and [admonish us] for deriving our morality from common sense rather then Deistic Dogma?

I always like to think of that tall tale of a computer supposedly somewhere deep in Tibet, administered by Monks; and designed only to discover all the possible names of God. 8-)

But I'm getting ahead of myself. Because we're not there.

Yet. :)

|Ko-Fi Me! ☕😎|NH844lc.png | PSN | chi-logo-only-favicon.png(C.H.I) Ltd. |🏳️⚧️♥️
Zilla360 on
«1

Posts

  • Irond WillIrond Will WARNING: NO HURTFUL COMMENTS, PLEASE!!!!! Cambridge. MAModerator mod
    edited July 2008
    No matter how great our data modeling, there will always be unanswered and unanswerable questions that religion will claim to hold the answers to. Also, keep in mind that the information age has seen a stratification of knowledge and education: no matter how smart your scientists and techies on the coastal cities are, there are still whole states full of pigfuckers somewhere who are still handling snakes, and there will probably always be many more of the pigfuckers.

    Irond Will on
    Wqdwp8l.png
  • Popped CollarPopped Collar __BANNED USERS regular
    edited July 2008
    If technology grows so advanced that its indistinguishable from magic to the average joe, then Technology will eventually be the new religion someday.

    Popped Collar on
  • SanderJKSanderJK Crocodylus Pontifex Sinterklasicus Madrid, 3000 ADRegistered User regular
    edited July 2008
    I don't believe technology will ever be religion. It's treated with apathy in general, distrust by many, and only lauded by few. Sure, people are happy with their new gizmo's, but for the overwhelming part don't care at all how it works. Nor can most ever grasp how it works, because they simply have not been educated in the specific fields. Technology has grown complex enough that Joe Consumer does not have the background to understand it anymore. This causes fear and distrust (microwaves/cellphones/powerlines/nuclear plants/genetic modification/biochemistry are dangerous!) as well as opens up great room for fraudning people (since they don't know what the hell is going on, you can sell them anything with flashy enough advertisement) , which in turn causes more fear and distrust once those swindles are exposed. The media is no great aid in this, jumping on everything gone wrong, any dumbass "study" showing eating X causes cancer, living near Y causes brain injury, and in general, reporting by people who don't appear to know 200 year old science like the laws of thermodynamics (Lol another "water as fuel" story).

    SanderJK on
    Steam: SanderJK Origin: SanderJK
  • muninnmuninn Registered User regular
    edited July 2008
    If technology grows so advanced that its indistinguishable from magic to the average joe, then Technology will eventually be the new religion someday.

    I think you it was Asimov that invented that whole technology as magic meme.

    muninn on
  • Popped CollarPopped Collar __BANNED USERS regular
    edited July 2008
    SanderJK wrote: »
    I don't believe technology will ever be religion. It's treated with apathy in general, distrust by many, and only lauded by few. Sure, people are happy with their new gizmo's, but for the overwhelming part don't care at all how it works. Nor can most ever grasp how it works, because they simply have not been educated in the specific fields. Technology has grown complex enough that Joe Consumer does not have the background to understand it anymore. This causes fear and distrust (microwaves/cellphones/powerlines/nuclear plants/genetic modification/biochemistry are dangerous!) as well as opens up great room for fraudning people (since they don't know what the hell is going on, you can sell them anything with flashy enough advertisement) , which in turn causes more fear and distrust once those swindles are exposed. The media is no great aid in this, jumping on everything gone wrong, any dumbass "study" showing eating X causes cancer, living near Y causes brain injury, and in general, reporting by people who don't appear to know 200 year old science like the laws of thermodynamics (Lol another "water as fuel" story).

    I don't think modern technology is anywhere close to what we are talking about.

    I'm thinking more of technology in a very distant future. Hundreds of years from now perhaps.

    Popped Collar on
  • electricitylikesmeelectricitylikesme Registered User regular
    edited July 2008
    Ok, that article on "The end of theory" is complete and utter crap and slashdot.org already pretty well dissected it. The scientific method is more important then ever in an age where acquiring huge volumes of data is so easy precisely because it's just that: data. Without a theoretical basis it doesn't actually mean much of anything, and there's no reason to make predictions from it.

    I mean take the soon to be fully online (seriously when the fuck are the turning it on there's either a Higgs boson or not someone tell me now) LHC as an example - that thing is going to kick out terabytes of data per second. However none of it is worth anything unless we can put it in the context of a theoretical model to establish what it should mean and what we should do next to select the right model. The wired article implies that for some reason by virtue of quantity a correlation should be enough. Why?

    EDIT: Hell, take the example in the article of Venter. Comparing DNA sequences gets us absolutely no where. The only reason he accomplishes anything worthwhile is because we have the theory of evolution and specifically a detailed knowledge about how most if not all DNA and RNA-based organisms on Earth function and produce proteins (as well as a lot of knowledge on how those proteins work - active sites are conserved and often quite independant of the bulk etc etc.) What is actually shown is how computers and large data sets can be used within the context of an effective and tested scientific theory to generate useful information rapidly. Absent of theory however, it's worthless.

    electricitylikesme on
  • friarbaylifffriarbayliff Registered User new member
    edited July 2008
    Speaking of Asimov, one of my favorite things about the rapidly-evolving technology question is "The Last Question" by Asimov.

    http://www.multivax.com/last_question.html

    Another way to measure the technological advancement of a civilization is the Kardashev Scale that refers to the magnitude of energy consumption of the sum of that civilization's activities.

    http://en.wikipedia.org/wiki/Kardashev_scale

    It's hard to imagine that we would have progressed to any of these scenarios any time soon, but considering how technology and human advancement seems to grow exponentially, we could be there a lot sooner than you think. Speaking of which, consider this:

    http://en.wikipedia.org/wiki/Technological_singularity

    None of these things will happen in our lifetime, but if you consider how far we've come since the industrial revolution, and how far we project ourselves as going in the next equivalent timeframe, this kind of stuff starts to seem possible within the next several hundred years (maybe even less).

    Just my thoughts

    friarbayliff on
    Please use caution with the back-doors-open-and-everybody-flies-out button
  • Loren MichaelLoren Michael Registered User regular
    edited July 2008
    http://bloggingheads.tv/diavlogs/11693

    ^^^Discussion on the singularity^^^

    http://bloggingheads.tv/diavlogs/11935

    ^^^Also talks about the singularity^^^

    Loren Michael on
    a7iea7nzewtq.jpg
  • stiliststilist Registered User regular
    edited July 2008
    I thought this was an interesting response.

    It’s sorta like what ELM said: while machines are great at collecting data, somebody’s gotta be around to ask questions and give the machines projects.

    stilist on
    I poop things on my site and twitter
  • Zilla360Zilla360 21st Century. |She/Her| Trans* Woman In Aviators Firing A Bazooka. ⚛️Registered User regular
    edited July 2008
    The wired article implies that for some reason by virtue of quantity a correlation should be enough. Why?

    EDIT: Hell, take the example in the article of Venter. Comparing DNA sequences gets us absolutely no where. The only reason he accomplishes anything worthwhile is because we have the theory of evolution and specifically a detailed knowledge about how most if not all DNA and RNA-based organisms on Earth function and produce proteins (as well as a lot of knowledge on how those proteins work - active sites are conserved and often quite independant of the bulk etc etc.) What is actually shown is how computers and large data sets can be used within the context of an effective and tested scientific theory to generate useful information rapidly. Absent of theory however, it's worthless.
    Quantum Mechanics, Probability, Chaos Theory (fuzziness, focus), and all the roads not taken, simply for lack of time in which to do so?

    This becomes irrelevant in the Petabyte age, as 'The Cloud' performs (and becomes more capable at) exponentially greater feats of comparative statistical analysis of massive data sets. Not only that but the ability to iterate and not (hopefully) repeat any past mistakes.

    Humans already iterate on theories based on the size of the available data set; take Newton and how his theories were improved upon by Einstein, it's like knowing the difference between an Apple and our Solar System, both operate at different Planck scales, and one requires vastly more data to model and describe than the other.
    And I've probably not explained what I meant to say, very well at all... :P

    Zilla360 on
    |Ko-Fi Me! ☕😎|NH844lc.png | PSN | chi-logo-only-favicon.png(C.H.I) Ltd. |🏳️⚧️♥️
  • poshnialloposhniallo Registered User regular
    edited July 2008
    I always wonder why 'the singularity' is so talked about. It doesn't seem like something everyone would want to participate in and it seems to be predicated on colossal assumptions about the nature of reality, humanity and technology.

    Also the OP talks about 'common sense' as if it's somehow objective, when it very much isn't.

    It all reminds me of the Rapture.

    poshniallo on
    I figure I could take a bear.
  • amateurhouramateurhour One day I'll be professionalhour The woods somewhere in TennesseeRegistered User regular
    edited July 2008
    I think tech is eventually going to hit an end point, or apex, like anything else. It's nice to dream of a Jetson's future, and it may be attainable one day, but what happens next?

    I just think one day we'll hit a point where he have the things we need, and there may be upgrades for the sake of upgrading, but as far as innovative technology goes, we will have reached an endgame.

    amateurhour on
    are YOU on the beer list?
  • OctoparrotOctoparrot Registered User regular
    edited July 2008
    poshniallo wrote: »
    I always wonder why 'the singularity' is so talked about. It doesn't seem like something everyone would want to participate in and it seems to be predicated on colossal assumptions about the nature of reality, humanity and technology.

    Also the OP talks about 'common sense' as if it's somehow objective, when it very much isn't.

    It all reminds me of the Rapture.

    Yeah it could always be just a really pinched gaussian.

    Octoparrot on
  • stiliststilist Registered User regular
    edited July 2008
    I think tech is eventually going to hit an end point, or apex, like anything else. It's nice to dream of a Jetson's future, and it may be attainable one day, but what happens next?

    I just think one day we'll hit a point where he have the things we need, and there may be upgrades for the sake of upgrading, but as far as innovative technology goes, we will have reached an endgame.
    I rather doubt this. Why don’t we just stop now? We have sufficient technology that, if applied correctly, could cure or reduce most diseases, bring everybody up to a high standard of living, generate large amounts of money, move to sustainable farming, shift from oil usage. Further, comments similar to yours have been made many times throughout history—‘I can’t imagine anybody needing more than we have’, ‘there’s nothing left to invent’, and so on.

    As long as there’s money to be made and a decent level of education, there will be innovation. Not all of it worthwhile or useful, but it’ll be there.

    stilist on
    I poop things on my site and twitter
  • electricitylikesmeelectricitylikesme Registered User regular
    edited July 2008
    Zilla360 wrote: »
    The wired article implies that for some reason by virtue of quantity a correlation should be enough. Why?

    EDIT: Hell, take the example in the article of Venter. Comparing DNA sequences gets us absolutely no where. The only reason he accomplishes anything worthwhile is because we have the theory of evolution and specifically a detailed knowledge about how most if not all DNA and RNA-based organisms on Earth function and produce proteins (as well as a lot of knowledge on how those proteins work - active sites are conserved and often quite independant of the bulk etc etc.) What is actually shown is how computers and large data sets can be used within the context of an effective and tested scientific theory to generate useful information rapidly. Absent of theory however, it's worthless.
    Quantum Mechanics, Probability, Chaos Theory (fuzziness, focus), and all the roads not taken, simply for lack of time in which to do so?

    This becomes irrelevant in the Petabyte age, as 'The Cloud' performs (and becomes more capable at) exponentially greater feats of comparative statistical analysis of massive data sets. Not only that but the ability to iterate and not (hopefully) repeat any past mistakes.

    Humans already iterate on theories based on the size of the available data set; take Newton and how his theories were improved upon by Einstein, it's like knowing the difference between an Apple and our Solar System, both operate at different Planck scales, and one requires vastly more data to model and describe than the other.
    And I've probably not explained what I meant to say, very well at all... :P
    Actually you haven't explained anything. You've supported my point - how does the ability to compare large sets of data actually belie understanding?

    A hypothesis isn't just "made up" - it is always based off an observation. The point though, is that it needs to predict a future, different, observation - some corrollary, so that we can test the hypothesis for it's predicative ability. Curve fitting a large data set doesn't do that - it doesn't tell you anything meaningful about what to do to confirm the curve fit, nor does it give you any reason to believe the curve behaves as modelled outside the dataset (it frequently does not in fact).

    EDIT: Or put it another way - frequently interesting trends are the starting point of a hypothesis. They are not the endpoint of science, nor ever can be.

    electricitylikesme on
  • saggiosaggio Registered User regular
    edited July 2008
    Correlation, even significant statistical correlation, tells us very little in real terms.

    Theory allows us to leverage what can be learned from statistical correlation in the most effective means possible, which in turn allows us to actually express what we have discovered about the world.

    saggio on
    3DS: 0232-9436-6893
  • Loren MichaelLoren Michael Registered User regular
    edited July 2008
    I think tech is eventually going to hit an end point, or apex, like anything else.

    Like what else?

    Loren Michael on
    a7iea7nzewtq.jpg
  • tbloxhamtbloxham Registered User regular
    edited July 2008
    Zilla360 wrote: »
    Humans love to create, and all creation is essentially new or remixed data entering a closed thermodynamic system...(true?)

    This is a kind of follow-on thread from the god + science thread, since I thought it might be too big of a tangent to post in there. :)

    So:
    The Petabyte Age:
    Sensors everywhere. Infinite storage. Clouds of processors. Our ability to capture, warehouse, and understand massive amounts of data is changing science, medicine, business, and technology. As our collection of facts and figures grows, so will the opportunity to find answers to fundamental questions. Because in the era of big data, more isn't just more. More is different.
    http://www.wired.com/science/discoveries/magazine/16-07/pb_theory

    I guess the real question that arises is, just how far away is humanity from a computer like Multivac? We can already have the capability to simulate atomic bombs and protein folding.
    When we eventually hypothesize, theorize, model, and test far faster than any human, on top of this new model of statistical elimination, what role will religion have to play? Will it be that little whisper in our collective ears that tells us to stop, slow down, and [admonish us] for deriving our morality from common sense rather then Deistic Dogma?

    I always like to think of that tall tale of a computer supposedly somewhere deep in Tibet, administered by Monks; and designed only to discover all the possible names of God. 8-)

    But I'm getting ahead of myself. Because we're not there.

    Yet. :)

    This is the silliest article ever. I actually am a nuclear physicist, and as such have an overwhelming quantity of data available on my computer right here from simulations, detectors, and so forth. I'm talking thousands of gigabytes, and there are others in my building working with orders of magnitude more data.

    However what you discover is that the more data you have, although it becomes potentially more useful, in its raw state it becomes less and less useful for the exact problem you may want to solve. 90% of my actual scientific work is producing statistical models and codes to operate on the data and extract useful information from the noise and background, or to produce useful ways of displaying the data so that a human can get answers from it.

    Look at it this way, in the keyboard you are using right now is encoded hundreds of billions of billions of petabytes of data. In atomic spins, and nuclear decays, and so forth. All the secrets of the stable physical universe at low temperatures lie at your fingertips, and likely many of the secrets of the high energy universe. Is there a Higgs Boson? What is the mass of the neutrino? The relationship between gravity and the electroweak and strong forces? All the answers to these questions are encoded in the matter around you. Unfortunately just like any data that could be produced from our omni scanner, it is nigh on impossible to simply look at, requires huge amounts of processing to understand, and quite a lot of brains to then remake into a useful system which answers your questions.

    tbloxham on
    "That is cool" - Abraham Lincoln
  • amateurhouramateurhour One day I'll be professionalhour The woods somewhere in TennesseeRegistered User regular
    edited July 2008
    I think tech is eventually going to hit an end point, or apex, like anything else.

    Like what else?

    Civilizations in general have a history of expanding technologically with the elements available to them at the time until they hit an apex and then fall inwards. It's happened over and over and over again. Obviously there are other factors involved, I won't just blame technology, but it plays an important part in the process.

    Regardless of how amazing things are getting, there's always a breaking point to something. I'm not trying to strawman or kill the argument or say "lol we're gonna apocalypse ourselves so why bother?", I'm just saying that I don't see technology delivering us to a religion free utopian destination.

    amateurhour on
    are YOU on the beer list?
  • BufordHicksBufordHicks Registered User regular
    edited July 2008
    muninn wrote: »
    If technology grows so advanced that its indistinguishable from magic to the average joe, then Technology will eventually be the new religion someday.

    I think you it was Asimov that invented that whole technology as magic meme.

    It was Clarke

    BufordHicks on
    steam_sig.png
  • Zilla360Zilla360 21st Century. |She/Her| Trans* Woman In Aviators Firing A Bazooka. ⚛️Registered User regular
    edited July 2008
    So in the short term (20 years from now) you would all agree it's impossible for any computer to observe and theorise on it's own, even if equipped with a vast array of sensors and inputs?

    But in the long term...?

    Zilla360 on
    |Ko-Fi Me! ☕😎|NH844lc.png | PSN | chi-logo-only-favicon.png(C.H.I) Ltd. |🏳️⚧️♥️
  • stiliststilist Registered User regular
    edited July 2008
    Zilla360 wrote: »
    So in the short term (20 years from now) you would all agree it's impossible for any computer to observe and theorise on it's own, even if equipped with a vast array of sensors and inputs?

    But in the long term...?
    LISP machines are waving hello from the graveyard.

    stilist on
    I poop things on my site and twitter
  • amateurhouramateurhour One day I'll be professionalhour The woods somewhere in TennesseeRegistered User regular
    edited July 2008
    Zilla360 wrote: »
    So in the short term (20 years from now) you would all agree it's impossible for any computer to observe and theorise on it's own, even if equipped with a vast array of sensors and inputs?

    But in the long term...?

    Yes, anything is possible. We already have computers that observe and theorise on their own, planning military strategy and playing chess, they're just limited in their functionality.

    Do I think the general public will accept a computer that's connected everywhere, and has the ability to watch us and make decisions?

    no.

    amateurhour on
    are YOU on the beer list?
  • RichyRichy Registered User regular
    edited July 2008
    I should point out that these sensors and endless data storage are all digital. In the end they store stuff in 1-and-0 format. Meanwhile, the world is not digital. So no matter how much we measure and store, we will always only have an approximation of the world. And like any approximation, it will be incomplete and flawed in some respect.

    Richy on
    sig.gif
  • stiliststilist Registered User regular
    edited July 2008
    Richy wrote: »
    I should point out that these sensors and endless data storage are all digital. In the end they store stuff in 1-and-0 format. Meanwhile, the world is not digital. So no matter how much we measure and store, we will always only have an approximation of the world. And like any approximation, it will be incomplete and flawed in some respect.
    There’s no way to losslessly preserve data because machinery is never sensitive enough, whether analogue or digital.

    stilist on
    I poop things on my site and twitter
  • Zilla360Zilla360 21st Century. |She/Her| Trans* Woman In Aviators Firing A Bazooka. ⚛️Registered User regular
    edited July 2008
    Richy wrote: »
    I should point out that these sensors and endless data storage are all digital. In the end they store stuff in 1-and-0 format. Meanwhile, the world is not digital. So no matter how much we measure and store, we will always only have an approximation of the world. And like any approximation, it will be incomplete and flawed in some respect.
    Sorry, but technology has already jumped that hurdle. :)

    Zilla360 on
    |Ko-Fi Me! ☕😎|NH844lc.png | PSN | chi-logo-only-favicon.png(C.H.I) Ltd. |🏳️⚧️♥️
  • Mr_RoseMr_Rose 83 Blue Ridge Protects the Holy Registered User regular
    edited July 2008
    I don't think modern technology is anywhere close to what we are talking about.

    I'm thinking more of technology in a very distant future. Hundreds of years from now perhaps.
    Then go out into the street right now and ask the first ten people you meet how the lights in their home work.
    I suspect that the very best answer you get, unless you live on a university campus and even then only maybe, will be "electricity." Even when pressed, you are extremely unlikely to get any sort of detailed response involving resistive heating filaments, fluorescence, or any of the other principles which underlie modern lighting.

    Essentially, to the man in the street, modern technology might as well be magic already. They won't ever admit it though.

    Mr_Rose on
    ...because dragons are AWESOME! That's why.
    Nintendo Network ID: AzraelRose
    DropBox invite link - get 500MB extra free.
  • electricitylikesmeelectricitylikesme Registered User regular
    edited July 2008
    I think tech is eventually going to hit an end point, or apex, like anything else.

    Like what else?

    Civilizations in general have a history of expanding technologically with the elements available to them at the time until they hit an apex and then fall inwards. It's happened over and over and over again. Obviously there are other factors involved, I won't just blame technology, but it plays an important part in the process.

    Regardless of how amazing things are getting, there's always a breaking point to something. I'm not trying to strawman or kill the argument or say "lol we're gonna apocalypse ourselves so why bother?", I'm just saying that I don't see technology delivering us to a religion free utopian destination.
    Prior civilizations have had numerous reasons to fail before they could go very far, but let's be serious - there has only been one known sustained period of enlightened technological progress and that has been the age we're in right now.

    The Roman and Greek empires would come in maybe as the second, but the fact is there has not been the great history of rise and fall which explains everything which people so often seem content to say there has been. Civilization as it is today is distinct and unique to any that have existed before us and there is no reason to think that it will actually collapse barring some unchecked social issue like religious fanatics (Christian fundamentalists).

    electricitylikesme on
  • tbloxhamtbloxham Registered User regular
    edited July 2008
    Zilla360 wrote: »
    The wired article implies that for some reason by virtue of quantity a correlation should be enough. Why?

    EDIT: Hell, take the example in the article of Venter. Comparing DNA sequences gets us absolutely no where. The only reason he accomplishes anything worthwhile is because we have the theory of evolution and specifically a detailed knowledge about how most if not all DNA and RNA-based organisms on Earth function and produce proteins (as well as a lot of knowledge on how those proteins work - active sites are conserved and often quite independant of the bulk etc etc.) What is actually shown is how computers and large data sets can be used within the context of an effective and tested scientific theory to generate useful information rapidly. Absent of theory however, it's worthless.
    Quantum Mechanics, Probability, Chaos Theory (fuzziness, focus), and all the roads not taken, simply for lack of time in which to do so?

    This becomes irrelevant in the Petabyte age, as 'The Cloud' performs (and becomes more capable at) exponentially greater feats of comparative statistical analysis of massive data sets. Not only that but the ability to iterate and not (hopefully) repeat any past mistakes.

    Humans already iterate on theories based on the size of the available data set; take Newton and how his theories were improved upon by Einstein, it's like knowing the difference between an Apple and our Solar System, both operate at different Planck scales, and one requires vastly more data to model and describe than the other.
    And I've probably not explained what I meant to say, very well at all... :P
    Actually you haven't explained anything. You've supported my point - how does the ability to compare large sets of data actually belie understanding?

    A hypothesis isn't just "made up" - it is always based off an observation. The point though, is that it needs to predict a future, different, observation - some corrollary, so that we can test the hypothesis for it's predicative ability. Curve fitting a large data set doesn't do that - it doesn't tell you anything meaningful about what to do to confirm the curve fit, nor does it give you any reason to believe the curve behaves as modelled outside the dataset (it frequently does not in fact).

    EDIT: Or put it another way - frequently interesting trends are the starting point of a hypothesis. They are not the endpoint of science, nor ever can be.

    Indeed, I actualy found this article a rather scary observation of the fact that even relatively intelligent people have begun to totally fail to understand what is useful science, and what is not. Electricitylikesme makes a hugely important point here regarding curve fitting. With modern computers and a given data set, you can fit a curve to that data set, provided you give it enough degrees of freedom.

    However unless you fit the curve which actually corresponds to the way the data is produced, and whose coefficients mean something in the physical world the you have learned nothing. All you have learned is that there is a spectra (which you already knew, since it is true of all possible things), and it exists in the phase space of curves that can be fitted if any given function is applied (which you also already knew, since again it is true of all possible spectra)

    For google, it is enough to find the correlation. If X seems to cause Y, then so be it. If you search for X, here, have some Y as well. And here is the X that people like Z seem to like the most. Googles job is simply to do this, and not to understand. Google does not need to know why X likes Y, only that it does because Google is already doing its job. A baker doesn't need to know why cakes rise when they have yeast in them, or why sugar tastes sweet, simply that they do. He is already doing his job. However Bakers are not Chemists or Biologists, and Google is not a groundbreaking science experiment.

    The purpose is not to fit the curves, and give blind correlations. The purpose is to understand the coefficients in the fit, since that is where the science is. We didn't invent semiconductors just by noticing that quantum physics happens, we did it by understanding it.

    tbloxham on
    "That is cool" - Abraham Lincoln
  • Popped CollarPopped Collar __BANNED USERS regular
    edited July 2008
    Mr_Rose wrote: »
    I don't think modern technology is anywhere close to what we are talking about.

    I'm thinking more of technology in a very distant future. Hundreds of years from now perhaps.
    Then go out into the street right now and ask the first ten people you meet how the lights in their home work.
    I suspect that the very best answer you get, unless you live on a university campus and even then only maybe, will be "electricity." Even when pressed, you are extremely unlikely to get any sort of detailed response involving resistive heating filaments, fluorescence, or any of the other principles which underlie modern lighting.

    Essentially, to the man in the street, modern technology might as well be magic already. They won't ever admit it though.

    So? Technology will get a hell of a lot more magical than that. Someday there might be technology where even the inventors themselves wouldn't know how it works, just because of the nature of how its made. When technology can feasibly run millions of years of evolution within a single machine and use the results to create the best solutions on its own, that will be amazing.

    Popped Collar on
  • Zilla360Zilla360 21st Century. |She/Her| Trans* Woman In Aviators Firing A Bazooka. ⚛️Registered User regular
    edited July 2008
    tbloxham wrote: »
    Indeed, I actualy found this article a rather scary observation of the fact that even relatively intelligent people have begun to totally fail to understand what is useful science, and what is not. Electricitylikesme makes a hugely important point here regarding curve fitting. With modern computers and a given data set, you can fit a curve to that data set, provided you give it enough degrees of freedom.
    I can't disagree with this, even though you seem to think in fairly limited time-scales, relevant to how the world was when you gained your degree, most likely. :P

    Who decides what is useful science, and what is not? Consider the Laser, a technology that was initially deemed to only have a limited set of use cases, but has ended up being one of the most versatile cornerstones of our modern world. :)

    Zilla360 on
    |Ko-Fi Me! ☕😎|NH844lc.png | PSN | chi-logo-only-favicon.png(C.H.I) Ltd. |🏳️⚧️♥️
  • Loren MichaelLoren Michael Registered User regular
    edited July 2008
    I think tech is eventually going to hit an end point, or apex, like anything else.

    Like what else?

    Civilizations in general have a history of expanding technologically with the elements available to them at the time until they hit an apex and then fall inwards. It's happened over and over and over again. Obviously there are other factors involved, I won't just blame technology, but it plays an important part in the process.

    Civilizations compete or cooperate in a number of ways. Some lose and some get absorbed. Successful civilizations last a long time. One civilization's tech fails to another civilization's greater tech(or numbers), and if a tech is deemed worthwhile, it's adopted by whomever comes across it.

    So, maybe we'll be obliterated by alien conquerors or nuclear equipped religious zealots, but beyond that, I don't see how an apex is so obviously in the cards, particularly when one looks at history.

    Loren Michael on
    a7iea7nzewtq.jpg
  • electricitylikesmeelectricitylikesme Registered User regular
    edited July 2008
    Zilla360 wrote: »
    tbloxham wrote: »
    Indeed, I actualy found this article a rather scary observation of the fact that even relatively intelligent people have begun to totally fail to understand what is useful science, and what is not. Electricitylikesme makes a hugely important point here regarding curve fitting. With modern computers and a given data set, you can fit a curve to that data set, provided you give it enough degrees of freedom.
    I can't disagree with this, even though you seem to think in fairly limited time-scales, relevant to how the world was when you gained your degree, most likely. :P

    Who decides what is useful science, and what is not? Consider the Laser, a technology that was initially deemed to only have a limited set of use cases, but has ended up being one of the most versatile cornerstones of our modern world. :)
    Logic and the scientific method haven't changed in the past 4 years, nor the past 50 or 200 for that matter.

    Statistics is not going to replace the scientific method. Computers could, of course, become sentient and able to devise complicated hypotheses on their own but then we'd just be talking about a different type of thing entirely - an AI.

    There have been a lot of interesting advances in this sort of thing mind you - there is work going on in creating tools which allow scientists to quickly get through literature surveys and the like, or which try to link concepts between different papers in order to help people find otherwise unspotted trends (I believe in particular this was a project looking at PubMed to try and find connections between medical research which otherwise someone would have to have read the right papers in the right order).

    These things just need to be looked at in the right context - it's all a tool that assists the basic process.

    electricitylikesme on
  • electricitylikesmeelectricitylikesme Registered User regular
    edited July 2008
    I think tech is eventually going to hit an end point, or apex, like anything else.

    Like what else?

    Civilizations in general have a history of expanding technologically with the elements available to them at the time until they hit an apex and then fall inwards. It's happened over and over and over again. Obviously there are other factors involved, I won't just blame technology, but it plays an important part in the process.

    Civilizations compete or cooperate in a number of ways. Some lose and some get absorbed. Successful civilizations last a long time. One civilization's tech fails to another civilization's greater tech(or numbers), and if a tech is deemed worthwhile, it's adopted by whomever comes across it.

    So, maybe we'll be obliterated by alien conquerors or nuclear equipped religious zealots, but beyond that, I don't see how an apex is so obviously in the cards, particularly when one looks at history.
    To be fair if someone went back in time and taught the ancient Greeks coordinate geometry you'd probably advance human science today by thousands of years.

    electricitylikesme on
  • Loren MichaelLoren Michael Registered User regular
    edited July 2008
    I think tech is eventually going to hit an end point, or apex, like anything else.

    Like what else?

    Civilizations in general have a history of expanding technologically with the elements available to them at the time until they hit an apex and then fall inwards. It's happened over and over and over again. Obviously there are other factors involved, I won't just blame technology, but it plays an important part in the process.

    Civilizations compete or cooperate in a number of ways. Some lose and some get absorbed. Successful civilizations last a long time. One civilization's tech fails to another civilization's greater tech(or numbers), and if a tech is deemed worthwhile, it's adopted by whomever comes across it.

    So, maybe we'll be obliterated by alien conquerors or nuclear equipped religious zealots, but beyond that, I don't see how an apex is so obviously in the cards, particularly when one looks at history.
    To be fair if someone went back in time and taught the ancient Greeks coordinate geometry you'd probably advance human science today by thousands of years.

    Sure.

    ...

    Doesn't "to be fair" imply that a sort of weak and possibly conditional "to the contrary" comment is to immediately follow?

    Loren Michael on
    a7iea7nzewtq.jpg
  • electricitylikesmeelectricitylikesme Registered User regular
    edited July 2008
    I think tech is eventually going to hit an end point, or apex, like anything else.

    Like what else?

    Civilizations in general have a history of expanding technologically with the elements available to them at the time until they hit an apex and then fall inwards. It's happened over and over and over again. Obviously there are other factors involved, I won't just blame technology, but it plays an important part in the process.

    Civilizations compete or cooperate in a number of ways. Some lose and some get absorbed. Successful civilizations last a long time. One civilization's tech fails to another civilization's greater tech(or numbers), and if a tech is deemed worthwhile, it's adopted by whomever comes across it.

    So, maybe we'll be obliterated by alien conquerors or nuclear equipped religious zealots, but beyond that, I don't see how an apex is so obviously in the cards, particularly when one looks at history.
    To be fair if someone went back in time and taught the ancient Greeks coordinate geometry you'd probably advance human science today by thousands of years.

    Sure.

    ...

    Doesn't "to be fair" imply that a sort of weak and possibly conditional "to the contrary" comment is to immediately follow?
    I guess I was kind of implying the whole deal with the dark ages were that they really were basically a failure of civilization for a couple hundred years. Though I suppose that isn't really fair to the progress in China and the Middle East during this time - still, in terms of mathematics it would've been fantastic if Greece had kept calculus going.

    electricitylikesme on
  • Loren MichaelLoren Michael Registered User regular
    edited July 2008
    I think tech is eventually going to hit an end point, or apex, like anything else.

    Like what else?

    Civilizations in general have a history of expanding technologically with the elements available to them at the time until they hit an apex and then fall inwards. It's happened over and over and over again. Obviously there are other factors involved, I won't just blame technology, but it plays an important part in the process.

    Civilizations compete or cooperate in a number of ways. Some lose and some get absorbed. Successful civilizations last a long time. One civilization's tech fails to another civilization's greater tech(or numbers), and if a tech is deemed worthwhile, it's adopted by whomever comes across it.

    So, maybe we'll be obliterated by alien conquerors or nuclear equipped religious zealots, but beyond that, I don't see how an apex is so obviously in the cards, particularly when one looks at history.
    To be fair if someone went back in time and taught the ancient Greeks coordinate geometry you'd probably advance human science today by thousands of years.

    Sure.

    ...

    Doesn't "to be fair" imply that a sort of weak and possibly conditional "to the contrary" comment is to immediately follow?
    I guess I was kind of implying the whole deal with the dark ages were that they really were basically a failure of civilization for a couple hundred years. Though I suppose that isn't really fair to the progress in China and the Middle East during this time - still, in terms of mathematics it would've been fantastic if Greece had kept calculus going.

    The dark ages weren't all that dark anyways. "Dark ages" was a term coined by renaissance humanists who wanted to leave the impression that they had rescued the world from ignorance... And there's some pretty strong arguments I've heard that debate the occurrence of a renaissance outside some fairly limited fields.

    Yes, there were setbacks in intellectual progress, but the scope is, I think, overblown.

    Loren Michael on
    a7iea7nzewtq.jpg
  • electricitylikesmeelectricitylikesme Registered User regular
    edited July 2008
    I think tech is eventually going to hit an end point, or apex, like anything else.

    Like what else?

    Civilizations in general have a history of expanding technologically with the elements available to them at the time until they hit an apex and then fall inwards. It's happened over and over and over again. Obviously there are other factors involved, I won't just blame technology, but it plays an important part in the process.

    Civilizations compete or cooperate in a number of ways. Some lose and some get absorbed. Successful civilizations last a long time. One civilization's tech fails to another civilization's greater tech(or numbers), and if a tech is deemed worthwhile, it's adopted by whomever comes across it.

    So, maybe we'll be obliterated by alien conquerors or nuclear equipped religious zealots, but beyond that, I don't see how an apex is so obviously in the cards, particularly when one looks at history.
    To be fair if someone went back in time and taught the ancient Greeks coordinate geometry you'd probably advance human science today by thousands of years.

    Sure.

    ...

    Doesn't "to be fair" imply that a sort of weak and possibly conditional "to the contrary" comment is to immediately follow?
    I guess I was kind of implying the whole deal with the dark ages were that they really were basically a failure of civilization for a couple hundred years. Though I suppose that isn't really fair to the progress in China and the Middle East during this time - still, in terms of mathematics it would've been fantastic if Greece had kept calculus going.

    The dark ages weren't all that dark anyways. "Dark ages" was a term coined by renaissance humanists who wanted to leave the impression that they had rescued the world from ignorance... And there's some pretty strong arguments I've heard that debate the occurrence of a renaissance outside some fairly limited fields.

    Yes, there were setbacks in intellectual progress, but the scope is, I think, overblown.
    To be fair my reference is high school history and the break down of the course probably wasn't great in timelining the different empires so I'll take your word for it.

    electricitylikesme on
  • tbloxhamtbloxham Registered User regular
    edited July 2008
    Zilla360 wrote: »
    tbloxham wrote: »
    Indeed, I actualy found this article a rather scary observation of the fact that even relatively intelligent people have begun to totally fail to understand what is useful science, and what is not. Electricitylikesme makes a hugely important point here regarding curve fitting. With modern computers and a given data set, you can fit a curve to that data set, provided you give it enough degrees of freedom.
    I can't disagree with this, even though you seem to think in fairly limited time-scales, relevant to how the world was when you gained your degree, most likely. :P

    Who decides what is useful science, and what is not? Consider the Laser, a technology that was initially deemed to only have a limited set of use cases, but has ended up being one of the most versatile cornerstones of our modern world. :)
    Logic and the scientific method haven't changed in the past 4 years, nor the past 50 or 200 for that matter.

    Statistics is not going to replace the scientific method. Computers could, of course, become sentient and able to devise complicated hypotheses on their own but then we'd just be talking about a different type of thing entirely - an AI.

    There have been a lot of interesting advances in this sort of thing mind you - there is work going on in creating tools which allow scientists to quickly get through literature surveys and the like, or which try to link concepts between different papers in order to help people find otherwise unspotted trends (I believe in particular this was a project looking at PubMed to try and find connections between medical research which otherwise someone would have to have read the right papers in the right order).

    These things just need to be looked at in the right context - it's all a tool that assists the basic process.

    Right again Electricitylikesme. All this progress in data aquisition no more changes the scientific method than does the invention of the calculator, or the abacus. It just gives you more data to sort, and more tools to sort it with. Only an AI system actually capable of innovative thinking would do more than that, and even it would still theorize, analyze and predict in a similar way as a human scientist does.

    And, to Zilla, I would say that my 'useful' comment is more to do with the uselessness in terms of progress of simply observing trends and correlations. To google, the why is unimportant, to a scientist or engineer the why is all that is important in terms of real progress. Almost any breakthrough in understanding ever made has been shown to be useful, breakthroughs in correlation observation conversely are almost always useless until they are understood.

    For example, we have seen that a force appears to exist which causes the universe to accelerate beyond the deceleration caused by gravity at extreme distances. This is interesting, and very important to move towards understanding but is in itself useless and as of right now may as well be caused by magic. Only once we can understand, predict, and observe will we be able to use the knowledge in a useful way.

    tbloxham on
    "That is cool" - Abraham Lincoln
  • Zilla360Zilla360 21st Century. |She/Her| Trans* Woman In Aviators Firing A Bazooka. ⚛️Registered User regular
    edited July 2008
    Statistics is not going to replace the scientific method. Computers could, of course, become sentient and able to devise complicated hypotheses on their own but then we'd just be talking about a different type of thing entirely - an AI.
    And it's beginning right now... Going beyond Polymorphic code, and creating compilers that compile better compilers...ad inifintum, etc all on their own! 8-)

    Somebody call John Connor right now. D:

    Zilla360 on
    |Ko-Fi Me! ☕😎|NH844lc.png | PSN | chi-logo-only-favicon.png(C.H.I) Ltd. |🏳️⚧️♥️
Sign In or Register to comment.