Humans love to create, and all creation is essentially new or remixed data entering a closed thermodynamic system...(true?)
This is a kind of follow-on thread from the
god + science thread, since I thought it might be too big of a tangent to post in there.
So:
The Petabyte Age:
Sensors everywhere. Infinite storage. Clouds of processors. Our ability to capture, warehouse, and understand massive amounts of data is changing science, medicine, business, and technology. As our collection of facts and figures grows, so will the opportunity to find answers to fundamental questions. Because in the era of big data, more isn't just more. More is different.
http://www.wired.com/science/discoveries/magazine/16-07/pb_theory
I guess the real question that arises is, just how far away is humanity from a computer like
Multivac? We can already have the capability to simulate atomic bombs and protein folding.
When we eventually hypothesize, theorize, model, and test far faster than any human, on top of this new model of statistical elimination,
what role will religion have to play? Will it be that little whisper in our collective ears that tells us to stop, slow down, and [admonish us] for deriving our morality from common sense rather then Deistic Dogma?
I always like to think of that
tall tale of a computer supposedly somewhere deep in Tibet, administered by Monks; and designed only to discover all the possible names of God. 8-)
But I'm getting ahead of myself. Because we're not there.
Yet.
Posts
I think you it was Asimov that invented that whole technology as magic meme.
I don't think modern technology is anywhere close to what we are talking about.
I'm thinking more of technology in a very distant future. Hundreds of years from now perhaps.
I mean take the soon to be fully online (seriously when the fuck are the turning it on there's either a Higgs boson or not someone tell me now) LHC as an example - that thing is going to kick out terabytes of data per second. However none of it is worth anything unless we can put it in the context of a theoretical model to establish what it should mean and what we should do next to select the right model. The wired article implies that for some reason by virtue of quantity a correlation should be enough. Why?
EDIT: Hell, take the example in the article of Venter. Comparing DNA sequences gets us absolutely no where. The only reason he accomplishes anything worthwhile is because we have the theory of evolution and specifically a detailed knowledge about how most if not all DNA and RNA-based organisms on Earth function and produce proteins (as well as a lot of knowledge on how those proteins work - active sites are conserved and often quite independant of the bulk etc etc.) What is actually shown is how computers and large data sets can be used within the context of an effective and tested scientific theory to generate useful information rapidly. Absent of theory however, it's worthless.
http://www.multivax.com/last_question.html
Another way to measure the technological advancement of a civilization is the Kardashev Scale that refers to the magnitude of energy consumption of the sum of that civilization's activities.
http://en.wikipedia.org/wiki/Kardashev_scale
It's hard to imagine that we would have progressed to any of these scenarios any time soon, but considering how technology and human advancement seems to grow exponentially, we could be there a lot sooner than you think. Speaking of which, consider this:
http://en.wikipedia.org/wiki/Technological_singularity
None of these things will happen in our lifetime, but if you consider how far we've come since the industrial revolution, and how far we project ourselves as going in the next equivalent timeframe, this kind of stuff starts to seem possible within the next several hundred years (maybe even less).
Just my thoughts
^^^Discussion on the singularity^^^
http://bloggingheads.tv/diavlogs/11935
^^^Also talks about the singularity^^^
It’s sorta like what ELM said: while machines are great at collecting data, somebody’s gotta be around to ask questions and give the machines projects.
This becomes irrelevant in the Petabyte age, as 'The Cloud' performs (and becomes more capable at) exponentially greater feats of comparative statistical analysis of massive data sets. Not only that but the ability to iterate and not (hopefully) repeat any past mistakes.
Humans already iterate on theories based on the size of the available data set; take Newton and how his theories were improved upon by Einstein, it's like knowing the difference between an Apple and our Solar System, both operate at different Planck scales, and one requires vastly more data to model and describe than the other.
Also the OP talks about 'common sense' as if it's somehow objective, when it very much isn't.
It all reminds me of the Rapture.
I just think one day we'll hit a point where he have the things we need, and there may be upgrades for the sake of upgrading, but as far as innovative technology goes, we will have reached an endgame.
Yeah it could always be just a really pinched gaussian.
As long as there’s money to be made and a decent level of education, there will be innovation. Not all of it worthwhile or useful, but it’ll be there.
A hypothesis isn't just "made up" - it is always based off an observation. The point though, is that it needs to predict a future, different, observation - some corrollary, so that we can test the hypothesis for it's predicative ability. Curve fitting a large data set doesn't do that - it doesn't tell you anything meaningful about what to do to confirm the curve fit, nor does it give you any reason to believe the curve behaves as modelled outside the dataset (it frequently does not in fact).
EDIT: Or put it another way - frequently interesting trends are the starting point of a hypothesis. They are not the endpoint of science, nor ever can be.
Theory allows us to leverage what can be learned from statistical correlation in the most effective means possible, which in turn allows us to actually express what we have discovered about the world.
Like what else?
This is the silliest article ever. I actually am a nuclear physicist, and as such have an overwhelming quantity of data available on my computer right here from simulations, detectors, and so forth. I'm talking thousands of gigabytes, and there are others in my building working with orders of magnitude more data.
However what you discover is that the more data you have, although it becomes potentially more useful, in its raw state it becomes less and less useful for the exact problem you may want to solve. 90% of my actual scientific work is producing statistical models and codes to operate on the data and extract useful information from the noise and background, or to produce useful ways of displaying the data so that a human can get answers from it.
Look at it this way, in the keyboard you are using right now is encoded hundreds of billions of billions of petabytes of data. In atomic spins, and nuclear decays, and so forth. All the secrets of the stable physical universe at low temperatures lie at your fingertips, and likely many of the secrets of the high energy universe. Is there a Higgs Boson? What is the mass of the neutrino? The relationship between gravity and the electroweak and strong forces? All the answers to these questions are encoded in the matter around you. Unfortunately just like any data that could be produced from our omni scanner, it is nigh on impossible to simply look at, requires huge amounts of processing to understand, and quite a lot of brains to then remake into a useful system which answers your questions.
Civilizations in general have a history of expanding technologically with the elements available to them at the time until they hit an apex and then fall inwards. It's happened over and over and over again. Obviously there are other factors involved, I won't just blame technology, but it plays an important part in the process.
Regardless of how amazing things are getting, there's always a breaking point to something. I'm not trying to strawman or kill the argument or say "lol we're gonna apocalypse ourselves so why bother?", I'm just saying that I don't see technology delivering us to a religion free utopian destination.
It was Clarke
But in the long term...?
Yes, anything is possible. We already have computers that observe and theorise on their own, planning military strategy and playing chess, they're just limited in their functionality.
Do I think the general public will accept a computer that's connected everywhere, and has the ability to watch us and make decisions?
no.
I suspect that the very best answer you get, unless you live on a university campus and even then only maybe, will be "electricity." Even when pressed, you are extremely unlikely to get any sort of detailed response involving resistive heating filaments, fluorescence, or any of the other principles which underlie modern lighting.
Essentially, to the man in the street, modern technology might as well be magic already. They won't ever admit it though.
Nintendo Network ID: AzraelRose
DropBox invite link - get 500MB extra free.
The Roman and Greek empires would come in maybe as the second, but the fact is there has not been the great history of rise and fall which explains everything which people so often seem content to say there has been. Civilization as it is today is distinct and unique to any that have existed before us and there is no reason to think that it will actually collapse barring some unchecked social issue like religious fanatics (Christian fundamentalists).
Indeed, I actualy found this article a rather scary observation of the fact that even relatively intelligent people have begun to totally fail to understand what is useful science, and what is not. Electricitylikesme makes a hugely important point here regarding curve fitting. With modern computers and a given data set, you can fit a curve to that data set, provided you give it enough degrees of freedom.
However unless you fit the curve which actually corresponds to the way the data is produced, and whose coefficients mean something in the physical world the you have learned nothing. All you have learned is that there is a spectra (which you already knew, since it is true of all possible things), and it exists in the phase space of curves that can be fitted if any given function is applied (which you also already knew, since again it is true of all possible spectra)
For google, it is enough to find the correlation. If X seems to cause Y, then so be it. If you search for X, here, have some Y as well. And here is the X that people like Z seem to like the most. Googles job is simply to do this, and not to understand. Google does not need to know why X likes Y, only that it does because Google is already doing its job. A baker doesn't need to know why cakes rise when they have yeast in them, or why sugar tastes sweet, simply that they do. He is already doing his job. However Bakers are not Chemists or Biologists, and Google is not a groundbreaking science experiment.
The purpose is not to fit the curves, and give blind correlations. The purpose is to understand the coefficients in the fit, since that is where the science is. We didn't invent semiconductors just by noticing that quantum physics happens, we did it by understanding it.
So? Technology will get a hell of a lot more magical than that. Someday there might be technology where even the inventors themselves wouldn't know how it works, just because of the nature of how its made. When technology can feasibly run millions of years of evolution within a single machine and use the results to create the best solutions on its own, that will be amazing.
Who decides what is useful science, and what is not? Consider the Laser, a technology that was initially deemed to only have a limited set of use cases, but has ended up being one of the most versatile cornerstones of our modern world.
Civilizations compete or cooperate in a number of ways. Some lose and some get absorbed. Successful civilizations last a long time. One civilization's tech fails to another civilization's greater tech(or numbers), and if a tech is deemed worthwhile, it's adopted by whomever comes across it.
So, maybe we'll be obliterated by alien conquerors or nuclear equipped religious zealots, but beyond that, I don't see how an apex is so obviously in the cards, particularly when one looks at history.
Statistics is not going to replace the scientific method. Computers could, of course, become sentient and able to devise complicated hypotheses on their own but then we'd just be talking about a different type of thing entirely - an AI.
There have been a lot of interesting advances in this sort of thing mind you - there is work going on in creating tools which allow scientists to quickly get through literature surveys and the like, or which try to link concepts between different papers in order to help people find otherwise unspotted trends (I believe in particular this was a project looking at PubMed to try and find connections between medical research which otherwise someone would have to have read the right papers in the right order).
These things just need to be looked at in the right context - it's all a tool that assists the basic process.
Sure.
...
Doesn't "to be fair" imply that a sort of weak and possibly conditional "to the contrary" comment is to immediately follow?
The dark ages weren't all that dark anyways. "Dark ages" was a term coined by renaissance humanists who wanted to leave the impression that they had rescued the world from ignorance... And there's some pretty strong arguments I've heard that debate the occurrence of a renaissance outside some fairly limited fields.
Yes, there were setbacks in intellectual progress, but the scope is, I think, overblown.
Right again Electricitylikesme. All this progress in data aquisition no more changes the scientific method than does the invention of the calculator, or the abacus. It just gives you more data to sort, and more tools to sort it with. Only an AI system actually capable of innovative thinking would do more than that, and even it would still theorize, analyze and predict in a similar way as a human scientist does.
And, to Zilla, I would say that my 'useful' comment is more to do with the uselessness in terms of progress of simply observing trends and correlations. To google, the why is unimportant, to a scientist or engineer the why is all that is important in terms of real progress. Almost any breakthrough in understanding ever made has been shown to be useful, breakthroughs in correlation observation conversely are almost always useless until they are understood.
For example, we have seen that a force appears to exist which causes the universe to accelerate beyond the deceleration caused by gravity at extreme distances. This is interesting, and very important to move towards understanding but is in itself useless and as of right now may as well be caused by magic. Only once we can understand, predict, and observe will we be able to use the knowledge in a useful way.
Somebody call John Connor right now.