Don't like the snow? You can make a bookmark with the following text instead of a url: javascript:snowStorm.toggleSnow(). Clicking it will toggle the snow on and off.
Our new Indie Games subforum is now open for business in G&T. Go and check it out, you might land a code for a free game. If you're developing an indie game and want to post about it, follow these directions. If you don't, he'll break your legs! Hahaha! Seriously though.
Our rules have been updated and given their own forum. Go and look at them! They are nice, and there may be new ones that you didn't know about! Hooray for rules! Hooray for The System! Hooray for Conforming!

The Petabyte Age: The End of Theory & the (exponential) advancement of Science

2»

Posts

  • tbloxhamtbloxham Registered User regular
    edited July 2008
    Zilla360 wrote: »
    Statistics is not going to replace the scientific method. Computers could, of course, become sentient and able to devise complicated hypotheses on their own but then we'd just be talking about a different type of thing entirely - an AI.
    And it's beginning right now... Going beyond Polymorphic code, and creating compilers that compile better compilers...ad inifintum, etc all on their own! 8-)

    Somebody call John Connor right now. D:

    I think you're giving that program a bit too much credit, its just creating a program which remembers what ran well on previous ASIC hardware and can stitch together code segments to perform processes in the optimum time based on that. It's just another tool for searching a database.

    Your puny weapons are useless against me
  • Zilla360Zilla360 Spaaaace! In Space.Registered User regular
    edited July 2008
    tbloxham wrote: »
    Zilla360 wrote: »
    Statistics is not going to replace the scientific method. Computers could, of course, become sentient and able to devise complicated hypotheses on their own but then we'd just be talking about a different type of thing entirely - an AI.
    And it's beginning right now... Going beyond Polymorphic code, and creating compilers that compile better compilers...ad inifintum, etc all on their own! 8-)

    Somebody call John Connor right now. D:

    I think you're giving that program a bit too much credit, its just creating a program which remembers what ran well on previous ASIC hardware and can stitch together code segments to perform processes in the optimum time based on that. It's just another tool for searching a database.
    Baby steps, baby steps...

  • stiliststilist Registered User
    edited July 2008
    Zilla360 wrote: »
    tbloxham wrote: »
    Zilla360 wrote: »
    Statistics is not going to replace the scientific method. Computers could, of course, become sentient and able to devise complicated hypotheses on their own but then we'd just be talking about a different type of thing entirely - an AI.
    And it's beginning right now... Going beyond Polymorphic code, and creating compilers that compile better compilers...ad inifintum, etc all on their own! 8-)

    Somebody call John Connor right now. D:
    I think you're giving that program a bit too much credit, its just creating a program which remembers what ran well on previous ASIC hardware and can stitch together code segments to perform processes in the optimum time based on that. It's just another tool for searching a database.
    Baby steps, baby steps...
    There’s something of a difference between optimising compilers and artificial intelligence.

    I poop things on my site and twitter
  • Zilla360Zilla360 Spaaaace! In Space.Registered User regular
    edited July 2008
    stilist wrote: »
    Zilla360 wrote: »
    tbloxham wrote: »
    Zilla360 wrote: »
    Statistics is not going to replace the scientific method. Computers could, of course, become sentient and able to devise complicated hypotheses on their own but then we'd just be talking about a different type of thing entirely - an AI.
    And it's beginning right now... Going beyond Polymorphic code, and creating compilers that compile better compilers...ad inifintum, etc all on their own! 8-)

    Somebody call John Connor right now. D:
    I think you're giving that program a bit too much credit, its just creating a program which remembers what ran well on previous ASIC hardware and can stitch together code segments to perform processes in the optimum time based on that. It's just another tool for searching a database.
    Baby steps, baby steps...
    There’s something of a difference between self-optimising compilers and artificial intelligence constructs.
    It's all executable, and we now have memristors, so...?
    If you fail to see the links between seemingly disparate fields of inquiry, then that's ok. But not being open to exploring new possibilities is what creationists want. And you're not one of those are you? :P
    Spoiler:

  • stiliststilist Registered User
    edited July 2008
    It doesn’t bother you that you’re talking a load of tripe? You’d be just as accurate saying that being able to count higher than ten brings somebody closer to creating AI.

    Ignoring that this has little to do with the thread’s original topic, there is a vast difference between telling a computer to record the speed at which different code runs and creating a computer that behaves like a human. Emergent data patterns are not that close to emergent intelligence.

    I poop things on my site and twitter
  • tbloxhamtbloxham Registered User regular
    edited July 2008
    Zilla360 wrote: »
    stilist wrote: »
    Zilla360 wrote: »
    tbloxham wrote: »
    Zilla360 wrote: »
    Statistics is not going to replace the scientific method. Computers could, of course, become sentient and able to devise complicated hypotheses on their own but then we'd just be talking about a different type of thing entirely - an AI.
    And it's beginning right now... Going beyond Polymorphic code, and creating compilers that compile better compilers...ad inifintum, etc all on their own! 8-)

    Somebody call John Connor right now. D:
    I think you're giving that program a bit too much credit, its just creating a program which remembers what ran well on previous ASIC hardware and can stitch together code segments to perform processes in the optimum time based on that. It's just another tool for searching a database.
    Baby steps, baby steps...
    There’s something of a difference between self-optimising compilers and artificial intelligence constructs.
    It's all executable, and we now have memristors, so...?
    If you fail to see the links between seemingly disparate fields of inquiry, then that's ok. But not being open to exploring new possibilities is what creationists want. And you're not one of those are you? :P
    Spoiler:

    But these compilers are self optimising to do tasks that are defined by users using code which is prepared by users. The compiler can assemble it into a more optimised form by trial and error and by consulting its "how things work database" but it can't try a unique and new way, nor can it conceive of new tasks and operations to carry out. Consulting databases is easy, and better human written code languages which are more object orientated of course allow computers to assemble code together more flexibly, but all it really is doing is consulting a database.

    The steps towards an AI would involve developing a computer which could think in an innovative fashion, and decide on new tasks that it found interesting. This program is just like the chess playing AI's, people thought that they were steps towards a real AI, but in fact all they were were steps towards a complete understanding of chess. When the system optimises itself in the same way a human would, then we are on our way.

    Your puny weapons are useless against me
  • seabassseabass Doctor MassachusettsRegistered User regular
    edited July 2008
    tbloxham wrote: »

    But these compilers are self optimising to do tasks that are defined by users using code which is prepared by users. The compiler can assemble it into a more optimised form by trial and error and by consulting its "how things work database" but it can't try a unique and new way, nor can it conceive of new tasks and operations to carry out. Consulting databases is easy, and better human written code languages which are more object orientated of course allow computers to assemble code together more flexibly, but all it really is doing is consulting a database.

    The steps towards an AI would involve developing a computer which could think in an innovative fashion, and decide on new tasks that it found interesting. This program is just like the chess playing AI's, people thought that they were steps towards a real AI, but in fact all they were were steps towards a complete understanding of chess. When the system optimises itself in the same way a human would, then we are on our way.

    "I don't mean to alarm you, but we've made a machine that can think" is probably the relevant quote here, about your chess and your thinking. If you define intelligence as being able to play chess, those machines are certainly smart as hell. I think that people saying things like 'hold on its thinking' of computers is really telling too, but if we want to talk about the Chinese room, maybe that could be its own thread.

    Typically, what the public thinks of as AI and what AI researchers do all day are very different. Most of us aren't trying to create brains, though we wish we could. Lots of people work in search, and are trying to do work on optimization problems. Some folks work in planning. Others write theorem provers. Some people like tooling around with robots, but very few people (maybe just Minsky?) are trying to make a brain. I guess what I'm saying here is that I take issues with the term 'real AI'. Real AI is what optimizes UPS routes to include fewer left turns and what decides which research tasks happen first on the mars rover. Thats whats real, or at least what is done.

    Now, I think the problem most people are going to bring up with AI as a researcher is the inability to be creative, in so far as it applies to making relevant hypothesis. If we have a language with which we can describe the world, even if there are an infinite number of statements, we could just set a machine away on it, and ask it to prove or disprove everything it can state. So, proving things doesn't require creativity or insight... prioritizing your efforts takes both.

    That being said, even if machines can't be made to self evolve into scientific super-minds, they do take care of a lot of the crap we'd otherwise have to do by hand, and in that capacity more computing resources, better tools to run on them, and better models for using both have already been snowballing, and should continue to.

    Run you pigeons, it's Robert Frost!
  • Zilla360Zilla360 Spaaaace! In Space.Registered User regular
    edited July 2008
    Sorry to resurrect this thread, but I thought this was interesting and related to the direction this discussion was going in:
    If you are still not convinced, perhaps it helps to take a longer view. The idea of a self-replicating machine can be traced back to remarks made by the Queen of Sweden to René Descartes, but they were more seriously explored in the 19th century by Samuel Butler, who described a machine that could mimic the biological process of plants in his novel Erewhon.
    The machine that copies itself!

    Also perhaps the thread title is a bit too strongly worded, but consider that the scientific method is built on tools of thought and logic, and that as we augment and extend those tools, we may end up changing the whole process beyond that which we recognize as 'the method' today; just as relativity and the laws of motion complement each other. :)

  • Loren MichaelLoren Michael Registered User regular
    edited July 2008
    To be fair my reference is high school history and the break down of the course probably wasn't great in timelining the different empires so I'll take your word for it.
    You and everyone else in this thread, should read Nonzero. :P

    Also, take a look at the links I posted on the last page.

    2ezikn6.jpg
  • Zilla360Zilla360 Spaaaace! In Space.Registered User regular
    edited July 2008
  • Zilla360Zilla360 Spaaaace! In Space.Registered User regular
    edited July 2008
    stilist wrote: »
    Emergent data patterns are not that close to emergent intelligence.
    Don't you yourself operate of the basis of procedural recall, using other seemingly unassociated yet linked memories to reconstruct a linear sequence of events?

    Like remembering where your keys are by thinking of the beer you drank last night? :P

  • saggiosaggio Registered User regular
    edited July 2008
    Zilla360 wrote: »
    stilist wrote: »
    Emergent data patterns are not that close to emergent intelligence.
    Don't you yourself operate of the basis of procedural recall, using other seemingly unassociated yet linked memories to reconstruct a linear sequence of events?

    Like remembering where your keys are by thinking of the beer you drank last night? :P

    No.

    3DS: 0232-9436-6893
  • Zilla360Zilla360 Spaaaace! In Space.Registered User regular
    edited July 2008
    saggio wrote: »
    Zilla360 wrote: »
    stilist wrote: »
    Emergent data patterns are not that close to emergent intelligence.
    Don't you yourself operate of the basis of procedural recall, using other seemingly unassociated yet linked memories to reconstruct a linear sequence of events?

    Like remembering where your keys are by thinking of the beer you drank last night? :P

    No.
    So what is your alternative method?

  • saggiosaggio Registered User regular
    edited July 2008
    I think you are assuming memory and understanding to be one and the same. It isn't.

    3DS: 0232-9436-6893
  • EchoEcho staring is caring Moderator mod
    edited July 2008
    I guess I was kind of implying the whole deal with the dark ages were that they really were basically a failure of civilization for a couple hundred years. Though I suppose that isn't really fair to the progress in China and the Middle East during this time - still, in terms of mathematics it would've been fantastic if Greece had kept calculus going.

    For all the credit Aristotle gets, he probably held science back for at least a century.

    Being a scifi geek this thread interests me greatly. Singularity stuff makes for great fiction.

  • Mr_RoseMr_Rose Registered User regular
    edited July 2008
    Echo wrote: »
    For all the credit Aristotle gets, he probably held science back for at least a century.

    Being a scifi geek this thread interests me greatly. Singularity stuff makes for great fiction.
    Have you heard of Dresden Codak?

    ...because dragons are AWESOME! That's why.
    Nintendo Network ID: AzraelRose
    DropBox invite link - get 500MB extra free.
  • redxredx Bow Down! Before the power of Santa!Registered User regular
    edited July 2008
    ieee Spectrum recently did a piece on Singularity some neat bits. Not exactly hard science or anything, but still fairly interesting.

    Bow Down, Bow Down
    Before the power of Santa
    Or be crushed, be crushed
    By his jolly boots of doom.
  • Zilla360Zilla360 Spaaaace! In Space.Registered User regular
    edited July 2008
    Mr_Rose wrote: »
    Echo wrote: »
    For all the credit Aristotle gets, he probably held science back for at least a century.

    Being a scifi geek this thread interests me greatly. Singularity stuff makes for great fiction.
    Have you heard of Dresden Codak?
    Spoiler:
    Religions as PLC's, that's brilliant. Invest in Jebus today! :lol:

  • Zilla360Zilla360 Spaaaace! In Space.Registered User regular
    edited July 2008
    Alright, now this is awesome: Of course, all skepticism/negativity is welcome and encouraged, since atheism isn't a religion... :P

  • Zilla360Zilla360 Spaaaace! In Space.Registered User regular
    edited July 2008
    Comment taken from a linked IEEE singularity article:
    Spoiler:
    Aaah, nooo, Zombie Barack Obama! Run awaay!

    You stay crazy and classy, internets. :lol:

  • Loren MichaelLoren Michael Registered User regular
    edited July 2008
    Zilla360 wrote: »
    Alright, now this is awesome: Of course, all skepticism/negativity is welcome and encouraged, since atheism isn't a religion... :P

    Okay, yeah, that's pretty goddamn awesome.

    2ezikn6.jpg
  • Loren MichaelLoren Michael Registered User regular
    edited July 2008
    Echo wrote: »
    I guess I was kind of implying the whole deal with the dark ages were that they really were basically a failure of civilization for a couple hundred years. Though I suppose that isn't really fair to the progress in China and the Middle East during this time - still, in terms of mathematics it would've been fantastic if Greece had kept calculus going.

    For all the credit Aristotle gets, he probably held science back for at least a century.

    I thought it was in vogue for to shit on Aristotle these days, actually.

    He is inextricable from the march of history. He is also one of the singularly great figures in intellectual history. He got stuff wrong, yes. But there is an enormous wealth of stuff that came from him that is incredibly original dealing with a vast array of subjects.

    2ezikn6.jpg
2»
Sign In or Register to comment.