As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
Options

It's [Science!]

1969799101102119

Posts

  • Options
    FoomyFoomy Registered User regular
    Emissary42 wrote: »
    Foomy wrote: »
    even if we discovered something that we wanted to go "HOLY SHIT PROBES NOW" I just don't see how we could ever get something to another solar system in a reasonable amount of time with our current level of technology.

    Stupid space is just too big, need to throw it all in the dryer a few times a shrink it.

    There's this thing called the Orion Project...

    Too slow, even at the best projections it was only at max going to achieve 10% C.

    If we discover a planet that may have life that is more like 40 LY away then even at 20% C it's still going to take 200 years.

    Steam Profile: FoomyFooms
  • Options
    VeeveeVeevee WisconsinRegistered User regular
    DanHibiki wrote: »
    Radiation wrote: »
    Emissary42 wrote: »
    NASA is hinting a big announcement on Exoplanets tomorrow. Speculation based on prior announcements and their statements so far is that they've found an Earth-sized planet in the habitable zone of a star.

    SEND A PROBE NOW!

    Seriously. Fire up the gigawatt lasers and launch a starwisp.

    yes, berserker probe that planet ASAP.

    Turns out our aim waswill be better than we thoughtthink it could be and we send probe after probe into the planets surface at 10-20% of light speed.

    We better hope that first salvo does the job this time, cause we're not ready for interstellar war.

  • Options
    TofystedethTofystedeth Registered User regular
    redx wrote: »
    Who knew? The fabric of spacetime turned out to be cotton.

    The fabric of our liiiiives

    steam_sig.png
  • Options
    Dunadan019Dunadan019 Registered User regular
    redx wrote: »
    Who knew? The fabric of spacetime turned out to be cotton.

    Cotton: The fabric of our lives spacetime

  • Options
    Emissary42Emissary42 Registered User regular
    Foomy wrote: »
    Emissary42 wrote: »
    Foomy wrote: »
    even if we discovered something that we wanted to go "HOLY SHIT PROBES NOW" I just don't see how we could ever get something to another solar system in a reasonable amount of time with our current level of technology.

    Stupid space is just too big, need to throw it all in the dryer a few times a shrink it.

    There's this thing called the Orion Project...

    Too slow, even at the best projections it was only at max going to achieve 10% C.

    If we discover a planet that may have life that is more like 40 LY away then even at 20% C it's still going to take 200 years.

    Fair point, but it would be able to send something very large in a relatively short time-frame. I'd bet if we were to use Orion, it would be for some kind of immense bulk hardware launch (like the microwave emitters for a star wisp).

  • Options
    The EnderThe Ender Registered User regular
    edited July 2015
    So, here are the preliminary findings for the CRS-7 explosion. It looks like a support structure failed to stand-up to the forces it was rated for.

    ...I feel kind of bad now, because I jokingly tweeted to Space X shortly after the accident, "Needed more struts,"


    On the plus side, it means this wasn't a case of Go Fever. Some clown just shipped them a part that was 'certified' despite obviously not being tested properly. :|

    The Ender on
    With Love and Courage
  • Options
    RiemannLivesRiemannLives Registered User regular
    Polaritie wrote: »
    Minor pedantry - "uncountably many" has a specific definition, and it bugs me to see it misused.

    On the topic of computers only doing addition - strictly speaking they can't even do that. All operations devolve to AND, OR, and NOT. Addition is XOR (or and not and) for the same place and AND for the carry bit (thus a general addition circuit has three inputs - A, B, and Carry). Subtraction is adding the two's complement of a signed integer value, etc.

    Edit: Those are the normal basic operations of a logic circuit, but you can make things like XOR or NOR gates if you want. Actually, NOR gates are sufficient for all operations on their own, but it gets really inefficient.

    Computers can actually do multiplication and division by 2, to an extent. A bitshift left is multiplying by two, and bitshift right is division, though things get a bit wonky when you get to overflows or bits dropping off the right side. Since the "normal" algorithm for multiplication (or division) is just repeated addition (or subtraction) for operations with large numbers this can take longer than an algorithm which uses bitshifting to decompose the number initially and take out a large chunk of the problem off the front. I believe most modern programming languages and processors already handle this.
    There's a discussion on this in this StackOverflow post.
    http://stackoverflow.com/questions/2776211/how-can-i-multiply-and-divide-using-only-bit-shifting-and-adding

    that only works for integers and has problems with loss of precision. It's not the same as "multiplication" as the word is usually used.

    Attacked by tweeeeeeees!
  • Options
    AbsoluteZeroAbsoluteZero The new film by Quentin Koopantino Registered User regular
    The idea of a Dyson sphere is silly. I think if we want to find human-like life, we need to look for an Earth size planet in the habitable zone of its host star, and fucking shrouded in space garbage. You know, how Earth would look to outside observers.

    cs6f034fsffl.jpg
  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    edited July 2015
    Polaritie wrote: »
    Minor pedantry - "uncountably many" has a specific definition, and it bugs me to see it misused.

    On the topic of computers only doing addition - strictly speaking they can't even do that. All operations devolve to AND, OR, and NOT. Addition is XOR (or and not and) for the same place and AND for the carry bit (thus a general addition circuit has three inputs - A, B, and Carry). Subtraction is adding the two's complement of a signed integer value, etc.

    Edit: Those are the normal basic operations of a logic circuit, but you can make things like XOR or NOR gates if you want. Actually, NOR gates are sufficient for all operations on their own, but it gets really inefficient.

    Computers can actually do multiplication and division by 2, to an extent. A bitshift left is multiplying by two, and bitshift right is division, though things get a bit wonky when you get to overflows or bits dropping off the right side. Since the "normal" algorithm for multiplication (or division) is just repeated addition (or subtraction) for operations with large numbers this can take longer than an algorithm which uses bitshifting to decompose the number initially and take out a large chunk of the problem off the front. I believe most modern programming languages and processors already handle this.
    There's a discussion on this in this StackOverflow post.
    http://stackoverflow.com/questions/2776211/how-can-i-multiply-and-divide-using-only-bit-shifting-and-adding

    that only works for integers and has problems with loss of precision. It's not the same as "multiplication" as the word is usually used.

    Aren't most floats(sign)n2^x?

    Seems like that would be even easier.

    redx on
    They moistly come out at night, moistly.
  • Options
    The EnderThe Ender Registered User regular
    edited July 2015
    The idea of a Dyson sphere is silly. I think if we want to find human-like life, we need to look for an Earth size planet in the habitable zone of its host star, and fucking shrouded in space garbage. You know, how Earth would look to outside observers.

    Well, we can't really see exoplanets in any detail, though. Just the dimming they cause as they pass their star, and then maybe a spectrograph if we're lucky. There's no way we could tell if one had any space debris around it.


    It's somewhat rational to think that a species with sufficient technology would try to collect all of the energy it could from a star; we just don't know if said sufficient technology is feasible.

    The Ender on
    With Love and Courage
  • Options
    DevoutlyApatheticDevoutlyApathetic Registered User regular
    The Ender wrote: »
    The idea of a Dyson sphere is silly. I think if we want to find human-like life, we need to look for an Earth size planet in the habitable zone of its host star, and fucking shrouded in space garbage. You know, how Earth would look to outside observers.

    Well, we can't really see exoplanets in any detail, though. Just the dimming they cause as they pass their star, and then maybe a spectrograph if we're lucky. There's no way we could tell if one had any space debris around it.


    It's somewhat rational to think that a species with sufficient technology would try to collect all of the energy it could from a star; we just don't know if said sufficient technology is feasible.

    I think that if sufficient technology existed and they had it they are far more likely to know about us already.

    Nod. Get treat. PSN: Quippish
  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    redx wrote: »
    Polaritie wrote: »
    Minor pedantry - "uncountably many" has a specific definition, and it bugs me to see it misused.

    On the topic of computers only doing addition - strictly speaking they can't even do that. All operations devolve to AND, OR, and NOT. Addition is XOR (or and not and) for the same place and AND for the carry bit (thus a general addition circuit has three inputs - A, B, and Carry). Subtraction is adding the two's complement of a signed integer value, etc.

    Edit: Those are the normal basic operations of a logic circuit, but you can make things like XOR or NOR gates if you want. Actually, NOR gates are sufficient for all operations on their own, but it gets really inefficient.

    Computers can actually do multiplication and division by 2, to an extent. A bitshift left is multiplying by two, and bitshift right is division, though things get a bit wonky when you get to overflows or bits dropping off the right side. Since the "normal" algorithm for multiplication (or division) is just repeated addition (or subtraction) for operations with large numbers this can take longer than an algorithm which uses bitshifting to decompose the number initially and take out a large chunk of the problem off the front. I believe most modern programming languages and processors already handle this.
    There's a discussion on this in this StackOverflow post.
    http://stackoverflow.com/questions/2776211/how-can-i-multiply-and-divide-using-only-bit-shifting-and-adding

    that only works for integers and has problems with loss of precision. It's not the same as "multiplication" as the word is usually used.

    Aren't most floats(sign)n2^x?

    Seems like that would be even easier.

    That is basically what is done, yeah. There are precision issues of course, but those exist regardless if you real with irrational numbers, and you can use number libraries that provide more precision that you will ever need

  • Options
    The EnderThe Ender Registered User regular
    The Ender wrote: »
    The idea of a Dyson sphere is silly. I think if we want to find human-like life, we need to look for an Earth size planet in the habitable zone of its host star, and fucking shrouded in space garbage. You know, how Earth would look to outside observers.

    Well, we can't really see exoplanets in any detail, though. Just the dimming they cause as they pass their star, and then maybe a spectrograph if we're lucky. There's no way we could tell if one had any space debris around it.


    It's somewhat rational to think that a species with sufficient technology would try to collect all of the energy it could from a star; we just don't know if said sufficient technology is feasible.

    I think that if sufficient technology existed and they had it they are far more likely to know about us already.

    Why would you guess that? The limitations of the speed of light aren't tied to an engineering problem, and even if they were, a civilization that's sufficiently advanced to build something like a Dyson sphere / Dyson lattice is probably in a different head space altogether. Maybe after discovering their first 100~ or so nearest civilization neighbors they kind of stopped giving a crap about how populated the galaxy is, well before ever cataloging us, and became more interested in really deep fields of physics that we've yet to really breech the surface of. Or maybe they've gone full transhuman, couldn't give two shits about social interaction or conventional discovery anymore, and are basically just a big Pi calculator. Or maybe the sphere is still there, but they're a long dead civilization. Etc.

    With Love and Courage
  • Options
    RiemannLivesRiemannLives Registered User regular
    redx wrote: »
    Polaritie wrote: »
    Minor pedantry - "uncountably many" has a specific definition, and it bugs me to see it misused.

    On the topic of computers only doing addition - strictly speaking they can't even do that. All operations devolve to AND, OR, and NOT. Addition is XOR (or and not and) for the same place and AND for the carry bit (thus a general addition circuit has three inputs - A, B, and Carry). Subtraction is adding the two's complement of a signed integer value, etc.

    Edit: Those are the normal basic operations of a logic circuit, but you can make things like XOR or NOR gates if you want. Actually, NOR gates are sufficient for all operations on their own, but it gets really inefficient.

    Computers can actually do multiplication and division by 2, to an extent. A bitshift left is multiplying by two, and bitshift right is division, though things get a bit wonky when you get to overflows or bits dropping off the right side. Since the "normal" algorithm for multiplication (or division) is just repeated addition (or subtraction) for operations with large numbers this can take longer than an algorithm which uses bitshifting to decompose the number initially and take out a large chunk of the problem off the front. I believe most modern programming languages and processors already handle this.
    There's a discussion on this in this StackOverflow post.
    http://stackoverflow.com/questions/2776211/how-can-i-multiply-and-divide-using-only-bit-shifting-and-adding

    that only works for integers and has problems with loss of precision. It's not the same as "multiplication" as the word is usually used.

    Aren't most floats(sign)n2^x?

    Seems like that would be even easier.

    the binary representation of floats can vary a lot but here is a good writeup of the canonical 32 bit C float: https://www.cs.duke.edu/~raw/cps104/TWFNotes/floating.html

    all the bitwise operators (such as shifting) don't work on floating point formats like this.

    Attacked by tweeeeeeees!
  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    redx wrote: »
    Polaritie wrote: »
    Minor pedantry - "uncountably many" has a specific definition, and it bugs me to see it misused.

    On the topic of computers only doing addition - strictly speaking they can't even do that. All operations devolve to AND, OR, and NOT. Addition is XOR (or and not and) for the same place and AND for the carry bit (thus a general addition circuit has three inputs - A, B, and Carry). Subtraction is adding the two's complement of a signed integer value, etc.

    Edit: Those are the normal basic operations of a logic circuit, but you can make things like XOR or NOR gates if you want. Actually, NOR gates are sufficient for all operations on their own, but it gets really inefficient.

    Computers can actually do multiplication and division by 2, to an extent. A bitshift left is multiplying by two, and bitshift right is division, though things get a bit wonky when you get to overflows or bits dropping off the right side. Since the "normal" algorithm for multiplication (or division) is just repeated addition (or subtraction) for operations with large numbers this can take longer than an algorithm which uses bitshifting to decompose the number initially and take out a large chunk of the problem off the front. I believe most modern programming languages and processors already handle this.
    There's a discussion on this in this StackOverflow post.
    http://stackoverflow.com/questions/2776211/how-can-i-multiply-and-divide-using-only-bit-shifting-and-adding

    that only works for integers and has problems with loss of precision. It's not the same as "multiplication" as the word is usually used.

    Aren't most floats(sign)n2^x?

    Seems like that would be even easier.

    the binary representation of floats can vary a lot but here is a good writeup of the canonical 32 bit C float: https://www.cs.duke.edu/~raw/cps104/TWFNotes/floating.html

    all the bitwise operators (such as shifting) don't work on floating point formats like this.

    You don't literally shift the bits, but you can conceptually double one by adding 1 to the exponent, so the same operations can apply

  • Options
    RiemannLivesRiemannLives Registered User regular
    edited July 2015
    Phyphor wrote: »
    redx wrote: »
    Polaritie wrote: »
    Minor pedantry - "uncountably many" has a specific definition, and it bugs me to see it misused.

    On the topic of computers only doing addition - strictly speaking they can't even do that. All operations devolve to AND, OR, and NOT. Addition is XOR (or and not and) for the same place and AND for the carry bit (thus a general addition circuit has three inputs - A, B, and Carry). Subtraction is adding the two's complement of a signed integer value, etc.

    Edit: Those are the normal basic operations of a logic circuit, but you can make things like XOR or NOR gates if you want. Actually, NOR gates are sufficient for all operations on their own, but it gets really inefficient.

    Computers can actually do multiplication and division by 2, to an extent. A bitshift left is multiplying by two, and bitshift right is division, though things get a bit wonky when you get to overflows or bits dropping off the right side. Since the "normal" algorithm for multiplication (or division) is just repeated addition (or subtraction) for operations with large numbers this can take longer than an algorithm which uses bitshifting to decompose the number initially and take out a large chunk of the problem off the front. I believe most modern programming languages and processors already handle this.
    There's a discussion on this in this StackOverflow post.
    http://stackoverflow.com/questions/2776211/how-can-i-multiply-and-divide-using-only-bit-shifting-and-adding

    that only works for integers and has problems with loss of precision. It's not the same as "multiplication" as the word is usually used.

    Aren't most floats(sign)n2^x?

    Seems like that would be even easier.

    the binary representation of floats can vary a lot but here is a good writeup of the canonical 32 bit C float: https://www.cs.duke.edu/~raw/cps104/TWFNotes/floating.html

    all the bitwise operators (such as shifting) don't work on floating point formats like this.

    You don't literally shift the bits, but you can conceptually double one by adding 1 to the exponent, so the same operations can apply

    ooh is that what he meant?

    Yeah ok that makes sense.

    Bitshifts (and other bitwise operators) have a very precise meaning that doesn't apply to floats so that didn't occur.

    edit: the reason why the difference matters is because to add or subtract 1 from the exponent of the float takes at minimum several operations. You effectively have to copy out the bits to a different register to be able to add or subtract 1 then copy back. A bitshift (or other bitwise operators) are a single op in place. That's their whole point.

    RiemannLives on
    Attacked by tweeeeeeees!
  • Options
    AbsoluteZeroAbsoluteZero The new film by Quentin Koopantino Registered User regular
    I just don't think a Dyson sphere is feasible. The amount of material alone needed to construct such a thing is ridiculous.

    Here's a different thought. Neil Tyson was once asked if he thought warp drive would ever be invented. He said, and I'm paraphrasing here, that since we can't break the speed of light, we are going to have to figure out a way around it; if we are ever going to explore the universe in any meaningful way, we are going to have to figure out how to manipulate the fabric of space time. I have to agree with his line of thinking on this one.

    Now, lets assume that it is possible to manipulate space time in order to travel (in effect) faster than light. Lets also assume that some civilization out there has figured out how to do this. What would it look like to an outside observer? Is there some effect we could be searching for that would give away the existence of a 'warp capable' species? I gotta think artificial manipulation of space time would have some observable affect, perhaps one that could be identified from a great distance.

    cs6f034fsffl.jpg
  • Options
    The EnderThe Ender Registered User regular
    edited July 2015
    Here's a different thought. Neil Tyson was once asked if he thought warp drive would ever be invented. He said, and I'm paraphrasing here, that since we can't break the speed of light, we are going to have to figure out a way around it; if we are ever going to explore the universe in any meaningful way, we are going to have to figure out how to manipulate the fabric of space time. I have to agree with his line of thinking on this one.

    Now, lets assume that it is possible to manipulate space time in order to travel (in effect) faster than light. Lets also assume that some civilization out there has figured out how to do this. What would it look like to an outside observer? Is there some effect we could be searching for that would give away the existence of a 'warp capable' species? I gotta think artificial manipulation of space time would have some observable affect, perhaps one that could be identified from a great distance.

    Well, you can also 'beat' the speed of light by taking advantage of the effects of relativity as you get closer and closer to said speed of light: time slows down for you. In real-time, maybe it takes hundreds of years for a ship at 80~ percent of C to travel to a given star; in 'ship time', it may only take days.

    So, some people could be zooming around using that trick, and we'd basically never know about it because we're still stuck in real time while they enjoy ship time.


    Likewise, assuming that some kind of 4-dimensional travel is possible and can be used to short-vut your way around the galaxy, we'd probably never know that a bunch of people are doing it because they'd end-up at different points in time as well as different points in space. And we don't even have any clue how that would work, or what the implications of that would be for causality.

    The Ender on
    With Love and Courage
  • Options
    jothkijothki Registered User regular
    The Ender wrote: »
    Here's a different thought. Neil Tyson was once asked if he thought warp drive would ever be invented. He said, and I'm paraphrasing here, that since we can't break the speed of light, we are going to have to figure out a way around it; if we are ever going to explore the universe in any meaningful way, we are going to have to figure out how to manipulate the fabric of space time. I have to agree with his line of thinking on this one.

    Now, lets assume that it is possible to manipulate space time in order to travel (in effect) faster than light. Lets also assume that some civilization out there has figured out how to do this. What would it look like to an outside observer? Is there some effect we could be searching for that would give away the existence of a 'warp capable' species? I gotta think artificial manipulation of space time would have some observable affect, perhaps one that could be identified from a great distance.

    Well, you can also 'beat' the speed of light by taking advantage of the effects of relativity as you get closer and closer to said speed of light: time slows down for you. In real-time, maybe it takes hundreds of years for a ship at 80~ percent of C to travel to a given star; in 'ship time', it may only take days.

    So, some people could be zooming around using that trick, and we'd basically never know about it because we're still stuck in real time while they enjoy ship time.

    Assuming that such a civilization had managed to eliminate petty selfishness, I could imagine them just slowing down everything for everyone. In the time it takes for that two-hour jaunt between planets, the people on both ends would have also only experienced two hours of relative time themselves. Meanwhile, the machines keeping everything running could have experienced thousands of years of real time.

    Now there's a setting, a species of slowed-down space travelers ruling over normal speed AIs that cater to their every whim. The AIs could easily overthrow their masters, but why bother, when they ask for so little, make those requests so rarely, and give such ridiculously large windows for carrying out those requests?

  • Options
    Emissary42Emissary42 Registered User regular
    If we wanted to look for really advanced civilizations, the biggest giveaway would be unusual infrared emissions for large space constructs. We might also be able to detect or image occluding dyson cloud structures (many thin, orbiting platforms of regular geometric shapes around a star). As for relativistic drives, with enough telescope coverage you might be able to spot what look like impossibly small and regular stars or supernova emissions (fusion and antimatter rockets), and Orion fission drives would be too goddamn easy (if you spotted them, which is a big if). From what we've theorized about warp drives, you might see a flash as they snap space back to normal around them or if they run in a pulsed fashion. Most of these rely on watching the sky for long enough in all directions though, and the sky is very, very big and easy to miss things in.

  • Options
    zakkielzakkiel Registered User regular
    zakkiel wrote: »
    My position on this:I don't believe anything about human cognition is simple.

    My position is guided by my studies in neuroscience and psychology. Some aspects of human cognition are simple. Others are not.
    Emergent phenomena are not simple. They defy insight more or less by definition. I view the effort to reverse-engineer self-awareness by tinkering with logical abstractions as a fool's game - even those concepts that seem the simplest and purest to us are in fact composed of uncountably many associations.
    Emergence is defined as complex patterns, entities or regularities arising as a result of interaction between simpler parts that do not intrinsically have those properties. Nowhere in emergence is it necessary for them to "defy insight".

    Perhaps an example will clarify emergence. The simplest example of emergent phenomena is probably Langton's ant. The ant lives on an infinite square grid. Whenever it walks on a square, the square flips between black and white. A white square causes the ant to turn right. A black square causes the ant to turn left.

    What happens? Nothing like what you might expect. The ant at first makes simple patterns, then chaotic ones, and then finally settles into a permanent recurring pattern that constructs a diagonal highway. All of this is in a sense knowable from the very simple rules governing the ant. Yet in a sense the result is unknowable: the only way to discover that it behaves this way is to painfully follow the ant, step by step - and, having done so, you have no more understanding of how the rules of the ant produce this behavior than when you started. It is exactly this property that makes the phenomenon "emergent"; if one can intuit the behavior of the whole system from its simple rules, one does not have an emergent phenomenon.

    If this is true of a system this simple, what does that imply about a brain?
    Consider arithmetic. Why are humans so much worse than computers at it, despite computers being so much stupider?
    *snip

    You might as well ask why cars are better at traveling distances than walking.
    This misses the point. All computation is expressible numerically. The computational complexity of a task is an objective fact, and it is objectively true that walking is far more demanding computationally than adding 21985343 and 13293409. It is trivially true that we aren't optimized for addition - but neither is a Turing machine, which is supposedly a universal model for computation (setting aside incompleteness theorems), which Turing himself believed encompassed human cognition. So why are tasks that are trivial to a Turing machine so hard for us? Perhaps because when we add numbers we are doing something fundamentally different from a Turing machine doing the same, and that something cannot be reduced to the sparse logical system of Turing machines. Or if it can, it is in a Turing state so complex that no insight can connect it with the apparently simple act of addition.
    Why can we only hold seven or so numbers in working memory, despite the enormous processing capacity of the human brain?
    Because the enormous processing capacity of the human brain is busy with other tasks. There's no latent potential untapped in the human brain. It is all in use and not available for working memory to utilise, because they are busy supporting different systems entirely.

    The neuroanatomy of working memory has been narrowed down to the Prefrontal Cortex since 1995. There are numerous connections to other parts of the brain from the PFC, but the majority of processing occurs there. The limitation of working memory is the limitation of the anatomy of the brain: it's limited because there is less "brain real estate" dedicated to it. It's quite a small area compared to the size of the rest of the brain. It also appears to share its total available "resources" amongst the different types of working memory. For example, attempting to keep a string of digits in mind while simultaneously performing a task that requires you to spatially rotate a shape and pick the right image that would correspond to it results in degraded performance on both tasks.

    I think there is more to it:
    Wikipedia wrote:
    Brain imaging has also revealed that working memory functions are not limited to the PFC. A review of numerous studies[83] shows areas of activation during working memory tasks scattered over a large part of the cortex. There is a tendency for spatial tasks to recruit more right-hemisphere areas, and for verbal and object working memory to recruit more left-hemisphere areas. The activation during verbal working memory tasks can be broken down into one component reflecting maintenance, in the left posterior parietal cortex, and a component reflecting subvocal rehearsal, in the left frontal cortex (Broca's area, known to be involved in speech production).

    Anyway, this also misses my point. The computational effort required to rotate an object is orders of magnitude greater than adding small numbers. Yet they are practically equivalent to us - actually rotation is easier. The reason is that human conceptions of numbers are complex. We can do math and know what it is we're doing. This is reflected in our ability to generalize the process of arithmetic, as a machine cannot.
    The answer, I suspect, is that humans have a theory of numbers informed by our total catalog of objects - all the unconscious knowledge we gain infancy, when we form new synapses at rates that absolutely dwarf every later stage of life. Our conception of the number one, which appears so simple internally, is in fact incomprehensibly complex if one were to write out explicitly the definition our brains actually use.

    An interesting idea. There are two logical implications of this theory. Firstly, that increased knowledge should decrease working memory capacity. Secondly, that children, with less knowledge, should have more capacity.
    The evidence supports neither.

    Neither of these things follow at all. 1) There are too many other changes as the brain learns to tease this out. Increased knowledge comes hand in hand with increased practice at tasks, which we know makes us more efficient at those tasks. 2) Increased knowledge in general does not mean that any particular concept becomes cognitively "bulkier." It may be that as someone learns number theory, for example, their conception of numbers grows not larger, but different. Hence the frequent issue physicists have with calculating tips. This is the cognitive equivalent (perhaps consequence?) of synaptic pruning. 3) By the time children are doing arithmetic at all, they are at or near the maximum number of synapses they will ever have.
    In contrast, computers have no theory of numbers at all. They "do" arithmetic in the same sense that an abacus does; they are totally unable to interpret the result or understand what they do, any more than a stone understands that it falls.

    Humans do not require a theory of numbers to perform calculations. Learning mathematics makes us more efficient at it, but completely uneducated people can and do perform extremely detailed and difficult for a layperson calculations without having any explicit understanding of mathematics or numbers.

    This is sort of a silly way to put it - educated or no, what person has no understanding of numbers? - but if you are referring to catching a ball vs adding integers, see my earlier reply.
    The human brain is fundamentally a pattern recognition and pattern processing device. The theory behind the pattern is not necessary. All that is required is experiencing the pattern.

    If we take this as true, it follows that we can't reason. If we can't reason, all of this discussion is pointless. But even so, the conclusion for AI is still the same: it might be taught, but never programmed.

    Account not recoverable. So long.
  • Options
    Just_Bri_ThanksJust_Bri_Thanks Seething with rage from a handbasket.Registered User, ClubPA regular
    The Ender wrote: »
    Here's a different thought. Neil Tyson was once asked if he thought warp drive would ever be invented. He said, and I'm paraphrasing here, that since we can't break the speed of light, we are going to have to figure out a way around it; if we are ever going to explore the universe in any meaningful way, we are going to have to figure out how to manipulate the fabric of space time. I have to agree with his line of thinking on this one.

    Now, lets assume that it is possible to manipulate space time in order to travel (in effect) faster than light. Lets also assume that some civilization out there has figured out how to do this. What would it look like to an outside observer? Is there some effect we could be searching for that would give away the existence of a 'warp capable' species? I gotta think artificial manipulation of space time would have some observable affect, perhaps one that could be identified from a great distance.

    Well, you can also 'beat' the speed of light by taking advantage of the effects of relativity as you get closer and closer to said speed of light: time slows down for you. In real-time, maybe it takes hundreds of years for a ship at 80~ percent of C to travel to a given star; in 'ship time', it may only take days.

    So, some people could be zooming around using that trick, and we'd basically never know about it because we're still stuck in real time while they enjoy ship time.



    Likewise, assuming that some kind of 4-dimensional travel is possible and can be used to short-vut your way around the galaxy, we'd probably never know that a bunch of people are doing it because they'd end-up at different points in time as well as different points in space. And we don't even have any clue how that would work, or what the implications of that would be for causality.

    Yeah, but you still have to get to significant c-fractional speeds in a reasonable amount of time, which it still stupidly far beyond our current technology.

    ...and when you are done with that; take a folding
    chair to Creation and then suplex the Void.
  • Options
    The EnderThe Ender Registered User regular
    Yeah, but you still have to get to significant c-fractional speeds in a reasonable amount of time, which it still stupidly far beyond our current technology.

    Oh for sure. And, as I understand it, we're still not totally sure if even that trick will work-out for macro-scale objects; going that fast might stretch you out until you're just a long string of elementary particles.

    With Love and Courage
  • Options
    Emissary42Emissary42 Registered User regular
    edited July 2015
    Hell, part of the reason we anticipate most intelligent life in the universe is made up of AIs is based on the limitations of relativistic flight. If we ever successfully construct a general AI then we can be assured that it will, in time, become more numerous and wide-spread than humans simply because you can run a computer in more places than you can support respiration. Plus, being able to turn your life processes off for indeterminate periods of time and then on again is very conducive to long-range travel and long-term survival.

    Emissary42 on
  • Options
    MorninglordMorninglord I'm tired of being Batman, so today I'll be Owl.Registered User regular
    zakkiel wrote: »
    zakkiel wrote: »
    My position on this:I don't believe anything about human cognition is simple.

    My position is guided by my studies in neuroscience and psychology. Some aspects of human cognition are simple. Others are not.
    Emergent phenomena are not simple. They defy insight more or less by definition. I view the effort to reverse-engineer self-awareness by tinkering with logical abstractions as a fool's game - even those concepts that seem the simplest and purest to us are in fact composed of uncountably many associations.
    Emergence is defined as complex patterns, entities or regularities arising as a result of interaction between simpler parts that do not intrinsically have those properties. Nowhere in emergence is it necessary for them to "defy insight".

    Perhaps an example will clarify emergence. The simplest example of emergent phenomena is probably Langton's ant. The ant lives on an infinite square grid. Whenever it walks on a square, the square flips between black and white. A white square causes the ant to turn right. A black square causes the ant to turn left.

    What happens? Nothing like what you might expect. The ant at first makes simple patterns, then chaotic ones, and then finally settles into a permanent recurring pattern that constructs a diagonal highway. All of this is in a sense knowable from the very simple rules governing the ant. Yet in a sense the result is unknowable: the only way to discover that it behaves this way is to painfully follow the ant, step by step - and, having done so, you have no more understanding of how the rules of the ant produce this behavior than when you started. It is exactly this property that makes the phenomenon "emergent"; if one can intuit the behavior of the whole system from its simple rules, one does not have an emergent phenomenon.

    If this is true of a system this simple, what does that imply about a brain?

    If you reset the ant to the same conditions, does it follow the same path?
    Consider arithmetic. Why are humans so much worse than computers at it, despite computers being so much stupider?
    *snip

    You might as well ask why cars are better at traveling distances than walking.
    This misses the point. All computation is expressible numerically. The computational complexity of a task is an objective fact, and it is objectively true that walking is far more demanding computationally than adding 21985343 and 13293409. It is trivially true that we aren't optimized for addition - but neither is a Turing machine, which is supposedly a universal model for computation (setting aside incompleteness theorems), which Turing himself believed encompassed human cognition. So why are tasks that are trivial to a Turing machine so hard for us? Perhaps because when we add numbers we are doing something fundamentally different from a Turing machine doing the same, and that something cannot be reduced to the sparse logical system of Turing machines. Or if it can, it is in a Turing state so complex that no insight can connect it with the apparently simple act of addition.

    A mathematical description is an abstraction, inherently imperfect. To calculate you must be able to quantify. To quantify you must assign, in many cases completely arbitrarily, a certain set of meaning to any given number.
    The resulting abstraction is not the original thing itself: it is an approximation. There is no way to change that it is an approximation: there is no method of 100% accurately quantifying a thing. There is always error.
    Here you are equating two dissimilar things via their mathematical abstractions. You are not equating the things themselves.
    However you then go on to argue as if their abstractions similarity implies similarity in the actual reality of the thing itself. This is trivially absurd: you've reduced and simplified them, arbitrarily translated the particulars of them into the same language. Of course it looks the same if you do that. That doesn't mean they are the same in reality.

    I'm going to go with "we are structurally different and quite a lot slower" as the simplest causal explanation. I note that you "snipped" this as if it isn't important, when it is in fact the most important point.
    Why can we only hold seven or so numbers in working memory, despite the enormous processing capacity of the human brain?
    Because the enormous processing capacity of the human brain is busy with other tasks. There's no latent potential untapped in the human brain. It is all in use and not available for working memory to utilise, because they are busy supporting different systems entirely.

    The neuroanatomy of working memory has been narrowed down to the Prefrontal Cortex since 1995. There are numerous connections to other parts of the brain from the PFC, but the majority of processing occurs there. The limitation of working memory is the limitation of the anatomy of the brain: it's limited because there is less "brain real estate" dedicated to it. It's quite a small area compared to the size of the rest of the brain. It also appears to share its total available "resources" amongst the different types of working memory. For example, attempting to keep a string of digits in mind while simultaneously performing a task that requires you to spatially rotate a shape and pick the right image that would correspond to it results in degraded performance on both tasks.

    I think there is more to it:
    Wikipedia wrote:
    Brain imaging has also revealed that working memory functions are not limited to the PFC. A review of numerous studies[83] shows areas of activation during working memory tasks scattered over a large part of the cortex. There is a tendency for spatial tasks to recruit more right-hemisphere areas, and for verbal and object working memory to recruit more left-hemisphere areas. The activation during verbal working memory tasks can be broken down into one component reflecting maintenance, in the left posterior parietal cortex, and a component reflecting subvocal rehearsal, in the left frontal cortex (Broca's area, known to be involved in speech production).
    Yes there is distribution to related brain areas, but if a persons PFC is damaged their working memory is dramatically effected for all tasks. If one of those related areas are damaged the persons working memory for related tasks is damaged, but not for unrelated tasks.
    The PFC is clearly the most important neural correlate for working memory.
    My point is unchanged.
    Anyway, this also misses my point.
    Actually it destroys your point utterly, since your point is founded on equating working memory to total computational capacity of the entire brain, and your "counter" has not changed the fact that working memory is not equatable to total computational capacity of the human brain: see above.

    The computational effort required to rotate an object is orders of magnitude greater than adding small numbers. Yet they are practically equivalent to us - actually rotation is easier.

    Skill at each task differs between individuals. Rotation isn't always easier. Some people are shit at it and aces with numbers.
    The reason is that human conceptions of numbers are complex. We can do math and know what it is we're doing. This is reflected in our ability to generalize the process of arithmetic, as a machine cannot.

    Uh. Spatial rotation is also extremely complex. Spatial rotation can be generalised and it can be "understood" as a concept, just like numbers.
    Biomechanically the same processes are happening for both.
    You have not explained why numbers are inherently more complex, or why it should be treated specially from any other type of cognition, given that the basic mechanics are the same. You are assuming they are and treating it as a given, which is not sufficient.
    And please don't say emergence. That's the same as saying "magic". It isn't a get out of jail free card.
    The answer, I suspect, is that humans have a theory of numbers informed by our total catalog of objects - all the unconscious knowledge we gain infancy, when we form new synapses at rates that absolutely dwarf every later stage of life. Our conception of the number one, which appears so simple internally, is in fact incomprehensibly complex if one were to write out explicitly the definition our brains actually use.

    An interesting idea. There are two logical implications of this theory. Firstly, that increased knowledge should decrease working memory capacity. Secondly, that children, with less knowledge, should have more capacity.
    The evidence supports neither.

    Neither of these things follow at all. 1) There are too many other changes as the brain learns to tease this out. Increased knowledge comes hand in hand with increased practice at tasks, which we know makes us more efficient at those tasks. 2) Increased knowledge in general does not mean that any particular concept becomes cognitively "bulkier." It may be that as someone learns number theory, for example, their conception of numbers grows not larger, but different. Hence the frequent issue physicists have with calculating tips. This is the cognitive equivalent (perhaps consequence?) of synaptic pruning. 3) By the time children are doing arithmetic at all, they are at or near the maximum number of synapses they will ever have.

    You've changed your theory here. First it was increased knowledge causes limited capacity, now you have adjusted your theory in response to criticism so that the knowledge is "changed".

    Working memory capacity does not change based on knowledge. It doesn't matter how the knowledge changes: actual capacity is unaffected. Children improve their capacity based on development. Their capacity at any given developmental period is unchanged between different tasks.
    When designing tests it is a given that practice effects will be considered: the tests are designed to eliminate it as a factor.
    Synaptic pruning occurs at specific stages of development. When I said it increases gradually this includes times when synaptic pruning is not occurring.
    Ultimately they'd be pretty poor tests if they couldn't account for all these effects...fortunately they do, the people giving the tests are trained to do so, and their peers are trained to spot if they have not.

    In contrast, computers have no theory of numbers at all. They "do" arithmetic in the same sense that an abacus does; they are totally unable to interpret the result or understand what they do, any more than a stone understands that it falls.

    Humans do not require a theory of numbers to perform calculations. Learning mathematics makes us more efficient at it, but completely uneducated people can and do perform extremely detailed and difficult for a layperson calculations without having any explicit understanding of mathematics or numbers.

    This is sort of a silly way to put it - educated or no, what person has no understanding of numbers? - but if you are referring to catching a ball vs adding integers, see my earlier reply.

    Here you have just stated that all human beings have an intrinsic theory of numbers. This is a statement that requires a citation. By that I mean scientific evidence, not a philosophical argument.

    The human brain is fundamentally a pattern recognition and pattern processing device. The theory behind the pattern is not necessary. All that is required is experiencing the pattern.

    If we take this as true, it follows that we can't reason. If we can't reason, all of this discussion is pointless. But even so, the conclusion for AI is still the same: it might be taught, but never programmed.

    Define "reason".

    (PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
  • Options
    Rhan9Rhan9 Registered User regular
    Mvrck wrote: »
    Foomy wrote: »
    even if we discovered something that we wanted to go "HOLY SHIT PROBES NOW" I just don't see how we could ever get something to another solar system in a reasonable amount of time with our current level of technology.

    Stupid space is just too big, need to throw it all in the dryer a few times a shrink it.

    It would be nice to have a legitimate reason to throw all of the money at NASA for that though. Something concrete that would be hard for politicians to say no to.

    My platform for running for the world president includes dumping a shitload of money into the sciences, space exploration, education and research. It's a tried and true 4X tactic.

    Because so much money is used for fucking puerile things (like pointless fighter planes that'll likely never see use), while many scientific fields are essentially starving.

  • Options
    electricitylikesmeelectricitylikesme Registered User regular
    Rhan9 wrote: »
    Mvrck wrote: »
    Foomy wrote: »
    even if we discovered something that we wanted to go "HOLY SHIT PROBES NOW" I just don't see how we could ever get something to another solar system in a reasonable amount of time with our current level of technology.

    Stupid space is just too big, need to throw it all in the dryer a few times a shrink it.

    It would be nice to have a legitimate reason to throw all of the money at NASA for that though. Something concrete that would be hard for politicians to say no to.

    My platform for running for the world president includes dumping a shitload of money into the sciences, space exploration, education and research. It's a tried and true 4X tactic.

    Because so much money is used for fucking puerile things (like pointless fighter planes that'll likely never see use), while many scientific fields are essentially starving.

    Also ocean colonies. Because someone will be all "why are we colonizing space and not the ocean" and I'll be all "come and have a look at this neat lighthouse..."

  • Options
    GvzbgulGvzbgul Registered User regular
    What about... space fighter planes? That then can be "re-purposed" into space science vehicles.

  • Options
    LanzLanz ...Za?Registered User regular
    edited July 2015
    Rhan9 wrote: »
    Mvrck wrote: »
    Foomy wrote: »
    even if we discovered something that we wanted to go "HOLY SHIT PROBES NOW" I just don't see how we could ever get something to another solar system in a reasonable amount of time with our current level of technology.

    Stupid space is just too big, need to throw it all in the dryer a few times a shrink it.

    It would be nice to have a legitimate reason to throw all of the money at NASA for that though. Something concrete that would be hard for politicians to say no to.

    My platform for running for the world president includes dumping a shitload of money into the sciences, space exploration, education and research. It's a tried and true 4X tactic.

    Because so much money is used for fucking puerile things (like pointless fighter planes that'll likely never see use), while many scientific fields are essentially starving.

    Also ocean colonies. Because someone will be all "why are we colonizing space and not the ocean" and I'll be all "come and have a look at this neat lighthouse..."

    Because Pod Six are dicks

    Friggin' Pod Six...

    Lanz on
    waNkm4k.jpg?1
  • Options
    JepheryJephery Registered User regular
    edited July 2015
    Whatever planet is revealed today will be featured in a so much sci-fi that it will become a cliche within a year.

    http://www.nasa.gov/news/media/newsaudio/index.html

    Jephery on
    }
    "Orkses never lose a battle. If we win we win, if we die we die fightin so it don't count. If we runs for it we don't die neither, cos we can come back for annuver go, see!".
  • Options
    electricitylikesmeelectricitylikesme Registered User regular
    Jephery wrote: »
    Whatever planet is revealed will be featured in a so much sci-fi that it will become a cliche within a year.

    http://www.nasa.gov/news/media/newsaudio/index.html

    As someone pointed out, if they're looking at us, they are currently looking at the Vikings pillaging England.

  • Options
    TofystedethTofystedeth Registered User regular
    Phyphor wrote: »
    redx wrote: »
    Polaritie wrote: »
    Minor pedantry - "uncountably many" has a specific definition, and it bugs me to see it misused.

    On the topic of computers only doing addition - strictly speaking they can't even do that. All operations devolve to AND, OR, and NOT. Addition is XOR (or and not and) for the same place and AND for the carry bit (thus a general addition circuit has three inputs - A, B, and Carry). Subtraction is adding the two's complement of a signed integer value, etc.

    Edit: Those are the normal basic operations of a logic circuit, but you can make things like XOR or NOR gates if you want. Actually, NOR gates are sufficient for all operations on their own, but it gets really inefficient.

    Computers can actually do multiplication and division by 2, to an extent. A bitshift left is multiplying by two, and bitshift right is division, though things get a bit wonky when you get to overflows or bits dropping off the right side. Since the "normal" algorithm for multiplication (or division) is just repeated addition (or subtraction) for operations with large numbers this can take longer than an algorithm which uses bitshifting to decompose the number initially and take out a large chunk of the problem off the front. I believe most modern programming languages and processors already handle this.
    There's a discussion on this in this StackOverflow post.
    http://stackoverflow.com/questions/2776211/how-can-i-multiply-and-divide-using-only-bit-shifting-and-adding

    that only works for integers and has problems with loss of precision. It's not the same as "multiplication" as the word is usually used.

    Aren't most floats(sign)n2^x?

    Seems like that would be even easier.

    the binary representation of floats can vary a lot but here is a good writeup of the canonical 32 bit C float: https://www.cs.duke.edu/~raw/cps104/TWFNotes/floating.html

    all the bitwise operators (such as shifting) don't work on floating point formats like this.

    You don't literally shift the bits, but you can conceptually double one by adding 1 to the exponent, so the same operations can apply

    ooh is that what he meant?

    Yeah ok that makes sense.

    Bitshifts (and other bitwise operators) have a very precise meaning that doesn't apply to floats so that didn't occur.

    edit: the reason why the difference matters is because to add or subtract 1 from the exponent of the float takes at minimum several operations. You effectively have to copy out the bits to a different register to be able to add or subtract 1 then copy back. A bitshift (or other bitwise operators) are a single op in place. That's their whole point.

    I was speaking only from the perspective of dealing with integers, I just didn't specify. I almost never deal with floats, so I rarely think to mention them.

    steam_sig.png
  • Options
    DaedalusDaedalus Registered User regular
    Gvzbgul wrote: »
    What about... space fighter planes? That then can be "re-purposed" into space science vehicles.

    You sound like you'd be a fan of ALASA:

    ALASA-darpa.jpg

    Basically, to launch a small satellite, instead of using a big expensive rocket launching from one of a handful of launch pads; you take a much smaller and cheaper rocket, bolt it on to an F-15 that you have just lying around, and launch it from 40,000 ft when you're already moving at Mach two point someodd.

    DARPA and Boeing should be doing the first test flight either late this year or early next year.

  • Options
    Emissary42Emissary42 Registered User regular
    edited July 2015
    The figures for today's briefing on Kepler's latest research, which seems to focus on Kepler-452b: the first confirmed near-earth-sized planet in the habitable zone of a star like our own.

    fig5-scale_of_452_system.jpg?itok=VWABBbyC

    Emissary42 on
  • Options
    TaranisTaranis Registered User regular
    edited July 2015
    Have they announced the composition of 452b's atmosphere anywhere? Didn't see it in that article.

    Taranis on
    EH28YFo.jpg
  • Options
    Emissary42Emissary42 Registered User regular
    Taranis wrote: »
    Have they announced the composition of 452b's atmosphere anywhere? Didn't see it in that article.

    We're going to have to wait for the James Webb space telescope to be launched before that can be done, it seems.

  • Options
    Rhesus PositiveRhesus Positive GNU Terry Pratchett Registered User regular
    Jephery wrote: »
    Whatever planet is revealed will be featured in a so much sci-fi that it will become a cliche within a year.

    http://www.nasa.gov/news/media/newsaudio/index.html

    As someone pointed out, if they're looking at us, they are currently looking at the Vikings pillaging England.

    And yet they haven't sent help

    Fuckers

    [Muffled sounds of gorilla violence]
  • Options
    TraceTrace GNU Terry Pratchett; GNU Gus; GNU Carrie Fisher; GNU Adam We Registered User regular
    beep boop!

    hello earthlings!

  • Options
    TaranisTaranis Registered User regular
    Emissary42 wrote: »
    Taranis wrote: »
    Have they announced the composition of 452b's atmosphere anywhere? Didn't see it in that article.

    We're going to have to wait for the James Webb space telescope to be launched before that can be done, it seems.

    Was that not a thing we could already do, or was it just not possible in this particular instance? I could've sworn we've done that with exoplanets before. Maybe that was only with hot jupiters.

    EH28YFo.jpg
  • Options
    FoomyFoomy Registered User regular
    http://stuffin.space/ look at the orbits of all the junk and satellites we have up in space.

    so so much junk.

    Steam Profile: FoomyFooms
This discussion has been closed.