The new forums will be named Coin Return (based on the most recent vote)! You can check on the status and timeline of the transition to the new forums here.
The Guiding Principles and New Rules document is now in effect.
Hewlett Packard just announced that they've created the first Memristor, a new elementary electronic component. Its resistance goes down when charge is driven one way, and its resistance goes up when charge flows the other way, and it remembers this state, without power.
Obviously huge for digital computing, the first story seems to be that it will allow for computers that boot up instantly, and eventual superior memory storage- replacing both RAM and hard disk. I would not be at all surprised if a great deal more than even that came from this, in analog electronics as well as digital.
Particularly cool note: the memristor was theorized to exist in 1971 by electrical engineer Leon Chua, derived from mathematical equations.
Now is a good time to be an inventor in the field of electronics, methinks. Anybody have any world-changing ideas yet?
[kurzweil]Does this mean the advent of the Singularity?[/kurzweil]
Okay, I'm kinda keying into how important this is, but am still a layman. Break this down for me: why is this so important?
Edit: Besides the insta-booting computers I mean. I'm just not really clear on what the technology is.
Well, I just finished my Electronics for Scientists II final today, but I'm certainly no expert. It's kinda tough if you don't know much about electronic circuits, or computing.
Here's an example- do you know how computer memory works right now? In most cases, there's a hard disk (magnetic ones and zeros, lots of space, slow to write/read) and RAM (random access memory; it's what the computer actually uses for what it's currently doing, but as soon as the power turns off, it loses everything).
Memristors can be used to store data within a circuit (so it can be used for what the computer is actually doing) without power.
I feel like a bit of a fraud here, as there's not whole lot I know that isn't in the articles in the OP, and there are definitely people here who know more than me. But hopefully helps a little.
It's not doing anything new, just the usual but smaller, cheaper, quieter, and more reliable, and I think faster.
--
Wait. That article suggests that it does something similar to my old "wavelength" idea because binary recording is lame. That would be nice. Also I expect royalties.
Yeah, definitely smaller- it's at about the size-per-bit as a hard drive disk now, with each one at 15 nanometers across or so. And they think they can get them to 4nm or less.
Second one talks about how part of the reason making these took so long is that people thought the fundamental relationship of electronics was between voltage and charge, and now they're saying that it's actually between flux (change in voltage) and charge.
Well apparently these memristors can do something computers currently cannot do, store information in a non-binary format. Instead of 1 and 0 it knows 1, 0, and everything in between basically.
Well apparently these memristors can do something computers currently cannot do, store information in a non-binary format. Instead of 1 and 0 it knows 1, 0, and everything in between basically.
That's an interesting aspect- if it's important early on, though, it would represent a very fundamental change in the way computing currently works.
Said simply, we use binary because it easier. Analog electronics are fidgety- they don't always work quite right, and they're a lot harder to get working "perfectly"- digital electronics you pretty much just have to turn on.
Here's an analogy my professor used: Imagine you and a friend are trying to send information to each other down a long hallway, with lamps. You've each got a lamp with a variable aperture, with ten settings of brightness. You can use that to send numbers- 9 is brightest, 1 is dimmest, 0 is dark.
Now, in perfect conditions, you might be able to use this system to send, say, your phone number. But what if the hallway's filled with fog, he's really far away? It's going to be very difficult to tell the difference between, say, a 7 and a 6. Mistakes are going to be made.
Now say you decided to just use the brightest lamp setting and off to send information. You have to flash a lot more times for the same amount of information, but it's a lot harder to confuse. This way, you can actually get stuff done.
(by the way, my professor is the guy who invented those security tags that beep when you pass through a gate)
If they can eventually shrink these memristors to a scale similar to that of neurons, then maybe the variability they have can be put to real use. Hell, it's possible that this is the secret to real artificial intelligence, so I'm not going to make any predictions.
Tarantio on
0
KageraImitating the worst people. Since 2004Registered Userregular
edited May 2008
So can I finally run :insert PC game with crazy requirements in 5 years:
I'm confused by how you can call these things an additional "fundamental" circuit element. I thought the capacitor, resistor and inductor were fundamental because if you take a bunch of wires you end up with some version of these three. Where the fuck does a memristor figure into this?
(this in no way implies I do not find this idea extremely cool, in fact I am highly excited by the possibilities here).
EDIT: I don't think these are a revolution in making computers work "analog" though, however they certainly give us some interesting new ways to go A2D.
As I understand it, it's the fourth passive circuit element... ever. It does something, just with current applied, that has never been done before.
[kurzweil]Does this mean the advent of the Singularity?[/kurzweil]
I forget who said it, but it's true:
Here's the big problem with all this singularity horseshit: Hardware may keep getting better and better, but software has been stuck at "fucking awful" for decades, and it ain't getting better anytime soon.
[kurzweil]Does this mean the advent of the Singularity?[/kurzweil]
I forget who said it, but it's true:
Here's the big problem with all this singularity horseshit: Hardware may keep getting better and better, but software has been stuck at "fucking awful" for decades, and it ain't getting better anytime soon.
[kurzweil]Does this mean the advent of the Singularity?[/kurzweil]
I forget who said it, but it's true:
Here's the big problem with all this singularity horseshit: Hardware may keep getting better and better, but software has been stuck at "fucking awful" for decades, and it ain't getting better anytime soon.
It's gotten worse. As things have gotten more complex, flaws have become much more obvious and are often fatal. We've barely scratched the surface of what we can do with our current tools, either out of expediency, laziness, or ignorance.
This is an interesting effect. I'm not surprised that it hasn't been observed until now due to the nano-scale requirements.
I can definately see its use in memory applications, but like DRAM, it would need to be refreshed after every read. Unlike DRAM, however, it wouldn't need constant refreshes/power to maintain state.
It's not going to bring about an analog revolution for "general purpose" computers, though. Mostly due to the fact that you can't get rid of the instabilities/noise in any system. So it is tough to get guaranteed, repeatable results if you have a long, multi-element analog path. Digital avoids this by limiting the possible choices to just two. Analog is great, but digital is easier.
So I'm still not seeing all the revolutionary stuff outside of computers that boot up really fast.
ElJeffe on
I submitted an entry to Lego Ideas, and if 10,000 people support me, it'll be turned into an actual Lego set!If you'd like to see and support my submission, follow this link.
In the end, is this a quantitative of qualitative thing? Or is it too early to say?
ElJeffe on
I submitted an entry to Lego Ideas, and if 10,000 people support me, it'll be turned into an actual Lego set!If you'd like to see and support my submission, follow this link.
Since capacitors relate voltage to charge, and we want to relate charge to magnetic flux, instead of memristor, couldn't we just call this component a Flux Capacitor?
WHY HASN"T ANYONE ADDRESSED THIS IMPORTANT ISSUE!?
In the end, is this a quantitative of qualitative thing? Or is it too early to say?
This... this has pretty much been the entire history of computing.
Well, yes. But nobody gets terribly excited when Intel busts out their new Pentium 5 that tops out at 4GHz instead of 3.5GHz and contains 10M transistors instead of 9M. Whereas this seems to be generated some hullaballoo.
Basically, I'm trying to find a reason to get all giddy. I mean, it's conceptually cool, but I'm trying to connect the discovery to something really awesome, rather than being the 2,354th claim that we're just around the corner from advanced AI and neural networks.
ElJeffe on
I submitted an entry to Lego Ideas, and if 10,000 people support me, it'll be turned into an actual Lego set!If you'd like to see and support my submission, follow this link.
Yeah I think killing the hard drive is a pretty big innovation. That's going to have way more of an impact than coming up with a 20% faster CPU. Also it lowers power consumption and computer noise. Plus they say they can make them even smaller. This means huge storage on your cell phone.
RandomEngy on
Profile -> Signature Settings -> Hide signatures always. Then you don't have to read this worthless text anymore.
The thing that makes this the most exciting to me is that it's possible to create circuits where the setting of one memristor predicates the setting of a number of other elements. In effect, this is the necessary invention leading to solid-state data compression.
Shouldn't there be something that satisfies the differential F^x=dV/dΦ and F^x=dV/dΦ, where x=1 or -1? It seems like the four main ones relate four out of the pair-wise combinations of charge, flux, resistance, and voltage, so why isn't there something for the other two?
Shouldn't there be something that satisfies the differential F^x=dV/dΦ and F^x=dV/dΦ and F^x∝dV/dΦ and F^x∝dI/dq, where x=1 or -1? It seems like the four main ones relate four out of the pair-wise combinations of charge, flux, resistance, and voltage, so why isn't there something for the other two?
Yeah, I'm pretty sure there are 2 more. Give me a while to figure out what they should be....
That's pretty much the best instance of wikipedia vandalism I've seen.
RandomEngy on
Profile -> Signature Settings -> Hide signatures always. Then you don't have to read this worthless text anymore.
0
Zilla36021st Century. |She/Her|Trans* Woman In Aviators Firing A Bazooka. ⚛️Registered Userregular
edited May 2008
:shock:
This is a huge discovery in electronics, I'm absolutely blown away by it.
Even the guy who theorized this components existence in the seventies is amazed that HP have actually gone and built one. It's incredible, and could have repercussions bigger than the atom bomb or genetics. Seriously. :shock:
So I'm still not seeing all the revolutionary stuff outside of computers that boot up really fast.
It's revolutionary if you're an electrical engineer; flip-flops and a lot of gates will probably be redesigned much more efficiently. No, it won't make your computer fly or bend space. I'm not sure what sort of revolution you imagine you might see as a consumer other than better data processing. Digital hardware is designed to realize a Turing machine. It can do it better, but it's still a Turing machine. Even if quantum computers were introduced you would still only see a faster, more powerful computer performing the same tasks.
This is a huge discovery in electronics, I'm absolutely blown away by it.
Even the guy who theorized this components existence in the seventies is amazed that HP have actually gone and built one. It's incredible, and could have repercussions bigger than the atom bomb or genetics. Seriously. :shock:
Posts
Edit: Besides the insta-booting computers I mean. I'm just not really clear on what the technology is.
Heard about this a week or two ago. It'll be nice in 2020. :P
We're going to need more atoms.
Well, I just finished my Electronics for Scientists II final today, but I'm certainly no expert. It's kinda tough if you don't know much about electronic circuits, or computing.
Here's an example- do you know how computer memory works right now? In most cases, there's a hard disk (magnetic ones and zeros, lots of space, slow to write/read) and RAM (random access memory; it's what the computer actually uses for what it's currently doing, but as soon as the power turns off, it loses everything).
Memristors can be used to store data within a circuit (so it can be used for what the computer is actually doing) without power.
I feel like a bit of a fraud here, as there's not whole lot I know that isn't in the articles in the OP, and there are definitely people here who know more than me. But hopefully helps a little.
--
Wait. That article suggests that it does something similar to my old "wavelength" idea because binary recording is lame. That would be nice. Also I expect royalties.
Ooh, new links:
http://en.wikipedia.org/wiki/Memristor
http://www.eetimes.com/news/latest/showArticle.jhtml?articleID=207403521
Second one talks about how part of the reason making these took so long is that people thought the fundamental relationship of electronics was between voltage and charge, and now they're saying that it's actually between flux (change in voltage) and charge.
That's an interesting aspect- if it's important early on, though, it would represent a very fundamental change in the way computing currently works.
Said simply, we use binary because it easier. Analog electronics are fidgety- they don't always work quite right, and they're a lot harder to get working "perfectly"- digital electronics you pretty much just have to turn on.
Here's an analogy my professor used: Imagine you and a friend are trying to send information to each other down a long hallway, with lamps. You've each got a lamp with a variable aperture, with ten settings of brightness. You can use that to send numbers- 9 is brightest, 1 is dimmest, 0 is dark.
Now, in perfect conditions, you might be able to use this system to send, say, your phone number. But what if the hallway's filled with fog, he's really far away? It's going to be very difficult to tell the difference between, say, a 7 and a 6. Mistakes are going to be made.
Now say you decided to just use the brightest lamp setting and off to send information. You have to flash a lot more times for the same amount of information, but it's a lot harder to confuse. This way, you can actually get stuff done.
(by the way, my professor is the guy who invented those security tags that beep when you pass through a gate)
If they can eventually shrink these memristors to a scale similar to that of neurons, then maybe the variability they have can be put to real use. Hell, it's possible that this is the secret to real artificial intelligence, so I'm not going to make any predictions.
As I understand it, it's the fourth passive circuit element... ever. It does something, just with current applied, that has never been done before.
I forget who said it, but it's true:
Here's the big problem with all this singularity horseshit: Hardware may keep getting better and better, but software has been stuck at "fucking awful" for decades, and it ain't getting better anytime soon.
Truest truth ever.
I can definately see its use in memory applications, but like DRAM, it would need to be refreshed after every read. Unlike DRAM, however, it wouldn't need constant refreshes/power to maintain state.
It's not going to bring about an analog revolution for "general purpose" computers, though. Mostly due to the fact that you can't get rid of the instabilities/noise in any system. So it is tough to get guaranteed, repeatable results if you have a long, multi-element analog path. Digital avoids this by limiting the possible choices to just two. Analog is great, but digital is easier.
Can't be. The math actually works.
Perhaps the return of the video game cart. :rotate:
In the end, is this a quantitative of qualitative thing? Or is it too early to say?
WHY HASN"T ANYONE ADDRESSED THIS IMPORTANT ISSUE!?
This... this has pretty much been the entire history of computing.
Well, yes. But nobody gets terribly excited when Intel busts out their new Pentium 5 that tops out at 4GHz instead of 3.5GHz and contains 10M transistors instead of 9M. Whereas this seems to be generated some hullaballoo.
Basically, I'm trying to find a reason to get all giddy. I mean, it's conceptually cool, but I'm trying to connect the discovery to something really awesome, rather than being the 2,354th claim that we're just around the corner from advanced AI and neural networks.
Massive storage available at RAM speeds sounds pretty cool.
Yeah, I'm pretty sure there are 2 more. Give me a while to figure out what they should be....
That's pretty much the best instance of wikipedia vandalism I've seen.
This is a huge discovery in electronics, I'm absolutely blown away by it.
Even the guy who theorized this components existence in the seventies is amazed that HP have actually gone and built one. It's incredible, and could have repercussions bigger than the atom bomb or genetics. Seriously. :shock:
It's revolutionary if you're an electrical engineer; flip-flops and a lot of gates will probably be redesigned much more efficiently. No, it won't make your computer fly or bend space. I'm not sure what sort of revolution you imagine you might see as a consumer other than better data processing. Digital hardware is designed to realize a Turing machine. It can do it better, but it's still a Turing machine. Even if quantum computers were introduced you would still only see a faster, more powerful computer performing the same tasks.
Care to explain?
Don't praise the machine!