The endpoint of all technology is unusable garbage.
In the far future, as the heat death of the universe approaches, and mankind fades into history, at least our legacy of AI Shitposting Bots will continue to live on.
0
Tynnanseldom correct, never unsureRegistered Userregular
it's a solution that's more for high-resolution, high-fidelity VR, so best for seated experiences like flight and racing sims than hand tracking solutions like Beat Saber
Meta put a "the Metaverse is the future" ad on a recent episode of the Vergecast and I thought it was pretty funny, considering how relentlessly they dunk on the metaverse on The Vergecast.
I did a lot of Zoom meetings during the pandemic, and recently we've started doing those meetings in person, and the in person meetings are so much better than the Zoom ones that I'm convinced that I'll never prefer a virtual meeting, I can' see it.
0
webguy20I spend too much time on the InternetRegistered Userregular
Meta put a "the Metaverse is the future" ad on a recent episode of the Vergecast and I thought it was pretty funny, considering how relentlessly they dunk on the metaverse on The Vergecast.
I did a lot of Zoom meetings during the pandemic, and recently we've started doing those meetings in person, and the in person meetings are so much better than the Zoom ones that I'm convinced that I'll never prefer a virtual meeting, I can' see it.
It must be a work culture thing. Our in person meetings here are terrible. We really slimmed down the online meetings. Plus i can still use y computer on a teams meeting versus just twiddling my thumbs in an in person one.
VR meetings would make our meetings bad sgain I bet. Ill pass.
ButtersA glass of some milksRegistered Userregular
I prefer Zoom/Teams when we are being presented something. I'd much rather sit at my own desk yawning at a meeting lead's spreadsheet than in a cold conference room. If we're discussing development on an engineering project then I prefer in person so we can draw shit out on a board if need be.
I think it the meeting is actually valuable, it's better in person.
If it could have been an email, it's not gonna be good no matter what.
I agree on both points and I don't get to see my peers very often (we manage branches of the library so we're each in our own location most of the time) so even if a meeting could have been an e-mail and it sucks, the before and after time when you get to hang out with people and grab lunch and kvetch is extremely valuable.
Yeah, the culture/people stuff can't easily be replicated virtually, but that's why hybrid models are pretty solid - get together periodically to hang out, talk about work and other stuff, then disperse. In the meantime, have functional meetings virtually.
My Reverb G2 is decent enough. I still wouldn't use it for spreadsheets or whatever. But it's never been hard to read text. I think the issue is more mental than visual at this point. I prefer text on a screen as it feels like the text is stationary? But if I'm reading text in VR it feels like the text is moving?
those sorts of baseline usability concerns with VR would be the thing I'd prefer meta to work on instead of making sure zuck's Wiimoji looks normal
they're wasting unspeakable amounts of organizational horsepower on aspects of VR I'm not sure anyone wants
i have a lot of respect for Carmack as an engineer, and he is outwardly critical about Meta on many things but even he's really into that virtual meeting thing and its just... why. i dont get it
those sorts of baseline usability concerns with VR would be the thing I'd prefer meta to work on instead of making sure zuck's Wiimoji looks normal
they're wasting unspeakable amounts of organizational horsepower on aspects of VR I'm not sure anyone wants
i have a lot of respect for Carmack as an engineer, and he is outwardly critical about Meta on many things but even he's really into that virtual meeting thing and its just... why. i dont get it
those sorts of baseline usability concerns with VR would be the thing I'd prefer meta to work on instead of making sure zuck's Wiimoji looks normal
they're wasting unspeakable amounts of organizational horsepower on aspects of VR I'm not sure anyone wants
i have a lot of respect for Carmack as an engineer, and he is outwardly critical about Meta on many things but even he's really into that virtual meeting thing and its just... why. i dont get it
There isn't enough money in selling hardware with useful functions that you described earlier. Zuck made his billions building a network and buying up other networks that their users are addicted to so he gets an endless supply of data to sell to advertisers. Now those networks are old hat and he wants a new network that you literally fucking live in because he either thinks the old world of tangible shit is dead or he thinks he can kill it and sell you the nostalgia of it after the fact.
I prefer Zoom/Teams when we are being presented something. I'd much rather sit at my own desk yawning at a meeting lead's spreadsheet than in a cold conference room. If we're discussing development on an engineering project then I prefer in person so we can draw shit out on a board if need be.
Meetings and one-way presentations are fine via Zoom, collaboration sessions are much better/more effective in person.
never in my life have i dealt with ups losing (honestly probably stealing) a package and then also the replacement to that package.
honestly don’t even want the new phone anymore, it’s been such an unending drama dealing with apple and ups.
apple doesn’t even want to send me another one now and they’re making me wait a week as though it will magically appear in a warehouse and be delivered even though ups said it’s gone
0
MaddocI'm Bobbin Threadbare, are you my mother?Registered Userregular
I once had UPS lose my TV for awhile, and Amazon sent me a replacement, and then UPS delivered both
Actually turns out the Google measurement has a set limit. So I went to the usual speed test website and uh
Goes a teensy bit faster than I thought.
We did this discussion in discord a week or so ago when I first moved and I discovered my ass had been spoiled by FiOS. Further, I'm still spoiled via gigabit cable
Fibre is the way...only my desktop caps out at 2.5gbs on the jack
+4
minor incidentexpert in a dying fieldnjRegistered Userregular
the only silver lining here is Zuckerberg's greed and incompetence could make this biometric surveillance device so unpleasant and expensive that nobody will actually buy it
it's a solution that's more for high-resolution, high-fidelity VR, so best for seated experiences like flight and racing sims than hand tracking solutions like Beat Saber
Dang it, my shipment has been delayed from today to Monday, and I'll be halfway across town doing interviews so I can't even come home on my lunch break. And it's fall break.
Well, I hope the horrible neighborhood children enjoy their new VR rig.
cool I was wondering when we'd get the next Google Glass idiot device that nobody uses, and there it is!
Houk the Namebringer on
0
MorninglordI'm tired of being Batman,so today I'll be Owl.Registered Userregular
edited October 2022
Actually if that does what it thinks I does, and I just looked it up and I think it does, I'll be using it, and so will many other people with disabled hands.
Or at least, the competitor that creates one that isn't fucking facebook, since as soon as someone invents it others will jump on the bandwagon.
It reads the intentions of hand movements. Literally, the tiniest gesture, completely imperceptible for others. You can basically almost pretend to move your hand, and it will work. For me, with really quite bad RSI, that sounds fucking wonderful. My problem is even the most forgiving buttons are too much, and I'm having trouble controlling both a left stick, and a right hand mouse. Both hands, fucked.
It's just that this is, you know, fucking Zuckerberg.
I am impatiently awaiting the neural headbands that just read my motor cortex so I can just sit there arms crossed and play action games with my mind. That's a way off, but if this thing can read impulses sent to my hand, that means I can sit there making only tiny movement that are comfortable, in any hand position I want, without having to struggle with a fucking piece of awkward plastic that is shaped and works for abled people but is totally fucked for me. I don't care if it takes literally months to learn it, I'll fucking learn it.
I want to make it clear: even if all I could use this for is press a button without actively moving my hands in any perceptible way, that would help my quality of life right now immensely. And I'm not even fully disabled, it's just RSI. I can move my hands, it just hurts to do it a lot. Imagine what this would be like for people with genuine limited mobility. This is a game changer.
This tech is very cool. The crime is that Zuckerberg is the one who will control it. Not the tech itself.
Morninglord on
(PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
Actually if that does what it thinks I does, and I just looked it up and I think it does, I'll be using it, and so will many other people with disabled hands.
Or at least, the competitor that creates one that isn't fucking facebook, since as soon as someone invents it others will jump on the bandwagon.
It reads the intentions of hand movements. Literally, the tiniest gesture, completely imperceptible for others. You can basically almost pretend to move your hand, and it will work. For me, with really quite bad RSI, that sounds fucking wonderful. My problem is even the most forgiving buttons are too much, and I'm having trouble controlling both a left stick, and a right hand mouse. Both hands, fucked.
It's just that this is, you know, fucking Zuckerberg.
I am impatiently awaiting the neural headbands that just read my motor cortex so I can just sit there arms crossed and play action games with my mind. That's a way off, but if this thing can read impulses sent to my hand, that means I can sit there making only tiny movement that are comfortable, in any hand position I want, without having to struggle with a fucking piece of awkward plastic that is shaped and works for abled people but is totally fucked for me. I don't care if it takes literally months to learn it, I'll fucking learn it.
I want to make it clear: even if all I could use this for is press a button without actively moving my hands in any perceptible way, that would help my quality of life right now immensely. And I'm not even fully disabled, it's just RSI. I can move my hands, it just hurts to do it a lot. Imagine what this would be like for people with genuine limited mobility. This is a game changer.
This tech is very cool. The crime is that Zuckerberg is the one who will control it. Not the tech itself.
A guy who wrote his PhD at the same time I did was looking into robotic prosthetics. (Caveat that this was 10 years ago; things can have changed since.)
He was looking into the use case of people who've lost their hand but not their arm. The idea was to use EMG (reading the electrical signals of the muscles) in the lower arm muscles; the hand is largely controlled by those muscles. (This is non-invasive; the electrodes are placed on the skin.)
Anyways, after four years of research he could discern the following motions and get the prosthetic to do them at least 90% of the time: Open/close fist ("grip"), open/close thumb and index finger ("pinch"), and rotate wrist ("pronate"), all while the arm was horizontal and while the arm was at an angle.
Previous state of the art was doing two of those things, but only with the arm horizontal. Most actual amputees discard their fancy prosthesis for purely cosmetic ones, claws, or ones that can grip actuated by a harness attached to the chest or buttons.
The fancy prosthetics with independently controlled fingers etc that you see of videos online are a) too fragile for everyday use and b) uncontrollable by real humans.
Anyways, long story short, I'll believe the neural wristband thing when I see it.
Actually if that does what it thinks I does, and I just looked it up and I think it does, I'll be using it, and so will many other people with disabled hands.
Or at least, the competitor that creates one that isn't fucking facebook, since as soon as someone invents it others will jump on the bandwagon.
It reads the intentions of hand movements. Literally, the tiniest gesture, completely imperceptible for others. You can basically almost pretend to move your hand, and it will work. For me, with really quite bad RSI, that sounds fucking wonderful. My problem is even the most forgiving buttons are too much, and I'm having trouble controlling both a left stick, and a right hand mouse. Both hands, fucked.
It's just that this is, you know, fucking Zuckerberg.
I am impatiently awaiting the neural headbands that just read my motor cortex so I can just sit there arms crossed and play action games with my mind. That's a way off, but if this thing can read impulses sent to my hand, that means I can sit there making only tiny movement that are comfortable, in any hand position I want, without having to struggle with a fucking piece of awkward plastic that is shaped and works for abled people but is totally fucked for me. I don't care if it takes literally months to learn it, I'll fucking learn it.
I want to make it clear: even if all I could use this for is press a button without actively moving my hands in any perceptible way, that would help my quality of life right now immensely. And I'm not even fully disabled, it's just RSI. I can move my hands, it just hurts to do it a lot. Imagine what this would be like for people with genuine limited mobility. This is a game changer.
This tech is very cool. The crime is that Zuckerberg is the one who will control it. Not the tech itself.
A guy who wrote his PhD at the same time I did was looking into robotic prosthetics. (Caveat that this was 10 years ago; things can have changed since.)
He was looking into the use case of people who've lost their hand but not their arm. The idea was to use EMG (reading the electrical signals of the muscles) in the lower arm muscles; the hand is largely controlled by those muscles. (This is non-invasive; the electrodes are placed on the skin.)
Anyways, after four years of research he could discern the following motions and get the prosthetic to do them at least 90% of the time: Open/close fist ("grip"), open/close thumb and index finger ("pinch"), and rotate wrist ("pronate"), all while the arm was horizontal and while the arm was at an angle.
Previous state of the art was doing two of those things, but only with the arm horizontal. Most actual amputees discard their fancy prosthesis for purely cosmetic ones, claws, or ones that can grip actuated by a harness attached to the chest or buttons.
The fancy prosthetics with independently controlled fingers etc that you see of videos online are a) too fragile for everyday use and b) uncontrollable by real humans.
Anyways, long story short, I'll believe the neural wristband thing when I see it.
That's nice but they've already got proof of concept of neural headbands controlling devices, and that was about five years ago. And literally all I need is one of those modes, a modified form of pinch, and I could suddenly do a ton more stuff pain free. I'd buy a 2000 dollar device that could only press one button with my mind in a heartbeat. Just one! Doesn't even have to actually physically press a button. It just has to tell the computer "hey send the left click impulse" with me barely moving my hand.
So, ten years ago, they were doing what I wanted. It's been a decade since, and this thing is projecting in 6 years time.
It's also not claiming to be a full prosthetic hand, the use case is replacing basic touch interfaces (touch and swipe) while you are walking around using your glasses, but I want to use it to replace left click on a normal computer pain free. I've already got an eyetracker I'd use for the mouse control, so I could not even use a mouse. The eye tracker works great, but it can't interact once the mouse is on the target without awkward stuff like using your voice (which tires your voice) or blinking (which again, tires you after a while). I currently have to setup an awkward system with accessible buttons, and even that can still hurt my hands and isn't very elegant/useful, is quite slow, isn't properly supported by windows and still isn't fully ergonomic.
This is not as much of a joke as you are claiming it is, at all. You've basically just confirmed its possible by saying they can already do what I want it to do with something vastly more complicated than this. Thanks for the good news I guess?
Also to be clear this is not a full blown prosthetic and any discussion that brings up full blown prosthetics is irrelevant to this device. It's a non physical way to left and right click, touch interact, and swipe. That sort of very, very basic stuff that is hard to do without a bulky physical accessory (a controller, or a phone), which they want people to be able to do with their hands in any position, naturally, just by gently twitching your fingers.
This is not a prosthetic, although it might be able to help someone without a hand use a computer. Which would be nice. But any discussion of how far away it is to have a full on robot limb is starting a different discussion about a different piece of tech.
Morninglord on
(PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
Actually if that does what it thinks I does, and I just looked it up and I think it does, I'll be using it, and so will many other people with disabled hands.
Or at least, the competitor that creates one that isn't fucking facebook, since as soon as someone invents it others will jump on the bandwagon.
It reads the intentions of hand movements. Literally, the tiniest gesture, completely imperceptible for others. You can basically almost pretend to move your hand, and it will work. For me, with really quite bad RSI, that sounds fucking wonderful. My problem is even the most forgiving buttons are too much, and I'm having trouble controlling both a left stick, and a right hand mouse. Both hands, fucked.
It's just that this is, you know, fucking Zuckerberg.
I am impatiently awaiting the neural headbands that just read my motor cortex so I can just sit there arms crossed and play action games with my mind. That's a way off, but if this thing can read impulses sent to my hand, that means I can sit there making only tiny movement that are comfortable, in any hand position I want, without having to struggle with a fucking piece of awkward plastic that is shaped and works for abled people but is totally fucked for me. I don't care if it takes literally months to learn it, I'll fucking learn it.
I want to make it clear: even if all I could use this for is press a button without actively moving my hands in any perceptible way, that would help my quality of life right now immensely. And I'm not even fully disabled, it's just RSI. I can move my hands, it just hurts to do it a lot. Imagine what this would be like for people with genuine limited mobility. This is a game changer.
This tech is very cool. The crime is that Zuckerberg is the one who will control it. Not the tech itself.
A guy who wrote his PhD at the same time I did was looking into robotic prosthetics. (Caveat that this was 10 years ago; things can have changed since.)
He was looking into the use case of people who've lost their hand but not their arm. The idea was to use EMG (reading the electrical signals of the muscles) in the lower arm muscles; the hand is largely controlled by those muscles. (This is non-invasive; the electrodes are placed on the skin.)
Anyways, after four years of research he could discern the following motions and get the prosthetic to do them at least 90% of the time: Open/close fist ("grip"), open/close thumb and index finger ("pinch"), and rotate wrist ("pronate"), all while the arm was horizontal and while the arm was at an angle.
Previous state of the art was doing two of those things, but only with the arm horizontal. Most actual amputees discard their fancy prosthesis for purely cosmetic ones, claws, or ones that can grip actuated by a harness attached to the chest or buttons.
The fancy prosthetics with independently controlled fingers etc that you see of videos online are a) too fragile for everyday use and b) uncontrollable by real humans.
Anyways, long story short, I'll believe the neural wristband thing when I see it.
That's nice but they've already got proof of concept of neural headbands controlling devices, and that was about five years ago. And literally all I need is one of those modes, a modified form of pinch, and I could suddenly do a ton more stuff pain free. I'd buy a 2000 dollar device that could only press one button with my mind in a heartbeat. Just one! Doesn't even have to actually physically press a button. It just has to tell the computer "hey send the left click impulse" with me barely moving my hand.
So, ten years ago, they were doing what I wanted. It's been a decade since, and this thing is projecting in 6 years time.
It's also not claiming to be a full prosthetic hand, the use case is replacing basic touch interfaces (touch and swipe) while you are walking around using your glasses, but I want to use it to replace left click on a normal computer pain free. I've already got an eyetracker I'd use for the mouse control, so I could not even use a mouse. The eye tracker works great, but it can't interact once the mouse is on the target without awkward stuff like using your voice (which tires your voice) or blinking (which again, tires you after a while). I currently have to setup an awkward system with accessible buttons, and even that can still hurt my hands and isn't very elegant/useful, is quite slow, isn't properly supported by windows and still isn't fully ergonomic.
This is not as much of a joke as you are claiming it is, at all. You've basically just confirmed its possible by saying they can already do what I want it to do with something vastly more complicated than this. Thanks for the good news I guess?
Also to be clear this is not a full blown prosthetic and any discussion that brings up full blown prosthetics is irrelevant to this device. It's a non physical way to left and right click, touch interact, and swipe. That sort of very, very basic stuff that is hard to do without a bulky physical accessory (a controller, or a phone), which they want people to be able to do with their hands in any position, naturally, just by gently twitching your fingers.
This is not a prosthetic, although it might be able to help someone without a hand use a computer. Which would be nice. But any discussion of how far away it is to have a full on robot limb is starting a different discussion about a different piece of tech.
Don't misunderstand me; I'd be super happy if someone gets the tech working properly.
My point was reading muscle activity with electrodes (or other non-invasive means) is imprecise and challenging.
You can have it working perfectly one day and not at all the next. E.g., it's hotter and so you sweat more and so the electrodes pick up a stronger signals due to the increased conductivity of sweaty skin. Or you want to have your arm down instead of horizontally or angled up. Or you haven't calibrated it in a month. Or a thousand other seemingly inconsequential things.
What my colleague told me is that there were hundreds of fancy prosthetics that were working perfectly in the lab and discarded after a day in the wild.
And while, yes, this use case is simpler than a full prosthetic, it's still very hard. The 90% success rate was in ideal conditions. How well do you think it will function in the wild, and without constant re-calibration?
Posts
In the far future, as the heat death of the universe approaches, and mankind fades into history, at least our legacy of AI Shitposting Bots will continue to live on.
Insufficient Pixar. Need to give him Pixar Mom Ass too.
Still looks more human than his skinsuit.
But somehow worse than his previous digital avatar.
i was a pretty early adopter... I had the stupid headset where you had to put your *samsung galaxy phone* into a plastic holder...
i still think that a high quality safe headset would be preferable to my hulking 27 inch monitors for most purposes, including work
maybe one day!
https://www.hp.com/us-en/shop/pdp/hp-reverb-g2-virtual-reality-headset
the HP Reverb G2 headset is still on a pretty big sale right now, $350 for the full kit
it's a solution that's more for high-resolution, high-fidelity VR, so best for seated experiences like flight and racing sims than hand tracking solutions like Beat Saber
I'll try not to forget you meatsacks when I'm jamming with the console cowboys in cyberspace.
I miss those sassy blobs so fucking much
I did a lot of Zoom meetings during the pandemic, and recently we've started doing those meetings in person, and the in person meetings are so much better than the Zoom ones that I'm convinced that I'll never prefer a virtual meeting, I can' see it.
It must be a work culture thing. Our in person meetings here are terrible. We really slimmed down the online meetings. Plus i can still use y computer on a teams meeting versus just twiddling my thumbs in an in person one.
VR meetings would make our meetings bad sgain I bet. Ill pass.
Origin ID: Discgolfer27
Untappd ID: Discgolfer1981
If it could have been an email, it's not gonna be good no matter what.
@Naphtali I discovered blob stickers a few minutes ago, thanks to this article
https://www.androidauthority.com/blob-emoji-how-to-3092425/
Welcome to several months ago, has anyone messed with Gboard emoji kitchen?
https://9to5google.com/2022/03/31/how-to-use-gboards-emoji-kitchen-on-android/
I agree on both points and I don't get to see my peers very often (we manage branches of the library so we're each in our own location most of the time) so even if a meeting could have been an e-mail and it sucks, the before and after time when you get to hang out with people and grab lunch and kvetch is extremely valuable.
yeah they patched out division it was too strong
they're wasting unspeakable amounts of organizational horsepower on aspects of VR I'm not sure anyone wants
i have a lot of respect for Carmack as an engineer, and he is outwardly critical about Meta on many things but even he's really into that virtual meeting thing and its just... why. i dont get it
eat the dogfood
There isn't enough money in selling hardware with useful functions that you described earlier. Zuck made his billions building a network and buying up other networks that their users are addicted to so he gets an endless supply of data to sell to advertisers. Now those networks are old hat and he wants a new network that you literally fucking live in because he either thinks the old world of tangible shit is dead or he thinks he can kill it and sell you the nostalgia of it after the fact.
Meetings and one-way presentations are fine via Zoom, collaboration sessions are much better/more effective in person.
honestly don’t even want the new phone anymore, it’s been such an unending drama dealing with apple and ups.
apple doesn’t even want to send me another one now and they’re making me wait a week as though it will magically appear in a warehouse and be delivered even though ups said it’s gone
Fibre is the way...only my desktop caps out at 2.5gbs on the jack
I had that happen once with a lamp, but the original showed up 8 months later. Oh, the stories that lamp probably holds…
In secret they just wanted to make a dick joke by taking the D away from PEMDAS
https://uploadvr.com/zuckerberg-wristband-keyboard/
christ
the only silver lining here is Zuckerberg's greed and incompetence could make this biometric surveillance device so unpleasant and expensive that nobody will actually buy it
I can't wait for the porn applications.
Dang it, my shipment has been delayed from today to Monday, and I'll be halfway across town doing interviews so I can't even come home on my lunch break. And it's fall break.
Well, I hope the horrible neighborhood children enjoy their new VR rig.
Or at least, the competitor that creates one that isn't fucking facebook, since as soon as someone invents it others will jump on the bandwagon.
It reads the intentions of hand movements. Literally, the tiniest gesture, completely imperceptible for others. You can basically almost pretend to move your hand, and it will work. For me, with really quite bad RSI, that sounds fucking wonderful. My problem is even the most forgiving buttons are too much, and I'm having trouble controlling both a left stick, and a right hand mouse. Both hands, fucked.
It's just that this is, you know, fucking Zuckerberg.
I am impatiently awaiting the neural headbands that just read my motor cortex so I can just sit there arms crossed and play action games with my mind. That's a way off, but if this thing can read impulses sent to my hand, that means I can sit there making only tiny movement that are comfortable, in any hand position I want, without having to struggle with a fucking piece of awkward plastic that is shaped and works for abled people but is totally fucked for me. I don't care if it takes literally months to learn it, I'll fucking learn it.
I want to make it clear: even if all I could use this for is press a button without actively moving my hands in any perceptible way, that would help my quality of life right now immensely. And I'm not even fully disabled, it's just RSI. I can move my hands, it just hurts to do it a lot. Imagine what this would be like for people with genuine limited mobility. This is a game changer.
This tech is very cool. The crime is that Zuckerberg is the one who will control it. Not the tech itself.
A guy who wrote his PhD at the same time I did was looking into robotic prosthetics. (Caveat that this was 10 years ago; things can have changed since.)
He was looking into the use case of people who've lost their hand but not their arm. The idea was to use EMG (reading the electrical signals of the muscles) in the lower arm muscles; the hand is largely controlled by those muscles. (This is non-invasive; the electrodes are placed on the skin.)
Anyways, after four years of research he could discern the following motions and get the prosthetic to do them at least 90% of the time: Open/close fist ("grip"), open/close thumb and index finger ("pinch"), and rotate wrist ("pronate"), all while the arm was horizontal and while the arm was at an angle.
Previous state of the art was doing two of those things, but only with the arm horizontal. Most actual amputees discard their fancy prosthesis for purely cosmetic ones, claws, or ones that can grip actuated by a harness attached to the chest or buttons.
The fancy prosthetics with independently controlled fingers etc that you see of videos online are a) too fragile for everyday use and b) uncontrollable by real humans.
Anyways, long story short, I'll believe the neural wristband thing when I see it.
If there's one thing that recent history teaches us, it's that sizable technological problems can be solved with time and scads of money.
That's nice but they've already got proof of concept of neural headbands controlling devices, and that was about five years ago. And literally all I need is one of those modes, a modified form of pinch, and I could suddenly do a ton more stuff pain free. I'd buy a 2000 dollar device that could only press one button with my mind in a heartbeat. Just one! Doesn't even have to actually physically press a button. It just has to tell the computer "hey send the left click impulse" with me barely moving my hand.
So, ten years ago, they were doing what I wanted. It's been a decade since, and this thing is projecting in 6 years time.
It's also not claiming to be a full prosthetic hand, the use case is replacing basic touch interfaces (touch and swipe) while you are walking around using your glasses, but I want to use it to replace left click on a normal computer pain free. I've already got an eyetracker I'd use for the mouse control, so I could not even use a mouse. The eye tracker works great, but it can't interact once the mouse is on the target without awkward stuff like using your voice (which tires your voice) or blinking (which again, tires you after a while). I currently have to setup an awkward system with accessible buttons, and even that can still hurt my hands and isn't very elegant/useful, is quite slow, isn't properly supported by windows and still isn't fully ergonomic.
This is not as much of a joke as you are claiming it is, at all. You've basically just confirmed its possible by saying they can already do what I want it to do with something vastly more complicated than this. Thanks for the good news I guess?
Also to be clear this is not a full blown prosthetic and any discussion that brings up full blown prosthetics is irrelevant to this device. It's a non physical way to left and right click, touch interact, and swipe. That sort of very, very basic stuff that is hard to do without a bulky physical accessory (a controller, or a phone), which they want people to be able to do with their hands in any position, naturally, just by gently twitching your fingers.
This is not a prosthetic, although it might be able to help someone without a hand use a computer. Which would be nice. But any discussion of how far away it is to have a full on robot limb is starting a different discussion about a different piece of tech.
Don't misunderstand me; I'd be super happy if someone gets the tech working properly.
My point was reading muscle activity with electrodes (or other non-invasive means) is imprecise and challenging.
You can have it working perfectly one day and not at all the next. E.g., it's hotter and so you sweat more and so the electrodes pick up a stronger signals due to the increased conductivity of sweaty skin. Or you want to have your arm down instead of horizontally or angled up. Or you haven't calibrated it in a month. Or a thousand other seemingly inconsequential things.
What my colleague told me is that there were hundreds of fancy prosthetics that were working perfectly in the lab and discarded after a day in the wild.
And while, yes, this use case is simpler than a full prosthetic, it's still very hard. The 90% success rate was in ideal conditions. How well do you think it will function in the wild, and without constant re-calibration?