In what context? Like you could put them on during video chat or what? I don't want to look at a 20 min long video of some tech executive upselling them or whatever, which is why I'm asking.
Fun facial overlay for video chat.
Snapchat has had a similar feature for a while.
It's inherently useless but people enjoy it.
As others have mentioned, it's a good proof of concept for facial recognition technology.
every person who doesn't like an acquired taste always seems to think everyone who likes it is faking it. it should be an official fallacy.
it's keypoint identification on faces and mapped onto an animal face?
oh yeah well my current army program can put googly eyes over your eyes in realtime so basically i'm an iphone too
It tracks 30+ facial features, including gaze estimation, at 60 fps on an embedded system, and it’s very stable. Using it to animate a 3D character is a fun demo but I am sure in the next few years we will see games and uses beyond making a cartoon unicorn sing 2 live crew.
oh nice
my system does 4 fps lol, and also it only has 5 face keypoints, cause I'm doing pose estimation not face
Right now, I would use this apple system (along with a neural network analyzing facial expressions, of course) for pain estimation on patients in an automated medical care environment (ahahaha how are we going to get the training data for that...)
well maybe someone some day will make a nice open source version to steal
credeiki on
Steam, LoL: credeiki
0
Options
Mojo_JojoWe are only now beginning to understand the full power and ramifications of sexual intercourseRegistered Userregular
Maybe scotch will help
Homogeneous distribution of your varieties of amuse-gueule
BeNarwhalThe Work Left UnfinishedRegistered Userregular
Hey [chat] friends, Rocket Update
Zuma has now moved to the 7th due to more favorable weather conditions on that date
Same launch time: 5pm Pacific, 8pm Eastern, 0100 on the 8th UTC
All the football should be over by then, and the captive lovely person in my bed should be freed by then, so I should be around to provide launch coverage as the first stage makes a landing attempt back at Landing Zone 1!
It's a DoD launch so they won't track the 2nd stage on stream, which means we'll get uninterrupted footage of the first stage's landing efforts
+1
Options
AtomikaLive fast and get fucked or whateverRegistered Userregular
3 hours
Misgendered by every staff member
Didn’t actually accomplish anything
May be late to seeing friend
0
Options
Mojo_JojoWe are only now beginning to understand the full power and ramifications of sexual intercourseRegistered Userregular
it's keypoint identification on faces and mapped onto an animal face?
oh yeah well my current army program can put googly eyes over your eyes in realtime so basically i'm an iphone too
It tracks 30+ facial features, including gaze estimation, at 60 fps on an embedded system, and it’s very stable. Using it to animate a 3D character is a fun demo but I am sure in the next few years we will see games and uses beyond making a cartoon unicorn sing 2 live crew.
oh nice
my system does 4 fps lol and also it only has 5 face keypoints cause I'm doing pose estimation not face
Right now, I would use this apple system (along with a neural network analyzing facial expressions, of course) for pain estimation on patients in an automated medical care environment (ahahaha how are we going to get the training data for that...)
well maybe someone some day will make a nice open source version to steal
Good idea on the pain thing.
Pain measurement is absolute bullshit
Homogeneous distribution of your varieties of amuse-gueule
it's keypoint identification on faces and mapped onto an animal face?
oh yeah well my current army program can put googly eyes over your eyes in realtime so basically i'm an iphone too
It tracks 30+ facial features, including gaze estimation, at 60 fps on an embedded system, and it’s very stable. Using it to animate a 3D character is a fun demo but I am sure in the next few years we will see games and uses beyond making a cartoon unicorn sing 2 live crew.
oh nice
my system does 4 fps lol and also it only has 5 face keypoints cause I'm doing pose estimation not face
Right now, I would use this apple system (along with a neural network analyzing facial expressions, of course) for pain estimation on patients in an automated medical care environment (ahahaha how are we going to get the training data for that...)
well maybe someone some day will make a nice open source version to steal
Good idea on the pain thing.
Pain measurement is absolute bullshit
in the far forward casualty care environment of The Future, we'd like to replace the human medic with automated systems serving as experts
not just vitals, but everything that a human might notice and respond to. Pain management is certainly something where doctors use not only vital signs but visual and auditory input to figure out what's going on.
it's keypoint identification on faces and mapped onto an animal face?
oh yeah well my current army program can put googly eyes over your eyes in realtime so basically i'm an iphone too
It tracks 30+ facial features, including gaze estimation, at 60 fps on an embedded system, and it’s very stable. Using it to animate a 3D character is a fun demo but I am sure in the next few years we will see games and uses beyond making a cartoon unicorn sing 2 live crew.
oh nice
my system does 4 fps lol, and also it only has 5 face keypoints, cause I'm doing pose estimation not face
Right now, I would use this apple system (along with a neural network analyzing facial expressions, of course) for pain estimation on patients in an automated medical care environment (ahahaha how are we going to get the training data for that...)
well maybe someone some day will make a nice open source version to steal
I used to know a pain researcher who took great pleasure in telling me just how... enthusiastic some of the volunteers for her studies were. Advertise it, and I'm sure they will come.
it's keypoint identification on faces and mapped onto an animal face?
oh yeah well my current army program can put googly eyes over your eyes in realtime so basically i'm an iphone too
It tracks 30+ facial features, including gaze estimation, at 60 fps on an embedded system, and it’s very stable. Using it to animate a 3D character is a fun demo but I am sure in the next few years we will see games and uses beyond making a cartoon unicorn sing 2 live crew.
oh nice
my system does 4 fps lol and also it only has 5 face keypoints cause I'm doing pose estimation not face
Right now, I would use this apple system (along with a neural network analyzing facial expressions, of course) for pain estimation on patients in an automated medical care environment (ahahaha how are we going to get the training data for that...)
well maybe someone some day will make a nice open source version to steal
Good idea on the pain thing.
Pain measurement is absolute bullshit
I'm terrible at recognizing pain in others. When they act like I do, I recognize it. But for others my mind jumps straight to they are faking/exaggerating.
0
Options
Element BrianPeanut Butter ShillRegistered Userregular
it's keypoint identification on faces and mapped onto an animal face?
oh yeah well my current army program can put googly eyes over your eyes in realtime so basically i'm an iphone too
It tracks 30+ facial features, including gaze estimation, at 60 fps on an embedded system, and it’s very stable. Using it to animate a 3D character is a fun demo but I am sure in the next few years we will see games and uses beyond making a cartoon unicorn sing 2 live crew.
oh nice
my system does 4 fps lol and also it only has 5 face keypoints cause I'm doing pose estimation not face
Right now, I would use this apple system (along with a neural network analyzing facial expressions, of course) for pain estimation on patients in an automated medical care environment (ahahaha how are we going to get the training data for that...)
well maybe someone some day will make a nice open source version to steal
Good idea on the pain thing.
Pain measurement is absolute bullshit
I'm terrible at recognizing pain in others. When they act like I do, I recognize it. But for others my mind jumps straight to they are faking/exaggerating.
I suspect this is 99% of human judgement about the feelings of others in general.
But that could just be because I am autistic and suspect non-autistic people are pretty often themselves pretty bad about projecting how they feel in specific situations onto others even when others feel differently.
it's keypoint identification on faces and mapped onto an animal face?
oh yeah well my current army program can put googly eyes over your eyes in realtime so basically i'm an iphone too
It tracks 30+ facial features, including gaze estimation, at 60 fps on an embedded system, and it’s very stable. Using it to animate a 3D character is a fun demo but I am sure in the next few years we will see games and uses beyond making a cartoon unicorn sing 2 live crew.
oh nice
my system does 4 fps lol, and also it only has 5 face keypoints, cause I'm doing pose estimation not face
Right now, I would use this apple system (along with a neural network analyzing facial expressions, of course) for pain estimation on patients in an automated medical care environment (ahahaha how are we going to get the training data for that...)
well maybe someone some day will make a nice open source version to steal
I used to know a pain researcher who took great pleasure in telling me just how... enthusiastic some of the volunteers for her studies were. Advertise it, and I'm sure they will come.
I don't know how to interact with IRBs
It's a huge problem in my contract because we just want some goddamn 3d models and accompanying 2d images of people and we can't just use ourselves, so we're going to do simulated data or something. If we get a phase II I'm definitely going to look into how to actually get permission for human subject research. Usually agencies do not want you to do any in a phase I.
I saw "evangelion" and "silicon" and hurriedly closed it just in case.
Too late. You are already an animu.
+1
Options
Nova_CI have the needThe need for speedRegistered Userregular
I was troubleshooting an e-mail problem a customer had and I wanted to do some testing on my phone. The Pixel 2 has GMail built in, but no e-mail app, so I downloaded Outlook for Android.
OH MY GOD it is useless for troubleshooting.
I was able to set up the account automatically, but I couldn't access ANY server settings for testing. Or even to make sure it's using the right servers.
If I tried to set up the account manually, it offered no options for port or security settings, so the outgoing settings couldn't complete. To make matters worse, every time the setup failed, it cleared ALL the information so I had to put incoming and outgoing information in every time before I realized it was never going to work.
Doesn't seem all that different, especially since it's a fun novelty anyway and not a substantial feature. Oh, and I can use Facerig on lots of non-apple stuff. I mean of course they are used for different things but it isn't like Apple's tech is some groundbreaking amazing new face tracking shit. It's good, but lots of other people are doing similar stuff. Snapchat has tons of filters and getting someone to explain the difference between those and this is probably pretty difficult for the average user! edit: Actually it doesn't look any different than the rooster on the last page. And the cat has way more detail too!
I was troubleshooting an e-mail problem a customer had and I wanted to do some testing on my phone. The Pixel 2 has GMail built in, but no e-mail app, so I downloaded Outlook for Android.
OH MY GOD it is useless for troubleshooting.
I was able to set up the account automatically, but I couldn't access ANY server settings for testing. Or even to make sure it's using the right servers.
If I tried to set up the account manually, it offered no options for port or security settings, so the outgoing settings couldn't complete. To make matters worse, every time the setup failed, it cleared ALL the information so I had to put incoming and outgoing information in every time before I realized it was never going to work.
Posts
Well it isn’t down here under your bed, I can tell you that much
Scheck has plenty of power already let's not give him more
Maybe the butt bounces based on gyroscope info as you wave the phone around
Fun facial overlay for video chat.
Snapchat has had a similar feature for a while.
It's inherently useless but people enjoy it.
As others have mentioned, it's a good proof of concept for facial recognition technology.
the "no true scotch man" fallacy.
oh nice
my system does 4 fps lol, and also it only has 5 face keypoints, cause I'm doing pose estimation not face
Right now, I would use this apple system (along with a neural network analyzing facial expressions, of course) for pain estimation on patients in an automated medical care environment (ahahaha how are we going to get the training data for that...)
well maybe someone some day will make a nice open source version to steal
https://www.youtube.com/watch?v=xV-bs5r1qpg
Zuma has now moved to the 7th due to more favorable weather conditions on that date
Same launch time: 5pm Pacific, 8pm Eastern, 0100 on the 8th UTC
All the football should be over by then, and the captive lovely person in my bed should be freed by then, so I should be around to provide launch coverage as the first stage makes a landing attempt back at Landing Zone 1!
It's a DoD launch so they won't track the 2nd stage on stream, which means we'll get uninterrupted footage of the first stage's landing efforts
Misgendered by every staff member
Didn’t actually accomplish anything
May be late to seeing friend
Good idea on the pain thing.
Pain measurement is absolute bullshit
what else is missing from my life aside from purpose
If I could, I would fly down there right now and projectile barf on those jerks! >:(
Buying Narwhal a pizza, probably
:bro:
in the far forward casualty care environment of The Future, we'd like to replace the human medic with automated systems serving as experts
not just vitals, but everything that a human might notice and respond to. Pain management is certainly something where doctors use not only vital signs but visual and auditory input to figure out what's going on.
Delicious, delicious garbage?
They didn't put it out on the Switch for some reason. It's dumb.
i'm not even fully clear on what this is
I have questions.
Ice cube tray!
I used to know a pain researcher who took great pleasure in telling me just how... enthusiastic some of the volunteers for her studies were. Advertise it, and I'm sure they will come.
I'm terrible at recognizing pain in others. When they act like I do, I recognize it. But for others my mind jumps straight to they are faking/exaggerating.
Arch,
https://www.youtube.com/watch?v=t_goGR39m2k
The issue is that this is nowhere near as good.
I suspect this is 99% of human judgement about the feelings of others in general.
But that could just be because I am autistic and suspect non-autistic people are pretty often themselves pretty bad about projecting how they feel in specific situations onto others even when others feel differently.
Being really good would kind of ruin it.
I don't know how to interact with IRBs
It's a huge problem in my contract because we just want some goddamn 3d models and accompanying 2d images of people and we can't just use ourselves, so we're going to do simulated data or something. If we get a phase II I'm definitely going to look into how to actually get permission for human subject research. Usually agencies do not want you to do any in a phase I.
OH MY GOD it is useless for troubleshooting.
I was able to set up the account automatically, but I couldn't access ANY server settings for testing. Or even to make sure it's using the right servers.
If I tried to set up the account manually, it offered no options for port or security settings, so the outgoing settings couldn't complete. To make matters worse, every time the setup failed, it cleared ALL the information so I had to put incoming and outgoing information in every time before I realized it was never going to work.
Doesn't seem all that different, especially since it's a fun novelty anyway and not a substantial feature. Oh, and I can use Facerig on lots of non-apple stuff. I mean of course they are used for different things but it isn't like Apple's tech is some groundbreaking amazing new face tracking shit. It's good, but lots of other people are doing similar stuff. Snapchat has tons of filters and getting someone to explain the difference between those and this is probably pretty difficult for the average user! edit: Actually it doesn't look any different than the rooster on the last page. And the cat has way more detail too!
Android integration for exchange blows.
b-baka