An Interactive Documentary About Big Tech, AI, and You
October 24, 2019 6:28 AM Subscribe
Meet the new A.I. that knows you better than you know yourself: https://stealingurfeelin.gs
"An augmented reality film revealing how the most popular apps can use facial emotion recognition technology to make decisions about your life, promote inequalities, and even destabilize democracy makes its worldwide debut on the web [last month]. Using the same AI technology described in corporate patents, 'Stealing Ur Feelings,' by Noah Levenson, learns the viewers’ deepest secrets just by analyzing their faces as they watch the film in real-time."
"The documentary was made possible through a $50,000 Creative Media Award from Mozilla. The Creative Media Awards reflect Mozilla’s commitment to partner with artists to engage the public in exploring and understanding complex technical issues, such as the potential pitfalls of AI in dating apps (Monster Match) and the hiring process (Survival of the Best Fit)."
"An augmented reality film revealing how the most popular apps can use facial emotion recognition technology to make decisions about your life, promote inequalities, and even destabilize democracy makes its worldwide debut on the web [last month]. Using the same AI technology described in corporate patents, 'Stealing Ur Feelings,' by Noah Levenson, learns the viewers’ deepest secrets just by analyzing their faces as they watch the film in real-time."
"The documentary was made possible through a $50,000 Creative Media Award from Mozilla. The Creative Media Awards reflect Mozilla’s commitment to partner with artists to engage the public in exploring and understanding complex technical issues, such as the potential pitfalls of AI in dating apps (Monster Match) and the hiring process (Survival of the Best Fit)."
The scary thing is, they thought I didn't like dogs! It's one thing to be measured, but wow it feels weird to be measured wrong.
posted by mittens at 6:44 AM on October 24, 2019 [1 favorite]
posted by mittens at 6:44 AM on October 24, 2019 [1 favorite]
the new A.I. that knows you better than you know yourself
No it doesn't.
learns the viewers’ deepest secrets just by analyzing their faces
No it doesn't.
posted by escabeche at 7:13 AM on October 24, 2019 [3 favorites]
No it doesn't.
learns the viewers’ deepest secrets just by analyzing their faces
No it doesn't.
posted by escabeche at 7:13 AM on October 24, 2019 [3 favorites]
I started this. It took forever for me to position my face properly in the screen (my laptop is on my desk). When I finally got it started, it told me that I like dogs. I don't particularly like dogs.
Does it tell everybody that they like dogs? Am I missing a joke or something?
posted by tallmiddleagedgeek at 7:24 AM on October 24, 2019
Does it tell everybody that they like dogs? Am I missing a joke or something?
posted by tallmiddleagedgeek at 7:24 AM on October 24, 2019
learns the viewers’ deepest secrets just by analyzing their faces
No it doesn't.
Does it tell everybody that they like dogs? Am I missing a joke or something?
I'll hop in for a minute to say that I think the broader point here is two-fold:
One, the privacy implications of (say) Snapchat watching your face while you watch stories and then doing things with what it thinks it sees on your face.
Two, you can beat the rap but you can't beat the ride? If your bank is scanning your face every time cash a check on your phone or whenever you talk to a teller and cancels your credit card because they think you hate dogs, and "sorry, the AI has spoken, I don't decide this" becomes the new "it's just our policy, I don't decide this", well then it doesn't matter if you like dogs or if the AI got your secret wrong. What matters is who is making decisions with that (incorrect) data. For some further reading on related concepts, check out ProPublica's article on machine learning bias involvement in criminal sentencing.
posted by Nonsteroidal Anti-Inflammatory Drug at 7:37 AM on October 24, 2019 [8 favorites]
No it doesn't.
Does it tell everybody that they like dogs? Am I missing a joke or something?
I'll hop in for a minute to say that I think the broader point here is two-fold:
One, the privacy implications of (say) Snapchat watching your face while you watch stories and then doing things with what it thinks it sees on your face.
Two, you can beat the rap but you can't beat the ride? If your bank is scanning your face every time cash a check on your phone or whenever you talk to a teller and cancels your credit card because they think you hate dogs, and "sorry, the AI has spoken, I don't decide this" becomes the new "it's just our policy, I don't decide this", well then it doesn't matter if you like dogs or if the AI got your secret wrong. What matters is who is making decisions with that (incorrect) data. For some further reading on related concepts, check out ProPublica's article on machine learning bias involvement in criminal sentencing.
posted by Nonsteroidal Anti-Inflammatory Drug at 7:37 AM on October 24, 2019 [8 favorites]
Does it tell everybody that they like dogs? Am I missing a joke or something?
I haven't checked it out, but perhaps the gag is giving old-fashioned cold reading techniques a satirical patina of technological precision? I mean, everyone knows that people have biases, but how could a computer possibly have anything but perfect objectivity?
That's nonsense, as many posts on the Blue have covered at length, but there's a lot of money to be made from – and some spectacular injustices to be perpetuated by – the credulousness of the wealthy, the powerful, and the public as a whole.
posted by lumensimus at 7:43 AM on October 24, 2019
I haven't checked it out, but perhaps the gag is giving old-fashioned cold reading techniques a satirical patina of technological precision? I mean, everyone knows that people have biases, but how could a computer possibly have anything but perfect objectivity?
That's nonsense, as many posts on the Blue have covered at length, but there's a lot of money to be made from – and some spectacular injustices to be perpetuated by – the credulousness of the wealthy, the powerful, and the public as a whole.
posted by lumensimus at 7:43 AM on October 24, 2019
No no, you can't arrest me, it was just bad lighting...
posted by sammyo at 7:43 AM on October 24, 2019 [1 favorite]
posted by sammyo at 7:43 AM on October 24, 2019 [1 favorite]
What matters is who is making decisions with that (incorrect) data.
Starting from the assumption that all these systems are going to end up working much the same way, there is possibly quite a lot of value to be had in practising with this thing until you can reliably game it to get whatever reading you want out of it. It's nice to have totally local tools that make that kind of practice possible.
posted by flabdablet at 9:20 AM on October 24, 2019 [1 favorite]
Starting from the assumption that all these systems are going to end up working much the same way, there is possibly quite a lot of value to be had in practising with this thing until you can reliably game it to get whatever reading you want out of it. It's nice to have totally local tools that make that kind of practice possible.
posted by flabdablet at 9:20 AM on October 24, 2019 [1 favorite]
I participated in a study in what I remember as human-computer interaction, something to do with different ways to unlock phones. I get intensely weirded out by the idea that my phone is looking back at me to decide whether to unlock itself or not. Some days I really just want a dumb phone again.
I tend to agree with Nonsteroidal Anti-Inflammatory Drug's take on this: it's not so much whether the algorithms can do what the engineers say they can do, it's how corporations act on whatever information/data is collected.
(For anyone else who was intrigued by the song during the closing credits, which twigged my 2000s J-pop filter, I'm pretty sure it's m-flo's remix of Utada Hikaru's "Distance".)
posted by invokeuse at 9:21 AM on October 24, 2019
I tend to agree with Nonsteroidal Anti-Inflammatory Drug's take on this: it's not so much whether the algorithms can do what the engineers say they can do, it's how corporations act on whatever information/data is collected.
(For anyone else who was intrigued by the song during the closing credits, which twigged my 2000s J-pop filter, I'm pretty sure it's m-flo's remix of Utada Hikaru's "Distance".)
posted by invokeuse at 9:21 AM on October 24, 2019
It told me I don't like dogs, pizza, or Kanye. It is wrong on two counts.
posted by ephemerista at 9:49 AM on October 24, 2019
posted by ephemerista at 9:49 AM on October 24, 2019
I have an IQ of 24?? How did I even turn on my computer??
posted by biddeford at 10:12 AM on October 24, 2019
posted by biddeford at 10:12 AM on October 24, 2019
No. You can't use my camera. I don't even use my camera.
posted by Splunge at 12:32 PM on October 24, 2019 [2 favorites]
posted by Splunge at 12:32 PM on October 24, 2019 [2 favorites]
Fei-Fei Li's Quest to Make AI Better for Humanity. Artificial intelligence has a problem: The biases of its creators are getting hard-coded into its future. Fei-Fei Li has a plan to fix that—by rebooting the field she helped invent.
posted by homunculus at 1:09 PM on October 30, 2019
posted by homunculus at 1:09 PM on October 30, 2019
« Older Lock in: no exit, voice nor loyalty | WeLose, IWin Newer »
This thread has been archived and is closed to new comments
posted by Nonsteroidal Anti-Inflammatory Drug at 6:29 AM on October 24, 2019