Cats resist attempts at being randomly generated
December 17, 2018 1:53 AM Subscribe
None of these people are real. NVIDIA's latest iteration of its image generator is getting pretty (creepily?) good at creating human faces. This is not rendering technology: these faces are generated from real ones distilled into high-level attributes (pose, glasses, hair length...) and then combined and subjected to random variations. More information is available in the paper A Style-Based Generator Architecture for Generative Adversarial Networks and in the video. The technology also works on cars and bedrooms, less so on cats due to "the high intrinsic variation in poses, zoom levels, and backgrounds".
If you open the image in a new tab, you'll get a larger version (still not great). I spotted a couple of Lovecraftian cat entities in there, but there were also a lot that wouldn't necessarily trigger my whaaaa? reflex – at least as much as I can tell from the annoyingly small images.
Maybe if someone had thought of somehow getting their cats into scanners this technology would have been more effective.
posted by taz at 2:31 AM on December 17, 2018 [20 favorites]
Maybe if someone had thought of somehow getting their cats into scanners this technology would have been more effective.
posted by taz at 2:31 AM on December 17, 2018 [20 favorites]
For the first time I can imagine a future where virtually all entertainment media feature synthetic performers. The ramifications are deeply chilling.
posted by kinnakeet at 2:55 AM on December 17, 2018 [6 favorites]
posted by kinnakeet at 2:55 AM on December 17, 2018 [6 favorites]
I guess all of this neural network shit fits perfectly with our post-fact based hyper reality. Enjoy your subjective computer generated dystopia guys!!!1
posted by Meatbomb at 2:57 AM on December 17, 2018 [3 favorites]
posted by Meatbomb at 2:57 AM on December 17, 2018 [3 favorites]
My cat definitely has high variation in zoom levels. I just sometimes wish his zoom levels were lower at night when I'm trying to sleep...
posted by Acheman at 3:26 AM on December 17, 2018 [46 favorites]
posted by Acheman at 3:26 AM on December 17, 2018 [46 favorites]
For the first time I can imagine a future where virtually all entertainment media feature synthetic performers. The ramifications are deeply chilling.
Gaming is going to be at the forefront of this but I can see things going in the opposite direction, i.e. our thirst for realness leading us to further worship real entertainers, actors and artists.
posted by Foci for Analysis at 3:28 AM on December 17, 2018 [2 favorites]
Gaming is going to be at the forefront of this but I can see things going in the opposite direction, i.e. our thirst for realness leading us to further worship real entertainers, actors and artists.
posted by Foci for Analysis at 3:28 AM on December 17, 2018 [2 favorites]
Impressive. Now imagine the soundtrack being generated by a GAN, instead of a 15 second loop pedal.
posted by Schadenfreude at 3:42 AM on December 17, 2018
posted by Schadenfreude at 3:42 AM on December 17, 2018
They just need a better cat dataset with all the cats posed in the same way. (Good luck with that.)
Meanwhile, the fake bedroom algorithm is really going to help me expand my Nigerian AirBNB scam operation.
posted by Skwirl at 4:15 AM on December 17, 2018 [3 favorites]
Meanwhile, the fake bedroom algorithm is really going to help me expand my Nigerian AirBNB scam operation.
posted by Skwirl at 4:15 AM on December 17, 2018 [3 favorites]
Synthetic actors will probably get more use in politically motivated videos than in porn, at least to start. Endless 30 second to 15 minute clips of fake people spouting hate speech or violent racist garbage, all indistinguishable by credulous people from real video, especially at shitty bit rates. Think generated ultra-creepy baby programming except targeting low valence thinkers. It's going to be a thing in the 2020 election for sure.
posted by seanmpuckett at 4:33 AM on December 17, 2018 [5 favorites]
posted by seanmpuckett at 4:33 AM on December 17, 2018 [5 favorites]
Would 3D masks made from these realistic neural network-generated human faces fool facial recognition systems?
posted by cenoxo at 4:41 AM on December 17, 2018
posted by cenoxo at 4:41 AM on December 17, 2018
Why do people want to develop something like this unless they want all news and all photography to have the same relation to reality as a computer game?
Seriously, show me some real progress and some real creativity, 'cause this ain't it.
posted by Termite at 4:44 AM on December 17, 2018 [3 favorites]
Seriously, show me some real progress and some real creativity, 'cause this ain't it.
posted by Termite at 4:44 AM on December 17, 2018 [3 favorites]
Gaming is going to be at the forefront of this but I can see things going in the opposite direction, i.e. our thirst for realness leading us to further worship real entertainers, actors and artists.
Agreed, I think gaming will definitely be the leader of this kind of technology. Also, I look at things like Hatsune Miku and I can definitely see this just becoming more and more normalized in the mainstream.
posted by Fizz at 5:03 AM on December 17, 2018
Agreed, I think gaming will definitely be the leader of this kind of technology. Also, I look at things like Hatsune Miku and I can definitely see this just becoming more and more normalized in the mainstream.
posted by Fizz at 5:03 AM on December 17, 2018
This will be a real time-saver for catfishers and astroturfers; no longer will they need to scour the web for plausible-looking twitter avatars.
posted by ook at 5:15 AM on December 17, 2018 [4 favorites]
posted by ook at 5:15 AM on December 17, 2018 [4 favorites]
Those cat memes are some Brundlefly shit.
posted by STFUDonnie at 5:18 AM on December 17, 2018 [2 favorites]
posted by STFUDonnie at 5:18 AM on December 17, 2018 [2 favorites]
It does raise some very serious issues, doesn't it? What happens to a society where individuals, as a simple matter of daily life, assume everything they see in any form of media isn't real? The concept of news, current events, etc. completely vanishes. There is no "news" anymore. Unless one is actually, physically there at the time, the assumption becomes that it didn't happen.
Does true, actual, actionable knowledge of the state of events in the world become a hyper-luxury item?
posted by Thorzdad at 5:22 AM on December 17, 2018 [6 favorites]
Does true, actual, actionable knowledge of the state of events in the world become a hyper-luxury item?
posted by Thorzdad at 5:22 AM on December 17, 2018 [6 favorites]
Flickr keeps reading my cat's face as human.
posted by The Underpants Monster at 5:24 AM on December 17, 2018 [1 favorite]
posted by The Underpants Monster at 5:24 AM on December 17, 2018 [1 favorite]
Gaming, indeed. Combine a realistic artificial face with a Fake Person Generator and altered videos, and you could make quite an online story for yourself. When your story becomes better than your reality, live your story.
On the Internet, nobody knows you’re not.
posted by cenoxo at 5:38 AM on December 17, 2018 [4 favorites]
On the Internet, nobody knows you’re not.
posted by cenoxo at 5:38 AM on December 17, 2018 [4 favorites]
GANs seem to generate pretty convincing dogs, but tend to fail at cats. Is it because cats are in some way more ineffable or inscrutable, or were the Google team who gathered the training images mostly dog people and/or motivated to redress the cat-centric bias of the internet?
posted by acb at 5:49 AM on December 17, 2018 [1 favorite]
posted by acb at 5:49 AM on December 17, 2018 [1 favorite]
They all look like they should be next to articles in The Onion.
posted by sexyrobot at 5:53 AM on December 17, 2018 [13 favorites]
posted by sexyrobot at 5:53 AM on December 17, 2018 [13 favorites]
The concept of news, current events, etc. completely vanishes. There is no "news" anymore.
Difficult though it may be to imagine, news was invented a few centuries before photography.
posted by sfenders at 6:00 AM on December 17, 2018 [2 favorites]
Difficult though it may be to imagine, news was invented a few centuries before photography.
posted by sfenders at 6:00 AM on December 17, 2018 [2 favorites]
Difficult though it may be to imagine, news was invented a few centuries before photography.
I read recently somewhere that different forms of journalism existed long before settlers arrived in North America.
posted by ricochet biscuit at 6:09 AM on December 17, 2018 [5 favorites]
I read recently somewhere that different forms of journalism existed long before settlers arrived in North America.
posted by ricochet biscuit at 6:09 AM on December 17, 2018 [5 favorites]
I've been trying to think of non-dystopian uses for this technology. So far the best I've got is you could use it at the barbers to try on different hairstyles before committing (feed it a bunch of pictures of your real face, plus a bunch of hairstyles / beards / mustaches, etc.) Or similar for plastic or reconstructive surgery. So that's a worthwhile tradeoff for losing one more toehold on objective reality, right?
Alternatively, just use it for disposal of hats.
posted by ook at 6:30 AM on December 17, 2018
Alternatively, just use it for disposal of hats.
posted by ook at 6:30 AM on December 17, 2018
I spotted a couple of Lovecraftian cat entities in there, but there were also a lot that wouldn't necessarily trigger my whaaaa? reflex.
I believe the technical term for that reflex is "uncatty valley".
Sorry.
Not sorry.
posted by The Bellman at 6:33 AM on December 17, 2018 [21 favorites]
I believe the technical term for that reflex is "uncatty valley".
Sorry.
Not sorry.
posted by The Bellman at 6:33 AM on December 17, 2018 [21 favorites]
How to recognize fake AI generated faces
posted by Mr.Encyclopedia at 6:37 AM on December 17, 2018 [17 favorites]
posted by Mr.Encyclopedia at 6:37 AM on December 17, 2018 [17 favorites]
Once photography reached the film negative and paper print stage (combined with high-speed newspaper printing and mass distribution), it was manipulated for political ends: Long Before Photoshop, the Soviets Mastered the Art of Erasing People from Photographs — and History Too.
Today, all that technology (and influence) is right in your lap. Toss in a digital memory hole, and you’re good to go.
posted by cenoxo at 6:41 AM on December 17, 2018 [2 favorites]
Today, all that technology (and influence) is right in your lap. Toss in a digital memory hole, and you’re good to go.
posted by cenoxo at 6:41 AM on December 17, 2018 [2 favorites]
I extracted and uploaded the TechnoKitty image for those who are having a hard time seeing it in the PDF. As mentioned, a couple of Brown Jenkin lookalikes, but in general pretty convincing. I did laugh at the ones that had incomprehensible meme text on them.
posted by Quindar Beep at 6:42 AM on December 17, 2018 [4 favorites]
posted by Quindar Beep at 6:42 AM on December 17, 2018 [4 favorites]
Thanks, Mr.Encyclopedia--that's fascinating! Worth noting that article has an update addressing the new material from the FPP. Totally worth a read!
posted by The Bellman at 6:42 AM on December 17, 2018
posted by The Bellman at 6:42 AM on December 17, 2018
Someday computers will debate whether humans will ever really be able to pass the Turing test.
posted by sjswitzer at 7:02 AM on December 17, 2018
posted by sjswitzer at 7:02 AM on December 17, 2018
> How to recognize fake AI-generated images
Good advice, thank you, for anyone who cares (or has time) to look, but these are temporary bugs to be fixed in the next iteration. Would such small defects even be noticeable in a news article read on a smart phone?
posted by cenoxo at 7:02 AM on December 17, 2018 [1 favorite]
Good advice, thank you, for anyone who cares (or has time) to look, but these are temporary bugs to be fixed in the next iteration. Would such small defects even be noticeable in a news article read on a smart phone?
posted by cenoxo at 7:02 AM on December 17, 2018 [1 favorite]
We'll have to use whatever tools we have at our disposal until the Voight-Kampff device is perfected.
posted by Mr.Encyclopedia at 7:06 AM on December 17, 2018 [1 favorite]
posted by Mr.Encyclopedia at 7:06 AM on December 17, 2018 [1 favorite]
How to recognize fake AI generated faces
Content warning for some surprise transphobia in this article, just because I wasn't expecting it, and you might not either!
posted by ITheCosmos at 7:11 AM on December 17, 2018 [5 favorites]
Content warning for some surprise transphobia in this article, just because I wasn't expecting it, and you might not either!
posted by ITheCosmos at 7:11 AM on December 17, 2018 [5 favorites]
So, are we ready to start the Butlerian Jihad yet? Or do we have a couple of years?
posted by Defective_Monk at 7:14 AM on December 17, 2018
posted by Defective_Monk at 7:14 AM on December 17, 2018
Also, I'd ask the people afraid of this being used for Fake News: how exactly WILL this be used for Fake News? The only things that spring to mind immediately would be rapid disinfo after a tragedy ("Here is a picture of the black antifa trump-hater that shot up this school etc") or Jacob Wohl-style fake "intelligence firms" filled with fake LinkedIn profile pictures. Both of which are already common and I don't think would be made more convincing with fake AI faces, not with the sheer amount of real faces available.
posted by Mr.Encyclopedia at 7:14 AM on December 17, 2018 [1 favorite]
posted by Mr.Encyclopedia at 7:14 AM on December 17, 2018 [1 favorite]
> ...until the Voight-Kampff device is perfected.
In about another year, 2019 will be ancient history with obsolete technology.
At the current rate of progress, the The Dead Past device could show up any day now.
posted by cenoxo at 7:30 AM on December 17, 2018 [2 favorites]
In about another year, 2019 will be ancient history with obsolete technology.
At the current rate of progress, the The Dead Past device could show up any day now.
posted by cenoxo at 7:30 AM on December 17, 2018 [2 favorites]
I've been trying to think of non-dystopian uses for this technology.
Once it becomes available in professional creative suites, this tech will make it much easier to create on-demand "stock" photos that don't rely on real people, so advertisers, designers etc. can include hi-res pictures of "people" that are 100% original, endlessly (and randomly) modifiable and copyright-free. You need a picture of Ariane happily eating salad but with blonde hair and a wider face? No problem (what makes Ariane such a popular stock photo model is that she covers a wide range of ethnicities, but her "style" is still limited to "young, female, thin, long dark hair"). You have a perfect group picture of young South Asian managers excitedly pointing at a screen but they all need to be middle-aged and East African? No problem. As noted in the FPP, this is not a rendering technology (there's no underlying 3d model of a "face" or a "cat"), so it's not directly applicable to games or movies, though it can be used (to some extent) to replace the faces of actual people by imaginary ones in short video sequences.
What's actually unsettling here is that those "fake" people have become undistinguishable from real ones, except for near-invisible glitches, so we're looking into the souls of people who don't exist even though they're on our side of the uncanny valley. Still, it doesn't make the tech scarier than what existed before. Image-based fake news don't need be to be high tech: all they need are people willing to believe them and willing to dismiss contextual information. It should be interesting to see whether hi-tech photo manipulation - not just cropping out inconvenient people, or tweaking contrast and colors - has been used recently for fake-news purposes. I've seen a lot of recycling - typically images from previous events used to describe current ones - but nothing technologically complex, such as making actual people say or do outrageous things in a way that would be credible to usually non-gullible people.
posted by elgilito at 7:31 AM on December 17, 2018 [6 favorites]
Once it becomes available in professional creative suites, this tech will make it much easier to create on-demand "stock" photos that don't rely on real people, so advertisers, designers etc. can include hi-res pictures of "people" that are 100% original, endlessly (and randomly) modifiable and copyright-free. You need a picture of Ariane happily eating salad but with blonde hair and a wider face? No problem (what makes Ariane such a popular stock photo model is that she covers a wide range of ethnicities, but her "style" is still limited to "young, female, thin, long dark hair"). You have a perfect group picture of young South Asian managers excitedly pointing at a screen but they all need to be middle-aged and East African? No problem. As noted in the FPP, this is not a rendering technology (there's no underlying 3d model of a "face" or a "cat"), so it's not directly applicable to games or movies, though it can be used (to some extent) to replace the faces of actual people by imaginary ones in short video sequences.
What's actually unsettling here is that those "fake" people have become undistinguishable from real ones, except for near-invisible glitches, so we're looking into the souls of people who don't exist even though they're on our side of the uncanny valley. Still, it doesn't make the tech scarier than what existed before. Image-based fake news don't need be to be high tech: all they need are people willing to believe them and willing to dismiss contextual information. It should be interesting to see whether hi-tech photo manipulation - not just cropping out inconvenient people, or tweaking contrast and colors - has been used recently for fake-news purposes. I've seen a lot of recycling - typically images from previous events used to describe current ones - but nothing technologically complex, such as making actual people say or do outrageous things in a way that would be credible to usually non-gullible people.
posted by elgilito at 7:31 AM on December 17, 2018 [6 favorites]
I've had my geforce RTX 2080 for about three months and the AI stuff on the chip has never been used. It's there for despeckling (which is a new problem) and anti-aliasing (which is a problem solved long ago). There are only a couple games that use it and I don't have them.
All I'm saying is, I have nvidia's consumer grade AI acceleration product right here and it's doing nothing. People will make cool stufff one day but for the moment, you've got battlefield and final fantasy.
This is pretty cool though. They've applied their image processing grunt to pilot an autonomous submarine. I suppose this is an adaptation of their driverless car stuff.
posted by adept256 at 7:34 AM on December 17, 2018 [1 favorite]
All I'm saying is, I have nvidia's consumer grade AI acceleration product right here and it's doing nothing. People will make cool stufff one day but for the moment, you've got battlefield and final fantasy.
This is pretty cool though. They've applied their image processing grunt to pilot an autonomous submarine. I suppose this is an adaptation of their driverless car stuff.
posted by adept256 at 7:34 AM on December 17, 2018 [1 favorite]
Humanity had a good run.
posted by Johnny Wallflower at 7:59 AM on December 17, 2018 [1 favorite]
posted by Johnny Wallflower at 7:59 AM on December 17, 2018 [1 favorite]
You want creepy? I'll give you creepy.
posted by mikeand1 at 8:02 AM on December 17, 2018 [1 favorite]
posted by mikeand1 at 8:02 AM on December 17, 2018 [1 favorite]
This could be terrible news for the headshot industry.
Just upload a shitty phone photo of yourself and let AI generate a headshot that looks like a more beautiful, well lit slightly unrealistic version of you. Have it auto update as you age.
posted by Damienmce at 8:17 AM on December 17, 2018 [1 favorite]
Just upload a shitty phone photo of yourself and let AI generate a headshot that looks like a more beautiful, well lit slightly unrealistic version of you. Have it auto update as you age.
posted by Damienmce at 8:17 AM on December 17, 2018 [1 favorite]
> elgilito: It should be interesting to see whether hi-tech photo manipulation - not just cropping out inconvenient people, or tweaking contrast and colors - has been used recently for fake-news purposes.
How Photos Fuel the Spread of Fake News [Wired, 12/21/2016] discusses this issue, but doesn't show any examples of technically manipulated photos.
Perhaps we just don't notice them: People Are Terrible at Identifying Manipulated Photos, Study Shows [Inverse, 7/17/2017].
posted by cenoxo at 8:20 AM on December 17, 2018 [1 favorite]
How Photos Fuel the Spread of Fake News [Wired, 12/21/2016] discusses this issue, but doesn't show any examples of technically manipulated photos.
Perhaps we just don't notice them: People Are Terrible at Identifying Manipulated Photos, Study Shows [Inverse, 7/17/2017].
posted by cenoxo at 8:20 AM on December 17, 2018 [1 favorite]
BBC News The artist transforming your face with a digital mask (YT, 12/26/2014).
Wear a portable version of this [starting about 0:30] and never worry about facial recognition (or cosmetics) again! Only $19.95.
posted by cenoxo at 8:52 AM on December 17, 2018
Wear a portable version of this [starting about 0:30] and never worry about facial recognition (or cosmetics) again! Only $19.95.
posted by cenoxo at 8:52 AM on December 17, 2018
"What happens to a society where individuals, as a simple matter of daily life, assume everything they see in any form of media isn't real?"
Several people have already addressed this in various ways, but in general "seeing it" seems to be a very naive standard for determining truth. At one extreme, what we think we see isn't actually that reliable. At the other extreme, an enormous amount of information comes to us in forms in which it's much easier to lie than images, particularly language. So if we just take something at, er, face value -- text or image or whatever, we're already fucked.
That's why we use complicated context-dependent heuristics to determine truth. With vision itself, for example, our visual system infers a great deal -- and we do the same when we reason about what we see.
Those with the simplest, superficial, and sensual standards for truth are, in fact, the most easily fooled. That's why we have magicians.
So, in my opinion, we've already long been faced this essential problem. The same sorts of things with which we already use to assess credibility, we'll use on this stuff. For example, as we do with the written word, provenance and authority will be very important.
Put differently, the idea that a photograph is "proof" is one that was reasonable only for a limited time and, regardless, has never been as useful a standard as many think. Even without any manipulation of the raw image, photographs lie. They can be staged, they can lie by omission, they can lie by framing. They can and do already lie in a great many different ways.
All that aside, what I find super interesting is the "average face". To the degree to which the dataset it's created from is representative of the human population, is the degree I find it interesting. It's still a long way off: flickr portraits from "around the world", even 70,000 of them, is far from representative of all people. But I still find it pretty interesting. Provocative, even.
posted by Ivan Fyodorovich at 9:02 AM on December 17, 2018 [4 favorites]
Several people have already addressed this in various ways, but in general "seeing it" seems to be a very naive standard for determining truth. At one extreme, what we think we see isn't actually that reliable. At the other extreme, an enormous amount of information comes to us in forms in which it's much easier to lie than images, particularly language. So if we just take something at, er, face value -- text or image or whatever, we're already fucked.
That's why we use complicated context-dependent heuristics to determine truth. With vision itself, for example, our visual system infers a great deal -- and we do the same when we reason about what we see.
Those with the simplest, superficial, and sensual standards for truth are, in fact, the most easily fooled. That's why we have magicians.
So, in my opinion, we've already long been faced this essential problem. The same sorts of things with which we already use to assess credibility, we'll use on this stuff. For example, as we do with the written word, provenance and authority will be very important.
Put differently, the idea that a photograph is "proof" is one that was reasonable only for a limited time and, regardless, has never been as useful a standard as many think. Even without any manipulation of the raw image, photographs lie. They can be staged, they can lie by omission, they can lie by framing. They can and do already lie in a great many different ways.
All that aside, what I find super interesting is the "average face". To the degree to which the dataset it's created from is representative of the human population, is the degree I find it interesting. It's still a long way off: flickr portraits from "around the world", even 70,000 of them, is far from representative of all people. But I still find it pretty interesting. Provocative, even.
posted by Ivan Fyodorovich at 9:02 AM on December 17, 2018 [4 favorites]
cat-gan.com
posted by biogeo at 9:55 AM on December 17, 2018 [3 favorites]
posted by biogeo at 9:55 AM on December 17, 2018 [3 favorites]
Ooh cool. I want some of those pictures for my sock-puppet accounts! The trouble with using pictures on a sock puppet account is that someone else somewhere has a very real identification with that face so the 'perfect" portrait that I find is really only a picture of someone else who is a doppelganger of my character. But now I can find pictures that do not have a secret breathing twin of my character who has a greater existential right to the image. Finally!
My sock puppet accounts, far from inciting debate or participating in disinformation generally post long rambling person histories that nobody reads (And after a couple of disastrous experiences involuntarily cat-fishing lonely souls who glommed onto me, I make darned sure nobody reads them) and visit my kingdoms in different games to trade and share resources.
Gaming, indeed. Combine a realistic artificial face with a Fake Person Generator and altered videos, and you could make quite an online story for yourself. When your story becomes better than your reality, live your story.
On the Internet, nobody knows you’re not.
posted by cenoxo
Yep, that's why I joined the internet, back in '82 when I found an article explaining about special interest newsgroups and warning that the default assumption is that nobody is who they say they are. The article warned that that person who is claiming to be a young woman is probably really a forty-eight-year-old neckbeard, and I went, wait, what, I can pretend to be a forty-eight-year-old neckbeard??!! And I had a dial-up account and a brand new e-mail within the week.
posted by Jane the Brown at 11:02 AM on December 17, 2018 [4 favorites]
My sock puppet accounts, far from inciting debate or participating in disinformation generally post long rambling person histories that nobody reads (And after a couple of disastrous experiences involuntarily cat-fishing lonely souls who glommed onto me, I make darned sure nobody reads them) and visit my kingdoms in different games to trade and share resources.
Gaming, indeed. Combine a realistic artificial face with a Fake Person Generator and altered videos, and you could make quite an online story for yourself. When your story becomes better than your reality, live your story.
On the Internet, nobody knows you’re not.
posted by cenoxo
Yep, that's why I joined the internet, back in '82 when I found an article explaining about special interest newsgroups and warning that the default assumption is that nobody is who they say they are. The article warned that that person who is claiming to be a young woman is probably really a forty-eight-year-old neckbeard, and I went, wait, what, I can pretend to be a forty-eight-year-old neckbeard??!! And I had a dial-up account and a brand new e-mail within the week.
posted by Jane the Brown at 11:02 AM on December 17, 2018 [4 favorites]
So, in my opinion, we've already long been faced this essential problem. The same sorts of things with which we already use to assess credibility, we'll use on this stuff.
But the bullshit keeps increasing in volume and sophistication every year. Is there a hard limit to our coping skills? Is there a point where it's not worth it anymore? I'd almost like to think that when Wikipedia gets completely overwritten by competing armies of bots, Facebook becomes tenanted largely by imaginary faces, and YouTube is conquered completely by incomprehensible videos generated to please an insane algorithm, humanity will finally give a great shrug and return to couriers and paper books. But we'll probably just go mad.
posted by Iridic at 11:04 AM on December 17, 2018 [2 favorites]
But the bullshit keeps increasing in volume and sophistication every year. Is there a hard limit to our coping skills? Is there a point where it's not worth it anymore? I'd almost like to think that when Wikipedia gets completely overwritten by competing armies of bots, Facebook becomes tenanted largely by imaginary faces, and YouTube is conquered completely by incomprehensible videos generated to please an insane algorithm, humanity will finally give a great shrug and return to couriers and paper books. But we'll probably just go mad.
posted by Iridic at 11:04 AM on December 17, 2018 [2 favorites]
Synthetic actors will probably get more use in politically motivated videos than in porn, at least to start.
Wait, have you seen the internet? They've been using synthetic actors in 3D porn for literally decades.
our thirst for realness leading us to further worship real entertainers, actors and artists.
That's a minor plot point in Altered Carbon IIRC.
posted by aspersioncast at 1:17 PM on December 17, 2018
Wait, have you seen the internet? They've been using synthetic actors in 3D porn for literally decades.
our thirst for realness leading us to further worship real entertainers, actors and artists.
That's a minor plot point in Altered Carbon IIRC.
posted by aspersioncast at 1:17 PM on December 17, 2018
So, in my opinion, we've already long been faced this essential problem. The same sorts of things with which we already use to assess credibility, we'll use on this stuff. For example, as we do with the written word, provenance and authority will be very important.
The essential problem you describe in regards to photographs and the threat this sort of technology poses are orders of magnitude different. It is one thing to realize that a photo doesn't reveal everything, limited in its framing and moment of capture, but another to be presented with a video showing a complete dialogue or continuous seeming sequence of events which never actually happened. Showing someone speaking words they never spoke isn't just only capturing part of a story, it's making one up entirely. A single image is not the same thing as showing someone engaging in a complex set of actions or speaking. While moving images always carried the question of what happened before and after the things captured occurred, and could in some ways be manipulated to show "fantastic" or staged events, there was still some security in the knowledge that the scope of alteration and the involvement was technologically limited. You couldn't, for example, show someone in clear definition engaging in acts they never actually did or stage large scale complex events in real locations in something like real time without mass involvement of people all in on the hoax.
Now we are heading to a time where there may be no ability to trust any information at all. A video showing something in absolute clarity won't be able to be trusted. If, for example, one sees a video showing someone saying or doing something abhorrent that they deny do you trust them or the video that clearly shows it? Since you know either can lie there is nothing left to fall back on to make a truth judgement, leaving "gut" beliefs and values to guide your choice. We already can see how that's working on the fringes, this can increase the problem to reach almost everyone. News media already suffers from an erosion of trust, how many fakes will it take to destroy that trust completely, either having fakes carried as news or having news challenged by fakes. Deep fakes will weaken the already shaky boundaries of whatever shared social values remain as virtually anything will be open for debate on its validity if it happened anywhere outside of a mass public arena or where multiple information streams are all carrying the event at the same time. Single sources will be suspect automatically and hostile actors could render even mass events as points of conflict as they sow doubt among the less sophisticated and those whose values align with whatever falsehood is shown.
posted by gusottertrout at 1:41 PM on December 17, 2018 [4 favorites]
The essential problem you describe in regards to photographs and the threat this sort of technology poses are orders of magnitude different. It is one thing to realize that a photo doesn't reveal everything, limited in its framing and moment of capture, but another to be presented with a video showing a complete dialogue or continuous seeming sequence of events which never actually happened. Showing someone speaking words they never spoke isn't just only capturing part of a story, it's making one up entirely. A single image is not the same thing as showing someone engaging in a complex set of actions or speaking. While moving images always carried the question of what happened before and after the things captured occurred, and could in some ways be manipulated to show "fantastic" or staged events, there was still some security in the knowledge that the scope of alteration and the involvement was technologically limited. You couldn't, for example, show someone in clear definition engaging in acts they never actually did or stage large scale complex events in real locations in something like real time without mass involvement of people all in on the hoax.
Now we are heading to a time where there may be no ability to trust any information at all. A video showing something in absolute clarity won't be able to be trusted. If, for example, one sees a video showing someone saying or doing something abhorrent that they deny do you trust them or the video that clearly shows it? Since you know either can lie there is nothing left to fall back on to make a truth judgement, leaving "gut" beliefs and values to guide your choice. We already can see how that's working on the fringes, this can increase the problem to reach almost everyone. News media already suffers from an erosion of trust, how many fakes will it take to destroy that trust completely, either having fakes carried as news or having news challenged by fakes. Deep fakes will weaken the already shaky boundaries of whatever shared social values remain as virtually anything will be open for debate on its validity if it happened anywhere outside of a mass public arena or where multiple information streams are all carrying the event at the same time. Single sources will be suspect automatically and hostile actors could render even mass events as points of conflict as they sow doubt among the less sophisticated and those whose values align with whatever falsehood is shown.
posted by gusottertrout at 1:41 PM on December 17, 2018 [4 favorites]
Seriously, show me some real progress and some real creativity, 'cause this ain't it.
It's been decades now where most of our time, money, intelligence, etc. have been spent making things that don't really benefit humanity as a whole, and it's getting worse.
posted by bongo_x at 2:07 PM on December 17, 2018 [2 favorites]
It's been decades now where most of our time, money, intelligence, etc. have been spent making things that don't really benefit humanity as a whole, and it's getting worse.
posted by bongo_x at 2:07 PM on December 17, 2018 [2 favorites]
There's probably social solutions to a lot of these social problems caused by technologies. Maybe we should burn all comment threads to the ground. Force people to adopt alternate channels within their own communities to discuss them. Abandon social media and let the spambots and celebrity product pages reign there. Curate our own chosen video sites. Self-enforce mandatory daily periods away from interlinked networks.
posted by Apocryphon at 2:22 PM on December 17, 2018
posted by Apocryphon at 2:22 PM on December 17, 2018
It's been decades now where most of our time, money, intelligence, etc. have been spent making things that don't really benefit humanity as a whole, and it's getting worse.
I dunno, I think this stuff is pretty neat...like most science! Its applications may be suspect, and it's up to us to decide how to use it. However, I'm not sure that investigations into AI don't "benefit humanity as a whole". In fact, investigating AI and the way machines learn can provide a window onto how human intelligence works. Again, like all technology, we have to be careful in how we use it. But I don't think it's a worthless endeavor.
posted by k8bot at 2:23 PM on December 17, 2018 [2 favorites]
I dunno, I think this stuff is pretty neat...like most science! Its applications may be suspect, and it's up to us to decide how to use it. However, I'm not sure that investigations into AI don't "benefit humanity as a whole". In fact, investigating AI and the way machines learn can provide a window onto how human intelligence works. Again, like all technology, we have to be careful in how we use it. But I don't think it's a worthless endeavor.
posted by k8bot at 2:23 PM on December 17, 2018 [2 favorites]
Oh. The humanity.
posted by Johnny Wallflower at 4:06 PM on December 17, 2018
posted by Johnny Wallflower at 4:06 PM on December 17, 2018
Social networks know who your friends are, who you have good interactions with. And they have everyone's picture. Should be straightforward to generate pictures that look very similar to your friends to use in very personalized advertising.
posted by Sophont at 4:53 PM on December 17, 2018
posted by Sophont at 4:53 PM on December 17, 2018
Tired: I Hate Mondays
Wired: Infinite variations of tensor-generated lasagna
posted by RobotVoodooPower at 4:55 PM on December 17, 2018 [4 favorites]
Wired: Infinite variations of tensor-generated lasagna
posted by RobotVoodooPower at 4:55 PM on December 17, 2018 [4 favorites]
> gusottertrout: Now we are heading to a time where there may be no ability to trust any information at all. A video showing something in absolute clarity won't be able to be trusted. If, for example, one sees a video showing someone saying or doing something abhorrent that they deny do you trust them or the video that clearly shows it? Since you know either can lie there is nothing left to fall back on to make a truth judgement, leaving "gut" beliefs and values to guide your choice.
Ignorance is strength, freedom is slavery, and war is peace. Who are you going to believe — the powers that be, or your lying eyes?
posted by cenoxo at 4:56 PM on December 17, 2018
Ignorance is strength, freedom is slavery, and war is peace. Who are you going to believe — the powers that be, or your lying eyes?
posted by cenoxo at 4:56 PM on December 17, 2018
None of these people are real, either.
posted by davejay at 5:51 PM on December 17, 2018 [1 favorite]
posted by davejay at 5:51 PM on December 17, 2018 [1 favorite]
"Since you know either can lie there is nothing left to fall back on to make a truth judgement, leaving 'gut' beliefs and values to guide your choice."
This argument falters when you consider that almost all of human history was absent photography, and the other means with which knowledge was transmitted allowed lies. And still does: I don't know about you, but I'm pretty certain that 99% or more of what I learn about the world isn't in the form of photography or moving images.
This is what I was getting at when I said that photography-as-proof is both a short-lived historical exception and not ever nearly as functionally important as we tend to think.
That said, I think most of this reaction is because we have such a strong response to sufficiently realistic imagery. We really want to say that photographs are "real". (Cue André Bazin's theory of cinema.) But, not to beat a dead horse, photography is much, much less "real" -- much less like actual vision -- when you examine and consider the matter closely.
Here's an example. I had an instructor in college who casually asserted that vision was a 2D projection (against the back of the eye) plus stereoscopy. His statement was that it's pretty much like a movie. This was problematic and mistaken in two different respects: one philosophical, which we'll set aside, and one neurological. It would be difficult to exhaustively list all the ways in which our perceived image is a composite built from a wide variety of characteristics of the sensations, across many different scales, detected regularities in the varying of input, and hard-coded evolutionary inference about the nature of the viewed environment. In short, what we see is a deep and complex synthesis of stimuli with a gigantic dose of statistical inference. It can be easily fooled. My initial point in this is that photography lacks much of this stimuli -- it and video work as well as they do because vision is easy to fool. (Don't get me started on color reproduction.)
Furthermore, the deeper point is that vision itself is very, very far from a faithful reproduction of reality.
In a practical, day-to-day sense, we heavily rely upon vision, we've evolved to weigh it very heavily, and so the benefit and curse of the invention of photography is that it leverages that credulity to convince us it is just like seeing, when it's not, really. Only in a relative sense.
But it's remarkable that the context of this discussion, and the concerns expressed, are all about news and wordly knowledge and such, all of which we rely upon almost not at all on vision. Given the implications of the concerns expressed in this thread, photography should have been the single most important technological leap for human knowledge. Much more so than, say, the printing press. And then, later, the moving image. But this isn't the case. Not even close.
Actually, mass printing and literacy are interesting examples because there were (and still are) many fears about the opportunity to misinform. When books could be written and produced only by a few, they were implicitly authoritative by their mere existence. And then they weren't. Civilization didn't collapse. This is far from a perfect analogy, but it's still instructive.
posted by Ivan Fyodorovich at 6:00 PM on December 17, 2018 [5 favorites]
This argument falters when you consider that almost all of human history was absent photography, and the other means with which knowledge was transmitted allowed lies. And still does: I don't know about you, but I'm pretty certain that 99% or more of what I learn about the world isn't in the form of photography or moving images.
This is what I was getting at when I said that photography-as-proof is both a short-lived historical exception and not ever nearly as functionally important as we tend to think.
That said, I think most of this reaction is because we have such a strong response to sufficiently realistic imagery. We really want to say that photographs are "real". (Cue André Bazin's theory of cinema.) But, not to beat a dead horse, photography is much, much less "real" -- much less like actual vision -- when you examine and consider the matter closely.
Here's an example. I had an instructor in college who casually asserted that vision was a 2D projection (against the back of the eye) plus stereoscopy. His statement was that it's pretty much like a movie. This was problematic and mistaken in two different respects: one philosophical, which we'll set aside, and one neurological. It would be difficult to exhaustively list all the ways in which our perceived image is a composite built from a wide variety of characteristics of the sensations, across many different scales, detected regularities in the varying of input, and hard-coded evolutionary inference about the nature of the viewed environment. In short, what we see is a deep and complex synthesis of stimuli with a gigantic dose of statistical inference. It can be easily fooled. My initial point in this is that photography lacks much of this stimuli -- it and video work as well as they do because vision is easy to fool. (Don't get me started on color reproduction.)
Furthermore, the deeper point is that vision itself is very, very far from a faithful reproduction of reality.
In a practical, day-to-day sense, we heavily rely upon vision, we've evolved to weigh it very heavily, and so the benefit and curse of the invention of photography is that it leverages that credulity to convince us it is just like seeing, when it's not, really. Only in a relative sense.
But it's remarkable that the context of this discussion, and the concerns expressed, are all about news and wordly knowledge and such, all of which we rely upon almost not at all on vision. Given the implications of the concerns expressed in this thread, photography should have been the single most important technological leap for human knowledge. Much more so than, say, the printing press. And then, later, the moving image. But this isn't the case. Not even close.
Actually, mass printing and literacy are interesting examples because there were (and still are) many fears about the opportunity to misinform. When books could be written and produced only by a few, they were implicitly authoritative by their mere existence. And then they weren't. Civilization didn't collapse. This is far from a perfect analogy, but it's still instructive.
posted by Ivan Fyodorovich at 6:00 PM on December 17, 2018 [5 favorites]
I had to stop watching the video when the "perfect faces morphing into other perfect faces" scenes kept giving me Prisoner's Cinema flashbacks. Sometimes when falling asleep or at rest in a dark room, I see apparitions of dozens of human faces blend into each other repeatedly as my visual cortex attempts to figure out what to do with a blank visual field. I usually find this comforting and even entertaining.
Watching this happen in a video online was deeply unsettling. Like being wired into someone else's brain and finding out they're some kind of maniac.
posted by Enkidude at 7:16 PM on December 17, 2018
Watching this happen in a video online was deeply unsettling. Like being wired into someone else's brain and finding out they're some kind of maniac.
posted by Enkidude at 7:16 PM on December 17, 2018
So a lot of the 'why' here seems to be around its possible use instead of stock photography (because hey, why not just put another entire industry out of business with software. What could be the harm?). So here's what I want to see: a bunch of these done from a dataset made up of only "lady laughing with salad". And the more firmly in the uncanny valley the better.
posted by sexyrobot at 8:00 PM on December 17, 2018
posted by sexyrobot at 8:00 PM on December 17, 2018
GANs seem to generate pretty convincing dogs
thanks for the nightmares
posted by neckro23 at 8:15 PM on December 17, 2018 [2 favorites]
thanks for the nightmares
posted by neckro23 at 8:15 PM on December 17, 2018 [2 favorites]
I also found that video super-unsettling.
posted by aspersioncast at 9:02 PM on December 17, 2018
posted by aspersioncast at 9:02 PM on December 17, 2018
This argument falters when you consider that almost all of human history was absent photography, and the other means with which knowledge was transmitted allowed lies. And still does: I don't know about you, but I'm pretty certain that 99% or more of what I learn about the world isn't in the form of photography or moving images.
The important difference is that in an era of photographic media those things do matter a great deal in how we come to understand the truth. They create a proof of sorts, limited as it may be at the boundaries, to something having actually happened. A video showing someone doing something may demand context, but it still could be held to capture something real due to the nature of the technology. Having the ability to fake entire sets of events or individual speech and actions is providing evidence of truth that will soon be able to be completely falsified.
It's easy to say our vision, knowledge, and memories are faulty and to simply distrust everything you see, but people simply do not work that way. They rely on their senses to understand the world, so simply wishing that away isn't going to work. Videos will be presented as showing something true that we will have no idea of whether that which is shown actually occurred, but the "proof" will remain to shape the understanding of the event, whether through belief or doubt. Use of this technology for disruption and discord doesn't have to be wholly accepted as showing actual events by everyone, it just needs to cultivate the attitude of doubt and distrust that makes people fall back on their peer networks and faulty and limited worldviews.
How many people will need to testify to the falseness of a video to have that claim be accepted as greater proof? Who will those people be? Surely not all claims will be held to be of equal weight. If Hoover had this technology do we really think he wouldn't have used it to frame people like King? If a world government produces video of some event and another challenges that with video of its own showing a different set of events those videos we won't be able to trust either save by falling back on our old biases over allies and enemies, which we so often don't agree on among ourselves.
At best this just obscures the possibility for truth in those instances, taking us back to an earlier era of less accountability and more government power, and at worst it creates havoc as different groups shape the beliefs of their followers to only accept their version of events and distrust all others. The technology can not only be used to frame the innocent but to protect the guilty by "showing" them doing something different than they did, raising sufficient doubt to make justice more difficult to gain against those with power.
This is essentially weapons technology and should be treated as such for the danger it poses. It may be fascinating in the abstract, like so much other weapon technology, but it will be put to harmful use in ways that will have mass effect and lead to deaths.
posted by gusottertrout at 10:45 PM on December 17, 2018 [4 favorites]
The important difference is that in an era of photographic media those things do matter a great deal in how we come to understand the truth. They create a proof of sorts, limited as it may be at the boundaries, to something having actually happened. A video showing someone doing something may demand context, but it still could be held to capture something real due to the nature of the technology. Having the ability to fake entire sets of events or individual speech and actions is providing evidence of truth that will soon be able to be completely falsified.
It's easy to say our vision, knowledge, and memories are faulty and to simply distrust everything you see, but people simply do not work that way. They rely on their senses to understand the world, so simply wishing that away isn't going to work. Videos will be presented as showing something true that we will have no idea of whether that which is shown actually occurred, but the "proof" will remain to shape the understanding of the event, whether through belief or doubt. Use of this technology for disruption and discord doesn't have to be wholly accepted as showing actual events by everyone, it just needs to cultivate the attitude of doubt and distrust that makes people fall back on their peer networks and faulty and limited worldviews.
How many people will need to testify to the falseness of a video to have that claim be accepted as greater proof? Who will those people be? Surely not all claims will be held to be of equal weight. If Hoover had this technology do we really think he wouldn't have used it to frame people like King? If a world government produces video of some event and another challenges that with video of its own showing a different set of events those videos we won't be able to trust either save by falling back on our old biases over allies and enemies, which we so often don't agree on among ourselves.
At best this just obscures the possibility for truth in those instances, taking us back to an earlier era of less accountability and more government power, and at worst it creates havoc as different groups shape the beliefs of their followers to only accept their version of events and distrust all others. The technology can not only be used to frame the innocent but to protect the guilty by "showing" them doing something different than they did, raising sufficient doubt to make justice more difficult to gain against those with power.
This is essentially weapons technology and should be treated as such for the danger it poses. It may be fascinating in the abstract, like so much other weapon technology, but it will be put to harmful use in ways that will have mass effect and lead to deaths.
posted by gusottertrout at 10:45 PM on December 17, 2018 [4 favorites]
> thanks for the nightmares
GanBreeder has a future in Hollywood SFX, but considering BrundlePod beta testing [deleted baboon+cat teleportation scene, NSFCatOwners] in The Fly (1986), I'll just take a taxi, thanks.
posted by cenoxo at 11:20 PM on December 17, 2018
GanBreeder has a future in Hollywood SFX, but considering BrundlePod beta testing [deleted baboon+cat teleportation scene, NSFCatOwners] in The Fly (1986), I'll just take a taxi, thanks.
posted by cenoxo at 11:20 PM on December 17, 2018
I don't mourn stock photography, on the contrary, I would be happy to see it go. Stock photography exists because we don't illustrate magazines and ads by hand any more -- if you haven't done it before, you should open a 1940s or 1950s magazine and marvel at the skill of the illustrators who painted all those images.
But this is about something bigger than the loss of stock photography. It's about losing the best technology so far for representing and documenting reality. It's not just about the present. It's about looking at historical photographs of everyday life and being able to know: this isn't computer generated.
When the Tiger Woods story was in the news (did he or didn't he beat his girlfriend?), I found a website that illustrated their speculations about what could have happened with a computer animation. We see Tiger doing what they thought he did. The animation was very crude, but the only question in my head was: how many years until we are not able to tell the difference between a simulation and actual footage? When computer animated people will be able to say and do things which they never did, there is no actual footage of anything. I can't understand why it is so desperately important for some engineers to put us in that ugly situation. There are so many more paths they could explore.
And to all those who write variations of "photography never gave us a perfect image of the truth" I would like to answer: the artificial reality technology takes away a very useful tool for documenting reality (photography) but it doesn't replace it with anything better. This means we lose something valuable and we're not getting anything better in return. We are basically thrown back into the age before photography. Soon, every image will be exactly as valuable as a 16h century propaganda woodcut of the Pope with donkey's ears. Thanks a lot.
posted by Termite at 3:09 AM on December 18, 2018 [2 favorites]
But this is about something bigger than the loss of stock photography. It's about losing the best technology so far for representing and documenting reality. It's not just about the present. It's about looking at historical photographs of everyday life and being able to know: this isn't computer generated.
When the Tiger Woods story was in the news (did he or didn't he beat his girlfriend?), I found a website that illustrated their speculations about what could have happened with a computer animation. We see Tiger doing what they thought he did. The animation was very crude, but the only question in my head was: how many years until we are not able to tell the difference between a simulation and actual footage? When computer animated people will be able to say and do things which they never did, there is no actual footage of anything. I can't understand why it is so desperately important for some engineers to put us in that ugly situation. There are so many more paths they could explore.
And to all those who write variations of "photography never gave us a perfect image of the truth" I would like to answer: the artificial reality technology takes away a very useful tool for documenting reality (photography) but it doesn't replace it with anything better. This means we lose something valuable and we're not getting anything better in return. We are basically thrown back into the age before photography. Soon, every image will be exactly as valuable as a 16h century propaganda woodcut of the Pope with donkey's ears. Thanks a lot.
posted by Termite at 3:09 AM on December 18, 2018 [2 favorites]
I wonder if this technology would be of any use to police artists, for producing more realistic images from eyewitness descriptions.
posted by The Underpants Monster at 7:17 AM on December 18, 2018 [1 favorite]
posted by The Underpants Monster at 7:17 AM on December 18, 2018 [1 favorite]
This will be a real time-saver for catfishers and astroturfers; no longer will they need to scour the web for plausible-looking twitter avatars.
And easily extensible and adaptable in real time! Impossible to thwart.
SCAMMER: Computer, give me a picture of Oyster smiling, with a banana on his head, holding a sign saying "Hi Reddit I am really here December 19 2018" and a copy of today's New York Times.
posted by Meatbomb at 8:41 AM on December 18, 2018 [1 favorite]
Fascinating seeing it smoothly moving through the different possible faces. I wonder if they could get the end results out in a format that you could put through UMAP to explore that multidimensional volume of humans.
posted by lucidium at 9:24 AM on December 18, 2018
posted by lucidium at 9:24 AM on December 18, 2018
Boring pitch: the hero falls in love with a procedurally-generated not too-attractive face.
posted by aspersioncast at 4:23 PM on December 18, 2018
posted by aspersioncast at 4:23 PM on December 18, 2018
« Older Betye Saar, Assemblagist of the Mystical | Peter, a man who finds himself kidnapped by a... Newer »
This thread has been archived and is closed to new comments
posted by stillnocturnal at 2:13 AM on December 17, 2018 [8 favorites]