[HN Gopher] South Park creators have new political satire series...
       ___________________________________________________________________
        
       South Park creators have new political satire series with AI-
       generated deepfakes
        
       Author : LaSombra
       Score  : 660 points
       Date   : 2020-11-02 14:03 UTC (8 hours ago)
        
 (HTM) web link (www.theregister.com)
 (TXT) w3m dump (www.theregister.com)
        
       | interestica wrote:
       | With this capability already in hand, it currently just feels
       | like politicians are holding onto a potential video as some sort
       | of 0-day exploit - waiting for the best moment to drop a video to
       | really sway opinions.
        
       | iworkfromhome wrote:
       | MIT Technology Review
       | 
       | The creators of South Park have a new weekly deepfake satire
       | show.
       | 
       | The character Fred Sassy, whose face is a deepfake of president
       | Trump's, on the new show Sassy Justice.
       | 
       | http://technologyreview.com.via.snip.ly/uxrgw8
        
       | steerablesafe wrote:
       | How will this work with IP laws around likeness? Is this fair
       | use?
        
       | shannifin wrote:
       | Collider did some deepfake vids with celebrity impressionists
       | which I thought were funny:
       | https://www.youtube.com/watch?v=l_6Tumd8EQI
        
       | 01100011 wrote:
       | This is the funniest thing Matt and Trey have done in years. I
       | don't know if they still write the show, but it hasn't been funny
       | to me in a couple years. This clip had me rolling. This feels
       | inspired and I hope it creates more awareness in the general
       | public.
        
         | bobbyi_settv wrote:
         | Until very recently, I thought they wrote all the episodes
         | without outside involvement, but then I recently came across
         | this clip where Bill Hader talks about being part of a "South
         | Park retreat" where they bring people to brainstorm episodes
         | 
         | https://www.youtube.com/watch?v=XdZnGz9CWpU
        
         | lolcatv wrote:
         | This funniest thing is this is 100% Trump and probably the best
         | thing he's put out yet.
        
         | thrav wrote:
         | I couldn't agree more. It managed to feel like good old
         | fashioned low budget internet fun from YouTube yesteryear.
         | Putting aside Matt and Trey, it's one of the few things that's
         | made me genuinely burst out laughing in years, period.
         | 
         | It's made me stop and think about why that is, and the best I
         | can come up with is that very little is surprising anymore.
         | When you get past 30, everything feels like something you've
         | seen done before. Deep fake comedy is so cutting edge, that it
         | couldn't help but be fresh.
         | 
         | Or maybe I'm just a boring old man. The last thing that made me
         | laugh this hard was this Reddit submission, and again, it
         | succeeds because I didn't see it coming:
         | https://i.reddit.com/r/ContagiousLaughter/comments/gzdja1/on...
        
         | ryan-allen wrote:
         | Trey Parker is credited as the writer for the vast majority of
         | episodes of South Park, especially since Season 4. Most of
         | their episodes are topical these days, I recommend checking out
         | the recent Pandemic Special.
         | 
         | Matt, Trey and a guy named Robert Lopez wrote the theatre
         | production Book of Mormon, which was well received and won Tony
         | awards (Trey studied Musical Theatre in College, which explains
         | why the South Park Movie was a musical).
         | 
         | Trey's daughter was cast as Jared Kushner in Sassy Justice,
         | which was hilarious, and the lead in Sassy Justice is played by
         | Peter Serafinowicz [0], who is a British comedian!
         | 
         | I hope they make more of these, not just because I'm a fan, but
         | because it massively disseminates the power of deepfakes!
         | 
         | [0] https://www.youtube.com/watch?v=6-7NDP8V-6A
        
           | 01100011 wrote:
           | I saw the pandemic special and, I don't know, it just didn't
           | seem funny to me. I still watch Mr. Hankey's Christmas
           | Classics every year when I decorate the tree, and I still
           | love what they did up until around season 12. After that it
           | gets hit-or-miss and after a few more seasons just didn't
           | seem worth watching anymore. I'd chalk it up to getting older
           | and not smoking pot anymore, but I still love their old
           | stuff.
        
             | ryan-allen wrote:
             | The older episodes are my favourite as well. Ding dong
             | m'kay!
        
           | Schiphol wrote:
           | That Robert Lopez also wrote the awesome Avenue Q musical,
           | and co-wrote the Frozen songs with his wife. He is also the
           | youngest person to achieve an EGOT (https://en.wikipedia.org/
           | wiki/List_of_people_who_have_won_Ac...).
        
         | crankyoldcrank wrote:
         | Would've been funnier if it was Biden having the stroke. Much
         | more believable. But a week before the election I guess only
         | Trump is an acceptable target. Notice Biden didn't get
         | mentioned a single time.. even when making fun of Trump's son-
         | in-law, surely there's a better joke there about the picture of
         | Hunter Biden with a crack pipe in his mouth that's more
         | relevant to the topic. Or I guess, Matt and Trey know that's
         | real ;)
         | 
         | I give it a 4/10 for effort. The first half was pretty funny.
         | After that, it's just Orange Man Bad. Comedy is always so
         | boring when there's a Republican in the White House.. usually
         | Matt and Trey are an exception but you can't win them all.
        
           | 1024core wrote:
           | Just post under your own account; why create a new one for
           | this comment?
        
           | [deleted]
        
         | [deleted]
        
         | spurgu wrote:
         | Others have probably mentioned this, but my first thought is
         | how this will bring deep fakes to people's attention, and make
         | them realize that _nothing_ they see on the internet is
         | necessarily real.
        
         | hhs wrote:
         | Trey Parker had a similar reaction [0]:
         | 
         | "But when Parker got to see himself digitally altered to look
         | like Al Gore, he said, "It was the first time I had laughed at
         | myself in a long time."
         | 
         | Parker added: "I always hate watching myself. Even with 'South
         | Park,' I have a perfect image of what it's going to look like
         | in my head all the time. But on this, there were moments where
         | we felt like kids in our basement again."
         | 
         | To Parker and Stone, the experience also reminded them of "The
         | Spirit of Christmas," their 1995 homemade short film that
         | became a viral sensation in a more primitive age of the
         | internet and paved the way for "South Park.""
         | 
         | [0]: https://www.nytimes.com/2020/10/29/arts/television/sassy-
         | jus...
        
       | davesque wrote:
       | I actually think this will be the best way to immunize people
       | against the effects of deepfakes. We need to make a sort of
       | ubiquitous meme out of it.
        
       | psyc wrote:
       | Welp, Christmas came early. I couldn't have hoped for a better
       | satiric voice to exploit this particular tech.
        
       | fredley wrote:
       | I _think_ that the Michael Cain section is Peter Serafinowicz
       | doing a Michael Cain impression (which has then been deepfaked),
       | but it could be Lyre Bird or similar! The fact that I really
       | can't make my mind up which it could be and the weird sense of
       | doubt, to me, means that this has hit the mark perfectly.
       | 
       | I've been thinking about this video a lot since I first saw it.
       | It's genuinely very unsettling.
        
         | pimlottc wrote:
         | NYT did an article about it [0], which has some details on the
         | casting. Peter Serafinowicz is indeed doing his Michael Caine
         | impersonation, and also does Trump. Other characters are done
         | by Trey Parker and various family members.
         | 
         | 0: https://www.nytimes.com/2020/10/29/arts/television/sassy-
         | jus...
        
         | pram wrote:
         | He also did Michael Cain in his show, it's extremely spot on.
         | 
         | https://youtu.be/HdBZ3Pg0-OM
        
           | agency wrote:
           | Why does it best? https://www.youtube.com/watch?v=HFIQIpC5_wY
        
             | asciimo wrote:
             | This is the first thing I thought about when I saw the
             | Sassy Michael Cain. This is the only thing I remember from
             | The Trip.
        
         | cableshaft wrote:
         | Wouldn't be surprised. While searching him on Youtube I even
         | stumbled on Peter Serafinowicz's appearance on Stephen Colbert
         | where they're talking about him even doing 'bad lip reading'
         | videos for trump in different personalities, one of them being
         | 'Sassy Trump', so he might have had something to do with this
         | show coming about in general.
         | 
         | I don't think he's doing the voice for Sassy Justice, though.
         | That sounds more like Trey Parker to me. But they probably
         | brought him in to do Michael Caine.
         | 
         | https://youtu.be/Oe28UOFNlxk?t=138
        
           | GilbertErik wrote:
           | I think you need to listen. There's little differences
           | between Peter Serafinowicz as Fred Sassy and Trey Parker as
           | Al Gore. There's tiny little mistakes in the voice. I think
           | you need to watch it again, but this time watch with your
           | ears. </s>
        
         | mattnewton wrote:
         | I thought the punchline was going to be a wavenet-style audio
         | deepfake for that segment haha, I had no idea it was a person
         | doing it haha. It explains why they repeatedly said it was
         | impossible for a _human_ to do a perfect impersonation.
        
           | jonhohle wrote:
           | That's where my mind went as well. There are some audio
           | artifacts right around the time he says that that almost hint
           | at the audio being generated. I really enjoyed the ambiguity.
        
         | modeless wrote:
         | Funny, I kind of got the opposite message. The Tom Cruise bits
         | were poking fun at the fact that all of these deepfakes are
         | every bit as obvious as the puppet. And they didn't even try
         | deepfaking the voices because it doesn't work at all. Just
         | listen to the latest state of the art voice mimicking samples.
         | They're terrible! It says a lot that the only actually
         | convincing fake in the entire video was a human impersonator's
         | voice.
         | 
         | That said, this stuff will improve over time. But before you
         | label it the end of the world you really have to think about
         | how much has always been possible with impersonators and
         | makeup/prosthetics.
        
           | nucleardog wrote:
           | The podcast Twenty Thousand Hertz did an episode on deepfake
           | voices[1], which is my only real exposure to the audio side
           | of this that I know about. Certainly wouldn't say it didn't
           | work or was terrible.
           | 
           | [1]: https://www.20k.org/episodes/deepfakedallas
        
         | unfunco wrote:
         | Peter Serafinowicz is a co-creator of the show and I think
         | Sassy Trump was originally his idea, I remember seeing him do
         | it on The Late Show.
        
         | brahweh wrote:
         | At about 3:18 into the video you can see what appears to be
         | Serafinowicz playing Sassy as the camera moves behind him.
         | Makes sense that faking would be hard from strange oblique
         | angles, so looks like they just skipped it.
         | 
         | https://youtu.be/9WfZuNceFDM?t=198
        
       | wodenokoto wrote:
       | At first I thought the sub-headings were summaries of sketches
       | from the show (which I thought was an odd thing to do) so imagine
       | my surprise that the article ends with announcing a new release
       | of PyTorch.
       | 
       | The article consists of 5 different small articles.
       | 
       | Does anybody know where the new show by Trey Parker and Matt
       | Stone called Sassy Justice is airing? Is it a free Youtube
       | series?
        
         | ds206 wrote:
         | https://www.sassyjustice.com/
        
         | techer wrote:
         | https://www.youtube.com/watch?v=9WfZuNceFDM&feature=emb_titl...
        
         | pimlottc wrote:
         | The homepage is sassyjustice.com. Currently it's embedded from
         | YouTube.
        
       | sto_hristo wrote:
       | Impressive. A show that puts those figures in their real-world
       | roles and surroundings, but satirizing everything they do will
       | raise hell indeed. Something like House of Cards, but with satire
       | and known figures.
        
       | jascii wrote:
       | Wow, that deep fake of Tom Cruise is just uncanny...
        
       | grawprog wrote:
       | I understand the worries behind deep fakes, but i'm going to be a
       | bit facetious here, hasn't getting the most realistic, lifelike
       | images always been the goal of computer graphics? Wouldn't
       | deepfakes be the ultimate culmination of lifelike CGI. So
       | lifelike it's indistinguishable from the real thing?
       | 
       | Like for most of my life, this is the goal I keep hearing about.
       | Now it's here, suddenly it's too real.
       | 
       | I dunno just seems a bit funny to me. I've never been one of
       | those graphics people myself, but it seems like a case of getting
       | what you asked for, but being upset because it's different to how
       | you expected it was going to be.
        
         | three_seagrass wrote:
         | Deep fake is photoshop for video and audio. Alarmists are going
         | to opine the many ways it will cause the sky to fall but life
         | will continue.
        
           | edanm wrote:
           | Life will continue is a pretty low bar...
        
             | three_seagrass wrote:
             | After the sky falls or how it has after the invention of
             | photoshop? I'm referring to the latter.
        
         | dchichkov wrote:
         | Who knows, maybe out of that more attention will be paid to a
         | message, rather than the person. Cults of personality had
         | always been problematic. And, technically, this allows to grab
         | the personality and dissolve the cult.
        
           | InitialLastName wrote:
           | The problem is that, in a representative democracy, voters
           | are deciding on the person, rather than the message. If you
           | have systems that increase the noise floor in the link
           | between the two, you end up with an electorate that's less
           | well-equipped to make coherent decisions.
        
         | standardUser wrote:
         | I don't think anyone is disputing the idea that technology in
         | this arena can/will/should advance, but rather that this
         | specific set of technologies come with consequences that are,
         | to many of us, absolutely terrifying.
         | 
         | I also don't think we truly understood how susceptible so many
         | people are to made up nonsense until recent years, so this
         | particular kind of made up nonsense suddenly looks like a
         | serious threat to social stability and cohesion.
        
           | rglover wrote:
           | Just a few more months of these weight loss pills and I'm
           | going to look like David Hasselhoff.
        
           | grawprog wrote:
           | >I also don't think we truly understood how susceptible so
           | many people are to made up nonsense until recent years
           | 
           | That's not true, that's been the basis behind marketing and
           | advertising since it existed. It's been fairly well known for
           | a long time people easily believe nonsense.
        
         | rexpop wrote:
         | The question isn't "what's wrong with the technology," but
         | rather "what harm can it do in the context of our societal
         | development?"
        
       | stanrivers wrote:
       | This feels like the start of something very bad - they are going
       | to be upfront about the deepfakes... others are not.
       | 
       | It's going to get to the point where you can't even say in a
       | court room "well we have video of him doing this". The fact that
       | deepfakes will exist will erode confidence in even those things
       | that are true. At the same time it will add additional fake
       | situations to the conversation.
       | 
       | Worst part is that eventually even the AI-powered counter
       | measures are going to fail eventually. The moment a computer
       | knows what gives away a hint that it is a deepfakes and not real,
       | a computer can solve to not present that give away. The "good
       | guys" and "bad guys" will iterate with each other until it is
       | perfected.
        
         | usrusr wrote:
         | > This feels like the start of something very bad - they are
         | going to be upfront about the deepfakes... others are not.
         | 
         | Well, something really bad is definitely ahead with fake videos
         | (as if real videos, with significantly meaning-changing
         | omissions made with the plain old cutting process weren't bad
         | enough) and people with high visibility creating awareness by
         | doing it in the open is the closest thing to a defense that we
         | have. It's a hopelessly weak defense but better than nothing.
        
         | hugi wrote:
         | It's not bad, it's inevitable and it's nice we have some light
         | shed on it. Stupid people will no doubt try to "ban deepfakes",
         | but we are going to have to learn to live with them.
        
         | m_ke wrote:
         | I recommend watching Adam Curtis' documentary called
         | Hypernormalisation to get a sense of how badly this can get
         | abused by political operatives. Especially this clip:
         | https://youtu.be/Y5ubluwNkqg
         | 
         | https://www.youtube.com/watch?v=-fny99f8amM
        
           | mistermann wrote:
           | There are a lot of valid and important ideas in Adam Curtis
           | (and many, many others) documentaries people could benefit
           | from exposing themselves to (Century of Self is another one
           | that I believe is a must watch for citizens of a democracy).
           | 
           | A weird thing about this thread, and others like it, is that
           | there seems to be this broadly shared implicit premise within
           | the thread context that the feared ill effects of new
           | technologies like this have not already been prevalent in our
           | societies and information ecosystems for decades. In a thread
           | about bias, fake news, propaganda, etc, people seem to have
           | no problem realizing and acknowledging that we already have a
           | very serious problem (often only visible in one's personal
           | outgroup, but that's better than nothing) - but when the
           | _specific_ topic of conversation is a new technology, the
           | majority of the comments seem to be written as if we don 't
           | really have any significant issues currently. It seems as if
           | there's some sort of a phenomenon whereby the logical
           | methodology for evaluation of the situation changes according
           | to the topic, as opposed to there being a consistent
           | methodology that at all times has an explicit awareness of
           | the ever-present bigger picture.
           | 
           | Here's [1] an 8 minute video on Presidential debates. This
           | fairly well demonstrates how this aspect of our political
           | system is largely pure theatre...and yet, intelligent people
           | often speak (again, _depending on the specific(!) topic of
           | discussion_ ) as if this charade is highly legitimate
           | process, within a larger political (and journalistic) process
           | that is also highly legitimate.
           | 
           | The way I view the ecosystem is that the vast majority of
           | things are to a very large extent ~fake (in whole or in
           | part). Cranking up the absurdity to 11 in classic South Park
           | style, making a complete mockery of both the politicians _as
           | well as those who can 't consistently(!) conceptualize the
           | true nature of our system_, seems like an excellent response
           | to a situation that has been sorely in need of some good old
           | fashioned satirical mocking for decades. Western society &
           | politics lost the right to be taken seriously ages ago -
           | admitting to ourselves that there's a problem seems to me
           | like a prerequisite first step in fixing it.
           | 
           | [1] Winning the Presidency: Debating
           | 
           | https://www.youtube.com/watch?v=Ayz7zLn6uFU
        
             | m_ke wrote:
             | I'm much more concerned about the inverse, not people
             | getting fooled by fake content, but believing that anything
             | that doesn't align with their views is fake. We can end up
             | in a post truth society where someone like Trump, Duterte,
             | Bolsonaro or Orban can claim that any footage that is not
             | politically advantageous to them is fake and created by an
             | evil opposition to hurt their cause.
             | 
             | ex:                 Trump access hollywood tape -> fake
             | Soros conspiracy to save the pedophiles
             | 
             | We'll have a lot more people believing conspiracies like
             | the moon landing being fake and end up with a lot of 9/11,
             | JFK, holocaust, etc deniers.
        
               | mistermann wrote:
               | > I'm much more concerned about the inverse, not people
               | getting fooled by fake content, but believing that
               | anything that doesn't align with their views is fake. We
               | can end up in a post truth society where someone like
               | Trump, Duterte, Bolsonaro or Orban can claim that any
               | footage that is not politically advantageous to them is
               | fake and created by an evil opposition to hurt their
               | cause.
               | 
               | I believe the optimum approach is to be concerned with
               | all risks, and weigh the magnitude of each in a state of
               | careful self-monitoring of one's potential biases (and
               | ideally, have your conclusions reviewed by others,
               | preferably from a diversity of ideologies and
               | perspectives in an attempt to minimize the well known
               | affects of groupthink). Noteworthy to me is that a
               | significant number of people (if not the majority,
               | depending on which community you are in) are easily able
               | to see the epistemic errors in their outgroups thinking,
               | but has more difficulty in doing the same within their
               | ingroup.
               | 
               | For example, in your comment it seems that you have
               | noticed shortcomings when it comes to politicians of one
               | general ideology, but I wonder if you are of the belief
               | that this phenomenon _does not_ occur across all
               | ideologies?
        
         | snarfy wrote:
         | It's never perfected, just look to the animal kingdom for
         | examples. Mimicry and camouflage is a never ending evolution.
        
         | javier123454321 wrote:
         | Yes, deepfakes are going to kill plausible deniability in audio
         | and video.
        
           | chrisseaton wrote:
           | > deepfakes are going to kill plausible deniability in audio
           | and video
           | 
           | Huh? Don't deepfakes do the exact opposite - make plausible
           | deniability in audio and video a much stronger argument?
           | 
           | Before if you denied a video of you was genuine that would
           | not be plausible. Now it would be plausible.
        
         | underdeserver wrote:
         | Well, others would be doing it anyway. Deepfakes are something
         | you can do at home, today (if you're willing to spend some time
         | and money on power to train them).
         | 
         | By being so open about it Stone and Parker will just increase
         | awareness of how easy it is to do, so people will know how
         | realistic these things can look and not blindly believe what
         | they see.
         | 
         | So, again, what these guys are doing is a true public service.
        
         | sbmassey wrote:
         | Maybe people will learn to no longer believe partisan
         | information sources, and organisations whose reputation is
         | based solely on the veracity of their content will predominate.
        
         | SpicyLemonZest wrote:
         | But you can say "we have emails where he admits to doing this"
         | in a court room, even though emails can be trivially faked. You
         | just have to provide information about how you got the emails
         | and how you know they're authentic. Is there a reason to expect
         | video will be more problematic as it becomes easier to fake?
        
           | harrisonjackson wrote:
           | Chain of custody is a good argument for protections within a
           | court room but court of public opinion is where the deep
           | fakes will really run wild.
        
         | njharman wrote:
         | > "well we have video of him doing this". The fact that
         | deepfakes will exist will erode confidence in even those things
         | that are true
         | 
         | That's good. Courts should not have high confidence in any
         | single bit of evidence. It can all be manufactured. And long
         | before now. It's just (too) hard to manufacture, not be
         | detected, and get larger number of people required to "go
         | along" with it, when you have manufacture multiple
         | collaborating pieces.
        
         | conception wrote:
         | Maybe... photoshop detection hasn't gone down that path as far
         | as I know. I would imagine the problems of hiding detection in
         | video to be a magnitude harder to hide?
        
           | bogwog wrote:
           | True, but photoshop is usually done by hand. An AI-generated
           | deepfake could maybe eventually be trained so that it outputs
           | videos that are indistinguishable from the real thing.
        
             | thebean11 wrote:
             | The detection side also gets the benefits of AI though. I
             | can't see the generation side getting that far ahead.
        
               | AnthonyMouse wrote:
               | > I can't see the generation side getting that far ahead.
               | 
               | How far ahead do they need to be?
               | 
               | Suppose that it's cat and mouse, at least initially.
               | Every six months someone comes up with a new way to
               | detect the best known deepfakes, then six months after
               | that there is a new way to evade that means of detection
               | as well.
               | 
               | Someone drops a deepfake five weeks before an election.
        
               | Gh0stRAT wrote:
               | From what I understand though, generators always have an
               | advantage because the generator is allowed to "see" the
               | discriminator's gradients during training. [0]
               | 
               | >The model for the discriminator is usually more complex
               | than the generator (more filters and more layers) and a
               | good discriminator gives quality information. In many GAN
               | applications, we may run into bottlenecks where
               | increasing generator capacity shows no quality
               | improvement. Until we identify the bottlenecks and
               | resolve them, increasing generator capacity does not seem
               | to be a priority for many partitioners. [1]
               | 
               | Put another way: GAN training ends when the discriminator
               | can no longer meaningfully distinguish real from fake. By
               | definition then, the best generator will have no useful
               | discriminator that can distinguish its output from real
               | data. (conversely, if you did have such a discriminator,
               | you could use it to train a better generator)
               | 
               | [0] https://developers.google.com/machine-
               | learning/gan/generator
               | 
               | [1] https://towardsdatascience.com/gan-ways-to-improve-
               | gan-perfo...
        
         | cm2187 wrote:
         | How is that different from photoshop? Today no one trusts a
         | photo, as even a teenager can do a decent fake on a smartphone.
         | Now the same for audio and video. Doesn't change our world.
        
           | wcarron wrote:
           | What? There are obvious examples where faked video can have a
           | serious affect. What kind of solipsistic nightmare are we
           | slipping into?
        
             | cm2187 wrote:
             | Because people still have some faith in videos. Deepfakes
             | will remediate that very quickly.
        
               | alcover wrote:
               | I fear they won't. Some crowds only need a spark to
               | explode. They won't wait for counter-evidence. They just
               | need a pretext.
        
         | metalliqaz wrote:
         | The synthetic media apocalypse is even worse then these
         | examples. The ability for liberal democracy to exist at all is
         | highly in doubt. We need serious leadership to be attacking
         | this problem head-on. Instead, I'm pretty sure that our current
         | administration (in the US but also elsewhere) only see it as a
         | welcome propaganda tool.
        
           | im3w1l wrote:
           | Wouldn't surprise me if society gets completely blindsided.
           | As corona has shown, the people who know lack power, and the
           | people with power don't give a shit.
        
         | DenisM wrote:
         | It's been exactly like that with newspapers and eyewitness
         | accounts for hundreds of years. We survived.
        
         | murzy wrote:
         | > The moment a computer knows what gives away a hint that it is
         | a deepfakes and not real, a computer can solve to not present
         | that give away. The "good guys" and "bad guys" will iterate
         | with each other until it is perfected.
         | 
         | You just described a GAN (generative adversarial network)!
        
         | captainmuon wrote:
         | Actually, I'm a bit more optimistic about this.
         | 
         | One thing that worries me is that everything is becoming
         | immutable, saved forever, and digitally signed. Hard to claim
         | you didn't post something if your account is super secure. You
         | can't go anywhere without cameras taking your picture. And if
         | someone steals your bitcoin or your smart contract has a
         | mistake, your money is gone. You can't argue with an algorithm.
         | And the internet never forgets. (Well, unless it is
         | inconvenient for you, then Murphy's law applies ;-))
         | 
         | There are many such phenomena but I'd say they are all related:
         | They mean you have less wiggle room for mistakes, or social
         | deviance. If you are in a situation where you have to break a
         | law, or if you are just having an affair, chances are someone
         | or something is going to see you. There is no anonymity in the
         | crowd anymore, on the contrary.
         | 
         | But this deep fake technology, if it really evolves to be
         | undetectable, can be liberating, if it erodes the trust in
         | pictures. At least the social control mechanisms based on
         | cameras are going to stop working.
         | 
         | We were living in a very specific phase of human history, where
         | we learned how to produce pictures from real scenes, but
         | haven't learned yet to easily fake them. I just think this is
         | going to come to an end, and we'll have to adapt. (I hope we'll
         | adapt socially, and not just by cramming DRM into our
         | technology like we tend do to; but that is a different topic.)
        
           | jackfoxy wrote:
           | Everything is not immutable or even becoming immutable.
           | Content is merely endpoints. Unless extraordinary care is
           | taken, like publishing hashes, the response from the
           | endpoints can change over time. Lots of content has changed,
           | for instance on Wikipedia. In theory you could wade through
           | the wiki history. I don't know if that is protected by
           | hashing or not. Lots of content gets shoved down the memory
           | hole.
        
           | notsuoh wrote:
           | Indeed. We went from a world where nothing was recorded
           | outside of very fallible memory, to one where everything is
           | recorded, to perhaps one where even though everything is
           | recorded, there's enough deniability via deep fakes that it
           | doesn't matter as much that things were recorded.
        
         | autumn_unlaces wrote:
         | We'll know our disinformation program is complete when
         | everything the American public believes is false.
        
         | acover wrote:
         | Would signing the image in hardware alleviate some of the
         | problem? It would create a high cost to faking.
        
           | b0rsuk wrote:
           | Maybe we should just go back to analog video?
        
         | staticautomatic wrote:
         | I've said it here before and I'll say it again:
         | 
         | Audio and video evidence isn't admissible because it's audio or
         | video (and may be inadmissible nonetheless). It's admissible
         | because someone testifies under oath that they have personal
         | knowledge of its provenance. The burden is on the party
         | introducing the evidence to show that it's reliable, and the
         | question of whether or not it is indeed reliable is a factual
         | one for a judge or jury to answer. It's not assumed to be
         | "true" or to accurately reflect reality just because it's a
         | purported photo, video, or audio recording.
        
           | jimkleiber wrote:
           | I appreciate you saying this, I hadn't known that. I think
           | therefore it may be OK in the court of law but what about in
           | the court of public opinion? I could see videos like this
           | easily eroding our trust in the judicial system even further.
        
             | diegoperini wrote:
             | It's probably not gonna take a single generation to get
             | used to this but as it always happened with new technology
             | throughout centuries, people who are born into this new
             | normal will know how not to be fooled.
        
               | germinalphrase wrote:
               | Will they be able to? Widespread cynicism and distrust
               | feels more likely.
        
           | mrfox321 wrote:
           | But the jury are humans. Should we be confident that a jury
           | of 12 (if in US) can pretend that _really_ convincing
           | deepfakes should not be trusted? Especially when this form of
           | evidence has been trustworthy for a decent chunk of time?
        
             | staticautomatic wrote:
             | If no one can testify about the video's provenance, then it
             | simply won't be admitted into evidence. If someone commits
             | perjury in order to get the video admitted, the jurors will
             | be the least of the problems.
        
               | iso1210 wrote:
               | People commit perjury all the time
        
           | lkxijlewlf wrote:
           | This works in a courtroom. This won't work for a politician
           | who retweets a faked video that then incites real violence.
           | And _that_ , at least for now, is where we truly need to be
           | concerned.
        
             | AnthonyMouse wrote:
             | > This works in a courtroom.
             | 
             | Does it though?
             | 
             | Suppose there is a theft at a company. The police go to the
             | company's security team and get the surveillance footage.
             | Presumably admissible.
             | 
             | If it was an inside job, the surveillance footage could be
             | a deepfake showing someone else committing the crime. Or
             | maybe it's real surveillance footage. Without some way to
             | distinguish the two, how do you know?
        
               | ethbr0 wrote:
               | The interaction of our current legal system with SotA
               | deepfakes seems terrifying.
               | 
               | To rip from current headlines:
               | https://www.justice.gov/usao-wdpa/pr/erie-man-charged-
               | arson-...
               | 
               | (Leaving aside the merits of the case / opinions, and
               | just using it as an example)
               | 
               |  _".. the [Facebook Live  & coffee shop] videos depict a
               | male - with distinctive hair to the middle of his back
               | wearing a white mask, white shirt, light blue jean
               | jacket, black pants with a red and white striped pattern
               | down the side and red shoes - setting a fire inside of
               | Ember + Forge."
               | 
               | "A review of additional Facebook public video footage
               | from the area of State Street near City Hall in Erie on
               | the evening of May 30, 2020, shows the same individual
               | without the mask but wearing identical clothing and
               | shoes. The subject's face is fully visible in this video
               | footage."_
               | 
               | And that's a federal arson charge (min 5 years, max 20
               | years prison).
        
               | Camas wrote:
               | And a witness might lie. How will we manage?
        
               | AnthonyMouse wrote:
               | Frequently by incarcerating innocent people.
        
             | zests wrote:
             | We already have plenty of politicians who misquote or lie
             | to sow division and therefore violence. What's new here?
        
               | scarmig wrote:
               | The argument would be that we have social antibodies to
               | typical politician lies, but we've not developed the same
               | antibodies to deepfakes. Since political deepfakes are
               | inevitable IMO, the question is how can we accelerate the
               | development of these needed social antibodies.
        
             | keiferski wrote:
             | The onus of responsibility is still on the person doing the
             | violence. Contrary to the fearmongering, I'd say deepfakes
             | will just make people not believe things they read on the
             | Internet (again), especially as the technology becomes more
             | widespread.
        
               | scarmig wrote:
               | Where the ultimate onus of responsibility falls doesn't
               | matter much when you've got a bunch of Rohingya hanging
               | from trees because some politician retweeted a fake
               | video.
        
               | keiferski wrote:
               | If you follow it through, that line of argument makes no
               | sense. If the video weren't fake, would that somehow make
               | the murder of Rohingya acceptable? Of course not.
               | 
               | Ultimately you end up back at the beginning: holding
               | people, and not pieces of information, accountable for
               | their actions. That's an issue with civil society, not
               | social media.
        
               | scarmig wrote:
               | Deepfakes have the potential to exacerbate existing
               | issues in civil society. If your husband has just been
               | murdered because of a riot instigated by a deepfake, it's
               | not a great comfort to know that it's not really the
               | "fault" of the deepfake but instead the "fault" of a
               | broken civil society.
               | 
               | Plus, deepfakes have the potential to create much more
               | noxious videos than real life. Real life videos depict
               | real life people, flaws and complexities all included;
               | deepfakes will be constructed to depict representations
               | of the target that're most likely to generate inchoate
               | rage of the mob. It's rare that the former happens to be
               | exactly the latter.
        
               | Nasrudith wrote:
               | Well the world does not operate based upon what is
               | comforting. I thought that it was obvious even before
               | COVID-19 but people keep on missing this.
               | 
               | Why not blame the person who confirmed your husband's
               | death at this point if accepting comfortable illusions of
               | someone easy to punish is what we are doing? Without them
               | you would still have some remote hope of survival!
               | 
               | The first step of solving a problem is recognizing the
               | actual problem - what we find comforting is only a
               | distraction for rationalization purposes. Better to
               | recognize that microscopic organisms caused the crop
               | failure instead of burning an old woman as a witch.
        
               | scarmig wrote:
               | Burning a woman as a witch doesn't build any kind of
               | incentive structure to prevent future crop failures.
               | Punishing politicians who incite violence creates the
               | obvious incentive structure for politicians not to incite
               | violence, which results in less violence. This applies
               | both the deepfakes and calls to violence in general.
               | 
               | Politicians don't have a particular right to remain in
               | office despite inciting violence, even if it gives them
               | the sads if they face repercussions.
        
               | alcover wrote:
               | If the video weren't fake, would that somehow make the
               | murder of Rohingya acceptable?
               | 
               | Your answer is astray. What is at stake is evident : one
               | can now provoke riots and deaths from thin air (and
               | bits).
        
               | mypalmike wrote:
               | Genocides happen because of propaganda. The planners and
               | financers and propagandists don't tend to kill with their
               | own hands. But they command massive acts of violence.
               | 
               | Just one example: A radio station (RTLM) in Rwanda laid
               | the groundwork of genocide by dehumanizing Tutsis, among
               | other things by referring to them as "cockroaches"
               | repeatedly. The station was deemed instrumental in the
               | resulting murders of at least half a million people.
               | 
               | It's not fear mongering to look at the facts of history
               | and see that propaganda is a primary way that violence
               | scales. Deep fakes are another tool which will certainly
               | be used towards these same ends. What propagandist would
               | _not_ want to use such a tool?
        
               | dragonwriter wrote:
               | > It's not fear mongering to look at the facts of history
               | and see that propaganda is a primary way that violence
               | scales.
               | 
               | Propaganda is a primary way that _human action and
               | organization_ , whether violent or not, for good or ill,
               | scales.
        
               | b0rsuk wrote:
               | So people will believe what they _choose to believe_. A
               | populist 's wet dream. Truly nothing to be afraid of!
        
             | munk-a wrote:
             | It doesn't work right now due to the partisan divide in
             | America. Spreading misinformation to help your party's
             | cause isn't currently looked down on - so, honestly, having
             | deepfakes out there might not hurt much and might at least
             | help reduce the weight that blatantly doctored videos carry
             | with the general populace.
        
               | [deleted]
        
         | optymizer wrote:
         | I think it'll be similar to the situation with pictures. We've
         | had the ability to 'photoshop' people's faces in and out of
         | images for a long time now.
         | 
         | I think the first question asked of a video is going to be "is
         | this a deep fake?" just like we currently ask "is this picture
         | photoshopped?".
         | 
         | It doesn't mean that we no longer use pictures, or that we
         | don't alter pictures, just that we are more critical of them.
         | 
         | That said, there is just more realism in a series of moving
         | pictures, but I don't see why the situation for a series of
         | fake pictures has to be wildly different from the situation of
         | a single fake picture.
        
           | sandworm101 wrote:
           | The issue is cost. Proving that an image is or isn't
           | photoshopped isn't all that difficult/expensive. Anyone well
           | versed in digital imagery can use basic tools to expose
           | 99.99% of photoshopped images. But moving pictures are
           | different. Due to compression artifacts and the overall lower
           | quality of each frame there is much more room for opinion. An
           | answer one way or another may be possible but it will be a
           | much greater and more expensive battle of experts.
        
             | Tenoke wrote:
             | If anything it's the opposite and it's much harder to fake
             | videos without leaving traces.
        
           | evan_ wrote:
           | In just the last couple hours someone has tweeted a picture
           | of Biden from June 2019 not wearing a facemask, and claiming
           | it's from more recently and he's being irresponsible by not
           | wearing a facemask. It has tens of thousands of retweets, no
           | deep fake necessary.
        
         | rexpop wrote:
         | > you can't even say in a court room "well we have video of him
         | doing this".
         | 
         | Good. We should dispense with retributive justice, and replace
         | it with restorative, transformative systems which are a noop on
         | the wrongly accused.
        
         | woah wrote:
         | Who cares? Photographic evidence has barely existed for 100
         | years, and it's been frequently doctored the whole time.
        
         | DennisP wrote:
         | So there are two problems: people not trusting true videos, and
         | people trusting fake videos. I think the second problem is
         | probably worse, and this satire will at least help with that.
        
         | at-fates-hands wrote:
         | > It's going to get to the point where you can't even say in a
         | court room "well we have video of him doing this". The fact
         | that deepfakes will exist will erode confidence in even those
         | things that are true. At the same time it will add additional
         | fake situations to the conversation.
         | 
         | You cannot underestimate the global political ramifications of
         | this. If you think the amount of video manipulation was bad
         | enough this cycle in the 2020 presidential election? Imagine a
         | few years from now when video's are being put out by political
         | operatives and people with nefarious agenda's with impunity all
         | over social media.
         | 
         | This sort of thing absolutely terrifies me since you can start
         | to twist reality to whatever you want and influence people in
         | ways we never thought possible. I really feel like the genie is
         | out of the bottle and this has the potential to become a very
         | dangerous tool for people with bad intentions.
        
           | m_ke wrote:
           | It's worse than that, politicians will be able to credibly
           | claim that anything that's not advantageous to them is fake.
           | We'll end up in a world where a tin pot dictator like Trump
           | can go out and say that anything he said or did in the past
           | was not real.
        
           | cm2187 wrote:
           | The other way round. Any shocking video trending on twitter
           | will start with a -100 credibility as people will think "just
           | another deepfake".
           | 
           | You are assuming people will have never heard of these
           | deepfakes. Sure there are still a few grannies today who have
           | never heard of photoshop. But your average twitter user?
        
             | mwigdahl wrote:
             | I think that's overoptimistic. A shocking video trending on
             | Twitter that reinforces some population's existing beliefs
             | and biases about the subject will find a ready audience of
             | believers.
             | 
             | Every piece of controversial video content will have a
             | group loudly screaming "TRUTH!" at the same time as another
             | group screaming "FAKE!", and it will just amplify the
             | social polarization we're already experiencing.
        
       | dirtyid wrote:
       | Can't wait the a pipeline between writing scripts to procedurally
       | generated deep fake videos.
        
       | hospadar wrote:
       | This is SO GREAT!
       | 
       | I often feel deep existential dread about deepfakes and the lag
       | between when they are viable (now) and when everyone stops
       | trusting video that doesn't have some kind of rock-solid
       | signed/encrypted/testified provenance (years from now or never?).
       | 
       | What better way to educate the public about deepfake technology
       | than with over-the top satire?
        
         | cm2187 wrote:
         | Agree, video has always been a deeply manipulative media.
         | Between editing, angles of shooting, music. The simple fact
         | that you have a voice over that tells you something and that
         | somehow because you are watching the video (what you watch is
         | real) the voice over is also the truth.
         | 
         | It will introduce a healthy dose of skepticism to that media.
        
           | Bluestein wrote:
           | We might as well ask Kuleshov about this, father of the
           | Kuleshov effect with his experiments between 1910 and 1920
           | ...
           | 
           | In the - now famous - film below, Kuleshov edited a shot of
           | the expressionless face of an actor, which was alternated
           | with various other shots (a bowl of soup, a girl in a coffin,
           | a woman on a divan).
           | 
           | - https://www.youtube.com/watch?v=_gGl3LJ7vHc
           | 
           | When the clip was shown to an audience, they believed that
           | the expression on the protagonist's face was different each
           | time he appeared, depending on whether he was "looking at"
           | the bowl of soup, a girl in the coffin, or the woman on the
           | divan, showing an expression of hunger, grief, or desire,
           | respectively.
           | 
           | Of course, the footage of actor Ivan Mosjoukine was actually
           | the same shot each time.-
           | 
           | The audience even went on to rave about the actors
           | "performance" ... "the heavy pensiveness of his mood over the
           | forgotten soup, [they] were touched and moved by the deep
           | sorrow with which he looked on the dead child, and noted the
           | lust with which he observed the woman".-
           | 
           | The point is that, given the audiovisual medium, a certain
           | degree of manipulation, or "intent" on the part of those
           | creating the work is always to be expected ...
           | 
           | ... and, also, that audiences bring their own bagage when
           | they view something.-
        
         | hutzlibu wrote:
         | "I often feel deep existential dread about deepfakes and the
         | lag between when they are viable (now)"
         | 
         | But they still should be spottable as Fake, if examined.
         | 
         | Related, a MIT project to build a KI to spot deepfakes (also
         | ways to spot deepfake as human):
         | 
         | https://www.media.mit.edu/projects/detect-fakes/overview/
        
         | gsich wrote:
         | Digital signatures do exist.
        
         | rossjudson wrote:
         | I couldn't agree with you more. It probably _is_ the best way
         | to popularize what deep fakes can really do.
         | 
         | I also agree on video provenance/signing. Perhaps signed video
         | will be something devices do by default in the future. If there
         | are multiple devices recording, we can probably find a way to
         | cross-check. Actually, that probably also exists. ;)
        
           | seg_lol wrote:
           | Multicamera deep fakes are already a thing. I just saw some
           | patches for injecting the same hardware codec fingerprints
           | into the synthesized video so that one appears to come from a
           | samsung and the other an iphone 11.
        
         | 0xffff2 wrote:
         | >years from now or never?
         | 
         | I really don't think this is true. I hear stories about
         | deepfakes on NPR pretty regularly. It would be very strange if
         | a subject that has managed to gain some mainstream media
         | traction was never internalized by a large percentage of the
         | general public.
        
           | cacois wrote:
           | NPR listeners != large percentage of the general public.
           | 
           | Also, people believe what they see, on an instinctual level.
           | Hell, people believe what they hear, mostly. It will take a
           | lot of education and time to untrain that.
        
             | 0xffff2 wrote:
             | I try to avoid consuming too much news, so I don't have
             | much exposure to other news sources (and I only probably
             | get an hour or two of NPR programming per week). Given the
             | level of coverage I've heard on NPR, I would be pretty
             | shocked if the subject hasn't received _some_ coverage by
             | most major media outlets.
        
           | [deleted]
        
         | echelon wrote:
         | I agree 100%. Deepfake hysteria is absurd.
         | 
         | I'm the author of https://vo.codes, and my goal has been to
         | make deep fakes as accessible as possible so people become
         | familiar with them.
         | 
         | You know what people use vo.codes for? Memes. That's it.
         | 
         | A number of journalists have decided the technology is entirely
         | confusing and deceitful. While there are new risks posed by
         | deep fakes, I believe the potential for good far outweighs the
         | bad. Today I see it being used mostly for artistic purposes and
         | for humor. The real threat is social media and the soapbox
         | amplification and attention algorithm, not deep fakes.
         | 
         | Deep fakes enable creatives to make this kind of stuff (most of
         | these are made with vocodes) :
         | 
         | https://www.youtube.com/watch?v=dSgd4PoQofQ
         | 
         | https://www.youtube.com/watch?v=iBpqJF5LXX4
         | 
         | https://www.youtube.com/watch?v=3qR8I5zlMHs
         | 
         | https://twitter.com/RodanSpeedwagon/status/13172304947156213...
         | 
         | https://twitter.com/twpl_logan/status/1306542488694460416
         | 
         | https://youtu.be/mS0v2zbjtTg
         | 
         | vtubers are using it to give themselves new voices, which is
         | amazing:
         | 
         | https://www.youtube.com/watch?v=LCKcrSrPYcw
         | 
         | As the technology improves, it will become disruptive and
         | enable average people to join the ranks of Hollywood actors,
         | directors, music producers, and vocalists. The future isn't
         | A-list celebrities and grammy winners, it's people like you and
         | me.
         | 
         | The people stoking the anti-deepfake fire are the media. They
         | claim that this is the end of trust and authenticity, despite
         | the fact that classical methods of trickery and deceit are
         | already way more common. Where are their doomsday articles
         | about catfishing, phishing, and social engineering? There
         | aren't any, because it isn't exciting.
         | 
         | This technology doesn't really move the needle for deception if
         | you can slow a video of Nancy Pelosi to half speed and claim
         | that she's drunk.
         | 
         | Deepfakes are just the next wave of photoshop. People don't use
         | photoshop to steal elections and win court cases. They use it
         | to make memes. It's the same deal with deepfakes.
         | 
         | The technology is going to improve. And I'm sure the hysteria
         | will increase in volume too.
         | 
         | Real time voice conversion and video generation is going to be
         | cited as "scary", but it'll mostly be a hit with gamers and
         | vtubers. It's not going to be a "Mission Impossible"-style
         | espionage tool. It's going to be used in good humor and to good
         | effect.
         | 
         | These tools are going to bring about a new media renaissance.
         | They'll let the small players compete with the giants. That's
         | not scary - it's exciting.
         | 
         | That's also what I'm working on for my startup: memes today,
         | hollywood / old media disruption tomorrow.
         | 
         | I've already got insane growth (millions of requests a week,
         | and our videos have hundreds of thousands of YouTube views).
         | I'm wondering when to pull the trigger and start hiring people.
         | (My users are already working with me to build more!)
         | 
         | I'm super excited by this field, and you should be too. The
         | media is screaming fire, but I'm running at it with full speed.
         | I see the magic and the amazing opportunity.
        
           | hutzlibu wrote:
           | "You know what people use vo.codes for? Memes. That's it."
           | 
           | Not really. Browsing the dark corners on 4chan and reddit,
           | you'll often find people posting pictures of real people and
           | asking if someone can put them into a deepfake.
        
           | LinuxBender wrote:
           | _As the technology improves, it will become disruptive and
           | enable average people to join the ranks of Hollywood actors,
           | directors, music producers, and vocalists. The future isn 't
           | A-list celebrities and grammy winners, it's people like you
           | and me._
           | 
           | Just my two cents, but acting is not as simple as wearing
           | someone's face. If deepfakes make me look like an A-list
           | actor, that alone will not get me a lead role in a big budget
           | movie. I would still require acting skills.
        
             | echelon wrote:
             | Do you know how many skilled, under-appreciated actors
             | there are in the world? The economics of production create
             | limited head room at the top. You'd be shocked how many
             | talented actors, directors, and writers there are that go
             | undiscovered and under-utilized.
        
               | seg_lol wrote:
               | Screenwriters will be able to make entire movies using
               | literally only their voice. I tell a story to a computer,
               | it will "imagine" the entire thing.
               | 
               | And after that, the computer will make the stories and we
               | will watch, at which point the Drake Equation takes over.
        
               | MaximumYComb wrote:
               | "Hey Google, make me a 90 minute movie about X where Y
               | also happens but put in some plot twists"
        
               | echelon wrote:
               | More or less bingo. It might be a few decades, but this
               | is where we're headed. And I'd rank this as more certain
               | and profitable than self-driving cars.
               | 
               | I don't think Disney will keep pace. This trend will
               | cannibalize their IP and this level of tech competence
               | isn't in their DNA.
        
           | atharris wrote:
           | As a corollary - how many people already spread inaccurate or
           | misleading video clips, sound bites, etc without this
           | technology, and how many already refuse to believe real ones
           | produced by the 'lying news media?' I think the hysteria is
           | completely misplaced, and the extremely polarized media
           | landscape is itself to blame.
        
           | Igelau wrote:
           | > A number of journalists have decided the technology is
           | entirely confusing and deceitful.
           | 
           | It's not the journalists I'm worried about. It's the
           | advertisers.
           | 
           | > Deepfakes are just the next wave of photoshop. People don't
           | use photoshop to steal elections and win court cases. They
           | use it to make memes. It's the same deal with deepfakes.
           | 
           | You say that now. By 2024 we'll be getting served political
           | ads depicting "Person who looks like my cousin" in a riot,
           | "Child that looks so much like mine" being shoved into the
           | backdoor of a pizza parlor, or "Sad sack that looks like me"
           | standing in an unemployment line. Fairly certain we all
           | signed away the permission to use our likenesses in the
           | various TOS.
           | 
           | It's also going to be a whole new vein for bullying, e.g.
           | "Goofus hates Gallant. Most kids hate Gallant. Goofus posts
           | low-grade deepfakes of Gallant dying and committing acts of
           | self-harm. The bodies and the hair don't match at all since
           | the source GIFs are from movies and tv, but it's definitely
           | Gallant's face. Goofus gets a short term dopamine burst from
           | his fake internet points as his peers pluslike and cross-
           | post. One day Gallant decides, 'maybe they're right'"
           | 
           | I know it sounds like panicky, theatric, Black Mirror script
           | stuff, but there are no missing pieces to keep either of
           | these from being a button click away. It just might not be
           | quite cheap enough _yet_.
        
           | kube-system wrote:
           | > People don't use photoshop to steal elections and win court
           | cases.
           | 
           | People definitely attempt to do both of these things.
           | 
           | While we definitely can't (and certainly shouldn't) stop this
           | tech from existing, there is still an important civic and
           | academic need to address deep-fakes in conversations about
           | media literacy. People should understand how to analyze the
           | reputability of media regardless of the type. It doesn't
           | matter if they're watching a video or reading a book.
        
           | eggsmediumrare wrote:
           | The problem I'm most afraid of isn't people actually using
           | deepfakes for nefarious purposes (although I think you are
           | not nearly concerned enough about that... It would be very
           | easy for Russia or China to alter local elections by making a
           | grainy video of a candidate appearing to smoke crack a la Rob
           | Ford, for example). I'm afraid of people internalizing the
           | idea that you can't trust video at all. I agree with your
           | criticism of the media on this, but from a slightly different
           | angle... They are spreading that exact idea. True or not,
           | it's a dangerous idea.
        
           | rexpop wrote:
           | > It is difficult to get a man to understand something, when
           | his salary depends on his not understanding it.
           | 
           | -- Upton Sinclair
        
             | echelon wrote:
             | I'm not sure if this is directed at me or the media.
             | 
             | If it's at me, I could just as well be building anything
             | else. I find this technology fascinating, and I see an
             | almost magical future ahead where we can tweak sensory
             | input and play it like an instrument.
             | 
             | It's the closest we've come to building our dreams. The
             | possibility of the _The Matrix_ made more real, and bent to
             | our own desires.
             | 
             | This stuff is going to sink Hollywood and replace it with
             | an improvement at least an order of magnitude more
             | imaginative.
             | 
             | So it's not that I'm letting personal interest or profit
             | motives cloud my judgment. I think this is truly
             | revolutionary, and I don't understand why others don't see
             | the same glittering and fantastical future.
             | 
             | They're too afraid of the demons to build the cool thing.
        
           | talentedcoin wrote:
           | Sorry, but if the "new media renaissance" is SpongeBob
           | singing WAP, count me out. I don't think it's exciting, I
           | think it's lame.
           | 
           | But you're right, there's always money to be made in making
           | the Internet an even dumber place than it is already. Have
           | fun!
        
             | echelon wrote:
             | That's such a narrow-minded, salt vinegar take.
             | 
             | How many leaps away are we from 10 year olds making their
             | own Star Wars movies? Not many, I posit. And I think that
             | many of them can and will do better than George Lucas.
             | 
             | This technology is going to give so many more people the
             | ability to create. As we begin to automate the tedious jobs
             | and industries, it's important we have something fulfilling
             | and engaging for people to move to. The creative field is
             | rewarding and leads to self-growth and entrepreneurism.
             | 
             | The future is going to be a Cambrian explosion of
             | creativity and expression. Look at YouTube, TikTok, and
             | Patreon. Imagine what more tooling will do for these folks.
             | Brains are teeming with ideas and imagination, but they
             | often don't have the resources to breathe life into things
             | imagined - with this next round of tech, we're going to
             | change that.
             | 
             | Conversely, the concentration of wealth and production
             | value at the top (entities like SpongeBob and Cardi B) will
             | erode once everyone has the ability to generate character
             | designs, animation, music, lyrics. More money will pump
             | into the system, and it'll spread more evenly.
             | 
             | This is the Internet / Smart Phone revolution all over
             | again.
        
               | talentedcoin wrote:
               | We don't have the same idea of the meaning behind the
               | words "create" or "creativity".
               | 
               | TikTok is exactly the kind of thing I'm talking about --
               | you just want to create another vector for even more
               | soulless, time-wasting popularity contests.
               | 
               | What you're describing to me sounds like a technological
               | way for adult children to play brand-promoting dress-up
               | inside the already-worn-out shell of pop culture, a way
               | to infinitely recombine the old without actually creating
               | anything new or interesting.
               | 
               | Call it a narrow-minded take if you like. I'm sure you'll
               | be laughing all the way to the bank, so who cares what I
               | think? It'll be a hit on 4chan.
        
         | phailhaus wrote:
         | > What better way to educate the public about deepfake
         | technology than with over-the top satire?
         | 
         | There is a sizable group of people who do not know that The
         | Colbert Report was satire. The problem with satire is that
         | there will always be a group of people who think it's real,
         | it's like a corollary to Poe's Law.
         | 
         | Source: Research: Conservatives believe Colbert isn't joking
         | https://www.cnet.com/news/research-conservatives-believe-col...
        
           | malandrew wrote:
           | Yup. The same goes the other direction. The Babylon Bee has
           | been fact checked a bunch of times because some liberals
           | think it is fake news not satire. Whether it's a few, some,
           | many, whatever is debatable, but seems like a general rule
           | that people generally don't perceive satire to be satire when
           | it doesn't match their preferred consumer choice of
           | gaslighting mainstream media. It's hardly specific to
           | liberals or conservatives.
        
           | tines wrote:
           | Not a compelling problem imo. You could say the same thing
           | about irony, sarcasm, etc. but nobody calls it a problem.
        
         | chrisseaton wrote:
         | > I often feel deep existential dread about deepfakes and the
         | lag between when they are viable (now) and when everyone stops
         | trusting video that doesn't have some kind of rock-solid
         | signed/encrypted/testified provenance (years from now or
         | never?).
         | 
         | Maybe we'll have to go back to sworn witness statements. Worked
         | for thousands of years.
        
           | davidw wrote:
           | Think how many instances of someone's word vs the police
           | there have been. It did not 'work' very well. Mobile phone
           | video has changed some things for the better.
        
           | AnIdiotOnTheNet wrote:
           | Sadly we are well aware of how inaccurate and untrustworthy
           | eye-witness accounts can be.
        
             | asperous wrote:
             | Eye witnesses are very accurate when it comes to broad
             | ideas like did you hear a loud noise on this day.
             | 
             | Where they fail, especially compared to evidence and video,
             | is details like what did the noise sound like, what did the
             | person look like, what were they wearing, etc
        
             | chrisseaton wrote:
             | But at least there is a human being behind each statement.
             | You can question, you can examine their motives, you can
             | challenge. A video is just a video. An anonymous accuser.
             | You can't ask a video any questions.
        
               | ethbr0 wrote:
               | Slightly OT, but I finally realized this was probably the
               | long-game behind Facebook et al.'s real-identity-only
               | policy.
               | 
               | Sure it has the side (main!) benefit of empowering
               | targeted advertising.
               | 
               | But fundamentally creating traceability between
               | content/comment and a singular identity enables
               | elimination of the most egregious abuses (mass-
               | disinformation/ganging).
               | 
               | I'm as 90s internet-is-for-anonymous as anyone, but I
               | have to admit traceability has merit in the larger social
               | ecosystems.
        
               | intotheabyss wrote:
               | We can create decentralized identity systems, where you
               | decide which information should be public, while also
               | providing accountability.
               | 
               | https://www.microsoft.com/en/security/business/identity/o
               | wn-...
               | 
               | Rather than let mega corps control identities, which
               | should be building open source tools to allow developers
               | to include decentralized identities within their
               | platforms, preferably systems that interoperate with
               | Ethereum.
        
               | pjc50 wrote:
               | Much of the worst content is published by people under
               | their own names and news bylines. And then happily
               | retweeted by people under their own names.
               | 
               | It gets worse when businesses and adverts are included in
               | the mix. Who is Bob's News Agency, _really_? Can you
               | click on every advert to find who paid for it? No.
        
               | ethbr0 wrote:
               | Absolutely, but the difference is those people have
               | fingerprints.
               | 
               | If someone is repeatedly toxic (which should be
               | algorithmically identifiable), you can take steps to
               | balance that.
               | 
               | You cannot do the same if 50% of your userbase lacks a
               | stable / historical identity. At least without attempting
               | to recreate identity on the basis of metadata (IP,
               | patterns, etc).
        
               | chillwaves wrote:
               | A video is data that exists with forensic context.
               | 
               | What we need are institutions that are trusted to verify
               | this data, like journalism. Except it is being allowed to
               | be perverted for profit or ideology.
        
               | chrisseaton wrote:
               | What does a journalist know about verifying video data?
               | 
               | And isn't a journalist going to bring their own side of
               | the story to it? Hardly independent.
        
             | Apes wrote:
             | As inaccurate and untrustworthy as deepfakes are making
             | video?
        
               | kube-system wrote:
               | If not worse.
               | 
               | 1. Physical evidence can at least by analyzed by a third
               | party, and
               | 
               | 2. Misleading deepfakes aren't created accidentally by
               | honest people.
        
           | rowanG077 wrote:
           | I would say it didn't work for thousands of years. It just
           | was the best we had.
        
       | sega_sai wrote:
       | Is there some kind of private/public key signing technique for
       | videos, when the camera creating a video signs it with some key
       | providing a signature that can be verified for authenticity ? so
       | you would be able to at least verify that the video was created
       | at certain date on a certain camera ?
        
         | blonde_ocean wrote:
         | The toughest issue here, IMO, is being able to sign and
         | authenticate a document without proprietary DRM. Been thinking
         | about this a lot.
        
           | Cthulhu_ wrote:
           | DRM is indeed what it is; you need to verify a signature
           | somewhere to ensure it hasn't been tampered or altered, which
           | requires an internet connection to verify or download.
        
         | iso1210 wrote:
         | There's the Trusted News Initiative [0] and Project Origin
         | [1,2]. Too little, and not enough support imo.
         | 
         | [0] https://www.bbc.co.uk/mediacentre/latestnews/2020/trusted-
         | ne...
         | 
         | [1]
         | https://www.bbc.co.uk/blogs/aboutthebbc/entries/46f5eb33-b7b...
         | 
         | [2] https://www.originproject.info/
        
         | djsumdog wrote:
         | I think this solves the wrong problem. Most of these deepfakes
         | have obvious places where you can tell they are renders. I
         | think it's more important to have tooling that can tell when a
         | video has been altered.
         | 
         | I'm more concerned with the manipulation of grainy low-res
         | video. Police body cameras are an incredibly tool for police
         | departments to fight misinformation (well, only when people
         | watch the whole thing). Is editing these types of videos as
         | obvious has high resolution video, or is it more easy
         | manipulable?
        
           | malandrew wrote:
           | > I think it's more important to have tooling that can tell
           | when a video has been altered.
           | 
           | It's probably going to be a cat and mouse game. Think about
           | how far along green screen technology has come. You watch a
           | movie in the 80s and 90s today and the green screening is
           | super obvious to our 2020 trained eyes. Imagine how someone
           | from the 80s and 90s would perceive the greenscreening from
           | 2020?
           | 
           | I imagine there will always be a cutting edge of deepfakes
           | that will travel halfway around the world before the truth
           | comes out and won't spread to all the people that believed
           | the fake.
           | 
           | I can't help but to think that as a society, we're entering
           | the "schizophrenic" phase of popular reality.
           | 
           | Read these quora answers on what it's like to have
           | schizophrenia and think "What would a society collectively
           | afflicted in the same way behave?"
           | 
           | https://www.quora.com/What-is-it-like-to-have-
           | schizophrenia-...
           | 
           | It's going to be weird to see society have to operate in a
           | state of constant disbelief of so many things that were
           | previously accepted as fact.
        
         | throwaway3699 wrote:
         | This seems absolutely inevitable.
         | 
         | More practically, PGP.
        
         | prutschman wrote:
         | So I can cryptographically prove that at a particular date at a
         | particular time with a particular camera that I authentically
         | pointed my camera at a screen projecting faked content?
         | 
         | I think this only works if you're signing something currently
         | much harder to fake, like a full light-field or something, and
         | even then it won't stay hard to fake forever.
        
           | tetraodonpuffer wrote:
           | You could have the camera record the lidar also and crypto
           | sign that too. Assuming it is all done in camera and the
           | lidar is high resolution enough it should be possible to
           | completely tamper proof a video recording at least at the
           | "talking head" level.
           | 
           | I think Apple could possibly start providing this for iPhone
           | videos if they decided to as the hardware is all there
           | already both the lidar sensor as well as the Secure Enclave
           | etc
        
             | malandrew wrote:
             | I think this idea of adding additional layers is on the
             | right track because it confounds the deepfake generation
             | problem by adding a curse of dimensionality. You can't just
             | fake 2D. You need to fake 3D which is much harder.
             | 
             | I'm curious what the equivalent of "3D" to "2D" will be for
             | deepfaked voice would be.
        
           | drdeca wrote:
           | I know they don't currently do this, but if GPS satellites
           | cryptographically signed their timing signals, could that
           | potentially be usable to establish that, in addition to the
           | particular camera capturing the images at a particular time,
           | that it also did so in a particular-ish place ?
           | 
           | It's possible to record and replay signals from gps
           | satellites with different delays, so maybe that could be used
           | to spoof the location, but if the camera also has an internal
           | clock, perhaps it could detect if there was too much
           | discrepancy there? But I don't know how much that could
           | constrain the location. Also gps only has limited precision,
           | especially indoors? Uh, maybe using cell towers instead of,
           | or in addition to, gps would be better? Hm.
        
         | Accujack wrote:
         | You can sign the data, but it won't help.
         | 
         | Basically, the ability to fake any video means that other data
         | is needed to prove that the person/events in the video was in
         | fact the person/events that that video is claimed to depict.
         | 
         | Other proof, other measurements of the same events, etc.
         | 
         | Deep fakes mean people will know they can't trust their eyes,
         | they need to think critically. That's been true for a while,
         | but now they'll know it.
         | 
         | Which I presume is why Parker/Stone are doing this now, other
         | than the obvious (make money). They're going to force people to
         | recognize this is possible and how perfect the fake can be.
        
           | oehpr wrote:
           | As a side note regarding the money: My understanding is they
           | had formed a studio to make a deep faked movie some time
           | ~2019? I think? But covid rolled around and the studio
           | dissolved. They took advantage of the contacts and assets
           | they had initially made to produce this youtube video. They
           | think it might be the most expensive youtube video ever made.
        
           | tomtomtom777 wrote:
           | Maybe if we start the habit of signing one's own appearances,
           | this at some day will make unsigned video considered
           | unreliable.
           | 
           | But I guess this is wishful thinking given that we don't even
           | generally sign written comments yet.
        
         | ip26 wrote:
         | It's a good question, crytpo can be used for this sort of
         | thing. It does preclude editing, unless you also establish a
         | cryptographic trust chain where each successive editor also has
         | a trusted key.
         | 
         | Probably never going to work with cameraphone journalism, but
         | if we're talking about newsroom & press conference footage you
         | could make it happen.
        
           | enchiridion wrote:
           | Why not? Can't there be "TLS for cameras"?
        
           | filleokus wrote:
           | Expect from editing, which, as you say should be able to be
           | solved somehow (encrypt every frame?), I think the biggest
           | problem is the equivalent of the "analog hole".
           | 
           | How do you know if a signed recording is from the actual
           | event, or from a camera being pointed at a screen in a pitch
           | black room recording the playback of some malicious video
           | trying to show said event?
        
             | ip26 wrote:
             | Trust. If the root signatory is CNN, you decide whether you
             | trust CNN not to do that.
             | 
             | Not really any different from other encryption. Your bank
             | _could_ be defrauding you on the back end despite the
             | verified https session. But you trust they aren 't.
        
               | filleokus wrote:
               | Why do we need to sign the video then? Isn't it enough
               | for me to ensure that I have a secure TLS connection to
               | https://cnn.com? Maybe for external platforms? But then
               | the challenge is just proving that
               | https://www.youtube.com/user/CNN/ is controlled by CNN.
               | Much simpler than involving the cameras.
        
               | ip26 wrote:
               | The hope would be that if video clips _were_ verifiable,
               | they would be widely _expected_ to be verified, such that
               | when someone links to  "old clip of Senator says
               | outlandish thing", you habitually check to see if it's
               | verified- no matter who or what is re-sharing it.
        
         | prox wrote:
         | I believe Canon has special cameras that the police uses. It's
         | been a while but they use some encoding scheme I think.
        
           | thenickdude wrote:
           | Canon's image signing scheme was half-assed and was
           | comprehensively broken, they didn't even store their signing
           | keys on a secure element:
           | 
           | https://www.digitalartsonline.co.uk/news/creative-
           | lifestyle/...
           | 
           | >In Canon's second version of its ODD system, the HMAC code
           | is 256 bits. The code is the same for all cameras of the same
           | model. Knowing the HMAC code for one particular model allows
           | the ODD to be forged for any camera within that model range,
           | Sklyarov wrote
        
       | codeulike wrote:
       | I watched a few mins of this because Peter Serafinowicz retweeted
       | it but I completely didn't spot that they'd put Trump's face on
       | the main guy (who I think is Peter?}. He looked vaguely familiar.
       | And I figured I was missing something. So that means it's both
       | very convincing and a bit too subtle for me?
        
       | codysan wrote:
       | In terms of using deep fakes to impersonate political figures, I
       | do think we'll ultimately be okay on that front. I can see the
       | entertainment industry being flipped on it's head and fabricated
       | conspiracy theories hitting a fever pitch but with initiatives
       | like CAI* cropping up, my hope is that there will be a digital
       | wax seal of sorts to make sure what you're seeing is what the
       | author intended.
       | 
       | * https://contentauthenticity.org
        
       | nathanvanfleet wrote:
       | I am kind of waiting for both of them to become extremely right
       | wing.
        
       | Cthulhu_ wrote:
       | Content aside, this is really interesting and I hope to see them
       | do more attempts at humor using deepfakes.
        
       | jayd16 wrote:
       | The ramifications for misuse are real BUT
       | 
       | I think making a studio that specializes in this type of work is
       | genius. This technology could really change the industry in
       | interesting ways. Faster iteration, less re-shoots, less extras.
       | 
       | I guess you could even "storyboard" with rough deepfakes.
       | 
       | Pretty interesting stuff. I wonder what else people will come up
       | with if this stuff becomes a shrink wrapped industry standard.
        
       | chaoticmass wrote:
       | I don't find the idea very funny or entertaining. If you have a
       | funny parody show or something, it should be funny and
       | entertaining without deep fakery. This seems like a gimmick that
       | won't last long. Maybe it will do some good in bringing the
       | issues of deep fakes into the wider public conversation though.
        
         | toxik wrote:
         | I don't find the idea of poorly drawn cartoons funny by itself
         | either, it's just a technique. It'll probably look pretty
         | crappy most of the time. I imagine it'll be like South Park, as
         | frequently has actual people in it, doing... odd things.
        
         | newhacker2719 wrote:
         | I think it's great, there is no problem with deep fakes as long
         | as it is clear they are fakes. The entire political class
         | including their donors are godawful and deserve to be made fun
         | of.
        
       | samirillian wrote:
       | Mary Poppins saying Fauci telling people to wear masks is a
       | deepfake because Covid is fake...sorry, doesn't do it for me,
       | esp. considering Fauci did, indeed, discourage people from using
       | masks.
       | 
       | Trump wrecked South Park the same way he wrecked the Daily Show.
       | These people are past their time, their satire is no longer
       | relevant; they too are missing the point.
        
         | williamtwild wrote:
         | "Trump wrecked South Park the same way he wrecked the Daily
         | Show. "
         | 
         | Didn't take long for the MAGA brigade to find this post.
        
           | samirillian wrote:
           | Def not MAGA.
        
       | BenGosub wrote:
       | Funny how the image caption is "Parker and Stone at a pre-corona
       | premiere", when the photo is like, 20 years old, while the
       | epidemic has been this year.
        
         | daveslash wrote:
         | I got a tickle out of that as well.
        
       | Pxtl wrote:
       | Let's remember their politics, though - they used to market
       | themselves as "both sides are bad" indpendants/libertarians, but
       | now they're full-out Trumpers.
       | 
       | I could see clips from this show making the rounds on lower-
       | information parts of social media like Facebook that are already
       | prone to being hoodwinked by conservative fake news like Alex
       | Jones and the like. Professionally-produced deepfakes becoming
       | the exact attack on democracy that other people in this thread
       | are proposing will "innoculate" us.
        
         | dleslie wrote:
         | Sassy Justice is one long gut-shot against Trump. I'm not sure
         | why you think conservatives would widely share this, unless
         | you're supposing that they have a good sense of humour and
         | humility.
         | 
         | And yes, all politics and public figures warrant criticism and
         | satire. Cast aside your sacred cows.
        
         | will4274 wrote:
         | Should be a headline from The Onion - Famous satirist uses
         | 'satire' in public; crowd (including Pxtl) unaware; takes him
         | literally.
        
       | b0rsuk wrote:
       | I think the biggest casualty of deepfakes are going to be
       | citizens. Record a video of police brutality or crime footage and
       | it can all be shrugged off as a deep fake. TV stations and
       | journalists can afford 'signed videos'.
        
         | three_seagrass wrote:
         | The type of person who will claim that has already been
         | claiming 'false flag' or 'fake news' anyways.
         | 
         | If someone has confirmation bias, they're still going to find
         | ways to call it fake with or without deep fake tech.
        
         | epistasis wrote:
         | It will be much more difficult to do deepfakes of that sort.
         | Doing a good deepfake requires lots of training data. We might
         | be able to get away with less training data a the tech
         | progresses, but that is one area that will probably be most
         | difficult to make progress in.
        
         | njharman wrote:
         | Deep fakes typically are putting a face onto an actor (or other
         | footage). Not faking an entire scene with multiple people,
         | that's just called CGI/Animation.
         | 
         | So you could easily take existing footage and change who it
         | appears to be. But,
         | 
         | 1) doing that to frame or unframe someone will be pretty niche
         | 2) the original correct footage might still be findable 3) with
         | access to the video files(which is going to be required in any
         | legal situation), I'm sure there will be (exists for photos)
         | algorytms that detect the editing the deepfake did to video
         | file.
        
           | strogonoff wrote:
           | Deepfaking doesn't have to be used on its own. A dedicated
           | entity can shoot entirely new original scenes (unique
           | footage), deepfake actors' faces to look like the people
           | they're targeting, edit it all together, and possibly add CG
           | VFX as a final layer to make the resulting video look very
           | real[0].
           | 
           | Of course, knowing all that is not really necessary for
           | someone to shrug off truthful incriminating footage as "fake
           | news".
           | 
           | [0] One caveat is that deepfaking works better with higher
           | quality footage. To produce a complicated scene with altered
           | actor faces while maintaining realistic "phone video" look,
           | it would make sense to film with good lighting and high-
           | quality gear into log or raw format, and then imitate the
           | look in post-production after deepfaking is applied.
           | 
           | Thus, one method of detecting such fakes could be by checking
           | for traces of VFX, artificial noise, looking for signature
           | lens properties, signature behavior of phone video recording
           | "magic" (such as noise reduction and stabilization), etc.
           | Enough of producer's dedication could make that tricky, but
           | IMO it could be easier than applying automated deepfake
           | detection straight up--it'd be buried early enough in post-
           | production workflow, with a lot of noise introduced by
           | subsequent "phone look" VFX.
        
         | Forbo wrote:
         | Seems like it would be pretty easy to add a way to sign a video
         | using keys stored in a phone's enclave, surely Apple and Google
         | are capable of doing so.
        
         | heavyset_go wrote:
         | Maybe in the court of public opinion, but I doubt it would
         | matter much in actual court. Chain of custody for video
         | evidence has always mattered there.
        
         | serial_dev wrote:
         | I also don't think that politicians will suffer the most. Some
         | people already know that social websites are full of fake
         | videos, pictures and quotes from politicians. The others who
         | don't know that already, could be more economically tricked,
         | for example use a picture of a politician, place some
         | outrageous quote next to him/her with Comic Sans, and some will
         | believe that, too.
        
         | bonoboTP wrote:
         | People weren't walking around with cameras all the time 10-15
         | years ago and the world functioned still.
        
         | racl101 wrote:
         | True. I've already noticed people who aren't cynical, become
         | cynical after seeing some deepfake videos. This, coupled with
         | confirmation bias will not help real victims who are regular
         | citizens.
        
       | feralimal wrote:
       | Well, personally I think deep fakes are a good thing.
       | 
       | The reality is that 'we' were too trusting of what is presented
       | on screens as true. Its the main piece that is used to manage and
       | govern us.
       | 
       | So, I welcome distrust on what we see on screens - that trust was
       | always misplaced, and all about manipulation rather than
       | information.
        
       | malandrew wrote:
       | Who was Lou Chang? I didn't recognize who that was a deepfake of.
        
         | dole wrote:
         | Julie Andrews
        
       | chrisjc wrote:
       | I'm not entirely sure why it reminds me of the old UK show
       | Spitting Image.
       | 
       | https://www.imdb.com/title/tt0086807/
        
         | crtasm wrote:
         | Spitting Image just started up again after 14 years. It's not
         | amazing but worth a look:
         | https://en.wikipedia.org/wiki/Spitting_Image_(2020_TV_series...
        
       | juskrey wrote:
       | Talking TV heads are already deep fakes. I'm glad computer fakes
       | will finally erode the confidence of public into those people,
       | from news anchors to presidents. Took almost a century to arrive
       | here.
        
         | pengaru wrote:
         | While I agree on your first point, the rest completely misses
         | that the public will continue to be tribal and instead _only_
         | believe the talking heads of their given tribe as authentic in
         | a world rife with deep fakes.
         | 
         | I fully expect the outcome to be increased tribalism, when
         | faith in your tribe's leadership and information sources is the
         | only source of comfort, confidence, and "truth".
         | 
         | I don't think it will be the opposite where everything is
         | questioned and critical thinking suddenly becomes prolific as
         | you seem to imply.
        
           | juskrey wrote:
           | Local talking heads by definition can be punched into the
           | face, seen in the local coffeeshop etc. They have skin in the
           | game. Presidents don't.
        
             | pengaru wrote:
             | What does that have to do with deep fakes pushing consumers
             | of Fox News talking heads further into the arms of Fox
             | News?
             | 
             | Are you imagining if Tucker Carlson were punched in the
             | face at his local coffee shop, he'd shut up? :rolleyes: His
             | following would only grow larger and more supportive as he
             | milked the controversy for all its worth.
        
       | nautilus12 wrote:
       | This feels like its stolen from Kyle Dunnigan
        
       | nemo44x wrote:
       | Dark days ahead for Hollywood. How long until Silicon Valley eats
       | that industry?
       | 
       | I can imagine lifelike movies that render characters to the users
       | preference in real-time.
       | 
       | Talk about "representation"!
       | 
       | I can imagine as well dialects, language, etc being rendered in
       | real time to adapt to what the user prefers. You and another
       | person could watch the same film and talk about the same story
       | but have totally different experiences on what the characters
       | looked like, talked like, and even said within some parameters.
       | 
       | Making a movie with humans will be a prestige event like riding
       | horses today or driving an ICE in the future. They don't be able
       | to compete with rendered film at a mass scale due to cost.
       | Rendered films could be built and distributed cheaply and cost a
       | fraction of a real movie to watch.
        
         | Nasrudith wrote:
         | Really that would be a good thing for "character actors" as it
         | goes less on image and more on as what they need to contribute
         | is creative to be a collaborator instead of a "living prop".
         | 
         | What would be interesting would be the techniques used. Are
         | they like animators or roleplayers focused on a single
         | character to give them emotional touches in added details,
         | quirks, and improved line changes or "greenscreened" such that
         | what they actually look like is utterly irrelevant to their
         | job?
        
         | skocznymroczny wrote:
         | They would make their living by licensing the likeness of
         | actors. Because you might have a great rendered movie, but most
         | people will want a great rendered movie with Tom Cruise and
         | Samuel L Jackson, not with nonane actors.
        
           | utexaspunk wrote:
           | Until you make a completely artificial persona that gets
           | famous.
        
             | marwatk wrote:
             | https://en.wikipedia.org/wiki/Simone_(2002_film)
        
       | tobr wrote:
       | YouTube is trying to autoplay a Fox Business live MAGA event
       | after this. I suppose they're in the same category - misleading
       | videos featuring Trump?
        
       | sacredcows wrote:
       | Awesome, maybe they'll add more climate change denialism into the
       | mainstream /s
        
       | craftinator wrote:
       | I'd be very interested in a show where they have deepfakes of
       | politicians reading FOIA obtained emails that they wrote. It
       | would be a positive use for a technology that has some pretty
       | negative use cases =)
        
       | koiz wrote:
       | Amazing.
        
       | tremon wrote:
       | Regardless of the actual content or quality of the show, I hope
       | this reaches a wide audience. All people should know about the
       | technical capabilities of deepfakes.
        
         | zeroping wrote:
         | Is this a method to innoculate the viewer against deepfakes? I
         | wonder if part of thief goal is to specifically make people
         | more aware?
        
           | dleslie wrote:
           | Matt and Trey usually have an interesting side project on the
           | go; where South Park pays the bills, shows like The Book of
           | Mormon and Sassy Justice are their creative expression.
        
       | yalogin wrote:
       | Deepfakes have the potential to turn the word upside down and
       | cause utter chaos in the world. I don't know how we are going to
       | deal with it as a society, I think we are not equipped for it. In
       | fact I fully expected trump to use it for the October surprise,
       | but I guess he didn't go there for whatever reason.
        
         | tsomctl wrote:
         | Create short deepfakes that can go viral and be so ridiculous
         | that they're both entertaining and obviously fake. Obama
         | advocating for a border wall. Al Gore lobbying for the oil
         | industry. Steve Jobs advising against staring at a screen all
         | day.
        
           | iso1210 wrote:
           | > Obama advocating for a border wall
           | 
           | Would be used as evidence of how untrustworthy the Dems are.
           | 
           | This video of two opponents endorsing each other cropped up
           | last year.
           | 
           | https://www.bbc.co.uk/news/av/technology-50381728
        
         | MadSudaca wrote:
         | I hold a similar view. In my opinion Deepfakes will help
         | accelerate the end of the democracy experiment, and you know
         | what? good riddance.
         | 
         | Only people that know you on, at least, a last name basis,
         | should have political power over you.
        
           | Robotbeat wrote:
           | If you think the end of democracy will mean the end of people
           | having power over you who don't know your name, I have some
           | really bad news.
        
             | MadSudaca wrote:
             | At least under this scheme I'll have an easier time telling
             | my friends from my enemies.
        
             | LanceH wrote:
             | The governments will get it right this time. They just need
             | absolutely authority to ensure it goes smoothly for
             | everyone.
        
               | nemo44x wrote:
               | The problem has always been accountability. If we can
               | devise a system that includes accountability in the
               | authority we could be on to something.
               | 
               | Democracy has accountability by virtue of everyone having
               | a vote. It's a small power that everyone can use to hold
               | their leaders to account. But it still allows people to
               | hold arbitrary power over each other. If you can convince
               | enough people, you can apply your morals and beliefs on
               | others. For instance, lots of discrimination is a
               | function of democracy and codifying oppression of certain
               | people.
               | 
               | We sit here furiously debating who can use a bathroom,
               | who gets preferential treatment, who you have to interact
               | with and a million other things. "Both sides" are bent on
               | forcing people to behave a certain way and they use the
               | power of the vote everyone has to accumulate power and
               | make things "how they ought to be".
               | 
               | I think this could be done democratically but everyone
               | has different interests. How do we find a common,
               | singular interest and then optimize around that?
               | 
               | Or maybe democracy is the "best bad system" and we just
               | have to make do. I do believe with the hyper connected
               | world we have today, cryptography, and resource abundance
               | that we could transcend the modern system and discover
               | liberation from each other to be ourselves and pursue
               | truly enriching lives at a mass scale within local
               | communities. And this means a different thing to
               | different people. But just about everything in our modern
               | system would need to be disposed of and recast.
        
               | utexaspunk wrote:
               | I think it's all fucked until we can get humans out of
               | the loop entirely. The sooner we can develop benevolent
               | AI overlords the better.
        
           | Veen wrote:
           | That view makes sense if the only alternatives are democracy,
           | extreme localism, and benign anarchy. A brief historical
           | survey should suffice to demonstrate that there are, in fact,
           | other less pleasant alternatives.
        
             | Miner49er wrote:
             | True, but maybe deepfakes can help challenge or prevent the
             | "alternatives"? They are fairly cheap to make. They seem
             | like they could be a great tool for challenging
             | authoritarianism.
        
         | aiyodev wrote:
         | I think you're right. It sounds like their plans fell through:
         | 
         | https://outline.com/FuBxnT
        
         | notanotherycom1 wrote:
         | If an _obvious_ lie can consistently fool 80% of the players in
         | among us (details below) then I can imagine deepfakes will fool
         | just as many people despite how obvious it is. By obvious I
         | mean almost 100% of the time it 's a lie and people fall for it
         | 
         | -------
         | 
         | I played maybe a hundred games of among us (they can be very
         | short). The game is about one or more imposters trying to
         | murder the rest of the crew but you'll have to be discreet and
         | get/find people alone so you don't get voted off. When a body
         | is found a meeting happens and you can lie (text chat)
         | 
         | One problem is you don't want to accuse someone when you're an
         | imposter because you immediately become suspicious. Most games
         | will tell you if you voted off an imposter or not so they'll
         | know you're lieing right away once game tells them they voted a
         | non imposter. Most of the time you want to accuse noone, play
         | dumb and act like you're everyeone else and saw nothing.
         | 
         | I lost count when a guy doesn't accuse anyone for 20+seconds,
         | get accused then claims the guy who found the body is an
         | imposter and all these things he did that are suspicious. (why
         | didn't you say it right away?!). Like 90+% of the time the guy
         | being accused is the imposter who waited so he can feel the
         | situation out. It's _extremely obvious_ but maybe 70% of the
         | time literally _every player_ but me and the guy reporting the
         | body is fooled. Which is far too many players at a far too high
         | fool rate. It 's so painful because it's so obvious. 90+% of
         | the time in that scenario the reporting guy is telling the
         | truth.
        
         | csmattryder wrote:
         | In the early days, you could spot one by the lack of blinking
         | because the model was trained on open-eyed images. Not sure if
         | that's still the case, I'd be surprised if it was.
         | 
         | Any time I see a "this is peak technology" comment, I'm always
         | reminded of the PC gaming magazine cover showing the first
         | Unreal game's graphics ("Yes, that's an actual PC
         | screenshot!").
         | 
         | It looks awful now, but in the nineties, it blew us all away.
         | 
         | https://i.pinimg.com/originals/d1/5b/d9/d15bd9f544e6fd4bf740...
        
           | lagadu wrote:
           | These do blink so it's no longer an issue. It's still not
           | perfect because other than blinking the eyes have no
           | expression to them, they're mostly still which is very
           | unnerving once you start paying attention to that (it's also
           | how you can tell a fake from a genuine smile). But we're at
           | the point where unless you're looking for it, it's easy to
           | believe.
        
           | ip26 wrote:
           | People didn't think it looked like reality. Back in that
           | time, marketing used a lot of illustration to promote the
           | game that had little connection to the in-game
           | visualizations. So that cover is more saying, "This wasn't
           | drawn by an artist".
        
         | whatnojeez wrote:
         | > In fact I fully expected trump to use it for the October
         | surprise, but I guess he didn't go there for whatever reason.
         | 
         | Because he's not the cartoon supervillain you think he is?
        
           | pessimizer wrote:
           | He has to be a supervillain, or else how could he have
           | defeated the perfect team that ran the perfect campaign for
           | the perfect candidate in 2016?
        
         | codysan wrote:
         | People have had the opportunity to impersonate anyone they'd
         | like over the radio for over 100 years and yet, here we still
         | stand.
        
         | opwieurposiu wrote:
         | He has hunter's laptop, no reason to lie.
        
         | throwaway0a5e wrote:
         | The catholic church survived the printing press. We'll be fine.
         | Maybe not all of us will live to see it but society and its
         | institutions will go on.
        
       | readams wrote:
       | People are going crazy over deepfakes but we've had Photoshop for
       | a long time and the world hasn't ended. We find ways to verify
       | the content of photos through other means, such as their
       | provenance.
        
         | justEgan wrote:
         | Something to consider is that we already had video when
         | photoshop arrived, which is able to merit at least more
         | authenticity than images do. Are there other mediums that's
         | able to take this baton of authenticity now? I don't think so.
        
         | patcon wrote:
         | i dunno, seems pretty significant to me. used to be that we
         | only had to be skeptical of blurry newspapers photos, then
         | detailed photos, then videos without people as giveaway, now
         | highly detailed videos and audio aren't trustable. You don't
         | think it seems significant that it's getting to a point where
         | only "seeing with your eyes" is believing? All our systems of
         | spreading information beyond our senses start to fail when we
         | can't trust anything outside our perceived experience (or
         | things brought to us by our trusted friends)
        
           | paulryanrogers wrote:
           | Blurring around the face edges and mismatched lighting still
           | give them away. Time will tell if the machines can overcome a
           | trained eye.
        
             | godelski wrote:
             | I think that is a bit of selection bias. Outside of what
             | I've seen in papers Ctrl Click Face[0] has some of the best
             | "in practice" deep fakes I've seen. The video in the main
             | post is meant to be a joke. But these still aren't state of
             | the art and really are just "some dude" that is doing this
             | on their own. Not a production studio. What I think is
             | different here is that we can think that: photoshop got so
             | good that your average person can create convincingly fake
             | content. At the end of the day deep fakes are part of
             | photoshop (I mean they have neural filters now...). The
             | question is more about ease of use. "Requires a team of
             | highly trained special effects artists" vs "some dude in
             | his basement and a shiny computer."
             | 
             | And a big part is that you have prior knowledge. I'll be
             | honest, I didn't realize it was Trump at first. Nor did a
             | friend that I sent the video to that didn't have the prior
             | that all characters were fake. Took him a good minute.
             | That's a meaningful difference.
             | 
             | [0] https://www.youtube.com/watch?v=H3pV-_iyT4U
        
         | stevofolife wrote:
         | The problem isn't about the existence though. Accessibility and
         | convenience to use this type of technology are the issues here.
        
         | ACow_Adonis wrote:
         | except (in my experience as a photographer), most/a significant
         | chunk of the general public not only don't understand
         | photoshop, but will actively disbelieve you about the extent,
         | purpose and outcome of manipulations.
         | 
         | Also, it's one thing to say the world hasn't ended, but that's
         | to potentially downplay at a minimum the idea that commercial
         | and widespread use of photoshop hasn't had widespread effects
         | on body and self-image, creating and interacting with arguably
         | culture-bound psychological issues such as anorexia, bulimia,
         | unnecessary surgery, self- harm, suicide, etc. Or to take
         | examples from non Anglo culture, eyelid removal, skin
         | whitening, nose surgery, etc.
         | 
         | it's true that the world hasn't ended, but that's a thought
         | terminating cliche. there's a lot of evidence it's creating and
         | created significant harm and significant effects.
        
           | ufmace wrote:
           | It helps that a lot of the photoshops going around the
           | internet are either super-sloppy or depict things that are
           | obviously ridiculous. I actually can't think of any time
           | somebody tried to alter a photo in a way that would change
           | the meaning in a believable way and distributed it in such a
           | way that they were trying to convince people it was real.
           | Have I missed something?
        
         | spurgu wrote:
         | Are you blind at how stupid people are verifying even the basic
         | things?
        
         | watwut wrote:
         | We as a culture are not able to verify content of deceptively
         | cut videos.
        
         | 6gvONxR4sf7o wrote:
         | Verifiable content is definitely not the norm through history.
         | I'd love to see an academic analysis of whether misinformation
         | really went down by much (if at all) when photography became a
         | thing. Or when camera phones and social media came along.
         | 
         | Yesterday there was a fake Trump tweet with a zillion upvotes
         | on Reddit, and that's just text. Text is trivial to fake, so
         | maybe the chain of trust we see in text is what we'll see with
         | everything else moving forward. "An anonymous source within the
         | White House provided this footage," being countered with, "you
         | can't trust anonymous sources!" It will come down to who you
         | trust, just like it always has.
        
         | fullshark wrote:
         | Looking at the video here, the two best by far are the Michael
         | Caine and Julie Andrews deep fakes, and in both cases the voice
         | is doing most of the heavy lifting. Deepfake audio is somewhat
         | scarier to me in terms of political / legal chaos than video,
         | much easier to possibly trick someone with a "secret" audio
         | recording than video recording if we some day get to the point
         | of near identical audio mimicry.
        
           | DerDangDerDang wrote:
           | Not sure there's any deepfakery on the Michael Caine audio.
           | Peter Serafinowicz (the other collaborator and originator of
           | 'sassy Trump') is well known for his Michael Caine
           | impersonation.
        
             | filoleg wrote:
             | >Not sure there's any deepfakery on the Michael Caine audio
             | 
             | Most likely not, and the parent comment seems to agree with
             | you on that. I think they were just trying to point out, in
             | general, that the arrival of commonplace audio deepfakes
             | might be way more disruptive than video deepfakes, despite
             | a lot of people (including myself) who used to counter-
             | intuitively think that video deepfakes would be more
             | disruptive.
        
               | DerDangDerDang wrote:
               | Fair point, and I agree audio is potentially more
               | disruptive - especially if it can get to good real-time
               | performance.
        
           | lillesvin wrote:
           | Audio deepfakes are getting better but they have a
           | surprisingly long way to go still.
           | 
           | Here's an exploration of a deepfaked Jay-Z reading/rapping
           | the Navy Seal copypasta:
           | https://www.youtube.com/watch?v=UZzYoOdIXoQ
        
       ___________________________________________________________________
       (page generated 2020-11-02 23:01 UTC)