We may soon be watching hyper-realistic dubbed movies, scrolling apps with a toe run thumb while we thinking DMs into existence. Yes. Science is Slick.
Hosts: Matt Armitage & Richard Bradbury
Produced: Richard Bradbury for BFM89.9
Follow us:
https://www.instagram.com/kulturpop/
https://twitter.com/kulturmatt
Richard Bradbury: Science, the final frontier. These are the voyages of the Mattsplained enterprise. To seek out new life and the future of civilizations. To cheesily go where many people have repeated similar words before. That nonsense signals that Matt Armitage is back for Science is Slick and has spent all his free time streaming reruns.
Richard Bradbury: I assume that the awful intro signals that the first story is about some kind of entertainment?
Matt Armitage:
• Yes, to that point about streaming reruns.
• I do seem to be watching a lot of things that I've watched before.
• Partly, I think that's because we all feel that we're trapped in this Groundhog Day of a pandemic, but sometimes it's easier to watch something familiar then to tax your brain with anything else.
• But, in my case anyway, part of it is to do with the popularity of non-English language content.
• As I've mentioned on the show before I have a bit of an issue with reading text and Vertigo, which makes foreign language content problematic.
• And there's been a raft of fantastic non-English language shows over the past few years.
Richard Bradbury: You could always watch the dubbed versions...
Matt Armitage:
• Sometimes I do. But very often I only get a few minutes into the episodes because it stops being believable.
• Not just that the movements of the actors mouths and the words don't match,
• More than that, the voices themselves don't match.
• Or at least they don't meet your expectation of the character that you're seeing on the screen.
• And it's not a criticism of the skill of the voice actors, generally they do a really good job,
• but for some reason you don't get that same sense of immersion that you get when you're watching an animation or something that is mostly in CGI
• Even though those shows are propelled by voice actors.
Richard Bradbury: I can feel a scary AI story coming on...
Matt Armitage:
• So most of us are aware of the existence of deepfake technology.
• Technology that manipulates video and essentially puts words into someone’s mouth.
• One of the most famous, or rather Internet famous, examples of this technology is control shift face,
• The YouTube channel that puts out edits from movies where lead actors are substituted for one another.
• Jim Carrey becomes Jack Nicholson, for example.
• And in the political sphere we’re starting to see manipulated video - usually for demonstration purposes - putting words into politicians mouths.
• So far, I don't think we've seen too much successful disinformation using this approach.
Richard Bradbury: is that because it's still mostly an emerging technology?
Matt Armitage:
• Yes, what we can do is absolutely amazing, but it's not sophisticated enough to fool us yet.
• It's still a bit like a human looking robot,
• Our attention is drawn to the small percentage of things that aren't right, rather than the large percentage of things that are.
• So, it's probably not a surprise that a film production company is trying to bring this technology to the industry not in that role of being a deepfake.
• but to actually improve the experiences of audiences watching dubbed entertainment content.
• I fully acknowledge how spoiled I am. I'm a native English speaker, so the content industry, Hollywood, the world of blockbusters and entertainment operates to my benefit.
• But that bias towards English in terms of the production of content means that hundreds of millions, maybe billions of people
• Are left watching subtitles or the best efforts of local voice actors and translators to create an experience they can enjoy in their own language.
Richard Bradbury: this is the company, Flawless, that released a video Robert De Niro speaking German and Tom Hanks speaking Japanese, amongst other languages?
Matt Armitage:
• Yes, if you haven't seen it do search for it.
• What the company has done is partner with researchers in Germany to create a neural net
• that analyzes movie footage to combine the facial expression and movements of the actors.
• It does the same with the voice actor creating the foreign language track.
• And then it merges the two, so the face of De Niro appears delivers the lines naturally,
• As though he's a native German or Japanese or Malay speaker.
• And the new version of the movie is then digitally stitched together.
• The results, from the trailer the company has put out, are a little bit varied.
• Robert de niro is great in German.
• Watching Tom Hanks speaking Japanese is weird.
• Jack nicholson's courtroom scene from a few good men in French is slightly less believable to me.
• But that could simply be because I'm a little more competent in French than I am in either German or Japanese.
Richard Bradbury: Do you have any level of competence in Japanese?
Matt Armitage:
• If this technology takes off, then it's quite possible that I will.
• One thing that the company flawless hasn't spoken about, at least as far as I'm aware.
• Is about framing those foreign language lines in the original actor’s voice.
• What we've seen with existing deepfake technology is neural Nets that have analyzed a database of say Barack Obama or Donald trump's speech.
• Typically those rely on someone that has a large body of work that can be analyzed to recreate their speech patterns, tone etc.
• People like politicians, actors but also I would imagine people in any broadcast environment.
• So that would include people like you and me.
• And of course, we put out that episode last year that contained AI versions of our own voices speaking words that we typed.
• So I do wonder how long it will be before we can square that circle.
• Taking the performance of those foreign language voice actors but processing it into the tone and speech mannerisms of the original actor.
Richard Bradbury: Are you worried about the ethical concerns?
Matt Armitage:
• With AI we always have to be worried about the ethical concerns.
• Not so much with the technology that flawless is marketing.
• That has a specific purpose.
• The issue comes when the technology is used beyond that specific purpose.
• It's virtually impossible to limit technology to a specific case use.
• With all the intellectual property in the world, other AI researchers we're going to come up with competing ways of doing this.
• And I think this is where we get into that odd world that we've partially explored before.
• Which concerns the rights actors have over the use of their image and their words.
• And we used the example of Will Smith in Gemini man, where the studio owns the rights to the digital Will Smith.
• There’s a version of himself that he doesn’t own.
• Now, that's not to say that there aren't limits on how the studio or the production companies can exploit his digital voice and image.
• We've spoken before about Roy Orbison, Tupac Shakur, Montserrat Caballe, Amy Winehouse and more being resurrected in holographic form for afterlife concerts.
• There was news back in:• Of course, James Dean died years before the Vietnam War became a US conflict.
• And that movie doesn’t seem to have materialized.
• Then there’s the immortal Carrie Fisher returning to the Star Wars franchise.
Richard Bradbury: you used the example of the two of us. And, of course, this relates equally to anyone who regularly speaks to camera and puts that footage on their social feeds. Could any one of us become one of these digital avatars?
Matt Armitage:
• Theoretically, but I don't think it's anything we have to get too worried or worked up over at this point in time.
• You can imagine this kind of technology being used by more repressive governments.
• Let's say you arrest or disappear a dissident,
• this kind of technology could make it appear that that person had simply gone on holiday, or on a retreat.
• But I think these would be extreme cases.
• Because there's still an awful lot of work involved in creating any kind of deepfake.
• Of course, technology advances and those difficulties erode.
• but I don't think it's a big concern at this point.
Richard Bradbury: can we have an AI story that doesn't want me to make me lock myself in a cupboard?
Matt Armitage:
• Well, the last story was mostly good.
• As I think the flawless trailer says, that technology means we get to watch all these movies instead of reading them.
• That's a good thing.
• So this next story, is also on the subject of reading.
• This is about AI that is helping people to write.
• I dictated the notes for today's story, using one of those AI systems the processes my words and turns them into text.
Richard Bradbury: I can always tell. Because there are chunks of it you forget to edit and make no sense.
Matt Armitage:
• Which is the point of this next story.
• Those transcribing services not only have to recognize my words as I speak them but have to try and put some context into them as well.
• Those differences between the number 2, going to somewhere or something that is too much.
• 2/2 and two.
• And Richard can see how my computer has mangled that last statement.
Richard Bradbury: [Replies]
Matt Armitage:
• How much easier would it be if the AI could just read our thoughts?
Richard Bradbury: {Replies – something along the lines that story was supposed to be less scary]
Matt Armitage:
• But not everyone has the ability to write with a pen, to type or even to speak.
• That's what Jaimie Henderson and his colleagues at Stanford University are working on.
• A neural network that recognizes signals from the brain of the person who imagines that they're writing with a pen and then converts those signals, those thoughts intertext.
• The team has been working with a 65-year-old man with a spinal cord injury that has left him paralyzed below the neck.
• They implanted sensors into his brain that allowed them to monitor the signals from around 200 neurons.
• Of course, there are close to 100 billion neurons in the human brain.
• Way more than that in mine, but then I'm barely human.
• But those 200 are enough for the neural network to detect clues as to what letters the man imagines that he's writing.
Richard Bradbury: to be very clear here: it isn't reading his thoughts?
Matt Armitage:
• No, think of it more as interpreting the electronic pulses that relate to certain actions.
• In this case, the team had to build a synthetic data set because there was no available data set of that man imagining writing letters.
• And his paralysis prevents that from being done.
• But despite these limitations, the early versions of the system are remarkably accurate.
• The subject could type roughly 90 characters per minute with an accuracy rate of around 94%, which is further enhanced by the existing auto correct functions on most of our smartphones.
• That’s as good as most of us manage, and only slightly slower
• That said, I don't expect that I'll be using my brain to create scripts anytime soon.
• Although people have described these shows as brainless.
• But this tool,
• as a communication aid for people with physical or neurological conditions they make it hard for them to write or communicate,
• it has enormous potential.
Richard Bradbury: Unfortunately, Matt isn’t done with the body hacking. More on AI, implants and enhancements after the break, here on Mattsplained.
BREAK
Richard Bradbury: Before the break we were talking about developments in brain computing interfaces that are enhancing our communication skills. What other body mods have you got for us?
Matt Armitage:
• This is a story I got from Ars Technica, in which researchers a developing prosthetic limbs that include the sensation of touch.
• Prosthetic limbs have come a long way over the last few decades, but as we move from systems that are activated by muscles, to those that are activated by electronic impulses from our nerve endings
• and of course, from signals in the brain.
• One thing that they've had in common is that they rely on our visual senses.
• Someone fitted with a robotic arm or hand has to be watching that limb to know that they've grasped something or picked something up.
• They don't have that sensory feedback that we take for granted.
• and with granular tasks, the one we usually use is picking up an egg, there's a lot of guesswork involved in terms of how much pressure should be applied.
• More modern systems have sensors that can create feedback in that sense.
Richard Bradbury: But it's not the same kind of feedback that we get from the sensation of touch?
Matt Armitage:
• No, typically those “sensations” in inverted commas were applied to a patch of skin and the user would have to learn what that feedback related to.
• And similar to that writing technology we mentioned before the break,
• we've moved on to prosthetic limbs that are controlled by sensors in the brain interpreting signals.
• You think about picking up the egg and the prosthetic picks up the egg.
• Researchers at the University of Pittsburgh have come up with a new system that produces the sensation of something touching the palm and the fingers.
Richard Bradbury: how did they determine how effective the new arm is?
Matt Armitage:
• The test subject, again a guy who's been paralyzed for some time, had been using a robotic arm for the last couple of years.
• he was quite skilled already. Doing all the basic tasks like manipulating the arm and picking up items, that kind of thing.
• they concentrated on tasks with the sensation system switched on and switched off.
• The participant was able to do around 12 tasks with the sensation system switched on.
• verses 9 using sight alone with the system switched off.
• The biggest improvement was in that mechanism of gripping.
• The feedback made it much simpler to grasp an object securely, transport it and put it back down.
• At the moment this is a prototype with a single test subject.
• It's not ready to be released as a medical device.
• But even at these early stages come up with a single test subject,
• you can see how much impact a system like this could have in the development of the next generation of artificial and prosthetic limbs.
Richard Bradbury: nicely done. That wasn't threatening at all. I think you have one last prosthetic related story.
Matt Armitage:
• I do. How would you like an extra thumb?
Richard Bradbury: [Replies]
Matt Armitage:
• Danielle clode and her fellow researchers at University College London gave 32 people a prosthetic thumb that was strapped to their wrist and hand and sat underneath the little finger of their right hand.
• If that makes sense. They had a thumb on either side of the hand.
• They wore the thumb for five days with an average of three hours per day.
• And they were encouraged to use the thumb in everyday life as well as while doing lab-style experiments.
• For example, you could clench the thumb to hold a Cup.
• Apparently, some of the subjects used the thumb for flipping the pages of a book.
Richard Bradbury: You haven't mentioned how the thumb is controlled…
Matt Armitage:
• We've mostly been talking about brain computing interfaces on today's show.
• Obviously, putting electrodes in the brains of healthy people just to get them to control a thumb they don't actually need, is probably not ethical.
• I’m not allowed to give children cat paws, for example.
• The thumb is actually controlled by the movement of the big toe.
• Which I know some people will probably find worse and more gross than the idea of electrodes in the brain.
• Using quite standard wireless technology, users were able to move their toe and ankle and the thumb would make corresponding gestures.
Richard Bradbury: We've covered the how. The why is still a very big question to me. Why would anyone need a toe run thumb?
Matt Armitage:
• I think there are a number of reasons at play here.
• Firstly, we're starting to move into that era of technological body modification or enhancement.
• So one issue was simply the researchers wanting to find out what the study participants would use it for and how useful it could become.
• More importantly: how would they train their own brains to integrate these new abilities.
• And what kind of long term implications this might have.
• We're talking about messing with evolution.
• Adding functionality that nature hasn’t designed or perhaps anticipated.
• So the participants were given MRI scans before and after the experiment.
• And the results of the post experiment MRI showed that the participants had changed the way they perceive the fingers of the right hand.
• They saw them less as individual digits and more as a unit.
Richard Bradbury: And were there any long term implications?
Matt Armitage:
• A week later, 12 of the participants came back for a third MRI.
• And this one showed that the changes were beginning to fade,
• suggesting that the brain both accommodated the new digit and similarly adapted when it was removed.
• It’s a positive result.
• Maybe in a few years’ time we’ll all be wiggling our feet to turn the page of a book or grab that macchiato with our extra thumb.
• There may even be some tweens with cat paws if I get my license back.
Richard Bradbury: Can we not talk about AI and creepy stuff anymore?
Matt Armitage:
• OK. I guess that mean I’ll junk the story about microscopic submarines powered by sunlight that can break down pollutants in water.
• You can Google that one.
• If you do, there’s an outside chance that you may be Googling it on Internet Explorer.
• Explorer’s star has faded in recent years. Chrome, Firefox and Safari have become the go to web tools for most of us.
• IE is how many of us of a certain age first browsed the Internet.
• Unless you were on AOL. In which case you weren’t really on the Internet in the first place.
years, since the mid-:• The kind of legal case we’re currently seeing between Epic Games and Apple over access to the App Store
• is the kind of battle we saw Microsoft fighting against other browser manufacturers who alleged that by bundling IE with the MS Windows OS,
• The company was engaging in anti-competitive practices.
• When was the last time you used IE?
Richard Bradbury: [Replies]
Matt Armitage:
• I honestly can’t remember. It’s probably a decade. Maybe more.
replaced IE with Edge back in:• I don’t think I’ve even seen, let alone used Edge.
• They continued to support IE but not to introduce new features.
Richard Bradbury: Why are we talking about Internet Explorer?
Matt Armitage:
• Because Microsoft is retiring it as of June 15.
• It will disappear from most standard Windows 10 editions although it will live on through Windows Server and some other platforms for some time.
• If you’re wondering what will happen to all those IE configured sites and apps.
• Edge has an Internet Explorer Mode that allows you to access those legacy sites.
• And Edge is much more secure than IE, so there’s really no reason for not moving on and up.
• Or using Firefox, Chrome or any one of those half a dozen alternative browsers.
• But it is quite a sad day – like the day Adobe switched off flash.
• Like most people I hated Flash. And I hated IE.
• But they summed up an era of digital technology.
Richard Bradbury: [Comments]
Matt Armitage:
• In a way it’s quite sad.
• 25 years of history switched off.
• IE was how I first explored the Web.
• Hotmail was my first webmail.
• We have all these relics and mementoes of our past.
• Physical photos. Cassette tapes, or vinyl, or CDs.
• VHS, DVD, laserdisc, VCD.
• I even have the CD ROMs of Carmageddon and DOOM hanging around somewhere.
• But what physical reminders do I have of IE?
• All of those endless hours spent on dial up, waiting for a webpage to load.
• JPEGs revealing themselves line by line.
• Photos of cat paws in case you were thinking anything else.
• It does make me wonder: what mementoes the truly digital generations will keep from their lives.
• What happens if Insta or TikTok are switched off?
• What will the choose as their physical treasures of a digital world that is constantly evolving and deleting their history?