Artwork for podcast MSP [] MATTSPLAINED [] MSPx
Weird Science: Lip-Reading ‘Phones And Glasses For The Deaf
Episode 21822nd August 2022 • MSP [] MATTSPLAINED [] MSPx • KULTURPOP
00:00:00 00:34:07

Share Episode

Transcripts

Richard Bradbury: Headphones that lip read, mouse-rat chimeras, soap-based TV screens and a robot with feelings. With a list like that it must be weird science.

Richard Bradbury:

Matt Armitage:

• This is a story that’s been all over the news and social media the last couple of weeks.

• Apologies to anyone who has ‘heard it before’.

• I wanted to include it because it is genuinely weird.

• And it’s one of those example of a genuinely useful breakthrough tech.

• The kind you only find once in a very rare while.

• As an idea, it’s incredibly simple.

• It’s a set of glasses equipped with a head up display that shows text.

• Nothing amazing in that. It’s the same tech we see in HUDs in cars.

• In lots of the smart glasses that are already out there.

• What makes it special is the app that comes with the glasses.

Richard Bradbury: Last week you were arguing that it isn’t the app that’s important.

Matt Armitage:

• That was a bicycle. These are smart glasses. So the app is the bit that makes them smart.

• The smart part is that they instantly translate spoken language into text.

• Why is that useful? Well, it’s very useful for people who have hearing impairments.

• Because it allows them to see those conversations in real time.

• The XRAI Glass smart glasses are currently in testing mode. They’re inviting members of the public to join their programme.

• In a publicity video for the product, they show a deaf woman wearing them for the first time.

• And you see this incredulous reaction on her face as she realises she can have a conversation without having to watch the person she’s talking to.

• And she breaks into this enormous smile.

• That’s something that people who can hear take for granted.

• That ability to have indirect conversations.

• If you’re hearing impaired you rely on reading the lips or the sign language of the person you’re talking to.

Richard Bradbury: What sparked the idea?

Matt Armitage:

• I saw a couple of videos with company founder Dan Scarfe.

• He shared the story of an ageing relative who was losing his hearing.

• And he was surprised by the solutions available. Hearing aids, that kind of thing.

• The shortcomings of a lot of those solutions.

• But then he had that light bulb moment of ‘well we have HUDs and we have speech to text’.

• Why don’t we have smart speech-to-text glasses for the deaf community?

• And he was off.

• Scarfe has partnered with an AI company called Nreal to produce the glasses.

• They hope to reach about 70,000 hearing impaired people by the end of next year.

• They have two pricing models. You can buy them outright for UKP399.

• For those on limited incomes, they are also making them available on an instalment plan for UKP35 a month for a year.

• So at no additional cost.

• Something as simple as that – being flexible in the way you get paid can hugely extend the reach of your product.

• You don’t have to be a person with a disability and a person with easy access to money to benefit from this development.

Richard Bradbury: Just to sidenote for a moment: Are we seeing more companies in the assistive technology space adopting more flexible purchasing models?

Matt Armitage:

• We do seem to be. Not just in the assistive space but across the board.

• Often with those intermediary companies that pop up when you buy online and you can spread your payments across 3 or 6 months.

• It’s usually not as generous as XRAI’s model – paying over a year at no penalty.

• We’re in a cost of living crunch. Tech is really expensive.

• And especially when it comes to assistive devices like this – we’re used to health and medical tech being really expensive and either benefitting those with more money.

• Or seeing families make sacrifices to provide the care for that family member.

• So this is a reminder that devices can be priced fairly.

• Where high prices can’t be avoided, they can be priced flexibly and the company is still operating commercially.

• In the scheme of things UKP400 – RM2.2k for life-changing technology.

• Isn’t a lot of money. At the same time, that’s a not inconsiderable sum.

• More so in a country like Malaysia.

• 0

Richard Bradbury: Is this an issue of technology becoming increasingly exclusionary?

Matt Armitage:

• And not just in terms of price – a lot of new technologies rely on specific degrees of ableism.

• In another of the XRAI’s promo clips they show someone conversing with an Alexa device.

• With the machine’s voice responses translated into text for the deaf user.

• I talk a great deal about screenless devices and natural language on this show.

• But I take it for granted that this technology is only useful if you can hear.

• So this inclusionary, less intrusive technology is automatically exclusive for deaf people om the way that a screen-filled world excludes people with vision impairments.

• Developments like this help to level that technology gap, as well as having all of those positive IRL uses.

Richard Bradbury: What are some of the other potential uses? Could this be used as an instant translation device, for example?

Matt Armitage:

• There are all sorts of ways that you can imagine this being used. All kinds of AR applications.

• Scarfe mentioned that he hoped the device would soon be available in other languages.

• I think that was meant more in terms of the non-English speaking hearing impaired.

• But I can imagine it being used as an instant translator as well.

• Allowing people to have live conversations even if they don’t share a language.

• A lot of the instant translation devices we’ve seen translate into speech.

• Which is great but weird. It means everything is said twice, which kind of impedes the conversation.

• With this you would have a real-time conversation in your own language with the recipient seeing the text of your words in their HUD.

• And expanding that pool of users could enable the company to scale and, in the process, reduce the costs of the device and make them more affordable for those that really need them.

• For the next story I want to stay in a similar space – assistive technology.

• We’ve just spoken about conversation enhancing glasses

• Now I want to talk about lip-reading earphones.

Richard Bradbury: Do you realise that sentence makes no sense?

Matt Armitage:

• I do. But it’s also true. Which is why this is weird science.

• This is a device developed by researchers at the university of buffalo.

• And it also relates to voice-controlled devices.

• Giving audible commands to a machine is, by any definition, weird.

• In the middle of conversations with friends they break off to tell Alexa to do something.

• It’s another one of those weird interruptions that technology has introduced into our social lives.

• This third spoke in the wheel of our conversations.

• So the Buffalo earphones…

Richard Bradbury: Sorry. Buffalo earphones?

Matt Armitage:

• The earphones developed at Buffalo University.

• Not headphones for buffalo. Or worse, headphones made of buffalo.

• Although meat-based phones might work for Marilyn Manson or Lady Gaga fans.

• As I said, this is a set of earbuds that read your lips.

• Not with weird camera trickery.

• It turns out that our facial muscles change the shape of the ear canal when we speak.

• And AI-equipped headphones can read those changes and match them to the words we’re mouthing.

• Why am I wearing headphones that can read my lips you might be asking yourself?

• That same voice tech we were talking about.

• It’s bad enough the way we interrupt conversations – it’s even odder when you’re walking along the street talking to an invisible machine.

• Talking to yourself in public is still something that’s seen as a little out there.

• Take it from someone who does it.

• With Buffalo’s EarCommand system you can operate your Alexa or Google Home device simply by silently mouthing the commands.

Richard Bradbury: How extensive is the recognition system?

Matt Armitage:

• Like any AI based system it’s only as good as the data fed into it.

• So far the team has it recognising 32 single word commands. Like TikTok or Instagram.

• To open those apps, obviously. And around 25 simple sentences. And it makes mistakes around 10% of the time.

• Which is about double the rate of most current commercial voice recognition systems.

• The team hopes to bring that down to the market average with more training and to expand the usable vocabulary of the device.

• There is also likely to be a learning curve for users as the device tunes itself to their particular speech patterns and dials itself in.

• So, it’s a long way from being a market ready product.

• But it’s another interesting concept that I would love to see being made available.

• At least until we get Alexa beamed direct to our brains.

• Imagine Siri and Alexa having an argument in your head over the relative merits of Google and Bing.

• Fascinating.

Richard Bradbury: What do you want to leave us with before the break?

Matt Armitage:

• Rat sperm! Last year, Swiss researchers were able to grow mouse sperm cells inside sterile rats.

• This year they decided to try to create the reverse, rat sperm cells inside sterile mice.

• Before you say anything, yes, you want to know why anyone would want to do such an unpleasant-sounding thing.

• This is an attempt to create chimeras: these are organisms that contain cells from different individuals as the NS so pleasantly puts it.

• Pluripotent stem cells, PSCs, are taken from one animal and injected into what’s called a blastocyst.

• This is an embryo of around 60 cells. If the blastocyst is genetically modified to lack specific genes, then the Pluripotent stem cells from the donor animal take their place.

• In this case, the blastocyst lacked the gene for mice to produce sperm.

• Instead, they started to produce rat-compatible sperm instead, based on the PSCs the blastocysts contained.

Richard Bradbury: Does that mean they were able to fertilise rat eggs?

Matt Armitage:

• Yes, but not with the same effectiveness. That doesn’t mean mating rats with mice by the way.

• These are lab tests. And the fertilised eggs always failed. So there is clearly a lot more to do.

• Why would we want to create chimeras?

• Firstly, you might want to introduce human DNA into lab rats and mice for medical testing, drug development, that type of thing.

• There is also the possibility of recreating extinct species.

• You could potentially use the same process to create germ cells – sperm and eggs – of extinct species if you have the requisite DNA samples.

• It could also be used to boost the survival hopes of endangered species.

• By kind of engineering new members of the species.

• I’m looking forward to the rat-based Jurassic Park of tomorrow.

Richard Bradbury: On that horrifying note, we should take a break. And hope there’s nothing like this to follow.

BREAK

Richard Bradbury: We’ve had some assistive devices on today’s show. As well as something I hope to never think about again.

Matt Armitage:

• Yes, I gave myself quite a stern talking to during the break.

• Unfortunately, I won. Or I lost. I don’t know which.

• Whatever the case, it’s unlikely anything will change.

• I will try and keep it clean, though,

• By kicking off this half with a story about soap molecules.

• Scientists have been looking at alternative methods to manufacture silicon-based LEDs for many years.

• Ones that are easier to manufacture, are less energy intensive in that production, and include performance upgrades to the device itself.

• Better, cheaper displays and devices. Something we all want.

• And the most promising alternative has long seemed to be a material called perovskite,

• Which is a crystal comprised of titanium and calcium.

Richard Bradbury: I’m not sure there are many people who use titanium soap…

Matt Armitage:

• I’ll get there. It’s not the perovskite that’s found in soap.

• As I said, perovskite has held promise for many years but scientists struggle to make the diodes stable.

• As a result, LEDs made with the material – even in best case scenarios – would only last a few hundred hours.

• Orders of magnitude less than the average silicon based LED.

• However, researchers at Zheijang University in China have created a prototype infrared diode from perovskite that lasts in excess of 10,000 hours.

• Which puts it in the same ballpark as silicon-based LEDs.

• This is where the soap comes in.

• The researchers added a detergent molecule called sulfobetaine 10 to the diode.

• According to NS the molecule acts as a stabiliser.

• It attracts positive and negative ions which would otherwise be moving freely within the crystal.

• Creating this instability and shortening the effective life of the device.

• Adding the soap stabiliser boosted the diode from those hundreds of hours to more than 10,000.

Richard Bradbury: When do you think that soap-based screens will be hitting the market?

Matt Armitage:

• The team has only tested them with infrared diodes at the moment.

• Though there isn’t any theoretical reason why they shouldn’t work for LEDs in the visible light spectrum.

• The team does caution that it may be more difficult to produce blue light diodes with this process, but green and red should be possible.

• I should have taken the time to figure out why making them into blue diodes is harder, but I can’t do everything.

• In any case, we may have to wait a while for clean screens to hit the market.

• Wouldn’t it be cool if the detergent molecule made the screens self-cleaning?

• Obviously, that isn’t how chemistry works. But other researchers are saying how promising the concept looks.

• So clean screens may be a thing in the none-too-distant future.

• What should we do now?

Richard Bradbury: Part two is where we typically head off into the frightening world of AI…

Matt Armitage:

• Yes. Not so frightening with this first story.

• One of the great successes of the battle against COVID was the rapid development of home test kits.

• Again, pricing. making them affordable enough – in most countries at least - that people could use them as a matter of routine.

• When you consider what a lot of medical tests involve – in terms of time, discomfort and cost.

• You wonder why we don’t have a lot more reliable at-home type testing solutions.

• A biotech company Viome in the US recently started retailing a home testing kit for oral and throat cancers.

• It’s more like those DNA type test kits rather than a COVID test.

• You purchase the kit and then send the sample back to VIOME and they give you the results in around 2 weeks.

• The kit that may actually be more accurate than tests that are typically done by specialists.

Richard Bradbury: Is that because oral cancers are notoriously difficult to detect?

Matt Armitage:

• Doctors typically rely on visual inspection to spot the signs.

• And these are cancers where survival rates are pretty high but only when they’re detected early.

• Visual inspections can be subjective – because the lesions or other indicators have to be a certain size before the doctor can see them.

• Viome wanted to find a more reliable method. One that we can test for.

• So they looked at changes in the microbiome of the mouth that might serve as indicators.

• They used a sample of nearly 1,000 genetic samples. Around 90% of those samples were cancer free.

• Around 8 per cent of the samples were from people with oral cancer, and a further 1.2% with throat cancer.

• They plugged that information into a machine learning tool which identified 88 changes or markers that signify the presence of cancerous cells.

• And performed the tests with an accuracy rate of above 90%.

Richard Bradbury: Is this a game changer in diagnosis and detection?

Matt Armitage:

• There’s always a note of caution.

• The company is currently pursuing FDA approval in the US, which would enable the product to be more widely available and to be covered by health insurers.

• At the moment, you can only buy it online at their website.

• Some oral cancer specialists have made the point that the test isn’t definitive.

• It only tests for the markers associated with the cancers. So a negative test doesn’t mean you’re in the clear.

• In the same way a positive test isn’t a definite diagnosis.

• In either instance, if you are someone who is at risk of these diseases or suspect you might have it.

• Then you still need to see a health professional, where biopsy material can be taken if required.

• For its part, the company claims that the test will become more accurate as the number of people using it increases.

• Because the neural net will continue to learn and refine its results.

• It’s certainly a welcome addition. It’s not a replacement for specialists and healthcare professionals.

• But it would certainly be useful – from a consumer perspective – to see either at-home or easy testing for a range of health conditions.

• Let me ask you a question: how do you like your fingers?

Richard Bradbury: [reply]

Matt Armitage:

• One of those things we take for granted when we use our hands is the sensory feedback we get from our fingertips.

• Is the cup hot? Is the surface smooth?

• And in fact, we can often determine what an object is made of solely from touch.

• A few weeks ago I spoke to Freda about a robot that was able to figure out its own position in the world.

• And plan its movements accordingly. While that’s a great step towards flexible and semi-autonomous machines.

• Even with a battery of current sensors to tell them how hard they’re gripping an object.

• Machines have lacked what we could call a sense of touch. Until now!

• A team at the Beijing institute of nanoenergy and nanosystems has created a robot finger that can identify what an object is made from as well as what its surface texture is like: rough, smooth, jagged etc.

• And has been tested on a variety of materials ranging from silicon to plastic to wood and can detect them with an accuracy of almost 97%.

• The finger uses a clever array of four sensors made from polymers with different electrical properties.

• Electrons from each sensor interact with the surface of the object in a different way, and once again, using a machine learning tool.

• These measurements can be translated into what we would call sensory data.

Richard Bradbury: Are they planning massage robots?

Matt Armitage:

• I don’t think anyone would want that. For lots and lots of reasons.

• And I promised to keep this half of the show on the up and up.

• Their primary application is likely to be in quality control.

• Fingers or hands equipped with the sensors would be able to tell if an object has been finished correctly.

• They could also be used for grading and sorting the raw materials for production as well.

• The research team has also pointed out their potential usefulness in artificial limbs.

• But other researchers have pointed out that we don’t necessarily need that functionality.

• We still have our own senses to rely on, so their assistance might be limited.

• But yeah, robots hands that know what you feel like could be on their way.

Richard Bradbury: How are we going to close out today?

Matt Armitage:

• With some fun news about Whatsapp!

• What could be more delightful?

• The service is due to launch a raft of new functions in updates to the app over the next few weeks.

• One of the key ones is the ability to ghost yourself from group chats without sending one of those horrible

• Elvis has left the building notifications to everyone that you have decided to no longer be associated with.

• We’ve all been there. You’re happily chatting to a group of old school friends when a couple of them decide to hijack the group with posts for miracle cure horse dewormer and magic crystals.

• And then when you leave, everyone reacts as though you’re the one with the issues.

• In the next update you’ll have the option to silently leave groups. Elvis can tiptoe out of the building and disown his former friends without confrontation.

• Other updates see the adoption of functionality we’ve previously seen in Snap!

• You can already send destructive images and pictures over WA – not in the emotionally weaponised sense.

• Although I suppose you’ve always been able to do that.

• I mean Messages that self-destruct once the user has seen them.

• Which I think was also borrowed from Snap!

• Now you will have the option to disable screenshotting of those messages.

• Though that doesn’t stop someone taking a photo of the screen and capturing an image that way.

• I’ll point you to the Netflix documentary The Most Hated Man On The Internet for follow ups on that one.

Richard Bradbury: Anything else in there?

Matt Armitage:

• Yes. You’ll get more granular control over one of my favourite WA spying features.

• That online message it shows for people. Telling you when they last looked at their messages.

• We’ve all done it. Sent a message. The recipient doesn’t respond till much much later and tells you that, no, they didn’t see your message.

• But you saw the blue ticks. And the notification that the person is online and using WA.

• Haven’t seen it? Couldn’t be bothered more like.

• Well, now the app will have the ability to stop annoying people like me from identifying your lies.

• You’ll be able to choose who can see your current status on the app and whom it’s hidden from.

• I can’t remember if I have that feature turned on or off at the moment.

• It might not be weird but I certainly am.

• I guess blocking me counts as weird science.

Links

Chapters

Video

More from YouTube