An AI that creates ghost images from our subconscious, a shotgun-wielding man in hot pants, penguins with accents, and photosynthesis in the dark. That’s some very weird science.
Hosted by Matt Armitage & Freda Liu
Produced by BFM89.9
Episode Sources:
https://gizmodo.com/elon-musk-twitter-sec-bots-lawsuit-1849183376
https://www.nytimes.com/article/nasa-webb-telescope-images-galaxies.html
https://www.sciencefocus.com/space/whats-so-special-about-james-webb-space-telescope-images/
https://www.jwst.nasa.gov/content/about/faqs/whoIsJamesWebb.html
https://www.iflscience.com/artificial-photosynthesis-can-now-happen-without-light-64363
Photo by Willian Justen de Vasconcellos on Unsplash
Subscribe to the Substack newsletter: https://kulturpop.substack.com
Follow us:
Tw: @kulturmatt
In: @kulturpop & @kulturmatt
Freda Liu: There’s a ghost in my house according to Matt Armitage. A self-conscious robot, a shotgun-wielding man in hot pants and penguins with accents. In other words, a normal week chez Matt.
Freda Liu: As much as I want to talk about the Penguins, are there any Twitter updates this week?
Matt Armitage:
• We’re recording this a bit earlier than normal – so there may be more news after the recording. We’re going weird science again this week.
• So I won’t go into detail save to say that at the time of recording Twitter had put in a motion to expedite its trial against Musk and proceed in the next few weeks.
• With Musk counter-petitioning for a delay until next year.
• In the meantime, the SEC is also looking into the takeover and has requested a clear statement from Mr Musk to demonstrate that his intentions to buy the company were real.
• We can expect this story to run and run and we’re already at the point of not caring.
• And on the subject of not caring. This year Netflix reported its first drop in subscribers.
• And the company has lost – I think I’ve got this figure right – more than $60bn on its market value.
Freda Liu: Is that because we’ve reached peak Netflix or that people are feeling a cost of living squeeze?
Matt Armitage:
• it’s a mixture of both. I think we all Netflix and chilled ourselves to death during the early part of the pandemic.
• streaming services were a genuine lifeline during those first few months when millions of people were physically confined to their homes.
• So, there is that element of enough already. And yes, those cost-of-living decisions.
• Especially given the amount of competition now in the streaming sector, further muddied by the launch of Paramount+ in the US with a really strong opening line up…
• Including Star Trek Discovery, which was pulled back from Netflix, and Picard as well as the new addition to the franchise: Strange New World.
• Other original shows on the network that are being lauded included Yellowstone, The Mayor of Kingstown, game tie-in Halo, and The Offer.
• That’s on top of HBO, Disney, Amazon. You get the idea.
• At the same time – the pandemic disrupted the production of a lot of shows, so subscribers have been waiting longer to see the return of their favourites.
• Are you a Stranger Things fan?
Freda Liu: [replies]
Matt Armitage:
• We’ve had to wait so long for season 4 that supposed final year high school student Jonathan Byers looks more like one of the parents.
• Which isn’t surprising, 28 year old Charlie Heaton who plays Byers on the show, has a roughly 8 year old son.
• Not far off the age the kids were supposed to be when the show first aired.
• That’s my rant over – oh, yes. Netflix and subscribers.
• To address its declining audience share and its revenue growth, Netflix has been developing an ad-funded tier with a lower monthly subscription.
• And recently it announced it would be partnering with Microsoft to deliver those targeted ads.
• Now, I know most people will be saying: ‘why Microsoft?’
• And I get that: a lot of people are still stuck in the Microsoft equals Windows and Office mindset.
• And as we’ve talked about on the show many times, Microsoft is amongst the leaders in so many other fields:
• Cloud computing, data management, gaming, virtual reality.
• Yes, we laugh about the people Binging their searches.
• And we still talk about Internet Explorer despite it having been replaced years ago.
• Microsoft is essentially an entirely different company from the one we remember from the early days of the Internet.
Freda Liu: Was Google in the running?
Matt Armitage:
• Apparently, along with NBCUniversal.
• Understandably, as Microsoft isn’t well known for its video ad capabilities.
• What seems to have tipped the balance is Microsoft’s purchase of AT&T, which gives the company ownership of that company’s Xandr programmatic advertising system.
• There are no details yet when we’re likely to see that ad assisted tier or how much it will cost, but given some of the forecasts on its revenue growth,
• I’m guessing Netflix will be pushing to roll it out as soon as possible.
• So, who knows, in the future maybe we’ll all be Bing watching our favourite shows.
Freda Liu: Was that story an elaborate set-up for a very weak joke?
Matt Armitage:
• It’s a genuine story but it’s true that I couldn’t resist that very poor pun.
• So far the weird stories haven’t been very weird.
• I mean the Elon Musk one is weird. Netflix partnering Microsoft is unexpected rather than bizarre.
• If you want a really weird story – search for Delta Dental.
• There’s a 13 second video clip of
• A man who celebrates Independence Day in the US by firing a shotgun into the air, standing on the back of his truck while wearing cowboy boots and extremely short shorts.
• After shouting how happy that his country won its Independene from the UK.
• And who wouldn’t be happy about that?
• He then shouts Delta Dental at the end of the clip for no apparent reason.
• Has he won independence from dentistry? Is Delta Dental his cynical overlord?
• No one seems to know.
Freda Liu: Let’s zoom out from that image and into the James Webb telescope…
Matt Armitage:
• Yes, I’m sure a lot of listeners will have seen some of those first images from the James Webb telescope.
• Beaming back shots of previously unseen galaxies.
• One of the first images showed a cluster of stars from a galaxy 13 billion light years away.
• From a time in the universe’s history that it’s speculated that the stars were formed in a different way, being composed mostly of hydrogen and helium.
• So that they would be much larger and more volatile than our own sun with a tendency to collapse quickly into supermassive black holes.
• Not everything in the past was the good old days.
• One of the things I found amazing about the images were the vivid colours.
• So I started to do the nerd thing and look up how the images were taken.
• For example, radio telescopes take images that are radio waves and we use algorithms to turn that data into something resembling a photo.
• So I wanted to see how the Webb telescope worked.
• To my own dishonour, I knew that the Hubble telescope was a space telescope but never bothered to check how it worked.
• The telescope operates in the infrared and near infrared spectrum.
• The images it captures are actually black and white.
• So the lens is fitted with a series of colour filters, creating those images you can view on NASA’s website.
Freda Liu: Shall we back up a little and explain the project to people? Where the camera is? How it got there, what it’s for and who James Webb is?
Matt Armitage:
• Of course, James Webb is the guy who was chosen to sit on the moon with the camera, taking the photos.
• I’m kidding of course.
n NASA throughout most of the:• That critical period of the space race between the US and the Soviet Union which culminated in the small step for man in July 1969.
• Or the week that Stanley Kubrick shot a sci fi movie in a warehouse, if you believe some rumours.
• NASA named the Hubble replacement in his honour.
¢ It was launched in December:• It sits at the LaGrange point, at a halo orbit roughly 1.5m km from earth.
• Which is far enough to keep it out of both the moon and the earth’s shadow.
• And what’s really remarkable is how well it works.
• It reached its orbit, it deployed its shields and solar array.
• The Hubble needed a few trips from astronauts to fix it.
• The Webb is too far for fixes, so it had to work from the start.
• It has already been struck by micrometeors a number of times.
• And collisions with space debris remain one of the biggest ongoing challenges for the telescope.
• But so far, the strikes haven’t created any operational damage.
Freda Liu: What are some of the discoveries that scientists are hoping for?
Matt Armitage:
• As well as seeing further into the universe than ever before, we will also get a better idea of the atmospheres of planets that are a little closer to us.
• One of the discoveries, from I think a test image taken of Jupiter, revealed water vapor and clouds on one of Jupiter’s exoplanets.
• The planet had previously been surveyed by the Hubble telescope, but the spectral image that the James Webb took, confirmed the presence of water.
• And that presence of water, will give us a better indication of which planets are capable of hosting life.
• Not that the Elon Musks of this world should suddenly start planning colonisation missions.
• But it will give us a better sense of the odds of there being life-bearing planets out there.
• scientists think the most exciting thing it will uncover are new mysteries.
• There are already things in the current batch of images that were unexpected or we can’t yet explain.
• We saw the Carina, a swirling dust cloud that acts as a star nursery.
• And there were curving structures within those images that scientists couldn’t explain.
• That’s what makes it exciting – all the things out there we haven’t even imagined yet but that have existed for billions of years.
• I haven’t gone into too much detail about the launch and the structure of the telescope – but it’s worth reading about.
• Details like the shields that protect the telescope from the sun’s energy, otherwise the lens would be flooded with radiation and be unable to pick out those infrared signals.
• And we’ve parked it in space more than a million km away.
• Amazingly ingenious.
Freda Liu: It’s tradition to have something wacky to round out the weird on these shows. Penguins?
Matt Armitage:
• Penguins! Did you know they have regional accents?
• We’ve long known that whales, bats and some species of birds have local dialects.
• New research from the University of Turin suggests that even within the same areas,
• penguins in neighbouring colonies may have linguistic differences from one another.
• The team recorded calls from three different African penguin colonies over three years.
• They then matched certain calls to birds who were mates, and who were friendly with one another.
• They discovered that amongst family and friends, the frequency and amplitude of the calls tended to become more similar.
• This might make it easier for the penguins to find family members in a crowded colony.
• Maybe penguins think all penguins look the same?
• This kind of vocal accommodation is thought to be highly unusual.
• The discovery could mean that it’s present in more species than previously thought.
• And understanding it could help us to understand how our own languages evolved and on what kind of times line.
• So maybe those accents in Penguins of Madagascar aren’t as silly as we thought.
Freda Liu: When we come back – Matt Armitage is back on the AI trail.
BREAK
Freda Liu: Are you back to tell us that the machines are self-aware?
Matt Armitage:
• Actually, yes. Listeners may remember a few weeks ago we covered the story of Blake Lemoine.
• A former software engineer at Google who believes that one of the company’s experimental artificial intelligence machines.
• …A conversational network called LAMDA…
• Had developed self awareness.
• We spent most of that episode pointing out the fairly substantial reasons why LAMDA probably isn’t self aware.
• But we also covered comments from some scholars who maintain that general AI is trending in that direction.
• Last week I saw a story in NS about a robot that its builders are claiming has created its own self-awareness.
• The developers are a team of engineers from Duke University and Columbia.
• They have created a machine that can plan how to move reach a goal.
Freda Liu: Isn’t that essentially what robots do?
Matt Armitage:
• Yes, but usually we program them to do those tasks.
• There’s very little room for autonomy or deviation in the system.
• That’s why it’s so dangerous for humans to be on the production floor with industrial robots.
• Unless they have the sensors, they generally aren’t looking out over their shoulder for a fleshy, beady-eyed manager to approach them.
• Something that is programmed into humans by default.
• Meat suit versus machine. You get the idea.
• Their machine creates the equivalent of a mental image of itself, and figures out how it should complete its task.
• It’s this ability to imagine itself and map its own movements that has given its creators cause to claim it’s self aware.
Freda Liu: When you say it imagines itself, do you mean in a philosophical sense?
Matt Armitage:
• I don’t want to go there. That’s real bladerunner, do androids dream of electric sheep territory.
• No. The machine is a robotic arm. It can view itself on a number of cameras, from different angles.
• The cameras feed into a neural network that controls the arm.
• The arm moved around randomly for three hours, feeding information into the neural network about where it is and how it acts in physical space.
• That generated over 7,000 data points which the team augmented with around 10,000 more that had been generated in a virtual simulation.
• When they tracked the neural network’s perception of where it thought the arm should be and where it actually was, they found it was accurate to around 1%.
• Which is probably more accurate than my own estimation of where my limbs are at any given point in time.
• As self-aware goes, that doesn’t sound very threatening, and the researchers acknowledge that we’re probably
• at least two decades away from general AI being self-aware in a more recognisably human sense.
Freda Liu: Is there any scientific consensus over the claims?
Matt Armitage:
• The paper is still new, and it is quite a bold claim.
• Some detractors have claimed that this it’s not really aware of its own shape.
• It’s simply modelled its movements in the shape the camera has informed it to.
• It doesn’t have that idea of self. It’s still a programmatic execution.
• Others have questioned whether the machine would be capable of completing the same tasks in a different environment.
• Whether it could successfully overcome new parameters and obstacles.
• And able to do that modelling in real time as those parameters change.
• That would equate more closely to something more closely resembling self-perception.
• But let’s switch from robots that are aware of themselves, to machines that can read our minds.
• Would you be happy with a robot that could read your thoughts?
Freda Liu: [replies]
Matt Armitage:
• You’ll be glad to hear that this story isn’t about a machine that can do that.
• In fact, it does something that’s way stranger and spooky seeming.
• It recreates images from our brainwaves that we can’t see.
• So it isn’t reading our minds to find out what we know. It’s reading what we don’t know we know.
• This is research from the University of Glasgow using a principle called ghost imaging.
• Genuinely I think this is amazing.
• In their experiment, a test subject wearing one of those weird EEG brainwave hats was placed in front of a white wall, and next to a wall painted grey.
• The grey wall was there to obscure the subject’s view of an object and a projector behind it.
• The projector, using a special computer-controlled pattern, shines on the image and some of the reflected light appears on the white wall or is diffused around the room.
• But there isn’t enough linked information for the subject to identify what the object is.
• However, the neural net processing the EEG data can create a 16 by 16 pixel image of the object.
• It creates an image from all that ghosted information in our mind.
Freda Liu: How can an AI access information that our own consciousness is either overriding or ignoring?
Matt Armitage:
• From a machine standpoint we don’t have the processing power to instantly translate all the information our eyes and other senses are feeding us.
• We’ve talked about it on the show before in relation to self-driving cars.
• It’s our ability to both process information and action it in milliseconds that makes our brains so powerful.
• But that means that the brain filters those inputs. It’s one of the reasons that our recollections about the same thing can differ.
• One person might more vividly remember the smells or the colours. Someone else the sounds.
• The act we witness might be the same but our experiences of it aren’t.
• Because out brains are all processing the information in different ways.
• In the same way one person experiences a rollercoaster as a thrill and another is terrified.
• So a lot of the information we see gets filtered to our subconscious.
• It isn’t discarded, but it’s used to support the images we perceive rather than overwhelm them.
• This ghost imaging technique allows the neural network to piece together some of those subconscious stimuli and translate them into a useful image.
Freda Liu: What types of practical applications are likely to come out of this?
Matt Armitage:
• Well, a lot of the stories we cover relating to AI are about their assistive potential rather than that ability to replace us.
• And it seems that the intentions for this would be similar.
• Obviously, those EEG caps are not the most practical real world accessory.
• Although now that Balenciaga has those face disguising airflow masks out, who knows.
• Maybe we’ll see a Kardashian rocking a brain interface at next year’s Met Gala.
• So, there could be applications where we have to react to visual stimuli very quickly and precisely.
• I think the military and law enforcement potential is obvious. But I’m hoping that isn’t the researcher's first thought.
• Perhaps it could even be used to prevent those itchy trigger fingers shooting the wrong person, though I imagine it would be used to hit the right person faster.
• It’s really fascinating. I wonder if the same technique can be used to enhance memories. Whether we store all that additional information.
• The researchers want to see if the system can handle inputs from multiple people at the same time, and see if that speeds up the identification of the object.
• But yeah, really cool. The ghost in the machine in our heads.
• Shall we end with some photosynthesis?
Freda Liu: replies
Matt Armitage:
• In some ways this might be the weirdest story of the week, from IFL Science
• That we can now create photosynthesis without light.
• This is a project from University of California Riverside.
• They’ve created a more efficient process for turning solar energy into food than traditional photosynthesis.
• They used an elecrocatalytic process to create acetate – the basis of vinegar - from carbon dioxide, water and electricity.
• The acetate can be fed to food producing organisms which can grow in the dark.
• Replacing the biological photosynthesis usually required.
• As well as being more energy efficient, it also opens other potential avenues for growing food.
• Those indoor vertical urban farms we’re starting to see are quite energy intensive.
• This could represent an alternative. Even potentially allowing us to grow food in space.
• The team started with years and fungi. So far, rice, canola, tomatoes and several other varieties have all demonstrated an ability to be grown like this.
• Another avenue is to breed crops that can accept acetate as a kind of booster to increase crop yields.
• Or even to increase carbon sequestration. Who says we should be afraid of the dark?