Artwork for podcast Impact Quantum: A Podcast for Engineers
Quantum is Coming
Episode 114th August 2020 • Impact Quantum: A Podcast for Engineers • Data Driven Media
00:00:00 00:52:14

Share Episode

Shownotes

The very first episode of Impact Quantum.

Watch the livestream

https://www.linkedin.com/posts/frank-lavigne_quantum-compute-is-coming-activity-6700060953964232704-8JwD

Transcript (AI Generated):

This episode of the impact quantum podcast is rated one schroedinger and is entitled Quantum is coming. 

Speaker 1 

I'm Frank Lavigna an this is the very first episode of impact quantum quantum computing is about the radically change. 

Speaker 1 

Uh, how we live an work, so let's roll the intro. 

Speaker 1 

Alright, with me to my I think it's left on the screen. Here is my cohost from data driven Andy. Later, how's it going, Andy? 

Speaker 2 

Hey Frank is going awesome. 

Speaker 2 

How are you? 

Speaker 1 

I'm doing well doing well. I've got the I've I've got the fever. 

Speaker 1 

Do you? It's not covered. It's not Corona Squantum Fever. That's what I got. 

Speaker 1 

So Andy and I have another podcast data driven.tv where we explore the emerging fields of data science machine learning an artificial intelligence, and you're probably wondering what the heck are you data sciency data engineer guys doing here talking about quantum computing? 

Speaker 1 

For miles back, let me tell you a story. In November of 2019 this is back when people would actually travel places. 

Speaker 1 

And congregate in large groups without thinking twice. This is before the words Wuhan meat market. 

Speaker 1 

More commonplace before when Corona was just a beer I attended a conference called M labs or might machine learning and data science summit. It's an internal only kind of conference for Microsoft employees thrown by Microsoft Research where they kind of come out of the lab and they talked to us about kind of what's around. 

Speaker 1 

Pike 

Speaker 1 

And, um. 

Speaker 1 

This is the conference that I went to in 2016 where I had my ahha moment or you know about getting into this field right? I kind of. I was into data visualization and power BI and kind of Munging Data with Excel. 

Speaker 1 

But the engineering wasn't in me, was not satisfied. Then I discovered machine learning and artificial intelligence, and where that was headed. 

Speaker 1 

And when I came back from that conference, completely switched on about. This is the future. This is coming. It's here, it's you know it's it's on its way. 

Speaker 1 

Most people thought I was crazy. 

Speaker 1 

I mean, even I had my doubts about it, so yeah. 

Speaker 1 

Through 

Speaker 1 

A number of coincidences. It took me 4 or or other reasons, took me about four and a half years or 3 1/2 years to get back there. 

Speaker 1 

Um and the first day came and went. I didn't have that. Ah ha moment, so I'm like, well, you know, maybe that's just like a once or twice in a lifetime type. Ah ha moment. 

Speaker 1 

It happens, doesn't have sure sure. 

Speaker 1 

And. 

Speaker 1 

Then on the second day it was a hardware presentation. Now I can't talk about everything I saw, but that's when quantum computing was explained to Maine. 

Speaker 1 

And why? 

Speaker 1 

It is the next big thing. 

Speaker 1 

An I knew I was on the right track because when I came back I started telling other people that quantum is coming quantum quantum quantum. 

Speaker 1 

I sat it again like a ranting lunatic. 

Speaker 1 

And that kind of told Maine. 

Speaker 1 

That maybe I'm on to something because the next wave always looks crazy before it happens. 

Speaker 2 

Yeah. 

Speaker 2 

So. 

Speaker 1 

With that in mind, I attended, um I started studying quantum type computing articles and and Whatnot, and I even installed Q Sharp that night in the hotel. 

Speaker 1 

Then I opened up my. 

Speaker 1 

Brought on my Visual Studio. 

Speaker 1 

And, um. 

Speaker 1 

Was like OK, now what? 

Yeah. 

Speaker 1 

Um and once again, I'm trying to keep a growth mindset here when I first read a book on statistics. 

Speaker 1 

My first reaction was, uh, what did I get myself into? 

Speaker 1 

Right? 

Speaker 2 

But that's you know that's not a common Frank. A lot of people come into this an you know I don't. When you're starting out, who knows everything you're gonna learn and everything you gonna need to learn. 

Speaker 1 

Exactly right, and I think that there's going to be a lot of people. 

Speaker 1 

Um, that are going to be in a similar boat and. 

Speaker 1 

On our main show we kind of discussed this is that. 

Speaker 1 

It's hard for people to think back. I mean, it's hard for people to think back honestly about what life was like before Koven you know going yeah, getting into elevators like whoa other people. 

Speaker 1 

On that, at the same time. 

Speaker 1 

Things like that, but it's hard to go back even further to think that there was a time when only PHD's did data science right and then was right. 

Speaker 1 

Think about, Well, you have to go back to school like when I when I asked folks for advice on what to do at that summit actually in 2016. 

Speaker 1 

I know that one guy is like I just go back to school and get get PhD in statistics and this is before data science programs existed at universities, so this was really yeah. 

Speaker 1 

We forget how quickly this is moved. I mean now everybody and their cousin hasn't data science course and I don't mean that disparagingly, I just. 

Speaker 2 

Know there's just. 

Speaker 1 

A the Germans have a phrase. 

Speaker 1 

Um, it's been awhile, so I spoke German on daily basis. Just broke fall derval something like that was like the agony of choices and we kind of have that now which four years ago we didn't. But around three years. Three years or so ago, guy named Siraj Raval and I'll just drop his name. I won't say anything else opinion Wise had the ability. 

Speaker 1 

Not so much to conduct unique and custom research on his own. 

Speaker 1 

But he did have the gift. The true gift is true talent. 

Speaker 1 

Was explaining these complicated data science machine learning and artificial intelligence concepts in simple ways. 

Yeah. 

Speaker 1 

And it is a rare gift that if you can understand something, you can explain it now. Richard Feynman, a noted physicist. He says if you can't explain it, you don't understand it. 

Speaker 1 

Or you can't explain it simply you don't understand it. Something like that. I totally butchered that. 

Speaker 2 

Now you got it, yeah? 

Speaker 1 

So the key here and the goal for this show is going to be just that, like the ability to explain this. 

Speaker 1 

And, um, in in in a clear and simple way that engineers can understand. Maybe not your Grandma. 

Speaker 1 

Unless your grandma happens to be a particle physicist. 

Speaker 1 

It happened, I had a great aunt who was sister to my grandmother who was an engineer which back when in the 40s and 50s was a woman engineer was was quite an accomplishment in the military, no less too. 

Speaker 1 

And it was not Grace Hopper, No, but. 

Speaker 1 

Um, though that would have been cool. 

Speaker 1 

In any case, um, the key here. 

Speaker 1 

Is that we want to explain this in ways that software engineers and data engineers would understand? 

Speaker 1 

And that's our goal for this show. 

Speaker 1 

And. 

Speaker 1 

Now, the best way to set up a learning scenario, which I in the last four years or so I have made learning a priority in my life, right? Yeah, I have the numbers to prove that 266 certifications since December 2016. 

Speaker 1 

So. 

Speaker 1 

That I don't say that to show off, I just showed that to saying like I got. You know, it's just become who I am now. It's become a habit. Yeah, I don't know where I was going with that, but. 

Speaker 1 

The best way to set up. 

Speaker 1 

The learning here is the notion for learning is to set up a reason why right. If you if you want to learn something, you're going to be far more motivated and far more likely to capture that information and retain that information. 

Speaker 1 

And. 

Speaker 1 

So. 

Speaker 1 

Let's start there. Why why should you learn quantum computing? Why? Why is this? Like with the data science world is rockin, right like AI is rocking? Yeah, it is rocking. However, winter may be coming right to continue that Game of Thrones theme. And I'm going to try to share my screen. 

Speaker 1 

An if you were listening to just the audio, don't worry. I will explain this in a um. 

Speaker 1 

In a way. 

Speaker 1 

That makes sense. 

Speaker 1 

Um, hopefully. Let's see if I get that screen up. 

Speaker 1 

Alright, So what you're seeing here is an article. It's from Venturebeat they basically warn. 

Speaker 1 

Researchers at MIT warn that Dean is approaching computation. It's now what does that mean? Well, I thought the cloud had infinite scalability. Well, it it kind of does, but there is a practical upper limit that we are starting to bump up against. 

Speaker 1 

And this is kind of the thing that got my attention, so this article is from July, but I heard this in. 

Speaker 1 

Um, late last year? Kind of about it. You know this is becoming a problem and let me explain why this is a problem. 

Speaker 1 

Um? 

Speaker 1 

Because let's take GPT 3 for example. So GPT 3 is an. 

Speaker 1 

A natural language processing model. 

Speaker 1 

It took about. 

Speaker 1 

Our 175 billion parameters in right? So that was just the input level like in terms of what it what it could do. 

Speaker 1 

It took a certain amount of time to train that and it took costs somewhere between 5:00 and $12,000,000 depending on depending on who's answering the question. 

Right, yeah? 

Speaker 1 

GPT 2. 

Speaker 1 

Only only had 1 1/2 billion parameters. 

Speaker 1 

Right, yeah, that's a That's an order of magnitude greater right 175 from 1.5. 

Speaker 1 

If there was ever going to be. 

Speaker 1 

A Jeep 4. 

Speaker 1 

We're now looking at. 

Speaker 1 

Um? 

Speaker 1 

We're now looking at. 

Speaker 1 

I totally lost my train of thought. 

Speaker 1 

OK, alright thank you for handling the comments and. 

Speaker 2 

Yeah, that's what I'm I'm. 

Speaker 1 

Looking for streaming this live. I actually I might edit this part out of the RSS feed, but the I announced the show on roopesh show on Wednesday. He has an Instagram life thing going on, so I announced the show. So that gave me a little bit of extra leverage to do this. 

Speaker 1 

Um on time today, but the short of it is, is that we are hitting these limits, so you can imagine what a GPT 4 would have would have trillions of parameters. 

Speaker 1 

And I don't know if this is going to be. What's the cost going to be? Is it going to be exponential rise? But at some point we are going to run out? 

Speaker 1 

Of. 

Speaker 1 

Keep you. 

Speaker 1 

And certainly doing this at scale, we are going to run out. 

Speaker 1 

What this means? 

Speaker 1 

Is somewhat concerning. Kind of, you know, three to five years out. Is that the innovation in research is going to stop. 

Speaker 1 

Right, yeah and. 

Speaker 1 

Then, once the innovation research stops, the new products that come out that get venture funding then ultimately get bought by some of the big tech firms is going to start slowing down. 

Speaker 2 

That is going to, oh, that's what we call winter, right? 

Speaker 1 

Right, that's a I winter an if you think this is implausible, I urge you to take a look at history because this has happened before. 

Speaker 1 

And we are at the early stage of this, potentially happening again. Predicting the future is hard by the way. In case you didn't know. 

Speaker 1 

But if you. 

Speaker 1 

Look at the advancements that were made in artificial intelligence from I mean, Alan Turing. 

Speaker 1 

Um came up with the notion of the Turing test, kind of defined kind of the core principles of what we call the field of artificial intelligence today. 

Speaker 1 

This is a 70 year old we're in the 7th decade of AI. 

Speaker 1 

Right now to read some of the breathless press releases from various companies, you would not think you would think that AI started about 2000 nine 2010. 

Speaker 1 

Not true. 

Speaker 1 

But at every stage that there's been advancements in AI and I'm talking going back to the 60s. 

Speaker 1 

And even in the 80s. 

Speaker 1 

When the concept of Neural Networks were really pioneered and kind of used. 

Speaker 1 

At every time there was a burst of innovation. 

Speaker 1 

And at every time. 

Speaker 1 

The innovation outpaced compute ability. 

Speaker 1 

To keep up, yeah. 

Speaker 1 

So then that led to AI winters, so including the worst AI winter so far has been um from the mid 80s till about 10 maybe 12 years ago. 

Speaker 1 

Now, that doesn't mean that research doesn't happen in that field in the mean time, I actually. 

Speaker 1 

Took my first. 

Speaker 1 

Course in artificial intelligence in 1990. 

Speaker 1 

4 now was the 2nd semester, so it was. 

Speaker 1 

It was 1995 or 1994. 

Wow. 

Speaker 1 

And nine years began with a one. 

Speaker 1 

Right crazy there are people that are. 

Speaker 1 

Probably watching this that aren't even they weren't even live then, and my professor at the time noted researcher in the field X IBM doctorate distinguished engineer. I mean smart guy. 

Speaker 1 

But he was like, oh, this is gonna be the next paradigm. This is going to be. And basically it was. 

Speaker 1 

Prolog was a Prolog programming course. 

Speaker 2 

Yeah. 

Speaker 1 

An the final I kept waiting for like OK where's the AI coming from like where is he coming from and and that ah ha moment never came and it was just basically kind of. 

Speaker 1 

A hybrid of inference and. 

Speaker 1 

Recursion and was like that's it. 

Speaker 1 

An A lot of these expert systems that were that he was an expert in expert system, so that's very meta, right? There were essentially if you kind of Peel the covers back, they were essentially just a ton of ifdef statements. 

Speaker 1 

Yeah. 

Speaker 1 

Again, where's the AI in that, right? 

Speaker 1 

Right, so that really set me up for to not be a nonbeliever so so much the point when I saw some of the early work before they were cognitive services, this was kind of they were just still in the research lab. 

Speaker 1 

They had a DC Tech Fair which is essentially researchers coming out to DC at the Microsoft Office will kind of show off what they're working on, and a number of it was computer vision and artificial intelligence. 

Speaker 1 

And I saw this thing where he he he uploaded a picture to this program and it it was a picture of a cat. 

Speaker 1 

He was a picture of a cat. 

And. 

Speaker 1 

He uploaded it and the description came back saying, hey, this is a picture of a cat. 

Speaker 1 

And I'm like I looked. I looked at that and I was like my first part of it was like yeah, that's cool. Then my sinic kind of came out and I was like, Yeah, but it's probably some kind of weird inference recursion thing going on. 

Speaker 1 

So I kind of walked away and continue filming the rest of actually filmed the event. You can check out the video on YouTube. 

Speaker 1 

Which we will link in the show notes. 

So. 

Speaker 1 

That left me very sceptical of what the state of the art was. It wasn't until I saw kind of what the real state of the art was at a follow on event the year later, the ha moment that I had the Blues brothers moment as we've often referred to in a data driven dot TV show, so that takes us to today, 2020 or 2019. 

Speaker 1 

Is when I was there, I actually recorded a dated point right in front of the...

Follow

Links