Artificial Intelligence (A.I) Safety Expert Dr. Roman Yampolskiy

Artificial Intelligence (A.I) is building the future. But will it be a paradise or our doom. Computer Scientist Dr. Roman Yampolskiy studies safety issues related to artificial intelligence. We talk ChatGPT, the next wave of A.I. technology, and the biggest A.I. threats. Then, we take a look at “society” for a special Top 5.

Dr. Roman Yampolskiy: 01:50

Pointless:31:14

Top 5: 57:03

Contact the Show

Dr. Roman Yampolskiy Twitter

Dr. Roman Yampolskiy Website

Interview with Artificial Intelligence Expert Dr. Roman Yampolskiy

Nick VinZant: 0:11

Welcome to Profoundly Pointless, my name is Nick coming up in this episode, AI, and society

Dr. Roman Yampolskiy 0:20

that but like really, really good at it like it knows everything about you everything you ever wrote, read it just amazing at predicting the next token. And if it keeps scaling, it becomes smarter than us. We don't know how to control it. So you can look at every single aspect of society today and basically see how it's not in any way prepared or preparing for this technology. When we think about someone super smart, we think of like Einstein, who was, you know, half a standard deviation away from smart people. We're not thinking something 1000 standard deviations above us.

Nick VinZant 0:56

I want to thank you so much for joining us. If you get a chance, subscribe, leave us a rating or review. We really appreciate it, it really helps us out. If you're a new listener, welcome to the show. If you're a longtime listener, thank you so much for all of your support. So our first guest is a computer scientist who specializes in artificial intelligence, specifically, the safety around artificial intelligence. I have to admit, when it comes to AI, I am really lost. But he does a great job of really explaining what's going on with AI. What's the risk and where we could be going in the very near future? This is computer scientist, Dr. Roman Yampolsky. For me, from an outsider's perspective, like I've followed AI, I don't really understand what it is. But it seems like in the last six months, everything has changed. Has there been some kind of development that has just catapulted it forward? Or what's kind of like what's happening in AI right now?

Dr. Roman Yampolskiy 2:10

Kinda Yeah, the last six months, a new product was released, which is the biggest model ever trained. Unlike previous 50 years of AI, it actually works. So everything we did so far was kind of like, oh, in this special case, it can play tic tac toe. This thing can do pretty much everything in different domains. It can translate, it can write essays, it can program it can throw things. So we finally dealing with something like aI from science fiction for movies, AI, people thought we're gonna have done we talked about creating AI.

Nick VinZant 2:47

Is this a big development like in the grand scheme of things for AI? Was this a hard thing to do? Or was this like, Oh, this is pretty easy. And the complicated stuff is next.

Dr. Roman Yampolskiy 2:58

It's not easy, it took a lot of compute. So we never had hardware this powerful, we needed special computers to be powerful enough, and we needed enough data to train them. So before internet, before everyone started posting everything online, we just didn't have datasets big enough. And same with Compute the idea of neural networks dates back to 1940s, artificial neural networks. But back then we just didn't have computers powerful enough to run those simulations.

Nick VinZant 3:27

So now we really kind of just have the technology in place to put this technology in place. Yep. And we're referring to chat is chat GVT, the name of a company, the name of the program, like I'm a little bit unclear on that.

Dr. Roman Yampolskiy 3:42

So the company is called Open AI. They use models of human brain loosely based on human brain, which are large collections of artificial neural networks, a large neural network simulating neurons in the human brain. The specific architecture which really makes it work well as a transformer, and the latest model they released is GPT four. So we have previous ones, we were not as powerful. This is the latest one. So chip GPT is kind of the public interface to get access to this model.

Nick VinZant 4:22

I kind of understand what it's doing right like I put in I want you to write me this paper writes me this paper, but from kind of a computer scientist perspective, like what is it doing?

Dr. Roman Yampolskiy 4:33

You know how to complete and your phone tries to guess the next word you're gonna write and just help you out by doing it. That but like really, really good at it like it knows everything about you everything you ever wrote, read it just amazing of predicting the next token. So good had had to create models for like, how computers work how chess works, so it can be really good at guessing what the next next statement will be in a program or next move in a chess game. But it's really just a very, very complex predictor of next word.

Nick VinZant 5:09

Does it understand? Or is it just really good at predicting?

Dr. Roman Yampolskiy 5:13

It needs to understand to predict at this level? So people argue about this word, what does it mean to understand? In terms of functionality, it's good enough to do things which if human did it would have no doubt the human understands?

Nick VinZant 5:28

Is this a good thing?

Dr. Roman Yampolskiy 5:29

If you want capabilities? Yes.

Nick VinZant 5:32

where's the where's that? What's kind of the bad part about it about it potential, we

Dr. Roman Yampolskiy 5:35

don't really know what it's like thinking or planning on doing or how it works really well. And if we keep scaling, it becomes smarter than us. We don't know how to control it.

Nick VinZant 5:48

Obviously, you know, like, my mind immediately goes to like movies, right? That's the first thing that my mind goes to. Are those realistic scenarios? Like if this goes bad, how do you think that it actually goes bad,

Dr. Roman Yampolskiy 6:00

so movies emphasize visual aspects of how bad it will get you have a physical embodiment Terminator killing people, this is just like that, but no robot bodies, just the intelligence, just the power of words and the Internet to manipulate the word, manipulate the world, get someone, bribe, blackmailed, to do things, maybe using nanotech, maybe using computer viruses, maybe using real viruses to do whatever it wants to accomplish.

Nick VinZant 6:31

Let's just say that right now for right now, when we look at chat, GBT, those kinds of kinds of technology, what do you think the impact is of that? A year from now? Five years, 10 years from now.

Dr. Roman Yampolskiy 6:42

So if we're freeze it in time, we're not going to release the next generation. Obviously, this is very impactful for economy, you can automate so many jobs. I mean, whole industries, pretty much don't need to be there, we can just use Chad GPT for this type of work. Long term, we don't really know, again, because there is not just kind of incremental improvements, but this exponential hyper exponential rates to get those systems to be as powerful as possible. So in a week, when our see more progress than we've seen in the year before,

Nick VinZant 7:17

do you think that we're kind of ready for this as a society? No. That's, that's what I thought the answer was going to be. Do you following up on that? Right? Do you think that the people who are in charge and I'm putting air quotes up there, right, the people who are making the decision so to speak? Have we really thought out like the consequences of this? What's ultimately going to happen? Are we kind of just like we made this new toy? Let's use it?

Dr. Roman Yampolskiy 7:46

Well, first of all, no one's in charge, really, politicians have no idea what's going on in the industry. Industry doesn't fully understand the social aspects. So it's all kind of separated interests, and they all care about their sub subdomain. People who are making those systems not not like explicitly designing them, but making them they don't really fully understand how they work either. They consider possibilities. They think about things like super intelligence and general intelligence, but usually they go let's, let's get a little closer to it. And then we'll figure out what to do we still want to monetize this next level of improvement. So certainly, there is very little thinking being done about how it's going to impact everything.

Nick VinZant 8:41

What part like if we, you know, we talk about Okay, so if we're not ready for this, this is such society, where do you think that we're kind of not ready for it?

Dr. Roman Yampolskiy 8:50

We have no legal framework for dealing with something like that in terms of copyright for creative outputs, art songs, movies, text, we have no responsibility for crimes committed or for patents granted, we don't really have economic safety net if this thing automates majority of jobs. It's not obvious how how we're going to pay for people who lost their jobs. So you can look at every single aspect of society today and basically see how it's not in any way prepared or preparing for this technology.

Nick VinZant 9:28

Do you think though is that the kind of stuff that like okay, well, we'll figure this out, right, like we always seemed as a society to have this attitude of like, we're gonna figure it out whatever problems coming we're gonna figure it out. Is this though something that could fall into that realm like all right, there's challenges but we'll we'll get it.

Dr. Roman Yampolskiy 9:46

Some of those definitely. I mean, if you want financial security, you can tax robot labor a labor redistributed firms funds. That's something government is good at redistributing money. The problem of control controlling the system itself is something we don't have any technical solutions for. We don't know how to control more intelligent agents. We don't even we don't fully know if it's possible or not, it seems like it may not be possible. So that's the one where I would say we're unlikely to easily solve it.

Nick VinZant 10:20

I'm a big numbers person in the sense that if you look at the AI technology that we have right now, and let's say one is, this isn't a real threat. 10 is like, man, we better watch out. Where do you think that we are kind of on that scale?

Dr. Roman Yampolskiy 10:36

Right now? Or what's coming next?

Nick VinZant 10:39

Let's let's do both.

Dr. Roman Yampolskiy 10:40

Right now, we're pretty safe. I mean, clearly, we have those systems, and we're all here, nobody died. So it's not a big deal. It will change social and economic situation, but it's not going to kill everyone. The next generation, we don't really know. It could be a slight improvement. It could be smarter than all of us combined. If it is, and we don't control it, then it's a 1011 12

Nick VinZant 11:01

When we look at AI in the general sense. So we have the chat GBT right now, where, what's the kind of the next couple of things that you see coming down the pipe?

Dr. Roman Yampolskiy 11:12

Oh, they just scaled their systems. So they add more compute, they add more data, train it longer, and it gets more and more capability. So you never probably heard about GPT two, because it wasn't that capable. GPT three people started writing about you had copilot for programming assisting programmers, now GBT forscan, a very general, lots of people found that useful for what they do. So it's it's the same level of progress continuous forwards into three years we can have human level or more capable systems.

Nick VinZant 11:44

The thing that I kind of don't really understand, right? It's like, okay, how do we get from the AI computer that's doing my homework, so to speak, to Skynet taking over the world, right? Like, how does that? How does that play out? Like, how does one become the other?

Dr. Roman Yampolskiy 12:00

So specifically with military, military is very interested in having their work automated, they want systems for detecting attacks for automated response. So they would place AI in charge of our, let's say, nuclear response. So all it has to do is decide that this is the right decision, we're under attack, or maybe we need to be first to attack to win the war. And you have nuclear war generated by AI decisions. But of course, this is just what we have today, the infrastructure today, plus the systems we have today. If you have a system, which is actually smarter than us a super intelligent system, I have no idea how it would go about killing everyone. I'm not that intelligent. That's the point.

Nick VinZant 12:42

Do you feel like the powers that be are listening to people like yourself? Are you kind of falling on deaf ears?

Dr. Roman Yampolskiy 12:49

So there is definitely more happening now there are conferences there are panels they're trying to listen, I'm not sure they have background to fully understand. So some of our leaders 80 years old, plus I don't know if actually use computers for anything. So it may be more up to the advisors to them to decide what's happening.

Nick VinZant 13:11

That's the kind of thing that worries me too. Right? It's like by the time that everybody understands the problem, it's is it too late to solve the problem?

Dr. Roman Yampolskiy 13:18

We'll see a simpler similar problems like cryptocurrency and governance of Bitcoin and things of that nature. It's been a decade and they haven't really produced any useful legislation in this space.

Nick VinZant 13:30

Is there any chance though, with stuff like this, right, because there always seems to be the thing in society that is going to be the next big thing, right? y2k was going to wreck everything. Cryptocurrency was going to change the world. And then it kind of just seemed to fade away, is there any chance that this is a flash in the pan.

Dr. Roman Yampolskiy 13:49

So AI has a history of kind of those boom, and bust cycles, you have aI winters where everything dies out? No funding? It seems unlikely at this point, just because the level we're already at, you can monetize so much of it for new companies, automation of labor, and the progress is not slowing down. But it's always possible. If we stop right now. And for the next 10 years, we don't have any progress. It doesn't really change anything, we have the same problem just in 10 years, we still don't know how to control it, we still don't know how to deal with it buys us a little time. So that will be wonderful.

Nick VinZant 14:24

A lot of the things that I wanted to kind of talk about are pretty much summed up for some with our listeners submitted questions. So are you ready for some harder slash listener submitted questions?

Dr. Roman Yampolskiy 14:33

Always what's the ultimate risk? Everyone dies?

Nick VinZant 14:37

But what would it be that would cause

Dr. Roman Yampolskiy 14:39

that mistake in programming but decision malevolent actors doing it on purpose? I have a paper with like, taxonomy of different ways to get to dangerous AI.

Nick VinZant 14:52

Could you just real quick, like sum up some of those categories?

Dr. Roman Yampolskiy 14:55

So most people understand malevolent actors really well. So bad guy ISIS terrorists crazy is called decide to destroy the world and purpose. They use this advanced intelligence to help them achieve their goals.

Nick VinZant 15:08

What areas do you think in society are most susceptible to change from Ai?

Dr. Roman Yampolskiy 15:13

Like we have a lot of kind of boring, bullshit jobs, we can easily automate, and nobody would complain.

Nick VinZant 15:19

I think that but we would have to have this, you know, the difficulty that I kind of look at it from that side of it is like, right, but people have to have jobs. So what happens if it replaces everybody's job, right? Like, that's kind of, I think, the worry that a lot of people have.

Dr. Roman Yampolskiy 15:35

So that's the idea with unconditional basic income, where you tax this technology, tax those corporations making mega profits, and then you provide some sort of basic income for everyone enough to just exist, but maybe not making you rich, and then you can do extra work. If you're interested in making more.

Nick VinZant 15:55

What do you think will happen in the next five to 10? years?

Dr. Roman Yampolskiy 15:58

So I don't know specifically, it's very hard to predict with specific dates, it seems like we're gonna get to human level and quickly after the super intelligence, but how soul, it could be as soon as five years it could take 1020? We don't really know,

Nick VinZant 16:13

have I mean, has this kind of always been moving forward? In the background, though?

Dr. Roman Yampolskiy 16:18

It was, but much slower. I mean, it was kind of linear progress, not exponential, hyper exponential. People used to say it would be 50 years, 100 years before we see something like that. Now, not crazy. People are saying it could be a year or two.

Nick VinZant 16:36

What was your reaction when you started using it? The chat GBT,

Dr. Roman Yampolskiy 16:39

I was very impressed. Especially, it's amazing that you can get basic model for free, which is like incredible cutting edge technology for everyone. And you can get like the very latest the best with internet access with plugins for 20 bucks a month,

Nick VinZant 16:54

did you want immediately jumped out as you as kind of the red flags about this kind of technology?

Dr. Roman Yampolskiy 16:59

Well, it does make up stuff like when you ask questions, which you really know answers to like, tell me about Dr. Yampolsky. And it just makes up complete nonsense. You notice that?

Nick VinZant 17:10

Why would it? Why would it do that? Why would it just make something up?

Dr. Roman Yampolskiy 17:14

It doesn't really know the difference between what is true and what is false. It just goes what is the most likely talking to complete the sentence. So if it says Dr. William Polsky has won an award for if the most common thing to win award for is chemistry, then it will say chemistry?

Nick VinZant 17:30

Have you seen any examples come out of it that would say like, Oh, that was a dangerous example, or like this is a problem.

Dr. Roman Yampolskiy 17:40

Or you can think about all sorts of situations, people asking for medical treatment, for example, what medication should I take for this or that and that can definitely give you bad advice. They trying to filter it out things like that. But there's so many ways to ask the same question in a different way to get it to answer that it still happens.

Nick VinZant 17:59

Do you think that this will be a thing that however it changes society? Do you think it will change it uniformly? Or will Pete there'll be big winners and big losers?

Dr. Roman Yampolskiy 18:06

Well, there are people who use it, who rely on it, and people who've never heard of it. So obviously, it's not going to be uniform for everyone. There is an advantage to being early to anything,

Nick VinZant 18:17

will there be certain demographics or certain areas of the world that you think like, oh, they're going to really benefit from this, and these, this area is going to get hit?

Dr. Roman Yampolskiy 18:29

Again, it depends on what level if we're talking about uncontrolled super intelligence, we all going to be killed uniformly, it's going to be very equal and diverse. If we're talking about just economic benefit from technology, where right now, obviously places with advanced computing infrastructure, advanced education will benefit more than, I don't know, Amish community. For example,

Nick VinZant 18:50

when you look at kind of the future of AI, what is a movie or TV show that you feel like, oh, that's probably what it would look like, and a movie or TV show, there's like, that's not the way this would work at all.

Dr. Roman Yampolskiy 19:03

So for short term, Black Mirror does a really good job of capturing a lot of kind of side effects of technology, we're gonna have virtual reality, mind reading, things of that nature, it's really quality. They have a lot of episodes, and each one is as good as a standalone movie. So that's a good example. As far as something not very realistic. I mean, there are aspects of all those like matrix talks about humans being used as batteries for energy. That's not the best way to get your energy.

Nick VinZant 19:39

So how do we kind of keep the bad stuff from happening and embrace the good stuff?

Dr. Roman Yampolskiy 19:43

As a wonderful question? I've been trying to solve it for decades. I don't know. I bring problems he'll give me solutions.

Nick VinZant 19:49

If you were to like hold right like you're out with your buddies though, talking about like, well, this might work. Is there anything that you could say Would that might work?

Dr. Roman Yampolskiy 20:00

So I always believe in self interest of people, if people truly believed that this is going to kill them, they would not push the button, they would not release it, they would not develop it and just be very happy enjoying billions of dollars they made with trippity. For

Nick VinZant 20:14

the companies that kind of make this are they going to end up controlling the world? I mean, until

Dr. Roman Yampolskiy 20:19

they create this product, and the product is out of control, right? You don't control uncontrolled super intelligence. That's the important point. Initially, you think you're going to be the guy controlling the light cone of the universe for forever, you godlike. But reality is you're the first victim of this technology, the closest thing to it?

Nick VinZant 20:39

Do we understand what we have done? No. What part of it? Do you think like, oh, that's that's the part that we're failing to grasp

Dr. Roman Yampolskiy 20:48

the difference in capability. But when what we are and what our systems will be if we can think about someone super smart, we think of like Einstein, who was, you know, half a standard deviation away from smart people. We're not thinking something 1000 standard deviations above us.

Nick VinZant 21:06

It's a level of intelligence that we wouldn't even be able to comprehend.

Dr. Roman Yampolskiy 21:11

Right? It would look kind of random to us. We won't understand what it's doing, or why is

Nick VinZant 21:15

there but I mean, are there when you look at like the current technology? Are there fail safes built into the system now? No. Why would anybody designed something and not build fail safes into it, though, that seems like

Dr. Roman Yampolskiy 21:27

they don't know how to do it. But there are basic security things they protect the files from hackers, they have passwords, but it's not. The model itself, could be very dangerous. What they do they put filters on top of it, don't say this word, don't ever talk about this group of people. It's after the fact kind of filtering, it's not really changing the model to be safer. But

Nick VinZant 21:51

why? Okay, I'm trying to understand as much as I can, right, like, but why would we not be able to build fail safes into the system that would protect us no matter how smart this thing is?

Dr. Roman Yampolskiy 22:05

Because you can't outsmart something smarter you think, like, think about having a child, right? You can definitely control a small child, you can lock them, you can do all sorts of things where they can figure out how to unlock the door. But it's not the other way around dumber things cannot control much smarter things. I'm definitely

Nick VinZant 22:25

the one thing that I kind of feel like, like, maybe this is more a societal thing. Is there any concern in your mind that it will kind of take away our ability to think our creativity that will be I imagine some ways becoming like Wally, the movie Wally where there's just people sitting on a boat, you know, like the Are there any concerns in your mind? Or kind of in the intelligencia mind about that, like, well, what is this going to do for us as a society, even if it works perfectly for everything else?

Dr. Roman Yampolskiy 22:57

Absolutely. It's a huge concern, we see it with attention spans getting shorter and shorter, we see it with inability to read the map or just kind of participate in something complex, it's definitely happening.

Nick VinZant 23:10

Um, that's pretty much really all the questions. I mean, you're you are efficient, you are an efficient man, I

Speaker 3 23:17

was able to solve that. Oh, sorry, that,

Nick VinZant 23:21

but is there anything else that like you think that we should be talking about in regards to this?

Dr. Roman Yampolskiy 23:25

So your audience is very general, I understand there's another experts with experts, you can go deeper on certain aspects of it for general audience, I just want everyone to understand what is happening. And that so called experts have no idea what's going on. I think most general, populace people think that experts are really getting the situation they understand what to do. But it's not the case at all.

Nick VinZant 23:54

Kin just seems like such a bad idea to me. I can't get over that. That like we are potentially unleashing something that we have no idea what it's going to do. But to be fair, right would have you been viewed in amongst your colleagues like are you kind of a I don't know what word to use. Not alarmist but are, are you the guy who worries more? Like the most people in your position, share your opinions.

Dr. Roman Yampolskiy 24:27

So it changed 10 years ago, what I was doing was considered kind of crazy science fiction and very few people took seriously especially in academia. Now we see 1000s of top researchers all common embrace this concern. We just had a letter signed by all the top labs and top scholars saying yes, AI is an existential risk. So they they accept what I'm saying we just 10 years behind.

Nick VinZant 24:53

Do you think that people are starting to come around to it or are we just embracing this?

Dr. Roman Yampolskiy 24:58

That is more understanding of it. For sure, there are still people who disagree, they think there is absolutely nothing to worry about, or we can easily solve it. But so far we haven't produced any type of proof or even rigorous argumentation for why this product or service will be safe no matter at what level of capability,

Nick VinZant 25:18

what are you researching? Now,

Dr. Roman Yampolskiy 25:20

I'm looking at different limits of what can be done in this space in terms of understanding their systems, predicting their behaviors, communicating with them without ambiguity, monitoring, next, training run and so on.

Nick VinZant 25:33

Couldn't we just, this is one of our listeners some basic questions, couldn't we just unplug it?

Dr. Roman Yampolskiy 25:39

That's the best question. I love it, we can also pour water on it, that completely solves the whole problem. Know, if a system is smarter than you, it will kind of anticipate you doing things like that. And stop you from doing that.

Nick VinZant 25:52

It's the how thing, right from is to think of like something like a

Dr. Roman Yampolskiy 25:55

computer virus. Sometimes people get those, can you turn it off? Can you just I don't like having computer viruses? Let me turn it off. How would that work?

Nick VinZant 26:06

As someone who's had a computer virus on his computer, like it doesn't work very well,

it doesn't do it right.

Dr. Roman Yampolskiy 26:12

And this is a dumb thing with no intelligence, this would be smarter than you.

Nick VinZant 26:16

Um, okay, feet to the fire 10 years?

Where are we?

Dr. Roman Yampolskiy 26:23

Not sure? Maybe it's some sort of virtual environment. But could it be virtual heaven? Could the virtual health

Nick VinZant 26:30

if it was going to take over the world? Like when do you think it would do it? Like, oh, we would we would hit the technology, we would hit the technology by this year where I think that okay, maybe it could take over the world.

Dr. Roman Yampolskiy 26:42

So I suspect, the next version of this large model GPT, five comes out, something will change. I don't know how soon that is going to be. They're claiming they're not training it right now. Probably takes about six months to train, it probably takes about a year to test it. So most likely not in the next two years. But we don't know for sure.

Nick VinZant 27:04

What does that mean, train it like, I don't understand what that means. So

Dr. Roman Yampolskiy 27:08

we take this very large computer and tell it to go read everything it can on the internet, every book, every paper, every forum post, it reads everything and kind of tries to process that information. And that's how it gets smart.

Nick VinZant 27:24

Oh, so essentially, then it and it's able to predict what we're doing based on all of those other models. I think we may have let me just want to touch on this again. But is that a difficult thing for AI to do? Like we've reached high end AI at that point, or is like, this is just the easy stuff for AI?

Dr. Roman Yampolskiy 27:41

Well, it's kind of self learning is capable of, if I tell it go read this book, and come back and tell me about this book. And let's have a discussion. And let's write a short story inspired by this book. It can do those things it can learn without me explicitly programming things. And that's pretty cool.

Nick VinZant 28:01

What like okay, now I am kind of curious, like, when you look at kind of into the weeds about this, what are you and your colleagues talking about in regards to it?

Dr. Roman Yampolskiy 28:09

So it's not obvious that everyone being killed is actually the worst scenario, it could get much worse you can have when we're talking about malevolent actors, suffering risks, torture, things like that.

Nick VinZant 28:21

Is there any good news? Do you have any good

Unknown Speaker 28:23

news for

Dr. Roman Yampolskiy 28:26

I mean, right now GPG for is only 20 bucks a month. So you can help them train the next generation with a payment.

Nick VinZant 28:33

So we can accelerate our doom to it. That's where all the really all the questions that I had, man,

Dr. Roman Yampolskiy 28:41

it's interesting that normal people have a lot more common sense about those things. When I talk to average people, they immediately understand it, just like you, they say, Well, this is stupid, why we're doing it, then you talk to the brilliant experts, a lot of them don't have that same common sense.

Nick VinZant 28:57

That's with a thing of right. Like, how does that happen, though? How does that happen? That people like, the common people, so to speak, can immediately identify like, well, maybe we shouldn't do this. But then it just keeps going. Like, what is how does that happen?

Dr. Roman Yampolskiy 29:12

How we're trained scholar scholars to think independently to come up with contrarian solutions. So they do,

Nick VinZant 29:19

and then there's the money, right, but like, how did they ultimately make their money off of this?

Dr. Roman Yampolskiy 29:25

Well, it's very lucrative. I mean, every corporation wants to automate their call centers and make their video games more entertaining. I mean, if you can have this automated human like intelligence for $20 a month. That's pretty good for your business.

Nick VinZant 29:42

In the things, though, that it has done can people tell the difference?

Dr. Roman Yampolskiy 29:49

Oh, experts claim they can. Some studies show they can't. It's not obvious depends on the domain. So in some domains, It's like, humans are funny. We have stand up comedians, there is no stand up comedian chargeability. It's not funny. So it's obvious that it's like horrible at time jobs. In other domains like art. I'm not an expert in art. To me, it looks better than all the modern art I've seen. But I'm not an expert.

Nick VinZant 30:22

Can it be creative in the same way that we can be creative?

Dr. Roman Yampolskiy 30:27

I think it can. It's just people tend to discount how well it does, especially when they decide to compare it to the very best of humanity and sort of average people.

Nick VinZant 30:38

Yeah, I can see that

right, like, Okay, this isn't Shakespeare. Yeah, but it is Bill from accounting pretty well, right. Now. I want to thank Dr. Yampolsky, so much for joining us, if you want to connect with him. We have a link to him on our social media accounts. We're Profoundly Pointless on Twitter, tick tock, Instagram, and YouTube. And we've also included his information in the episode description. The YouTube version of this interview will go live on July 29, at 4:30pm Pacific. Okay, now, let's bring in John Shaw, and get to the pointless part of the show. I want to mention, though, that after going back and listening to this conversation, I completely understand that we sound like grumpy old man. But I actually, I actually feel pretty hopeful about the future. I think that there's always hope. Do you know what's going on with AI? Do you understand this whole chat GBT thing?

John Shull 31:45

Oh, I hate it. Well, let me let me rephrase that. I think that's okay for certain things. But I mean, if I was a college student, that's all I'm doing.

Nick VinZant 31:52

I feel like it could be a really good thing in the sense that it could kind of free us up from doing menial tasks, right? Like, how often do you do the same thing, write the same email, do the same thing like that? The only thing that I worry about is if it's going to reduce people's ability to think, like, Oh, you don't have to think anymore.

John Shull 32:11

I mean, I think it's hard to argue that social media has at least made the world lazy in terms of thought processes. And I feel like that's, that's kind of what chat GBT is doing, it's just another tool to make you lazier, you don't really have to think about it. And it does most of the work for you.

Nick VinZant 32:28

I think, though, that this is the difficulty that we face with basically any kind of technology, we are going to go to the least common denominator, no matter what it is, we are going to use it for probably the worst possible purpose of it.

John Shull 32:42

We always seem to go to the extreme, but then immediately go to the remedial bottom. I'm not even a fan for like kind of what you said in terms of it could eliminate some of the remedial tasks that people do. Because what does that do even even having to say write the same press release every month. Still, if you're writing it from from scratch, or even if you're using a template, there's still some kind of brain connections there.

Nick VinZant 33:06

I just think that this is part of a symptom of why people are so unhappy in their lives is that we are doing things that we are not designed to be really doing. Like we are hunter gatherers. We're not supposed to be doing accounting, I think it's against our nature. And we can't continue on this forever. We're just going to be more and more and more unhappy. Something has to kind of alleviate this from this spiral of just brain numbing.

John Shull 33:30

But you're not saying that chat. GBT is that tool, I hope

Nick VinZant 33:34

I think that something that will just free us up, to be creative again, to do those kinds of things, is something that we have to add, I think that we need to kind of get back to who we are.

John Shull 33:47

I would argue that besides chat, GBT, there's other AI induced technologies and programs. And if you keep picking your teeth or popcorn anymore, I'm gonna just be distracted.

Nick VinZant 33:58

There's, it's impossible. Once you get something in your teeth, you have to get that thing out of your teeth. You can't just leave it in

John Shull 34:04

there. Like now, I haven't even had any kind of anything that can get stuck in your teeth today. Now I'm like, man, but something stuck back in the molar back there because it's just feels a little weird. Anyways, I would argue that with the different AI technologies that we already have that work on our as a society, we'll never get back to that simple way of life.

Nick VinZant 34:24

No, I'm not talking about that simple way of life. I'm talking about the idea that we're not designed to do this. Like, I don't think that we, as a civilization were like, You know what the human body and brain really needs spreadsheets.

John Shull 34:40

I don't disagree with that. I just feel that we're we're way too far gone already.

Nick VinZant 34:43

I actually kind of think that in the next 50 years. We're either going like Book of Eli apocalypse, or we're going to paradise and there's not going to be any in between. But we also don't really like change that much. Any technology that we've had, we really don't like that much change. We'd like a little bit, but not that much.

John Shull 35:03

If you had to stick a knife to my throat right now and say, When do you think things are going to culminate in the world in terms of like the AI? millennia or whatever taking over I mean, I would probably say the next 20 to 30 years, we're gonna see some kind of gravitational shift.

Nick VinZant 35:19

I think we need something, people are really unhappy. Society is getting really frustrating. I think there has to be some big change, it's come about, do you want to talk about what happened in your basement? Um, this is the funniest thing that I truly take this, I find that the universe has a way of always just reminding you who's in charge.

John Shull 35:44

Sure I can I can dive into this first,

Nick VinZant 35:46

set it up. Let me set it up. Okay. So for people who are new to this podcast, John has been working on his basement for years. And he talks about his basement pretty much every day, in some form, or fashion, if a conversation with John last more than five minutes is going to bring up his basement. And it just got finished. Telling people what happened to your basement.

John Shull 36:10

First off, let me let me preface this by saying the person that did it, I listen to those podcasts. And I want him to know that, that I love them. I appreciate them. There's no ill will. And I've said that over and over. But

Nick VinZant 36:24

I'd like to thank them, I would like to thank them. Because if I would have been there, I would have laughed hysterically and immediately taking pictures of your face to see how enraged you are.

John Shull 36:35

So I've been I've been hosting ping pong nights the last few nights or the last few weekends or few weeks rather. And we were playing this last this past Friday. And it gets heated. I mean, you know, competitive people, and he was a last person you thought would have this would have happened to but I'm not even really sure what happened to be honest. But I he was going for a ball, maybe lost his balance and his elbow and half his body went right through one of my basement walls.

Nick VinZant 37:08

And ruins your brand new bass.

John Shull 37:12

It's gonna get repaired. They're a great group of guys. They're gonna make sure that you know, it gets back in fighting shape. But what's funny was so I wasn't angry. You know, I was I was I was in shock, because I thought it'd

Nick VinZant 37:25

be angry. So we didn't do it on purpose. No need to be angry. No,

John Shull 37:28

I wasn't mad. Like, you know, I was just like, oh, man, like, fuck, like, of course, this has kind of what you just said earlier. Like, of course, what I thought was funny, or what I thought was at least interesting. Looking back on it was more people were and it was a smoke. I think it was five people here. Every one of them was like, Oh, what is your wife? What is your wife gonna say? Did you tell your wife? What is she gonna do? Um, I just thought that was funny.

Nick VinZant 37:54

So because it's a great opportunity for I told you so. Right. You've been spending all this time all this money on the basement. And then this happens. You should have known better.

John Shull 38:03

Well, I gotta say this common. I married a great one. Because she, I told her and her first reaction was we'll get it fixed. Don't let it ruin the night.

Nick VinZant 38:14

Yeah, that's a good, that's the appropriate reaction. But I think that basically what happened with your ping pong table and basement is synonymous a little bit with what happens with technology, no matter how nice of an idea you have, no matter how good the ideal is. People are people at the end of the day, and we're going to, we're gonna fuck it up. We're gonna fuck it up. And we're going to do the one thing you didn't intend to happen.

John Shull 38:37

I mean, I like I said, I never thought it would have been him. I mean, no, you know, no offense, but I've run into the wall. Other guys have run into the wall, but never through the wall. And he just happened to hit it right on. And he's a great guy. So I you know, and he's a loyal listener of the podcast. And he, I know he feels terrible about it, but it happens and we'll get it fixed and, and we'll move on.

Nick VinZant 38:58

Well, the next time you see him shake his hand for me.

I just I love it. I love it. I love it.

I love it when people make like that happen. I really want to keep this thing. Nice.

John Shull 39:09

I got I got to tell you my wife's reaction, though. I knew she wasn't going to be all up in arms. You know? I mean, let's be honest. Some some women would have been down in the basement like screaming, I think right?

Nick VinZant 39:20

I don't think so. I think that most people have a pretty good reaction about things. See, I never the reaction. You think it is right. See?

John Shull 39:27

I don't think so. Like I think you say that because of the way you react to things like that. And I found out that my wife, this is the first time in a decade of knowing her after something like this reacts the same way that you react just Alright, whatever. We're not going to let it ruin the night. We'll move on and you know, it's not life or death. Like we'll get it fixed.

Nick VinZant 39:47

I really disagree with this. I think that most people have completely reasonable reactions. We're all pretty much wired the same in that regard. I think that most people are reasonable people just seem unreasonable because a couple of people ruin the whole thing. I think most people are pretty reasonable.

John Shull 40:06

Yeah, no matter what I mean, that's not what I was really. I wasn't saying that people are unreasonable. I

Nick VinZant 40:10

think that I think that you're trying to attack everybody else. I think you're trying to put down everybody.

John Shull 40:16

No, no, not at all. Nope.

Nick VinZant 40:21

Illusions I feel so you're not gonna use bail out? Do you want to do your shout out?

John Shull 40:26

Yeah, let's let's get to those where we

Nick VinZant 40:28

can I hate people since you hate everyone.

John Shull 40:31

I don't hate everyone. I actually I you know, I would, I would say, with confidence. I don't hate one person. And I don't think there's one person out there that hates me.

Nick VinZant 40:42

Oh, then you're not living your life, right, man.

John Shull 40:45

I mean, I treat people with respect and kindness. Why would you hate somebody like that?

Nick VinZant 40:50

I think you need to have at least one or two people who really dislike you.

John Shull 40:53

And maybe an ex girlfriend or something. But not like somebody that I met or or have dealt with in the last decade, probably.

Nick VinZant 41:01

I can think of a couple of people who I'm pretty sure hate me. I feel good about it. I really think that my life has improved, the more people who hate me.

But you're standing up for yourself.

John Shull 41:15

I mean, I know that I don't want to get I don't disagree with you. But I feel like you know, there's ways there's ways of sticking up for yourself. And that's one of the ways the other ways is through dialogue and communication and things like that.

Nick VinZant 41:28

Too much when people please your man. That's why your basement got fucked up. You got to set some ground rules like hey, look, this brace was brand new. I understand that everybody wants to win, but there'll be diving for balls into the wall. If your 290 pounds. You're not setting you see you got to have some you got to you need to by the end of 2023 have at least one person who hates you.

Unknown Speaker 41:47

I mean,

Nick VinZant 41:49

I think that that would change your life. Well, actually,

John Shull 41:51

there might be a couple people that have that think about it, but

Nick VinZant 41:53

get a couple more man. Okay. All right.

John Shull 41:57

Let's just give some shout outs here. Right? So

Nick VinZant 41:59

I'm a firm believer you got to have because like if you're if nobody's hating you then you're not doing anything.

John Shull 42:03

Here's some chat GBT shout outs for you. Okay, just kidding. I didn't do that. Nor would I ever. Say we'll start with Jen Thompson. Vince law. Aidan lane. Preston cosmos. Not sure that's a real name, but sounds pretty awesome. Good. I like it. Reston, Cosmos sound Ryan de Thomas. How do you feel about people that make sure to include their middle initial and their names?

Nick VinZant 42:32

Just in case by case basis. It's a really a case by case basis. It can be hit or miss. But generally you're going to my interest is piqued a little bit, right? Like that's a lot. I wouldn't say that's a red flag but like, Well, you better pay little attention here for a second.

John Shull 42:49

Let's see Chris Monetti Ernesto Cardenas I like that name Ernesto Ernesto. That's a good one. Chris go Gary Corbin Mae and Ethan Kaiser.

Nick VinZant 43:05

Yeah, I don't think that you can have really traditional names anymore. Like you can't name somebody Carl. Now.

Unknown Speaker 43:13

George.

John Shull 43:15

Once again, I feel worn out once again. I feel that names are kind of like fashion trends that go to come back a decade later they go they come back. Like as John dead. I think John's a dead name. Nick, Nick will never be dead and might not be popular right now. And I don't even know if it is. I think it is but

Nick VinZant 43:37

it'll stay at about the same level that I think that it's always been there's always going to be a few but not a

John Shull 43:42

lot. You know, I feel like Michaels David's though but they'll always be there.

Nick VinZant 43:49

But you can name somebody like I could look at a baby and be like Michael. David, could you look at a baby and be like, That's Carl.

John Shull 43:58

Right. I have some kind of issue with Carlos right now.

Nick VinZant 44:02

As the first name that comes to my mind like, what are we going to name him?

Unknown Speaker 44:07

Carl?

Nick VinZant 44:10

What no naming him Carl. I mean, or like maybe the female equivalent would be like Kim.

Kim, she looks like a Kim. Yeah,

John Shull 44:23

I actually I think I work with at least for Kim's so there's a lot of Kim's in my office.

Nick VinZant 44:28

I don't know if I know any Kim's Well, I don't know a candidate. I don't know a single camera right now.

John Shull 44:35

I mean, you know one, I think he runs North Korea.

Nick VinZant 44:39

That's bill I believe. What was his name? Kim Jong moon. That's his account.

John Shull 44:44

Had to throw it out there. All right. Let's see. Let's move on Turner.

Nick VinZant 44:47

I knew a Kim Turner. That's it. Kimberly.

I feel different. Kimberly. I feel different about Kim.

John Shull 44:53

I mean, it's the same name, Kimberly Kim. No, it's not Kimbo. Because the

Nick VinZant 45:00

same doesn't have the same number of letters in it. No, it's not the same name.

John Shull 45:05

Kimbo Slice. Rip his parents meant to name him Kimberly or Kim and they just end up ended up with

Nick VinZant 45:11

I don't think that I think that's a nickname. Let's that brings. Do you know, do you know anyone? solely by their nickname like you don't know what their real name is?

John Shull 45:23

Kay Beausoleil? No I know I guess I'm not somebody personally, I am not hip enough to have friends with only nicknames.

Nick VinZant 45:34

I knew one guy. One guy that I his name was his nickname was boodle, and I wasn't sure what his real name was. Turns out his real name was Nick. Oh, Buddha. Buddha was like a family thing. Kimbo Slice

John Shull 45:49

his real name was Kevin Ferguson, just FYI. For my firstborn, we couldn't come up with a name so we put a March Madness bracket up in the in our hospital bedroom or hospital room and let the nurses kind of pick

Nick VinZant 46:03

that was wow, you can't even make a decision with your own children. That's ridiculous

John Shull 46:08

that it was it's you'd have to MIT before you're too

Nick VinZant 46:10

much of a people pleaser for your first child you should damn well have that name picked out that's the progeny of your family lineage

John Shull 46:19

wasn't even me wasn't even me. I'm not even gonna take that when I was all my wife All right, I have your

Nick VinZant 46:24

you have you're not putting your foot down enough. You're not that's why you need to get more people that hates you. You're too much of a people pleaser. What's everybody like? What's everybody want to name my child? What do you want to name your child John?

John Shull 46:37

I mean, we we took it into consider it's not like we we went with what they chose. But you know, we went in consideration and chose the most popular. No, no, no, no,

Nick VinZant 46:50

you shouldn't be making the decisions that regard your life. Anyway, that's why you're paying by us why your basement got fucked up.

John Shull 46:58

No, my baseball got fucked up because I'm I'm not afraid to hope you didn't

Nick VinZant 47:01

set priorities.

John Shull 47:03

What was I going to do? Stand against the wall and make sure he didn't run into it?

Nick VinZant 47:07

Know what you should have said? He's like, Look, I just got this basement finished. Let's not fuck it up.

John Shull 47:12

Oh, I mean, that was probably said and it still happened.

Nick VinZant 47:15

You either fail to plan or plan to fail.

John Shull 47:19

It is. It's I don't even know why I'm going back. It's always gonna come back to be my fault.

Nick VinZant 47:25

Yeah, doesn't mostly like that is the truth of life. No matter where you are. You're the one who got you there.

John Shull 47:32

Let's see. Are you a bath or a shower die?

Nick VinZant 47:36

I don't know if I've taken a bath in 20 years. Have you wanted to know? I want to get out of there. I don't enjoy being hot in hot tubs like me like let's go the hot tub sounds awful to me. Oh, not wasting my time in a bath. Are you a soaker? I just don't that's not generally a thing that men do. Like, you know what I really need to do just sit in the bath.

John Shull 48:03

No, I'm not a soaker. But you know, back in the day, and I would do like if I had access to one I would do it every day is like, like a steam room. It's it's awesome. There's no, I would actually put that on maybe the top 10 things that I enjoy the most is like a good theme room like a good steam. Like we're you can just feel the you can just feel the fat sweating out of you. Well, you're

Nick VinZant 48:27

not actually losing any weight. So what you really seem to want to do is pretend like you're losing weight without actually doing anything. How can I feel like I'm losing weight while literally just sitting here?

John Shull 48:37

I mean, we talked I mean, isn't that kind of the theme of the show? Dream? That's a dream AI.

Nick VinZant 48:44

Right. So wait a minute, are you taking baths? When's the last time you took a bath?

John Shull 48:50

Um, probably I mean, within the last five years. It's been a couple of years. But you know, every now and again, it's nice when you put in some bath bombs. They'll get the water nice and hot. It's just nice to soak sometimes.

Nick VinZant 49:05

I just can't. I just can't even imagine it. Like I like getting in the water. But I don't want to get in my bathtub and just be laying there. Like sitting in there. Just

John Shull 49:15

I mean, now that I have children who are really weird to me, foreign three. I can barely find more than 10 minutes for a shower when it's not 2am we go.

Nick VinZant 49:26

God I just I know that you've been like our top five is coming up. And this has been what John has been waiting for it just to be a cranky old man and complain about

John Shull 49:33

budgeting. I can't

Nick VinZant 49:35

shitting my bras with nobody bothering me.

John Shull 49:40

Anyways, I mean,

Nick VinZant 49:42

kids today one kid walked on my lawn.

John Shull 49:46

Alright, well, let me ask you the second question. So we can

Nick VinZant 49:48

I just want to read my book about submarines. And

John Shull 49:52

speaking of we're going to talk about that in a minute. Oh, yeah. All right. Well, rank these melts McFly. verse from what is

Nick VinZant 50:01

it? Wait a minute.

Let's hear it. Let's hear how you say it. You've already done it. Milk. Milk.

Milk. You I feel like you're slipping in a in their milk.

John Shull 50:16

Well, that's you right now. You rent these milk flavors bail me

Nick VinZant 50:22

out. Washing me out milk, and I am a a

John Shull 50:27

chocolate. vanilla or vanilla. Chocolate regular or strawberry.

Nick VinZant 50:33

Well strawberries last I don't trust people with strawberry tendencies. I don't get chocolate is obviously number one. And then I don't know if I've had like actual milk in yours like oh, just milk.

John Shull 50:50

What do you want? I was health freaks you drinking almond milk soy milk.

Nick VinZant 50:54

I drink when my wife buys

Unknown Speaker 50:55

Oh, that's yeah, you are?

Nick VinZant 50:58

No Oh, that's what she buys. It's probably what I would drink anyway. But I'm just not somebody that's like, I'm thirsty. Give me some milk. Like that's not thirst quenching or tastes like milk is I would that would be my last drink choice. Honestly. I mean, if we're looking at like water, pop, alcohol, soda juice. Milk, like milk is last on that list.

John Shull 51:22

I mean, it's I would probably put it above pop, but pop is more enjoyable to drink. But I drink more milk them pop on a regular basis.

Nick VinZant 51:33

Meal. Man I A ULK.

I feel just that's how you say it. Am I a you? Okay, no,

John Shull 51:40

just made me think of that. As Super Troopers get me out. You know, faster going me out? Yes. So

Nick VinZant 51:47

that's such a good under it. That's such an amazing thing where somebody can make that kind of movie and then all of their other movies were like not any good. Like, that's a like, how could you do I did this one thing really well, to me, that would be like dunking a basketball and then never ever in your entire life doing it again. Like I just did it once. And I never was able to do it again. Like that would be really strange to me.

John Shull 52:11

All right. Well, speaking of submarines, we talked about this on the last episode as they were searching for these poor people coming to find out that they were literally dead within like hours of going missing. Because their submersible imploded, which is an IF for those of you who may not have listened or don't know what I'm referring to, there was a billionaire diving expedition that there's a company that owns a submersible that basically will take you to the ruins of the Titanic. And I don't know if this is going to come out on Wednesday, what the 28th. But by now, it would have been two weeks or so since the Summersville went missing. They were looking for it come to find out within a couple of days, basically, there they heard a large implosion, the new US Navy did at the bottom of the ocean. And basically, it's like a pressure cooker. And without getting to gravity, you can imagine what happened, I'm sure. And yeah, all five people are dead. So in saying that, I have a few questions for you one. And I don't really think we talked about this last episode, because it was still kind of breaking and we didn't really know what's happening. Would you ever go in a submersible like that? No. To anywhere in the world, even even if even at let's just say 1000 feet down? Would you do it?

Nick VinZant 53:31

If I had a pool that was 10 feet deep, I would not get in a submarine to go to the bottom of my pool. All right. Nope, not doing it.

John Shull 53:41

I found it kind of fascinating that there were no windows in the submersible. It was controlled by basically like a Logitech PlayStation controller.

Nick VinZant 53:51

That's what it was. Right? Like that's the kind of thing that I got think that that a lot of stuff is still coming out. But this doesn't seem to be a very well designed or maintained thing. Like maybe somebody should have been checking on this a little bit. But that's not the kind of thing that like there's government regulations

John Shull 54:08

now because it right because it was probably have something that was owned by a billionaire, and it was a private company. And you know, it was marketed as we can take you there. And I'm not other than having a business license and all that. I don't think that in I don't think you had to be regulated, which is insanity to me. But I wouldn't get in one of those things now. But before before it's kind of brought to light, how dangerous some of these submersibles are, I probably would have entertained it.

Nick VinZant 54:35

How many books on submarines? Do you have?

John Shull 54:37

Way different kinds of submarine? How many more submarines?

Nick VinZant 54:41

Okay, but how many? You're dodging the question.

John Shull 54:44

I mean, I probably I don't know 10 To 20 Probably. That's how a lot of different stories though. Okay. I just have 20 books about the submarines like there's stories and things from whatever doesn't I think one

Nick VinZant 54:56

book on submarines is enough. You only need one One book on submarines, right? Like if you've got more than one self help book, it's not working for you. Listen, you should never buy a second help self help book.

John Shull 55:09

Let me put it this way. Some people are enthralled with airplanes. Some are enthralled with cars. I mean, I like cars I like you know, for fast cars, but I've always just always been in thrall with submarines. I don't know why. But the thought of going a mile under the ocean and a tin can. Like it's just, you know, just nuts to me. Yeah,

Nick VinZant 55:32

it is to me, too.

I didn't have no desire to do that whatsoever.

I did see a scientist was talking about like, if I didn't understand what an implosion really was, I thought that they would be like, drowned is like, No, you're basically incinerated? Like there would be nothing left of them but cells of their body. There's nothing left at all.

John Shull 55:51

Yeah, because the right if I'm not mistaken, it's yeah, the the pressure just literally implode you like from the inside out. But Lisa was gone isn't person. Yeah, at least it was you know, and I'm not trying to bring light to this. But at least it was quick, right? Because the alternative, I guess would have been for maybe like a slow leak or something and then maybe the suffocate or even a drown. So I guess at least I mean, I don't even know if they would have known what was happening. Right? It probably was just an instant, you know, then it's over. I would

Nick VinZant 56:22

think that like no matter what I feel like whenever something happens, you always see signs of it kind of happening. Nothing ever just said old Samuel Jackson thing like shit just doesn't happen takes time. It takes effort. Maybe it's suddenly like, it tipped the line suddenly, but it always leading up right. Like there's always things that I think a lot of people who have maybe gotten on this job or have come forward and been like, this thing shouldn't have been carrying people.

John Shull 56:49

Right, but but nobody was gonna say that beforehand. Right? It's always when tragedy strikes that they're like, oh, maybe maybe that we shouldn't have you know, maybe we should have inspected that bridge. Mazza move on to some things that we can complain about now.

Nick VinZant 57:02

Okay, so this is a list that John has probably been waiting for his entire life, which gives him a chance to complain about kids today and all the modern ills of society, top five things that are ruining society. And we're not talking about big things, like major issues, just little things. I wish you'd really do top five little things that are ruining society.

John Shull 57:22

Yeah, like we're not talking about politics or things like that, you know, or, or clunker. Climate change. Yeah, we're, this is clearly a fun but real top five. So in saying that, I'm going to start my number five off with let's go fast food as my number five.

Nick VinZant 57:43

Yeah, I mean, that's probably one thing that you could say is like, that's really probably bad for all of us. I mean,

John Shull 57:49

right. So unless I give credit to the companies, right, like McDonald's, Burger King, they tried coming out with salads, till you realize the salads are just as bad as having a double cheeseburger.

Nick VinZant 57:59

My number five is abbreviating words. gwoc. Doc, I, that just drives me insane. I think that that's ruining society.

John Shull 58:10

I have something like that a little further up on the list. So I'm going to

Nick VinZant 58:13

have that higher.

John Shull 58:14

I do. Yes, Bull.

Unknown Speaker 58:15

That's bull. That's bold and put it

Nick VinZant 58:18

higher. Okay. All right. Number four.

John Shull 58:23

This may piss off a lot of people when I'm going to say televangelists?

Nick VinZant 58:28

Oh, yeah, I think everybody pretty much agrees that that's a bad idea.

John Shull 58:31

Right? You know, and I'm only picking this person out because he's seen it. He I think he's the most famous by Joel Osteen, who clearly probably is not a very good person and has made billions of dollars. But yet he's on TV every Sunday morning. And I feel like you know, it's I don't know, I just all the people that that flocked to that message and watch it, it's like, what are you doing?

Nick VinZant 58:55

I think that you can kind of see that in a larger sense that there is a rise of the charlatan seems to be coming forward, the people who are doing something and saying something only for the purpose of making money, regardless of what it does to other people, like the charlatans, or there's a rise of the charlatans in our society. Now, I think

John Shull 59:14

I 100% agree with you on that.

Nick VinZant 59:17

My number four is celebrities. I'm sick of it.

I'm sick of it. Sick of all of it. I

don't want to hear about him. I don't want to see him. I'm tired of all celebrities.

Tired of it.

John Shull 59:27

Once again, I have that a little higher up on my lists. So my number three is lazy language, which is kind of what you were alluding to. But, you know, it's for me, it's not only talking to somebody where you know, I mean, I've had some I've had some people say things that I've never even heard of like, like are Oh TfL like and things like that. Oh, I'm not sure what that one means. rolling on the floor. laughing

Nick VinZant 59:57

Oh, that makes sense. Yeah, I'm okay with that kind of stuff

John Shull 59:59

of you. ever had somebody say, you're making me lol in person?

Nick VinZant 1:00:06

He can't say that stuff. You can't say it's

John Shull 1:00:08

you deserve to be hit with a steel chair you say that but I didn't I didn't

Nick VinZant 1:00:13

see. Because you're a people pleaser, you got to put your foot down a

little bit more, you got to let the people know, I'm

John Shull 1:00:21

pretty sure I would have been fired from my job. If I would hit people with steel chairs but or maybe I will get promoted. I have no idea.

Nick VinZant 1:00:28

Maybe you're in the wrong job hidden people steal chairs in a different line of work is completely acceptable.

John Shull 1:00:33

I will say that the the laziest language that just kills me as these, you know, text message language where, you know, it's and is like just an N for you know, your you are, are our numbers instead of letters. It's just kills me texting to me already, even though I do it. You know, it's mature the majority of what I do. But texting, lazy texting is something that I think has ruined society more than, than most things. Okay. All right, I'm gonna sit on my chair now.

Nick VinZant 1:01:08

And guy once you go yell at some clouds, I use damn clouds covered on my my number three is choices. I think that we have way too many choices. I think that we need to go back to having less choices for things.

John Shull 1:01:24

That's a good one. That's for sure. But yeah, I mean, I think choice. So I think it's part of the way we are as a society now where choices have were a good thing because it gave everyone an option to get something they liked. But I feel like we're coming. We're circling now. Right? We're on the other side of it to where there's so many choices for some things that now it's just a convoluted market. And you can't even some people that are simple, people can't even get simple things. Or vice versa. Complex people can't even get complex things. Because it's all like just all this is fucking mash. Just a mishmash of shit now.

Nick VinZant 1:02:01

I think that there's too many choices in that. When we were growing up. You just had to suffer through some stuff. Yeah, I mean, I think that that's an ability that you need to learn in life is like, no, no, no, you can't just go do the next thing. You got it. You got to eat some shit. Yeah, you got to eat it. And you got to write it. This is what you get. This is what you got. Okay. Wish number two

John Shull 1:02:21

semana. So it's kind of three things, but I'm trying to combine it into one, because it's all along the same lines. But I have celebrity bullshit, terrible reality TV and billionaires.

Nick VinZant 1:02:36

Like I agree with all that. First of all, it kind of leaves it ruined.

John Shull 1:02:39

Now I'm gonna go through all three and why they all pissed me off one reality TV show is garbage. They're all either has been or never will be as on these reality TV shows, especially the celebrity ones. Billionaires there shouldn't there's no room for a billionaire class in a economy. And there's hundreds of 1000s of them and to celebrities alike. And I'm talking about the celebrities that have all come about because of social media in the last decade, people that just, you know, there's some, I mean, there's some that, like, I don't want to call names, because I don't want to get us in trouble. But there are some viral sensations, we'll call them. I have no idea why they're famous. Because they posted one video of them lighting themselves on fire, and that's okay. I mean, it just pisses me off. I mean, I'm just gonna stop there. But yes, that's my number two in a nutshell.

Nick VinZant 1:03:36

I really wasn't paying attention. I was just kind of waiting for that whole thing to end. That's fine. Like, oh, he's gonna go, he's gonna not roll while I did it. No, no, it's fine. started thinking about what I was going to have for dinner.

John Shull 1:03:47

Um, I hope it's a big pile of shit.

Nick VinZant 1:03:51

My number two is comparison. I think that comparison, in some ways is ruining our society. In the past, you could be pretty good at something and you could take pride in that. Now you're automatically kind of exposed to everybody else who is better at something than you are. And you didn't have to know notice that before. I think comparison is kind of ruining the joy that we used to feel in our lives about being kind of good at something. We just see how much now we just see how much we suck compared to other people.

John Shull 1:04:19

Well, I mean, you know, get off social media, you won't have to worry about that.

Nick VinZant 1:04:24

There's no way to get off social media

John Shull 1:04:26

is life speaking of that's my number one. Social effing media.

Nick VinZant 1:04:35

Yeah, I don't really have that much of a problem with social media. My number one is the algorithm. I think algorithms are ruining our lives because they dictate them entirely. I don't think that social media is the problem. I think the algorithm is the problem.

John Shull 1:04:47

Kinda what you said earlier about choices. You know, when we were growing up, we didn't have social media, and we still had friends. We were still able to get things do things seem worldly even if we weren't The instant gratification of social media. It's ruined everything. Like it's, you know, I mean, how excited were you when you were a little Nicky? Well, you probably didn't do this because you were a weird kid. But you know, you would send away for a toy or something, or a magazine, and you had to wait two weeks to get it. And you just waited every day for the postman to come by smoking a cigarette to put it in your mailbox. And you never did. Until that one day.

Nick VinZant 1:05:29

I didn't really grow up like that. My parents were always just like, you're not getting it. Oh, well, I didn't get like you're not getting anything. Like, Hey, you want this thing? Well, you're not getting it. Let's go get a job at the age of six. And that's why I started mowing lawns at eight man. I wanted a new bike. My dad was like, I don't care if your bike doesn't work. You want a new one? Go get a job. So I started mowing lawns.

John Shull 1:05:54

I'm just saying, you know, this all goes to a bigger argument for me, but we need to slow down. You know, I think that's one thing that our forefathers and other other generations knew that they didn't know at the time. was slowing down as okay.

Nick VinZant 1:06:11

It's a weird thing that the older you get, the more you realize your grandparents were right. Yeah. Oh, yeah, they kind of had the basics are really gonna governs life. Um, I'm gonna start my animal and mentioned I really thought about putting this on my top bought one of my honorable mentions animals. I think we've traded pets for people. I think we've gone a little bit too far. Pets are great. I have a pet to great dog. But it is a pet, not a person. I think we've gone too far with that.

John Shull 1:06:38

But see, I'm gonna call you out because earlier you said about hunters and gatherers. And going back to that time. And pets were a large part of that

Nick VinZant 1:06:48

gathering you're taking but you're taking that like literally, I'm taking I'm talking about it in the figurative sense that like like we are designed. We're designed a certain way. And we are not necessarily built for the modern society that we have created.

John Shull 1:07:02

I think pets have gone too far. But not not the like I'm okay with the normal pets. And when I say normal, I mean like the dogs and cats is these people that are glorifying having a snake that they think no, that's name because they stick a rat on a counter. And it eats it and they're like, good job, little Bobby. Little Carl. Like no, it just wants to fucking food.

Nick VinZant 1:07:25

What do you Well, I mean, that's any animal dude. That's what we're like to

John Shull 1:07:28

we can we can train some animals to sit like, you know,

Nick VinZant 1:07:32

they just want the food to they're not doing it. And

John Shull 1:07:35

you can play devil's advocate on this one. I'm right. I

Nick VinZant 1:07:38

think that you're taking an attack and people's patrols in pets. I'm fine with people having any kind of pet that they want within reason, right? But I don't think that you should treat that pet as if it's a human being it's not think you got to put people first

John Shull 1:07:51

and I'm okay with treating some animals like, you know, humans, but not most.

Nick VinZant 1:07:58

What do you name like a snake? Like, what if I don't know anybody who has a snake like what do you name it? You can't get like, what do you like? Frank? Frank the snake? Like what do you name? An animal like that?

John Shull 1:08:11

Slither? Snake Plissken

Nick VinZant 1:08:17

as a video game, right?

John Shull 1:08:19

Oh, man, it's Kurt Russell's character and oh, God, not come to LA

Nick VinZant 1:08:25

from New York.

John Shull 1:08:26

Yes. Get from New York.

Nick VinZant 1:08:27

Never seen that movie. That's one of those movies that like now, the time has passed. I'll never see that movie. They don't care what's happening. What's else in your honorable mention?

John Shull 1:08:38

I mean, my intervention was pretty conclusive. I don't are my top five and I don't you know, keeping a kind of fun based. I don't have too many more. I put opinion based journalists. But that's kind of to a serious, like a serious tone.

Nick VinZant 1:08:54

Like that's really ruining society. We don't need to get into like, that's really fucking people.

John Shull 1:08:59

I had the only other one I had was tattoos. But like, I'm fine with tattoos. I don't care what people do. I but I do feel like some people are, are overdoing it with tattoos.

Nick VinZant 1:09:11

Oh, everybody's always got to take something too far. But

John Shull 1:09:15

But do I really care? Do I think it's ruining society? No, but I do think people are getting a little carried away with it. But who am I? Who am I to say?

Nick VinZant 1:09:23

Well, any kind of I think attention seeking behavior is kind of bad for society if you're doing something for the attention of it rather than because you want to be doing it. Think that's always kind of going to have a bad outcome.

John Shull 1:09:34

You know what now that one more that I'm thinking about it a little more as vapors to what's wrong. Well, I mean, smoke a cigarette, watch. Cow my words we may not be doing this podcast in 10 years, but they're gonna keeping they're gonna come to find out that vaping was was more dangerous than actually smoking a cigarette. I'm telling you it's gonna happen.

Nick VinZant 1:09:59

My grandpa you to say everything in moderation including moderation.

Just how I like to live my white life.

Unknown Speaker 1:10:06

Oh, okay,

Nick VinZant 1:10:07

that's gonna go ahead and do it for this episode of Profoundly Pointless. I want to thank you so much for joining us. If you get a chance, leave us a rating or review. Just a couple of quick words really helps us out and let us know what you think are some of the tiny things that just seem to be ruining society for you