Archive.fm

Turley Talks

Ep. 2507 Google BLEEDING Staff after DEI Disaster!!!

Duration:
21m
Broadcast on:
20 Apr 2024
Audio Format:
mp3

A recent article by The Free Press shares how woke Google has become. Google’s AI tool, Gemini, appears to be a DEI-infused tool that has Google staffers practically sprinting to find better places to work!

Google’s co-founder, Sergey Brin, tried to play it off saying, “We haven’t fully understood why it leans left in many cases that’s not our intention.” Sounds like a load of crap, if you ask me!

What’s happening is that these companies are using their “diversity hires” to come up with the most liberal-loving content and technology they can! They are training this AI technology to rewrite history and make sane, intelligent patriots like us think like those leftist lunatics!

Now, I’m all for advancements in technology, but only when they benefit humanity and tell the truth. So, today I’ve brought on my friend and sponsor, Mark Stross, to discuss how we can combat the dark side of technology and balance innovation with ethical considerations.

Mark Stross is a leading Chief Technology Officer, Author, Speaker, and Radio Contributor. His passion has always been innovating, breaking barriers, and helping our culture engage with technology in meaningful ways. He is passionate about preserving the core of what it means to be human: our creativity, critical thinking, connection, and conversation. He believes that technology should serve us, not the other way around.  

 

Resources: 

 

Thank you for taking the time to listen to this episode.  If you enjoyed this episode, please subscribe and/or leave a review.

Don’t let Big Tech WIN by staying connected to Dr Steve and joining the movement to reclaim our freedoms at: https://join.turleytalks.com/insiders-club=podcast

Make sure to FOLLOW me on X (Twitter): https://twitter.com/DrTurleyTalks

Do you want to be a part of the podcast and be our sponsor? Click here to partner with us and defy liberal culture! https://advertising.turleytalks.com/sponsorship

If you want to get lots of articles on conservative trends, sign up for the 'New Conservative Age Rising' Email Alerts: https://turleytalks.com/subscribe/.

*The content presented by our partners may contain affiliate links. When you click and shop the links, Turley Talks may receive a small commission.*

Are we seeing the revitalization of conservative civilization? All over the world has been a massive backlash against globalization, its leftist leadership, and its anti-cultural liberal values. And it's just the beginning. I'm Dr. Steve Turling. I believe the liberal globalist world is at its brink, and a new conservative age is rising. Join me every day as we examine these worldwide trends, discover answers to today's toughest challenges. And together, learn to live in the present in life of even better things to come. This is Turley Talks. Google says that it has fired 28 of its employees who protested at the company's offices across the country yesterday. One of those protests happened outside the Google offices in Sunnyvale. Demonstrators demanding the company cut ties with Israel. A police arrested five people there, including some Google employees for refusing to leave. Google, if you could believe it, is bleeding staff. Just weeks ago, the tech giant announced that it was about to lay off upwards of 12,000 employees, the equivalent of 6% of its workforce. And the CEO, Sundar Pichai, he's warned that more layoffs have come in 2024. Now, this, of course, is all happening on the heels of Google's premier of their new AI tool, Gemini. You remember that disaster, don't you? Let's just say that rollout frankly shocked everyone. It turns out that Gemini is little more than just a DEI-infused tool that's earned both the ire and the ridicule of social media users all across the planet. Google's co-founder Sergey Brin, he tried to, you know, just play it off saying, "Wow, we don't really know why it leaned left in so many cases. It's certainly not our intention." Yeah, BS. What's really happening most likely is that these companies are using their diversity hires to come up with most liberal, loving content and technology that they can. But what really upsets me in the end, gang, is that they're training this AI technology to rewrite history, to turn AI into a historical revisionist tool that refashions figures like our nation's founding fathers after their own leftist and frankly lunatic image. Now, of course, I'm all for advancements in technology, but that involves only those advancements that are actually human affirming that benefit the flourishing of our humanity rather than threaten it. But of course, that raises the question that so many of us are asking today. How do we faithfully navigate that often treacherous path between what's technologically promising versus what's technologically perilous? Well, joining me today is Chief Technology Officer, sponsor and fellow Turley talker, my good friend, Mark Stross. Mark is the author of an amazing brand new book, Killer Tech and the Drive to Save Humanity, which you can get just by clicking on that link below or going to markstross.com. Mark, welcome, my friend. Great to see you again. Thank you, Steve. You know, it is such a privilege to be on your show today. Your show taught me so much about the actual foundation that my book is based on. And it's based on the idea of the mass society turning into the network society. And you're the gentleman that brought those concepts to my little head and got me started down this rabbit hole. When I went down the rabbit hole of looking at how technology was having an impact on humanity, what you just talked about is really important. First of all, I want everyone to appreciate Google's AI runs the whole of Google. So the AI is actually really impressive and runs really well. I want to say that again. The Gemini experiment that we saw publicly was not the AI representative that Google has created. What you saw was how the search engine, that is the way we actually negotiate or actually use an AI, we use it for research. We create search terms and then the AI regurgitates something that it comes up with. But what people don't realize is those search terms have to be translated to the AI language. And it was in those translations that the AI was instructed, for example, not to show any whites in American history, only to show all types of ethnicity except no whites. The DEI was actually introduced at the search level, not inside the AI itself. You would ask the AI with an honest search that had no translation of DEI in it to do the search. It would have come out positive and correct. So I want to make that clear. Wow. Again, this is why we have you on the show. You're an expert in this. I told you, I've been thumbing through this book. It's so fascinating. It's so good. I love the action items that you have at the end of each chapter. I want you to talk about that at some point. But first, I am really curious. You and I have spent some time personally, one on one, chatting into the wee hours, which are quite wonderful. You know your stuff. I'm just curious. What's your assessment of the impact of this unbelievably absurdly biased AI coming out of Google on its users? What killer tech is so interesting, if I get it, what killer tech is so interesting about is you're fascinated by the effects of technology on us that we may not even be aware of. So I'm curious what you think about that. Well, as Peterson is a philosopher, I realized that there was really very few people dealing with the philosophy of technology. So we're dumping a whole new tier of philosophy onto the historic scene. And really, who is out there looking at this and actually documenting what is the impact of this technology and philosophically, how does it impact our humanity? That is super important. You know, when you're talking about Google and you're talking about choices and decisions, people at Google made decisions in order to make a woke AI. But did they ever think that in their attempt to basically make a biased machine that only spits out information they like, they were going to scare 50% of the population from using AI. So ultimately, they lose because they lose their jobs as they did. So ultimately, when I look at this, the ultimate, the ultimate here, is if you are actually fair, you actually find that you actually do good business. You know, it's interesting. In Vegas, people always assume slot machines are actually biased. No, actually, they're true random. Therefore, the house always wins. So the truth is you don't always have to cheat in order to win. You can win without cheating. Wow, that's brilliant. It's so neat, you say, because I mean, I've always found that liberalism just entails its own futility. And I love how you're saying, in effect, this kind of DEI nonsense is a sin that entails its own punishment, whereas its own judgment. It's literally, we have moral laws just like we have physical laws. Pride really does go before the fall. Things that can do that. And I love the way you're tracing that out. Do you think tech companies have a responsibility to not just prevent these ideological biases from taking hold in their AI in the first place, but to go above and beyond that, to actually to assure their customers that they've taken great lengths to be as objective and as accommodating to all as possible with AI? Well, yes, they do have a responsibility because they sort government protection with the 230 government protection. So if you think protection and you are going not to be a publisher, you're going to be utility, then you cannot do editorial bias and become a publisher. And that's what most of these tech companies have done. You were published, you didn't realize, but you had an editorial board on YouTube that decided to demonetize you. Right. That's publishing because they editorially decided your content was not favorable to their audience. Whatever. Let's take this to the next step. Who basically nominated those people to be judging you? And therefore your audience judged you, you get an A. So if their audience judged you, you got a D, my problem with that is who are they to judge? Right. I prefer not to judge. Right. Right. It's so interesting. I mean, because you're getting into now the ethical, the ethical nature of things, what you get into in the book on how these technocrats, in effect, have an ethical responsibility in terms of how they're using this technology. Can you flesh that out a bit for us? What will you play in the development of this type of innovation? In the book, I talk about a digital bill of rights. It's super important to me. The digital bill of rights establishes ownership. We have lost ownership. You probably heard the famous quote from a game developer that said, "Get used to it. You don't have to own anything." I mean, and frankly, I'm not going to get used to that. Because ownership establishes a bond between human beings and the things they own. We used to own our photo albums. We used to be able to go toward death knowing the photo album was going to hand it down to the family. We never went toward death realizing that all our photographs are owned by a corporation and they have exclusive license of how to use the photographs. When you think of the difference between 30 years ago, ownership and ownership today in the digital world, corporations are basically saying, "Abocate your ownership, advocate what you--" and also advocate your thoughts because we know better. We know what you should own and what you shouldn't own. For example, PlayStation. This theme is insane. You board a whole bunch of games and you board some movies from PlayStation and then PlayStation does not renew the rights of these movies and these games. Suddenly, you get a letter from PlayStation stating, "We're sorry, but we just lost ownership of 200 titles and you lose all your titles." My question is, was that user who should be called a human being instead of a user? To that human being, I've just got told that the game that he owns is no longer owned by him even though he owns it. In my future, we're going to have a digital wallet like in crypto and you have a lot of crypto people that come on your show and talk about crypto. Well, they use a general wallet and that wallet is available. It's transparent. It's available for anyone to look at so you know what different people have in their crypto wallets. Well, in that same way, we should have a general ledger for everyone's images and photographs and you can either have them out there in the big bad world or you can rescind access and you should be able to withdraw those photographs from meta or from X or from any of the platforms and those pictures should just come down because if we agree that crypto is such a cool way to bring freedom into the idea of fiat currency, the idea that you can own something that the government doesn't have access to, then couldn't we use that same technology to bring back ownership, true ownership, right? The internet that we have right, the internet 2.0 is just not an owner based economy. We need an internet 3.0 and the bill of rights, the users bill of rights and so on. Gang, this is the book Killer Tech Drive to Save Humanity, just click on the link below or to your copy or you can go to marxtross.com directly to order a copy. What I love about is it doesn't just explore the types of technology infiltrating our lives and the dangers and the like, it's hopeful. You love tech. This is what you do. You're a CTO. You love tech and you give people road map on how to take back control and that seems to be the key. If I may, a theme that comes out of this book is it's a matter of control. Do we control tech or does tech control us? So tell us a little bit about that. Can you just flesh out really what you were trying to convey overall in terms of killer tech and in terms of the positive message there? Yes, the positive messages were human beings and we should have a relationship with technology that actually augments us, not depreciates us. The best way for me to describe what I want is I would, for example, like a positive TikTok instead of a negative TikTok and it is possible because China's version of TikTok is positivity. It actually ferrets out the very best representations of Chinese culture as opposed to our TikTok version ferrets out the very base of our culture. So what I want is I want a future where technology represents human beings in a civil human way and not to basically take away our humanity, deconstruct us into just zeros and ones. That's where we are today. So a future where our digital lives are respected, a future in which we confront our digital dependency and we start to move away from addiction. You heard what I said, dependency Steve, because most of us are, let's face it, a little bit addicted to our devices and we all need those devices in order to do our works. I'm not suggesting that we suddenly don't use those devices but like with soda, we went from a society that didn't think sugar water was bad for us to now where we all agree that sugar water should be done in moderation and we have now adapted sugar water in our lives in a much more holistic way. We can do that with our phones, we can do that with technology but it's going to take people like myself and yourself bringing these topics up, making sure that we have a human discussion about it and not alienating people. I want to make it clear, how can I call any of you guys addicted when I am probably more addicted than you are because I use these tools in order to be current to talk to you Steve, and yet these tools, and this is super important everyone, these tools today, gen you up and gen up is a term we use in the book to describe when you have too much information and you're overflowing with information and you get anxious and you get mad about it because you want to tell the world about your, your truth, the only problem is it's your truth and the unbelievable part of our society today is social media has allowed us to design our own narratives, our own news, we only see what we like to see because we've designed it that way. The reason that occurs is social media wants to keep your eyeballs on the medium and the way they do that is they actually make sure you self-addict yourself by making sure you see all the content you want to see and the algorithms make sure that you, you spend as much time as you possibly can on your own echo chamber. Right. Wow. Gang, I'll just give you a little insight in some of the chapter. I love these titles. Human Productification, chapter one, chapter two, life after bandwidth, chapter three, the hacking of critical infrastructure, chapter five, the unseen cost of technology just jumping around here. I like chapter six, cyber warfare in your kitchen, chapter eight, the illusion of ownership. And I love chapter 11, your digital hygiene. And then of course, the final chapter, chapter 12, bringing humanity back. And I mentioned at the end of it all, you have tech action, what you call you, where you list out action items at the end of each chapter that are proactive, positive steps. We can all take parents or professionals, people. We can all take to better harness the human affirming aspects of technology. Do you want to talk a little bit about what kind of what was the reasoning behind tech action? Yes. When I was writing the book, actually you had something to do with this. And in some of your podcasts after the 2020 election, you talked about, we had to take action. You kept talking about this. This is actually on you. And I suddenly realized I wrote the book, but where was the action? What we're going to do? I mean, you can't just write a book and talk about all this gloom and doom and not come up with action steps and not give hope. So what I'm hoping with Killer Tech and the drive to save humanity is that we have given hope. We give you guys a way out and we give you guys a way to fight back. This is super important. One thing, local community should have incubators, Steve, I really believe in this. We have too much talent that leaves the home base. They graduate high school. They leave the local community because local community has no action for them. So one thing I'm advocating the book is we create local tech centers where you take local community programmers and people that could actually maybe do very well in the open market but try to entice them to open businesses in their local community and have local talents support them. Believe it or not, there's enough money in local towns and cities in the United States with grants and other programs that we could create successful incubator programs that would get kids off the street, that would actually start to support technology and teach people how to program because I think that's a very important skill for the future. Not just understanding technology but understanding how to program AI, for example. That's going to be incredibly important field coming out. And finally, to really point the audience to why this book's important, is AI is going to erode 40% of the jobs in the next 10 years that are out there in the world? I mean 40% of jobs that you see today will not exist in 10 to 25 years, 30% will be gone. In 40 to 55 years, they're predicting 50% of the job market is gone. Wow. We're talking from factory workers to surgeons. Yeah. It is so wide. Wow. What AI is doing, it's allowing us to bring in, for example, garbage collectors that aren't union workers so they can pick up the garbage 24/7. Why would you not want that for your city when you can save money and do better trash pickup? But that also ultimately means you're going to lose that those jobs, they're going to be gone and those are very high paying jobs today. It's anybody that does something like surgery where it relies on a list of known symptoms and a history of symptoms can be better served by AI seeing the whole history of every symptom known to man. Right. And if you have a surgeon working on you, would you prefer the surgeon that doesn't come in from a hangover, it hasn't had a bad night or the surgeon that's human. Right. Right. Right. Wow. Everything, this is amazing stuff, killer tech and the drive to save humanity by Mark Stross. It's a spectacular book full of insights and actual steps that you and I can take to ensure that the emerging technology around us is helping humanity instead of harming it. It's a wonderful resource written by a fellow patriot, fellow Turley talker. So click on that link below to get your copy or just go straight to markstross.com. That's markstross.com and grab your own copy today. While you're at it, get one for all the patriots in your life, your friends, your loved ones, making an early Christmas. I love you for it and you'll love this book. Killer tech. I'm Mark Stross. Mark. Thank you, brother. It's awesome seeing you again. Let's do this again real soon. Wow. Thank you, Steve. What a privilege. Thank you. Thanks so much for listening to this episode of the Turley Talks podcast. Don't forget to subscribe, leave us a five star review and share this episode with your friends. Help us defeat the fake news media and rank us the number one news and commentary podcast all over the world. Come back again tomorrow for another episode celebrating the rise of a new conservative age. in the world. (upbeat music)