Archive.fm

Ozone Nightmare

Intent Matters

Duration:
5m
Broadcast on:
01 Jul 2024
Audio Format:
mp3

Today on the 5: Last week I talked about the real issue with generative AI in terms of one specific person and their views on creativity. I hate to sound like a broken record, but when story after story after story after story underlines what I was talking about, I can't keep my mouth shut.

Welcome to You Daily Five for Monday, July 1st, 2024. Now, this five may seem like a bit of a repetition for the one that I did last week talking about Mira Morati and her comments about creativity. And it kind of is, I will admit, but it also is important because it underscores just in the time since that last five. There have been four news items about different aspects of these generative tools that underscore what I was talking about in that five, which is that it's not about the tools, so that's about the people and the way the tools are used. So one of the stories was the perplexity answer engine being deceptive, or depending on your point of view, outright lying about how it came up with its answers. There was another one about a chatbot whose name I'm losing that is bypassing paywalls to be able to suck in information and ingest things. There was the move by Reddit to attempt to tell scrapers from these different companies that they could not use the information on Reddit without the acknowledgement or an agreement with Reddit to do so, even though most of their stuff is posted in the clear. And there were comments made by one of the higher ups inside of Microsoft in their AI division where he was saying that essentially, oh, back in the '90s, everybody agreed that if you post something on the web, it's fair game for anybody, which a lot of people have a problem with. And all of these things, again, just reinforced that the tools themselves are just tools. They're just tools and they're just things that people can use to do both good and bad things. What really matters and where the focus should be in the scrutiny and the criticism is on the people who are behind these things because what they continually do, and now they're just outright saying it, as in the case of that Microsoft individual, they're just, and Mira Marathi, frankly, they're just outright saying that they don't care about where the information comes from, that they're using the build these tools, and here's the thing about it. This is what really makes it despicable. Is if any of these companies, these individuals, if these were people who were working completely on their own to advance the benevolence of mankind and they were using these tools and they were distributing them for free and they committed to them, always being free, and you could believe that they actually had noble intentions, you might extend some benefit of the doubt to them. You might be able to say, well, yeah, this isn't necessarily ideal, but this is the only way they're going to be able to make it to work, but that's not what's going on here. These are multimillion and billion dollar companies that are building up products that they are going to sell to other multimillion and billion dollar companies and into individual users. They're not doing this to somehow elevate humanity. That is not the goal. The goal is to make money. And knowing that that is the goal, that that is the actual intent, that that is the driving force behind these things makes you have to question all of their motives and why they're doing the things that they're doing. So are there going to be positive benefits to these tools? They absolutely will. Nobody doesn't want, or no reasonable person is against these tools being able to help people diagnose cancer, identify illness, come up with new solutions to complex problems. Of course, we all want that, but let's be honest with ourselves. It's not that the people who are making these really care about that, that's not their actual goal. Their goal is to boost the profit margin. Their goal is to make sure if the stocks keep going up and that the market rewards them. And in that respect, they don't care whether they're curing cancer or, frankly, inducing it. They don't care. That's a bit of an exaggeration. I don't think anybody in any of these companies actually trying to create cancer. But the reality is, they will be just as happy with a non-benevolent application of their tools as they will be with something that is actually beneficial to people, the world, whatever. But that's not what they actually care about. What they care about is what the market demands them to care about, and that is growth and profit. And as long as those are the two driving forces behind the people making the big decisions here, then you have to look at all of these tools, not because of the tools themselves, but because of the people behind them with doubt and with suspicion. And frankly, we have to keep an eye on what's going on here. And so when you see something where somebody says, oh, we're just going to take everything in, because who cares? It's all in the open web anyway. Or when somebody says, oh, some creatives don't need to exist. What good are they anyway? They weren't doing that great of work. You have to then question what they're actually saying behind that. What are they actually thinking? Where are their values? And ultimately, the values are how can we make our product the best one around, kill everything else that's trying to compete with us, and ultimately dominate the market? And again, that's not a benevolent position to start from. And so like so many other things, if you start from a place of dubious intent, let's keep it somewhat neutral, or questionable intent, if you want to be a really, really kind of gray area on it, then you have to then look at the output the same way and say, well, is the benevolence really benevolent? Or is that a byproduct and a happy accident for some people? And that's what I'm saying is it's not the tools. Once again, it's the people later.