Boys will be boys? With the use of AI, now boys may become bots. Hold onto your virtual hats, because we're diving into AI and how it can become a nightmare when used against women. Join us as we unpack how AI is reshaping the landscape with manipulated videos and pictures. Spoiler alert: it’s not all glitter and girl power. Tune in as we turn up the heat on the gendered use of AI, deepfakes, and manipulation. This is an important one!
We Are More: Sisters Talk Faith & Feminism
Ep. 12: Boys Will Be Bots
(upbeat music) - Welcome to the We Are More podcast. My name is Alyssa. - And my name is Bree. We're two sisters passionate about all things faith and feminism. - We believe that Jesus trusted, respected, and encouraged women to teach and preach his word, and apparently that's controversial. - Get comfy. (laughs) (upbeat music) - Hello. - Hello. (laughs) - It is hot. - It is hot. (laughs) It has been in the 90s in Michigan this week. - This will be in your past. - Yeah. - Have you survived? (laughs) Have we? - May not. (laughs) - Future selves, are you alive? - Also make better life choices. - Yeah. Move away. (laughs) Where? Where's colder than this, Brianna? - I don't know, no. - You guys wanna move to Florida. - Yeah, but at least you're in Florida. Like you have a pool and an ocean and Disney. Here, it's just hot and miserable. (laughs) - The Michigan people are gonna come for us. - Pure Michigan, pure Michigan. Thanks, Tim, that one. - Yeah. - You really put us on the map there. (laughs) Also, all of the lakes that we live in between. - I don't recall those. (laughs) - Where are we? But we are sitting in here. This is how committed we are to you guys. We're sitting in this room with no fans going. We don't have air conditioning in the house. We have those sad, like, cooling towels that you get at Disney around our necks. - Don't throw us, don't, don't out us. (laughs) - I'm just saying, this is, we're very committed to our craftsmen. - We might as well be recording outside. - Yeah. - I mean, really, except for the birds, would kind of kill the point of the note. - 'Cause you got pooped on today. - I did a bird pooped on me today. - And not like a gray poop on, like an actual poop on. - What? (laughs) - I just recently watched that commercial. - They have a commercial. - Yeah, have you seen it? They're like, I poop on my barbecue. I poop on my chicken. - Is that one of the super bowl commercials? - It may have been, I think it's old. - Oh, I feel like that's striking a memory. - Anyway, I could sing another song related to the sound of music. - Oh, good. - But I won't. - Good, 'cause that's a terrible song. - It is terrible. - And I'm really hot. - Yeah. - It's wrong. - In both ways. (laughs) So in this mini episode, this is our third mini episode, right? - Yes. - So we did one on Harrison Butkern, his stupid speech. - Butkern. (laughs) - You've made that joke every time. We had one come out a couple weeks ago on Bridgerton, which was a super fun one. - That was super fun. - Yeah. We got some really good quotes out of that one. - We did. (laughs) - And today we're gonna be talking a little bit about AI and its impact on women. - AI! - AI! - AI is fricking everywhere. It's in my workplace now. We're using it for monitoring patients. It's crazy. - Well, and I'm in like the marketing, editing sphere, and it's just, it's everywhere. You can't get away from it. We use it to edit the stinkin' podcast. Sometimes, is it my voice? Or is it AI voice? - You'll never know. (laughs) - If it says something stupid, it's pretty! (laughs) - So it's me all the time. - Yeah, it's pretty nice. (laughs) - I haven't been able to figure that feature out. We use an editing software that has the option. Like, if we talk over each other to fix it with AI, or if we mispronounce a word or whatever, although we've confused it a lot. Kafele, you think it knows how to say that? - No, it's not biblical. This is not a biblical editing software. (laughs) - That'll come out with that eventually, yes. - Maybe we will. - Maybe, but we will not pronounce it. (laughs) We'll find somebody else. - Yeah. - Anyway, so we've seen that explode in, I mean everywhere, you can't get away from me, I know. And so we wanted to talk a little bit about that because Brie saw this crazy TikTok video and sent it to me the other day. - I got swindled. - You did? - I just got swindled by AI. There was, okay, I'm addicted to TikTok. I will confess. I spend many hours a day on the ticks and the dogs. And I saw this video come through that was Kim Kardashian, and she was announcing that she was transitioning. She was gonna be-- - That she was-- - Mr. Kim Kardashian. - Yeah, transgender, right? - And I was like, holy cow. And they had these images of her essentially in drag, which I don't know if that's the appropriate term or not, but so I sent it to you. But it wasn't true. It was all AI. - Mm-hmm. - They're able to AI people's faces, manipulate their voices, make it look like it's a legit video. It looked like an interview. And it was based off of an interview that she did, but that is not what she said. - Yeah, I think we, I looked into it, and it was like an interview where she said she was transitioning into acting. - What she ate for her. - I mean, no, the world does not need that. (laughing) But like, instead they made it say that she was transitioning into a man. And the only way that I caught onto it at all is I was reading through the comments, and people were like, I can't find this anywhere else. Like, why can't I find this interview? And then somebody did find the interview and posted like what it actually said. - Leave it to people on the internet. They will, they are like little detectives out there. - Yeah, it's crazy to me. And in this particular case, yes, you have people that can kind of seek it out and whatever, because she's famous enough. - Yeah. - That if you fake something, there's enough of a record that people can go back. - Yeah. - But how can something like this where you can edit a voice, you can edit a picture, you can edit a video, affect someone who doesn't have the resources to prove that it's not true, right? - Well, there's a term for it now. It's called deep fakes. We looked it up. It's essentially what we're describing right now, using AI to manipulate videos, photos, voices, to do or say whatever you want. And it's super dangerous, not just to women, but we'll be talking about it relating to women. 'Cause that's what we do. I am a woman. Hear me. Sorry. (laughing) - Well, that's going on today. (laughing) - Yeah, I've lost my train of that now. It's a lot. (laughing) Have you seen that video of Taylor Swift like she was performing Brazil or somewhere like extremely hot? And in between songs, she's just like, (breathing heavily) like gasping for breath and all of a sudden the music hits and she just snaps back into. - I don't know how she does that. I don't know, 'cause I would pass away. - I am not built for heat. - No. - When I go and buy makeup, I buy the lightest shade of foundation they make. I was not built for the sun. - You are a translucent. - I am. - You're a blonde translucent. - You're a citizen of the moon. - Oh, now we've gone weird. - That's from Parks and Recreation. (laughing) Anyway, back on top. - So the issue, honestly, when we were looking up, like what should we talk about for this podcast, the reason that this topic came up was because I was thinking about AI, I deal with AI every day at work. And I looked up, all I did was how does AI affect women? And I was thinking not about this necessarily, but about in the workspace, because you see a lot of women in marketing, things like that, that it wouldn't kind of displace workers, but to think that this is gonna disproportionately affect women in being manipulated with pictures and videos. They say it's a new form of sexual harassment or assault against women, or men too. - Absolutely, yeah. I mean, for sure, we used against men. I think we just see it more likely as happening to women because pictures, videos, things like that, have already been so disproportionately used against women. - I was listening to a podcast before we started recording this that said that that happened to Taylor Swift. Actually, a bunch of AI generated nudes came out of her. Yeah, like super explicit, nasty stuff. And if that's happening to Taylor Swift, think about what's happening to kids in school, right? To minors, to like all kinds of stuff, it's super disturbing. - Well, and I mean, you think what can that be used to get? All you have to get is a picture of your face and how many pictures of our faces are out there. - Well, you have to scan your face every time to open your phone, right? - And we post things on social media because you think, well, it's Facebook and I've got it all locked down. - Yeah, it's private. - Right, but your profile picture's up there, your cover photos up there. Do you really know every friend that you have? - I have a lot in purgatory. - You do. - Several old men in purgatory, actually. - Yeah, well, think of why they've friended you because I'm beautiful. - Yeah, that's it. But it's just, it's one of those things that is so scary about the world today. - And how do we even protect yourself from it? - Like, you can't be totally not online at all. That's not really an option in today's world. Everything's online. You have to have a Google account to walk into Meyer, right? - Which is a store by us, but it's like... - I know that always throws me off that other people don't know what Meyer is. - A grocery. - A grocery. - I wish that we had like some big, like there was a big message here of like, here's how to protect yourself from the use of AI. I don't know that that's a thing yet. I don't think it is. I think AI is so like still experimental and people are still figuring it out, but people need to be really weary of it, I would say. And for the women out there, as sad as it is, and as much as this is always the advice, you have to keep your guard up. - Yep. - You have to be even more careful. And that shouldn't be, that shouldn't be what it is. No. You shouldn't be worried that the kid next door to you is gonna sneak a creepy picture of you and throw it all over the internet. - Well, one of the things I wanna talk about too, of how pictures are generally used against women. So, an example of this even before AI. So again, this is even bigger of a problem with AI, but it's not a new problem because of AI. I'm not sure if you guys have heard this story, but there was a woman, her name was Micah Miller. And she, she unalived herself, yeah. She unalived herself, unfortunately, recently. A few months ago. And her story goes that she was married to this pastor. And he was incredibly abusive. She was very vocal about him being abusive as she was trying to leave. Other people were aware of the situation. And she had told people close to her that if she ever died, it was him. He did it. Now, part of the abuse from him was that he had convinced her to send nude pictures. And I don't know the nature of the nude pictures. But they were married. You know, there's nothing inherently wrong with that, except that he had kind of forced her to send them. It wasn't like she wanted to. And that was, you know, like just part of their relationship, he had kind of forced that out of her. And then he used those pictures to blackmail her. To blackmail her. That's horrible. Yeah. Now the rest of the story, I would highly recommend looking it up. It's very important. It's a very important story, I think. Because as Bree said, she... I don't know how to say that without it being... That's what they say on TikTok, because if you say anything, they... Right. They take it down. And it was ruled that that's what happened. Now, you can decide for yourself whether you think that's actually what happened or not. A lot of people do not. A lot of people think her husband was at fault. But either way, from the abuse, I think regardless, he was at fault. Right. She was driven. Even if he didn't do it, he drove her to it with a lot of manipulation, a lot of hatefulness. And this man was a pastor, like, it doesn't get more disgusting than that. And to look at that situation and know that was without the use of AI, that was just manipulation. And that was a man who was married to this woman. It's supposed to love her. It's supposed to care for her. And he pulled this crap. So what happens now? Like, now that the whole avenue is open. One of the podcasts, I don't know that this necessarily relates to what we're talking about, but one of the podcasts was also saying that it's not just making images of people or exploiting them in that way. It's also historical, so you can take voices of old presidents. You can take voices of people of importance and make them say what you want them to say. So there's like a video going around of Nixon's voice, essentially saying that the moon landing never happened. But it was all AI generated. But people don't know that, right? And unless you do some serious digging, you think, well, I found my evidence. But how much more can you do with anything else? One of the podcasts that I was listening to is saying so many people are arguing that like, well, it's not as offensive because these AI images of women that they're stealing or creating aren't real, but they are real in very real ways. They're really going to affect someone. And if you really want a picture of someone that's just not real, don't use someone's face. Yeah. Why do you need her face in there if you're just computer generating something that doesn't affect her in any way? And let's talk about the issue of consent. Like she never consented to be in your pictures, or at least get her consent to do that. Right. Exactly. Yeah. I struggle with a lot of this stuff, but consent matters. It absolutely matters in more than just the ways that you're thinking. Even if you choose to keep these AI generated images or videos or whatever to yourself, if she didn't consent, that is still assault, that is still, and the law will have to catch up to these things. However, there are laws in place for essentially sexual manipulation. I remember when we were younger in our area at one of the schools, students sent around some pictures of another student, and they were both minors. She was minor. He was a minor. And he sent it to one person who sent it to the next person who sent it to the next person, and eventually the whole school had access to this picture. And a lot of people faulted her for having taken the picture in the first place, that it was somehow something that she had done wrong. Now, that's insane, because no matter what she's chosen to do, that boy had absolutely no right ever to share it. No. That was, again, consent, but with the advent of AI, you now no longer even have that excuse, because she never made the picture. She never sent you the picture. Maybe you pulled it offline. Maybe you snuck a creepy picture while you were at work, whatever it is. She didn't send you a nude photo. She didn't send you an inappropriate photo. I feel like, on some level, we're making like, here's the scary world filled with it. But I think what the purpose of this podcast, at least for me, is just awareness of the situation, for women to be aware that it could happen and to be cautious and to kind of steal up for in case that happens to you, what would you do? How would you handle it? I don't know what to tell you in that scenario. It's horrific, and I hope that you would be able to press charges. But also, for men, to make them aware that this is not okay, no, not in any world, is it okay? I think it's easy to convince yourself that anything you do in the walls of your home is your business, and it's fine. The devil is great at justifying terrible behavior. And at the end of the day, just because you're doing something in private, doesn't make it okay. Think about who else is this affecting? It's not just you if you're stealing those pictures offline. Even if, again, even if it's someone you don't know, celebrities, strangers, it's not okay. Well you think because I never have to interact with that person, they're never going to know me, that this makes it fine, but it doesn't, it doesn't. People think that celebrities, because they live their life so out in the open and publicly, they have some kind of right to them, and you don't. You never have rights to someone else's body, even if it's a picture. I'm interested to see how this situation progresses, because at least from where I'm sitting, we at work have AI detectors, so I'll have a writer send me in an article, and I can run it through an AI detector to tell me, essentially did chat GPT write this article. So it'll be interesting to see, do those things pop up for videos, or pictures, and how that functions. The future has some great technology ahead of. For us, we use it to track people's dental hygiene, and how their teeth are tracking with Invision. It's a very cool system. It tracks for like a hundred different, I don't know, it confuses me. Like it is cool, and using it in that way, where you can make someone's life better, awesome, when you're using it maliciously, well it can be such a tool that you can utilize. Like for the podcast and our editing software, it saves me so much time. Like the reason that we're able to do these mini episodes is because we use an editing software that automatically filters out all the times we say, which is a lot. And I'm going to have to tell it not to filter that one out, leave that one alone. And honestly like, you guys don't hear it as much, but we say it a lot. We say a lot of ums and we edit out a lot of crap for you guys, you're welcome. And that software allows me to do it quickly and simply without spending four years in this un-air conditioned room trying to deal with it. To be honest, this may be the end for me, it may pass away. We have an hour long episode to record after this. And now, a song. Don't trust the whole universe, I think that's the wrong message that we want. I think something that's important to bring up is like, if you see something, and this is like the oldest, well maybe not the oldest phrase in the book, always be cautious of what you see on the internet. Don't assume that what you see is real. Don't assume Kim Kardashian is going to be a man. Yeah. Because she'd be wrong. I'd be wrong. I'd be dead wrong. And don't share it with people if you don't know that. On the plus side, I think TikTok filters a lot of these things. And they're trying to filter the use of AI in a big way. We'll see how that works. And if you see something where you see another person that you know using this in a bad way, in front of them, it's okay to say that's not okay. Right. Well, I think in Christian circles, they say that P star RN is just massively prevalent. It's just everywhere in Christian circles. And I think it's because of what I just said, you think God doesn't care about what you do in private. Or maybe not that he doesn't care, but that it's not as bad. Right. God cares. God cares. God cares how you treat people. He cares about the woman that you are assaulting. That you don't think you're assaulting because you've justified it in your head. But this is all outlined for us in the Bible where God says love each other, love your neighbor, love your brothers and sisters. If you truly love someone, you don't treat them like this. Would you want someone doing that to you or your child or anyone you loved? Yeah. You wouldn't if you respect yourself in any way or anybody you love. You wouldn't. I think it's easy to lie to like people will say, well, I wouldn't care. I wouldn't care. And that justifies it for them. And I understand that that is probably going to be the justification for a lot of people. Why wouldn't care someone do this to me? It doesn't really matter if you wouldn't care. Would they care? Did you ask them? Yeah. Did you ask them? I'm going to guess you didn't. I'm going to guess you didn't. And if they found out what you were doing, would you feel ashamed of yourself? I would. Would you try to hide what you're doing? Yeah. Because if you're trying to hide it, you probably know that it's wrong. Right. And my concern too is like, you know, we think about this as like, you know, the gross people that are kind of at the bottom of society and whatever. It's not. It's not. It's not. And that pastors, it's your neighbor. It's your boss. Like, I don't know. Right. Well, that's why it just, I hate saying to women, be careful, watch yourself, be more cautious around men because we already, I mean, we didn't do an episode on the bear analogy, but yeah, if you haven't heard it, it's essentially asking women, would you rather run into a bear or a man, if you were walking alone in the forest? And most women, I would say a huge percentage of women said that they would rather run into the bear. And that's been very controversial because men have said, well, not all men are bad, not all men are bad. And that's, I, no one is saying that all men are bad. I don't think anybody is saying that. Maybe like two people, but like, I don't think that's the prevalent message here. The message is that we have to be careful of every single man. Because even if it's one in a hundred, one in a thousand, if I'm not careful, I'm likely to be the next victim, and I'm not willing for that to happen. So I have to be careful of all 100 men instead of just the one, because we've been conditioned to be that way too, because most women have run into situations in their life where they've run into men and been treated inappropriately. Whether that be like physically or verbally. I think that's probably universal. Like I don't think there's- Well, think back to like the Me Too movement. People kind of, especially like the Christian circles kind of, they've villainized it. They've villainized it because they're like, well, I don't know what they were saying, but that should show you like how big of a problem this is. Right. Well, there's that song, "Do a Leepa, Boys Will Be Boys." Yeah. Yeah. So one of the lines in that is sick intuition that they taught us so we won't freak out. And that is so, I listened to it today, that is so impactful, because you think back as a little girl. Yeah. If you're caught, if there's a man in the aisle at the grocery store, don't go in that aisle. If you're walking in the parking lot, have your keys between your fingers. If you're walking back from class, have your pepper spray out. When you walk in your house after you get home from work, don't turn all the lights on right away, because you don't want people to know when you get home, right? And I don't think either of us are saying that this can't impact men also. No, it's not. It's just from our perspective. Of what women were taught, and unfortunately, our mothers and our grandmothers had to teach this to us. Mm-hmm. There's no fault there in them teaching it to us. They had to, but why did they have to? Because we haven't taught our boys to respect women from the jump. Right. Right. Is that a bad way to say that? From the beginning. [laughter] Exactly, that's exactly it. And you can see that move through society. And so it needs to be addressed, it needs to be called out. People need to say to men, "We expect more of you." Mm-hmm. So I guess maybe that's the purpose of this podcast. Not to scare women, not to say, like, "You need to have an exit plan," whatever. But to tell men, I don't know how many men are listening to our feminism podcast. Hello men. [laughter] Nathan's listening. Hello man. [laughter] Hey, maybe you're boyfriend, Curtis. Curtis? [laughter] Are you still in want of a wife? Anyway. But for the men out there, if you hear this, if someone sends it to you, do better. Not just for you. Maybe you're doing great. I don't know. Yeah. I'm assuming you are. Do amazing things. Tell your sons to do better. Tell your friends to do better. Check in on them and make sure they're not behaving this way. Because it's like, remember the song from where your kids, this little light of mine? I'm going to let it shine. Who's going to sing now, Alissa? I'm not going to sing. You're going to sing. I just did. You did. But it's like if you have, if you, as a man, because you have more influence in the circles with the other men, if you decide that you're going to be a better person, that you're going to be above this kind of behavior, that you're going to fight for the rights of women, let that light shine, and then maybe it affects just one other person. And then they take that and it affects two other people. It's like, what was that movie where the kid did one good act and then- I know what you're talking about that. Yeah. All I can say is the giving tree. Nope. [laughter] That has fully escaped my brain. But yeah. The perfect gift? No. That's not it. Anyway. Eat, pray, love? That is definitely not it. So let your light shine. [laughter] That's the message. I got all I need when I got you in. Flashlight. [laughter] All right, guys, we're going to wind this down because I am going to die of a heat stroke at any moment. I've already gone. I'm just talking from the grave. The beyond. Ah. We listened to the Haunted Mansion theme song today. You have a problem. I can't wait to go. I know. Very excited. Sarah mentioned it to me today. She goes, "Bree! It's coming up so soon. Are you excited?" And I was like- [laughter] It's like- Are you more excited than me? Maybe. All right. We'll see you guys next time. Actually, we'll see you guys in about five minutes when we record the next episode. If you notice some consistency and like, "Oh, they sound like they're going to die." We recorded these two episodes at the same time. Don't worry. Back back. All right. We'll see you guys soon. Bye-bye. [music] [music] [music] [music] (upbeat music) [BLANK_AUDIO]