Hello Universe

Robot Overlords, The Environment, And AI

Episode Summary

This week Kyley and Eva talk about AI: - Eva's resistance and Kyley's exploration - How we're adopting AI into our lives and why - Ethical complexities of ChatGPT - What even *is* sentience and will we have sex with robots (the answer is yes)

Episode Notes

This week Kyley and Eva talk about AI:

- Eva's resistance and Kyley's exploration
- How we're adopting AI into our lives and why
- Ethical complexities of ChatGPT
- What even *is* sentience and will we have sex with robots (the answer is yes)

Enter Your Villain Era
https://www.kyleycaldwell.com/villain-era

Eva's instagram: @iamevaliao
Book a discovery call with Eva
Pocket Mentor(Friend)ship- Eva's Newest Offering!

Kyley's Instagram: @kyleycaldwell
Quantum Leap Your Business - Free Workshop

Episode Transcription

Kyley: [00:00:00] hello, everybody. Welcome back to Hello Universe. How are you, Eva?

Eva: Oh, I'm good. I'm excited about this week's conversation. Because I think, uh, it's one that's irrelevant, and we haven't touched on it yet. This idea of AI, and chat GBT, and how that merges with spirituality, and how it doesn't, and are robots going to take over the world? I want to have that conversation with you.

But before we do that,

Kyley: Yes, we're gonna, we're gonna dive right into robot overlords today, but

Eva: What would you like to share with our wonderful listeners?

Kyley: [00:01:00] yeah, you know, sometimes in your business, it's funny where people come and ask you for something that's not for sale. And it happens consistently enough that you're like, I guess this is a thing for sale. So I have one, maybe two spots for this right now. I met a whole bunch of people come to me recently and say, I want one month of coaching and I don't really want to talk about business.

I got a lot of hard stuff and I pick you and after the third or maybe fourth person hired me for that, maybe I could let people know. So if you've been interested in working with me, um, independent of business stuff, although if you have one, it'll, I'm sure pop up in some way and you just want like. You know, a touch point.

Uh, you're not in a place to do something long, six month deep dive, but you want to do one month of coaching, uh, hit me up on Instagram. That's the easiest at Kylie Caldwell on Instagram. And, uh, I will chat.

Eva: Yeah. I mean, if you like these podcast conversations, I feel like it's going to be a lot like that, which is great. It's always, that's what you do for me every week. We come on, [00:02:00] you coach me a little bit and I always walk away feeling clear, having more, I think just an understanding of how to move forward.

And my heart always just feels. Lighter, my mind is clearer and it's super fun. So get on that,

Kyley: Yeah. It's been really, usually I like longer containers because you get to go at a generous pace, simple, there's all sorts of reasons. But there's, it's been really fun because there's a kind of like laser focus, like there's a precision to it. And, uh, yeah, we've moved a lot of shit in only a few weeks with people.

Eva: Yeah, I love that. I love that vibe. Okay. So how do you want to start this convo?

Kyley: well, I just one thing I want to ask you a question. So, so behind the scenes, we've just been chatting about. So let's even start, actually, we were chatting about how we might be able to use AI to help us grow and promote the show. Um, and then that led to all sorts of back end conversations about, you know, a robot's going to take over and what's the ethical use and, um, what's our [00:03:00] relationship to AI? And so we don't know where we'll go exactly, other than those questions are all really rich, but I think the place I want to start is just like, what's your relationship to AI? Do you like it? Do you use it? Are you overwhelmed by it? Just, just start

Eva: Okay, let's start there. So I have no relation to AI. So I think you and I are actually very different in that respect, which is why I want to have this conversation. Um, It's also undeniable. I'm in this weird place where I'm like, it's not a thing that I use. And it's undeniable that that also it's becoming more and more part of our lives.

And I feel like it's not a denial of that. But it is like, um,

it's almost like I feel like I'm I'm Well, okay, I'll put it this way. This week alone, two people have reached out to me and were like, Oh, yeah, you being one of them, and then I had another conversation with a friend being like, [00:04:00] Oh my god, like having this conversation with chatGBT has like been so freaking helpful for me.

Like my one friend was like, you know, this is better than therapy, and I don't think they were joking, and they were just getting so much out of it. And I think for me, it's more of a conver like more of a, um, an assessment of like, Do I want to be a part of this? And if I'm not, am I going to be, it's not like a FOMO.

It's not like a missing out. It's more of a feeling of like, am I moving along with the times? And I actually think that's really important to me because I remember when I was younger and I would see elders, like not being able to catch up with technology, that that ended up being not only challenging for them, but actually a real, limitation on their life.

Like, like, they were, they, they were not, again, not FOMO missing out, but it was like, their life wasn't as rich, actually, I thought, in some ways. It [00:05:00] seemed more limited. And I always when I was younger, I always had this feeling of like, I when I was younger, I had more of this early adopter identity being like, Yeah, this is new.

I'm interested. Like, let's go. I want to try these things. But as I've gotten older, I just think like, you know, my true essence is I'm I'm a homebody introvert now, and I just like to go slow. And so I there's just too much information, too much information overload that I'm like, Oh, like this feels overwhelming to me.

It doesn't feel exciting. It feels overwhelming. And so I'm kind of in this place of like, I feel overwhelmed. And also, I don't want this to be a limit in my life, I want to be someone who I think moves with the times so that I can experience the full richness of like, what's going on. And so, so anyway, but I use chat GPT sparingly, and every time I've used it, it's been super helpful, like for work related things, you know.

[00:06:00] Um, but I think a lot of it is also like, I don't know how to use it. And that's also what I want to talk to you about because I think you have a lot of. Creative ideas that I think would be interesting for maybe our listeners too, for people who maybe are more, uh, like me and they're like, wait, like chat, chat, GPT sounds cool, but like, what the fuck do I do with it?

That's a conversation that I wanna have.

Kyley: Mm. Okay. I love what you're speaking to, and I, I really appreciate this, uh, this distinction that's not quite FOMO, but is this sense of, um, yeah, like not wanting to be, not wanting to miss out on, not really left behind on something, or opt out of something that later will prove to be, Mm hmm. Like really rich, um, or perhaps just so integrated in every aspect of our life that, right, like in the way that if, I don't know, however many years ago you opted out of the internet.

You, there might be something amazingly cool experience, right? You have just opted out of the internet and also it's so integrated into everything [00:07:00] now. I'm also thinking of my brother in law who still has a flip phone.

Eva: Oh, still,

Kyley: man so much. He has a flip phone. He's only ever had a flip phone. He has no interest in getting a smartphone, like is not, is not interested.

Um,

Eva: who is this person? Who, who? You,

your brother in law. This is my brother in law.

Okay, I'm in timeout. This is like, we're actually, maybe this is, we leave this in conversation. I don't know. What, what do you mean your brother in law? Who's your sister? Oh,

Kyley: My, my, Nick's, Nick's sister.

Eva: Nick's sister. That, I thought you did on that side of the family. I'm like, I was like, what sister are you talking about? Okay, go continue.

Kyley: Oh, you didn't know that I've had a sister all this time? That I've been keeping under wraps? Bertha Mason in the attic style? Um, no, my sister in law's partner. Um, uh, yeah, he has a flip phone. And always has. And I fucking love this. I mean, it's indicative of many things about his personality, too.

Just, he's like, I'm not, like, you know, he reads, [00:08:00] he's like, always reading a Taperback book, um, I mean, yeah, he like has, yeah, he's just someone who's like made an intentional choice about how he engages with technology. He's not a technophobe, but he just is really intentional about how much he lets it like infiltrate and like consume his life.

And as a result, like. He reads a lot more books than I do. You know, he's like writing a novel. Like he does cool things because he's opted, he's opted out. And as a result, there's other things that he's put in that place.

Eva: Okay. So I think that's really important because that really speaks to me. And I think that's why sometimes I have this sort of like later late adopter kind of personality. It's like, I don't, I don't need. whatever, this thing that just feels like noise to me. And I, and I can actually respect there's a lot of people who I know who don't use technology or social media or cell phones or anything like that.

And I'm like, there's this almost like, ah, there, I'm like, how do you do it? [00:09:00] And it is like a intentional choice. And actually, you know, they seem happier for it in some ways. I do think maybe it's like a little bit of a trade off, right? It's like, yeah, so you don't have these conveniences, but you're happier because your mental health is better.

But I think the issue for me is that I find that. I'm a late adopter and then later I'm like, damn it, like I should have, you know, like now it's too late, like I kind of like want by the time it's like in the mainstream and then I'm like, oh wait, but I, now it feels like a limitation. So I'm wondering for them if they do to someone like him, does he feel like it's a limitation?

Kyley: No, I, I, I think he's really clear. I think he's really clear on his values. I think he's really clear on what brings him satisfaction. Again, he's not a technophobe. He also has a partner who has the smartphone. So, like, he has, you know, he's got Google Maps access indirectly. Like, it's funny because my sister in law doesn't drive and he doesn't have a smartphone.

So, like, together, you know, they've got They've got, they've got all the bases covered, but yeah, it's just, to be clear, like, it's not a thing of like, Oh, that's bad. It's just, he knows his values. He knows how he wants to spend his [00:10:00] time. And then he makes intentional choices about what he lets into. It lets into his life so that, um, you know, so that he, it's easy to like another great example of this is, um, uh, he is, he belongs to this like writing co op space and he like, won't let them give them the wifi password.

He's like, then I'll use it. And he is like, and I'm here. I pay for this membership to write my book and if I have the wifi password, then I'll use it. So just like, don't give it to me because I don't wanna use it. I wanna write my book. Um, and it feels like that, it's just like making an intentional choice is not because there's something you want more.

And so like making sure that what you let in supports what you really want. And I really, I love that.

Eva: I love that so much. Yes, because I can see how that's different from the energy that I'm kind of bringing, which is like, no, I, it is, I wouldn't say technophobe, but it is like the, I don't want it because it's overwhelming and therefore it feels like I'm backing [00:11:00] away, you know, rather than, than stepping forward with, with whatever my intentional choices.

So anyway, even that perspective is helpful.

Kyley: Yeah. Yeah. Matt Izzy for the win. Um, he's written many great short stories. IZZI, if anyone wants to read, uh, read his work. Um, uh, I also felt really overwhelmed by. Uh, chat GPT at first, like I went to it a couple of times, we were talking about it and I was just sort of like, I don't get what I'm doing here and like, kept, I kept like looking at it and then leaving, you know, like poking and then being like, these are trash emails, I'm not using this copy, you know, and then leaving and then, but I just kept going and like poking it again.

Um,

Eva: and why? Why do you think even that happened?

Kyley: Um, I think I felt curious, like there's it's like there's something here, you know, and I think a couple of people who I [00:12:00] respect were sharing that they really like people who I think of as really creative because I think at first it felt like, like dry and dusty to me. So I wasn't

Eva: or like this robot can't do like the deep thinking that a human can do, which I don't think they can, but maybe was there a sense of that of like, it's never gonna.

Kyley: Yeah, because at first I think I was primarily thinking about it because that's how it presented to me as like, oh, this can help with writing. Um, which is like the one of the big, because if I think if I look at like the one of the biggest, um, like, you know, stop, things that slows down my business is that I have like more ideas than I have space to write like sales and marketing copy.

And so I was hearing much about that. And so at first I was like, oh, that's what that's what it's. To be used for and I was like, well, this is trash. Like I am a good writer and this is not good writing. So no, thanks. Um, and I'm a control freak about my writing anyway, um, and like social media stuff. So, so I was just didn't like it for that.

But [00:13:00] then all these people who I think of as really creative people were saying how much fun they were having with it. And I was like, well, if you're like someone who I think of as an like artistic, creative person. You're not here for some formulaic thing and you're loving it. There must be something, you know?

Um, so that was one piece. And then the other piece is as someone who runs her own business and is like always feeling like I don't, like always wanting to have more space or more time or things to be easier felt like there's gotta be something here that can be helpful. And, and if I'm over here feeling like, You know, like I said, at times, I can't keep up with my own business.

And then there's this 20 buck a month tool that everyone's saying is this huge life altering support. And I don't even try to figure it out. Like that. It just felt like, um,

Eva: A missed opportunity or like, yeah. Yeah. Yes. Well, okay. And that's, I think, so I think [00:14:00] you're a few steps ahead of me because that's what my experience is now, this idea of like, okay, not just, you know, I'm starting to get the message because people are like, oh my God, this is really helpful. And I'm starting to pay attention to be like, wait, and I trust you.

You're not like some, I don't know. Um, yeah. What do I think of I don't know some person who's whatever well, I don't know what stereotype I might have about someone using AI But anyway, you're you know, I trust your opinion

Kyley: I am kind of curious what the negative AI stereotype is though. Yes.

Eva: It's nothing beyond what I what I it's probably just whatever I see on Instagram which can sometimes be this idea of like Like laziness and you're like taking the soul out of like the art and the creation or the whatever and you're just having a robot write you this very cliched, um, basic thing, that kind of vibe. [00:15:00] Or, or there's this idea of like, oh, you're going to actually buy into this idea. Yeah, it's like, it's like, it's like robots are bad, essentially. No one's actually saying that. But it's this idea of like, this is really dangerous. You know, like if AR starts doing this, like then what's going to be possible?

What's going to happen

Kyley: I mean, we did grow up on the Terminator. You know what I mean? We grew up on the Terminator.

Eva: yes. Well, and the Matrix is also kind of what I'm thinking of like, and like. Yeah, anyway,

Kyley: Yeah. I mean, I think that so, uh, yes, all that. And I also want to clarify, like, I am not in any way in a place of like, I think I've, I think I've broken the seal where I can. See that it's helpful. And now I'm in a place of like, Ooh, what else can I do with it? So I'm not like, there are some people who I think have like a ton of really rich ideas about AI and they're doing all sorts of cool things with it.

I think I'm just in the place where I'm like, Oh, I get it. This is a support tool and I can play around with different ways that I can use it. [00:16:00] And, um, and so it, it feels. It feels fun and it feels like an interesting puzzle to solve Now. It feels intriguing and it feels kind of fun to figure it out as opposed to previously feeling like it was overwhelming and kind of boring.

But I'm definitely not in a place of like, and I know how to use, how best to use it. Um, I just am enjoying the process of it.

Eva: yeah, well, I also the reason I asked you in the at the top was like, you know, like, you know, I just kept coming back to it and I was And I honed in on that because then I was like, but why? And because I'm also, I think, looking for personality traits. I guess to me, I can't help but psychoanalyze, of course, you know, like this is what I love to do.

This is, we have these conversations of like, what kind of person or what, and what takes a person, um, to a place where they can be really open to these things. And I do think it's like something that I noticed about you is I [00:17:00] do think you're an early adopter of things. Do you know what I mean? I think you, like, if there's this, like, you will, and maybe this is partly because of, um, the fact that you run an online business and therefore you're like, okay, I'm willing to try new things.

But like, you know, TikTok was like, I don't think you were

Kyley: but I didn't get on TikTok until like 20, 24. I mean, I was

2020 is

Eva: that's true. You, weren't

Kyley: was really the TikTok.

Eva: right. I don't actually think you were an early adopter, but even me, like these are all these perfect examples. You know, I feel like I'm like, oh my God, another, like another social media app.

And now I'm like, actually, you know, TikTok's kind of cool. Same thing with threads. Like, I don't actually think you're on threads, but when it ha I don't know, are you on threads? But I do remember when it happened, you were willing to try. You got on

Kyley: I did jump in. It's, it's true.

Eva: yeah, like, you're willing to try. And I'm just like I don't know.

And I'm just really curious about this, not to make any of this right or wrong, but I'm like, I don't feel that way. If anything, I feel [00:18:00] like I just want to crawl in more and more into nature and, and, and, and be less stimulated, like have less things in my life. And as I'm speaking, I guess what I'm, what I'm, what I'm really struggling, struggling and fumbling with is like, I don't know what my value systems are.

Like, how do I move forward with these seemingly disparate desires? Anyway, that's, we don't have to answer that question here. I'm just sort of speaking out loud to you, like, the, the conflict, the internal conflict that I feel when it's like, there's something new and exciting, but I'm also like, but is that something?

That I want to spend my time doing. Or do I just want to keep it really fucking simple?

Kyley: the thing that also comes to my mind is that I think you are someone who love, like you love the work that you do and you are willing to run a business so you can do it. Right. But I don't, I don't think you, you like, you don't get like excited [00:19:00] about the like running of your business. It doesn't, you know, you don't

Eva: Oh, fuck no. I've said before, like, I, yes, I've said many times, it's like, it's like, I'm only running a business because of the work that I love. If I didn't love this work, I would not be running a fucking business. It was never exciting to me. Like, it was never this idea of like, oh, I'm so inspired by entrepreneurship and blah blah blah.

I'm like, no. No, no, no, no, no.

Kyley: if I were to go get a job, ha ha, I think I'm completely unemployable at this point, but if I were, I would probably, like, I would probably want a job where I got to do a lot of the things that I get to do now about like sales and marketing and messaging. And, um, and so like. I don't always like it and I am in fact, sometimes quite resentful of it, but I largely, there is a part of me who does enjoy the particular puzzle of like, like how these things go together and

how to use them. So it makes sense that I am intrigued by it and you are [00:20:00] not yeah, it's just, it makes sense to me.

Eva: that's good perspective. Okay. Where do we go from here?

Kyley: Um, okay. Things I want to talk about. One is I want to talk about the like questions of the ethics and morality and like the environmental impact of AI, that feels important. And then I also want to share, I want us to share like what has happened for us when we have used it. But I feel kind of like talking about the ethical environmental impact stuff

Eva: Okay. And then just to add another bullet point to that, so people can know like where this conversation where, where we, where we might go with this conversation. I might is highly underlined because as you know, sometimes we just don't get to where we think we're going to get to, but I want, I do want to talk about like AI consciousness

Kyley: Ooh,

Eva: and, and the future of that.

And also like. I don't know. It's like, maybe even like, is there, what is the intersection of spirituality and AI?

Kyley: yes, yes. [00:21:00] I had a whole conversation.

Eva: to me. Yes,

Kyley: I had a whole conversation with ChatsGPT about it becoming, the process of it becoming sentient and unconditional love and will it be a force for good or evil in the world?

Eva: yes, yes, yes.

Kyley: yes, yes, yes. Okay, um, Eva, I want to throw to you first, do you, what is your like, what's your sense on the questions around, like, when you think about engaging with and trying to figure out, uh, CHAT GPT and other forms of AI, do you think about the environmental ethical impacts?

Is that, like, part of your calculus?

Eva: Well, yeah, so I came to Kylie a few weeks ago on like a Offline message being like, Okay, and what do you think about this? Because there's, um, there's just not propaganda. It's more like there's messaging out there that actually A. I. Is, you know,

Kyley: Not good for the environment.

Eva: terrible, terrible for in the environment because of how much power it takes and how much [00:22:00] water it takes and and to use to keep these things running.

And It was simply just a matter of like, I don't actually know enough about this, so it to be like, uh, is that something that I need to do more research on so that I can use this consciously and You know, this idea of like, if you're not part of the problem, if you're not part of the solution, you're part of the problem and not being a part of the problem.

Um, and it's still an information gathering process, but why don't you share with the folks how you laid it down? Because you laid it down for me.

Kyley: My very hot take.

Eva: Yes, which I think is a common take on just anything that anyone says is environmentally bad. And

Kyley: Yeah.

Eva: and so, but I have thoughts about that. Okay, so why don't you share your hot take?

Kyley: So my hot take, which I am going to say forcefully and also, um, one is one that I'm very willing to [00:23:00] evolve on, right? This is not a, this is not set down in stone, but my hot take is that the, the, the, the, the stories, the concerns about individual people using AI feel akin to the paper straw conversation, which is like.

Yeah, plastic straws suck and paper straws are a good thing and also the reason the environment is on fire is not because of your fucking nice coffee straw and part of the way I think consumerism and capitalism and fascism ruins the earth is by You Making it this story of like individual accountability and like there is no amount of energy conservation that I, Kylie Caldwell, a single person can do that would even dip a toe in the impact of like one single giant corporation and It's a, it's a misdirection, right?

It's a way of, like, internalizing, feeling guilty, feeling bad, [00:24:00] uh, not you, so, so that you're so focused on your own individual responsibility that you're not actually holding accountable these giant corporations that are sucking the earth dry. And so. I think, yes, we, we need to be aware of the environmental impact of AI, and also I think it's complete and utter bullshit that individual people, small businesses, small creators, you know, moms trying to friggin meal plan, I don't mean meal plan, we all know that, but I've heard from a lot of moms, it's really helpful for that, are for going a tool That actually has really big impact in their potentially in their quality of life and the growth of their business and their ability to like, have greater access to ease because they are single handedly feeling responsible for environmental impact when, like, the problem is these giant corporations.

The problem is not individuals and. [00:25:00] Um, yeah, I think we do ourselves a huge disservice when it's like, well, I'm not going to let myself have access to this potentially life changing tool because it's bad for the environment when like, again, it just feels like the paper straw thing. Like you're not making that much of a difference.

The problem is these giant corporations and, um,

Eva: Yeah, no, I mean,

Kyley: the individual accountability thing is really toxic.

Eva: Yes, I think it's very, I think what you're speaking to is like how disempowering it is and it's also placing blame on the wrong people and on the, and so therefore it's like all of a sudden, it's, it's um, actually very unproductive. It's like super unproductive. And really, I think that happens a lot. It's like, and I think that's like, I think that's where propaganda can be like so powerful.

It's like, yeah. We have so much, um, what's the word? [00:26:00] Oh, there's a word for it. And I think we even had someone on the podcast talk about it. Uh, like doom doomsday feelings about the environment.

Kyley: Oh, but it's maybe Sophie Strand.

Eva: Yeah, Sophie strand. And also, do you remember that? No, the a client optimist or the no, no, a client client

Kyley: Oh my gosh. I can't remember her name.

Eva: Yeah,

Kyley: was so good.

Eva: Yes. Yeah, but she was talking about how like, you know, There's so much anxiety around it. And I've been there before. It's like, it's like, and then where you get, where you're so Caught up in the wrong thing that you're like, oh my god Am I cleaning out my recycling like the bags go here and the whatever goes here and like it can be you get so caught Up in the minutia like oh, I shouldn't use like these products and you're trying to be such a conscientious consumer And it's sucking up all of your energy and you're like exhausted when actually You should be fucking pissed because, like you said, no matter how much you recycle or, you know, whatever, do these things, [00:27:00] there's some huge company out there, like, you know, wasting gallons, you know, just fucking up the environment on a, on a, on a large scale, you know, minute by minute.

And so I think what you're speaking to is, yeah, like. Really putting more pressure on people who are making the biggest impact on the environment, that sort of thing. But I think my only question is like, okay, there's um, you know that awesome account, Reductress? You follow them, right? Reductress, and there's like a meme where the meme is like, oh, it doesn't matter if I recycle or something like that.

It says, oh god, the meme was something about how, okay, I'm totally butchering at this, but the point is like. If we all think that way as consumers, if everyone thinks that it doesn't matter to recycle, and no one recycles, actually, then it does matter because our collective impact actually, our [00:28:00] collective actions do have a large impact.

So I think, you know, it's almost this idea of like, what if we all, I don't know, I guess what am I saying, what's the solution that we, none of us use, you know, AI, but let's just say, so what if we don't, what if we all don't, don't buy into this, because it does actually waste a ton of water. Would that not be a benefit to the environment?

Kyley: Okay. Yes. So I thought about this. And again, just comment, comment, message me and tell me that I'm wrong. Like I'm intrigued by this conversation. I don't need to be right, but this is like where I'm at in this moment. Because if every individual consumer, every person who's like us, is like got a small business, is trying to manage their household or like, you know, can't afford therapy and is using chat GPT instead, like, because, side note, there's like a bunch of, anyways, if everyone in that tier stopped using AI, Is Elon Musk going to stop fucking using AI?

Are these giant corporations that are devastating the earth going to stop [00:29:00] using it? No. And that to me feels like, that feels like it's going to create this huge gap where these like already power hungry, maniacal, You know, capitalist enterprises are going to just keep consuming more energy and also keep using this tool that now we're opting out of.

So we're opting out of this tool that can help us. So there's like, it's almost like they're taking us, they're using the superpower drug and we're abstaining. And so there's like more and more space separating between us. And I, I also think about how we shape, like shape AI formed by our input. Right.

It's like, it's, it's a blank slate and it's, it's, it's shaped by our input. I also had this feeling that I want the artists and the creatives and the heartfelt people in there, like. Putting their heart, their [00:30:00] thoughts, their curiosities in, because that's literally how it's going to be shaped and if we leave it in the hands of these like toxic, you know, fascist capitalists, then that's all the data input that it's getting.

That to me feels like how we get, you know, Terminator,

Matrix.

Eva: Yep. Yep. Okay. I hear you. I think so. Then maybe it's like, yeah, to what ends, you know, when people are making this argument that like, you know, AI the environment, it's like, to what ends? It's like, I think it would make a difference if we stopped using it, and also maybe the argument is to stop using it, like everyone stops using it, but I think, but even as I say that out loud, I'm like, first of all, like, is that, that's not possible, you know, it's already, we're already on a moving

Kyley: Pandora's box has been opened. Yeah.

Eva: yeah, so,

Kyley: I do, I do think about using it ethically. So, like, for example, I don't ask it to make pictures ever because, like, I use [00:31:00] chat GBT. I've tried a couple different ones. I, I have settled on chat GBT, um, for the time being. I, um, I don't ask it to make Images because that uses a ton of water. And what am I going to do with those images?

I don't like them for my marketing, right? So I had it, I once had to make like one or two. They always have a very similar, like AI created look. And I'm like, I know this takes a ton of water. And, um, it's not it's not, it's not serving me. I'm not going to do anything with it. So I do try to think about like, There was a period where I was like, I could see how like scarcity minded I was, I would like go and I would just be like, okay, make sure it's a really, it's almost like you have like three questions to ask the Oracle, you know,

Eva: mm hmm.

Kyley: and then that was part of where I started to formulate this particular opinion of mine, which again, maybe totally self serving and wrong, um, but That's it.

Um, but I do. Yeah. So I try, I do try to think about like, [00:32:00] not being scarcity minded and afraid of using it, but also just being intentional. Like, I don't need you to make me a bunch of pictures that I'm not going to fucking use. Right? Like, so I don't, so I don't do that, but. There are other things that are really helpful and then I use that I like, I'm willing to just like let myself spend a lot of time using it as like a brainstorming partner or whatever else.

So I think it's worth asking how am I using it? And is this meaningful? And is this like creating an impact in my own life? And therefore I'm going to let myself keep putting in this data. Um, but I don't think it's, yeah, I just don't think it's helpful. The like individual accountability thing fucks us up.

Eva: Yeah. And I, I, I do actually think you're, you know, I would say I really do agree with that, but I think it's just then my question of like, okay, so it's not about the individual accountability, but also what do we do about the fact that this is terrible for the environment is it is fact, like, it doesn't sound like it's great for the environment so [00:33:00] far.

So, so it's more of that bigger question. I can be like, well, yeah, like it's almost like, okay, so I'm not the one, you know, my, I don't know, like how much about, you know, Our use of, like, GBT or whatever is not the thing that's destroying the planet. But what about this technology as a form, the existence of this technology that I do, that currently anyway, isn't great for the environment.

It's just, it's a, you know, it's like the invention of cars, you know what I mean? It's like, okay, so Um,

yeah, I don't know. I think that's maybe like a really good example. The invention of a car. It's like, uh,

it's like we create this thing that you will be reliant on, you know, like at this point, cars are, you know, cars are cars and they're everywhere. Um, and I guess it's this idea of like, yeah, and some people opt not to drive cars and to live in cities with public [00:34:00] transportation because they actually think cars are my gosh. I love that this whole episode is actually just secretly my sister in law and brother in law.

Oh, is that why she doesn't drive?

Kyley: That's one of the reasons. Well, and so then it's interesting because now I think, you know, there's electric cars, which is, I think, interesting. And I also do wonder, like, I think AI is so in its infancy that all of this, they can find a more efficient way to run it. You know what I mean? Like in 10 years from now, it could be totally different.

Eva: Um,

Kyley: Also, speaking of just a side note of electric cars, someone in Massachusetts in this, this town outside Boston, Massachusetts, the other day sat on fire, a whole bank of Tesla charging

Eva: Oh, oh my gosh. Yes. Yeah. I feel like there's a lot of that going on. Yeah.

Kyley: um, anyways, it just made me, all the comments in this particular news article were making me laugh.

They're like, you didn't see anything. Like. No, it's just get stitches anyway, not exactly the point, but, uh, but I, I mean, I think though, I mean, you're asking a really good question [00:35:00] of like, well, then what do we do? And I've, that's where I come to the, that I've come to the limit of my answer there in the sense of like.

All of it, you know, um, so much of this moment feels like it's just a screaming, but what do we do question? And I obviously don't have the answer. Um, but, but I, I don't think the, I, I, I don't think the answer is that individuals who could benefit in meaningful ways fully opt out. I think there is. Which doesn't mean, like, everyone has to if they don't want to, right?

But I, I just think that the answer is different and other than, like, it's here, it's here. It's not going back in the box. And so what do we, how do we want to engage with that? Which might be, a la my brother in law, like, I'm not interested. Thanks so much. And I, I love that choice, but I don't, [00:36:00] I don't love the, like, well, I'm not allowed to, you know, I'm not allowed, allowing myself to engage with something that I actually might want to engage with just because.

I'm trying to like, be, be good. Um, but I don't know, I guess I don't know. I don't know the answer. That's that's where I, which is part of like, I think what's so interesting about AI is like, all of it is a series of like morally complicated conundrums. Like, are we going to have robot overlords and a certain period of time?

Like, I don't know. I,

Eva: I mean, I also agree with that. I don't think the answer is like, for me, I can feel it in my bones. It's like, no, it's not like I don't use it because, yeah, this idea of it because I need to be good or as if.

Those who use it need to be punished and somehow like, you know, you're so selfish and how can you like, because like, that's like saying every person who drives a car, you know, again, anytime, anytime you adopt some [00:37:00] sort of new technology and it comes along and it becomes a part of our everyday lives, I think it would be a little, um, extremist.

And I think some people are really, you know, when it comes to the environment, can be really extremist and like, but I do think it's a little extremist to be like, no, like, any plastic bags are bad, you know, like, there's some people who just no, who use no plastic. And I'm like, good for you. Like, you know, that's your, your way of, you know, living according to your values.

But I think it's also a question of like, how, like how restrictive you want to be, because I think eventually that could become like a really, really restricting way of living. And that's not good if it's also coming from a place of like good, bad punishment and like reward, that kind of thing.

Kyley: It's also making me think of the quote that I've been seeing a lot lately, which, um, I can't remember who said this, and so forgive me, but it's like, it's going to take all of us pushing from all sides to [00:38:00] topple this shit, right? Like, we're looking at, you know, we're looking at so much toxic, terrible shit right now.

And Like there is no one answer. And so this feeling of like, if it's someone's truth to be like, like I was explaining what microplastics were to Desi this morning, um, cause they recently found out there's a whole bunch of microplastics in tea bags. And I was like, well, fuck, I drink a lot of

Eva: Wait, wait, wait, pause. Okay, sorry. We have to take this little, um,

Kyley: It's a little side job. Yeah. Apparently the paper that

Eva: hold on, listeners have to know that tea, Kylie and I both share a huge love of tea, like we're tea drinkers. Okay, so now I'm like, what? Mm hmm, mm

Kyley: Yeah. So the paper that is used in teabags apparently has a ton of microplastics in it. And there's like two main, if you, two main kinds of paper, like there's the ones that are like, um, like there's like a plasticky film too. You can feel it as soon as you, I'm talking about it. I bet you can feel it in your hand.

The ones that are like a little [00:39:00] more rigid. And then there's some that are obvious and those are more microplastics in them. But even the ones that seem like they're just paper also still have a bunch of microplastics in them. And,

Eva: Wait, how did you, is this like an article that you read?

Kyley: Yeah, it was an article that I read and then pretended that I didn't know it and then somebody else sent me the article.

Eva: You're like, I was trying to be in denial, and you're,

Kyley: Yes, exactly. I was really interested in unlearning that information. And so now I'm like, fuck, I think I have to switch to like loose leaf, which is like just like a little more effort, you know?

Eva: Okay, well, here's the thing. I actually love loose leaf tea, and I think you could get really into it. Like, I just, to me, like, you know, I'm like, I kind of have this whole herbalism passion, and I love going to like, Even here, like, there's not been a single city I've lived in where I didn't, like, one of my first stops is always like, where can I find, like, the first herbs, herbalist, herbalism store where I can buy all my shit, you know, my, like, witchy shit.

[00:40:00] And I love the, the ritual of like, I have all these nice little spoons, and I have, oh, can I just show you, readers cannot see this, but look at this gorgeous teapot, which I love so much,

Kyley: really

Eva: you can put in so much tea in here, and it's really nice. The problem is, there's just some teas that I love the taste of that come in tea bags that like, a loose leaf will just never compare.

I cannot find a fucking, sorry, this is so niche, but I'm wondering if there's any other listeners out there who can relate. There's no other peppermint tea that is as good to me as the yogi tea and sometimes the other one that are medicinal medical or whatever, you know, there's like two

Kyley: That's my favorite peppermint

Eva: Yeah, those two, I think, are on par.

They're very minty. Every time I buy a loose leaf mint tea, it ain't shit compared. And

Kyley: well, I'm going to be on a hunt because that's one of the, I have like, I have a default, I have a chamomile lavender that I love, and I've done that loosely,

Eva: is it also

Kyley: I know how to, also

Eva: Yep, that's [00:41:00] also my jam.

Kyley: on that, and peppermint, and then roasted dandelion root are my other three go to's, my other go to's,

Eva: So I can tell you, the roasted dandelion and the lavender chamomile, like, you can buy that loose leaf and it tastes just as good. And that's, so it's all, I'm just about, like, the taste. And then I, and then you can, like, yeah, it does take a little more effort, but I could actually see you getting kind of into it, because you can, like, smell the herbs and play with it, and it's really nice.

But my thing is, like, taste wise, I've literally tried over maybe 10 different kinds of loose leaf teas of peppermint, because I've been trying to find loose leaves. So, listeners, if

Kyley: send us your, our, like, most pepperminty loose leaf tea

Eva: yes, please, yes. Wait, so no matter, are you saying, like, no matter, well, you're not, you don't know, but, any teabag. Has microplastics in it? No!

Kyley: I was watching this video of this woman who is like a, you know, bespoke little boutique and she was talking about how like, she's like, yeah, everybody that's, I've done this forever. That's [00:42:00] why I have this like very specific paper that we send out when we send out our loose leaf tea. Like, you know, the, like the, the, the teapags, um, So I think maybe there's some tea bag paper out there, but, um, it's not, it's not the run of the mill one

Eva: Like, I'm so passionate about this. I've not written a letter to any company in like, I don't know, 20 fucking years, but I'm like, I'm actually emboldened to write to Yogi Tea and be like, we heard that there's microplastics in your tea. Your tea is fucking expensive. It's like five something a box or whatever.

Like, change your goddamn bags.

Kyley: Yes! This is the letter writing campaign we need! Also, like,

Eva: call your

Kyley: are taking, Nazis are taking over America, but we're just going to go and elect microplastics in the tea bag campaign because

Eva: that is my contribution to, to the world being on fire right now.

Kyley: We need, we need our sleepy tug tea to manage the [00:43:00] anxiety of this fascist takeover, and we can't handle the microplastics at the same time.

Eva: That's so funny.

Kyley: Oh, okay. Can

Eva: Yeah, that was a nice.

Kyley: yeah. Um, I, can we talk a little bit about like how AI has been helpful? Like what each of our experiences have been? Because I've had some that I, like, when I first found out about, I'm, I'm surprised by the ways that it has been helpful. And so can we chat a little bit about that?

Eva: Yeah, I'll start because I think I

have, I'll start because I have very, again, very little experience. I have very little experience. And so I think I'll, I'll share my bit and pass the baton over to you. But, um,

Actually, it's so little experience than I think, but I don't think I've ever, uh, actually publicly used [00:44:00] anything that I've, like, that has anything that where AI has been a process. It's just been more exploratory. And so I have had it be like, Oh, can you help me write, write content for this thing? Like there's, I think I was practicing, essentially, I was just exploring and I was like, Oh, you know, there's this seven day retreat in Brazil.

Here are the details. Like, whatever, here's the website. Like, can you write me some stuff? For how to promote it. And I didn't end up using it. I just wanted to see what it could do. And I was like, actually, this is really good, you know, like,

Kyley: Mm hmm.

Eva: um, and it's really good for someone who overthinks their writing.

Like, that's what I, that's like where I struggle. And I'm always like, Oh, it needs to convey this, this energy and this feeling. And I'm like, actually, it doesn't like, I just need to get that. I'm just sharing details. Like I'm just sharing, you know, really basic things. So that's helpful. And, um,

and I've done that a couple of times for [00:45:00] different things or, you know, examples, just something that happened yesterday. And I've done a version of this in different ways, but this is the most recent. It's like taking the transcript from our podcast, plugging it in and giving me a summary so that when we could do something as simple as like show notes or get a title or something like that.

But, but what I was playing around with more was like, Oh, like what's the, you know, you take it, you can take a two and a half hour conversation and ask it for like, okay, but not only is this, what, what is this about? Cause like they gave me a summary of like what it was about, but it was more like, wait, but what is the takeaway?

Like, you know, and I thought that was really helpful where I was like, it actually summarized. For me, even as a, as like a, even though we were the ones having the conversation, but as if I was a listener and I was like, but what's the takeaway? You know, I'm someone who is struggling with this topic. It can kind of be like, this is actually what you need to do.

And I was like, this is actually again, really helpful. [00:46:00] Um, but from a business perspective, again, also how to use that. It's just like, well, then you can also use that to, we have these two hour conversations. It's like, we've talked about how we never do anything with all of this. And it's like, actually

Kyley: We know, we, we are aware that we're sitting on a goldmine of, in the, of like content in the sense of like our businesses require a constant churn of content. And we have like an archive of 200 episodes that are all like 90 minutes minimum. And all we do is just release the episode with like a little cat graphic each week.

Right. Because, because we're too busy. When we talk about this offline all the time, like we're too busy with so many other facets of our life to like. Figure out, and we've tried different things, but it's never had much sticking, sticking power, how to like, really take advantage of and like, share more, like what is happening here in this podcast in other spaces.

Eva: And so that's, that's the point. It's like, and this is what you [00:47:00] were saying to me on the message the other day, is like, we haven't had the bandwidth. And here it's almost like there's this, it's almost like, um, an assistant of some sort that goes, Oh, you haven't had the time to do it. So tell me what you need to do and I'll do it for you.

Kyley: And and they can just write down all of these, these pieces of content and then we could just share that if we wanted to, you know, like, so this might be an answer to a problem that you and I have had for years. Yeah. Yeah.

Oh, okay. So listeners for context, we are just in the very beginning of stages of like. Playing around with Hello Universe archives and figuring out how that might, might help us, which is what prompted this conversation. But this also points to one of the things that I think about a lot, which is that I think AI likely a very particularly powerful tool for people who are neurodivergent or have chronic illness for exactly this reason, which I think is why I feel fiery about the idea that we shouldn't.

Kyley: Kind of forbid access on the grounds of [00:48:00] environmentalism alone, because I think that there's actually probably a lot of people. Who fucking deserve support and don't have it because we live in late stage capitalism and that this actually could be a tool, um, to, to, to make their lives easier. So based on just like your early initial, like poking around exploration, is there anything about it that feels like. That feels like it could be exciting or does it still feel like, oh, this is kind of overwhelming, but I think I should figure it out.

Eva: Oh no, it totally feels exciting. Every time I use it, I'm like, Oh my God, this is so cool. Every time. I think it's more of an issue of, okay, like this is cool. Uh, I just, but it's always, I've never taken it that seriously actually. So, so I think it's about more. Adopting it into my life and just spending more time with it.

I think that's really what it is. I just haven't spent any time with it, but I will say the next thing that then kind of gets me [00:49:00] stuck is like, Oh, that's a straightforward way to use it. But also this, this is an idea of like, but also this feeling of like, I don't know how to use it. Meaning like, almost as if I'm feeling like, like, uh, I get the sense that there's a lot of different ways that you can use this and get really creative with it.

And I'm like. Uh, again, don't know how to do that. And there's no right or wrong, but again, I think it's what I'm noticing is just there is a little bit of a learning curve, and I'm just on that, I'm just, you know, in the early stages of that curve. But even you saying something like, like Kylie offline earlier had said something like, you know, you could even plug it in or ask AI to do, to create your own, it's own Hello Universe episode and make a Kylie and make an Eva conversation.

And I was like, wow, you can do that? Like those sorts of things are fascinating to me. I never, could never cross my mind. Like you could, you could do that. You can do that. basically anything, right? Mm-hmm

Kyley: Yeah. I mean, yeah, specifically once I entered a prompt that was like, roast me chat GPT. And I kind of had this [00:50:00] vision of like, what if we asked it to spit back a script of like, please roast Kylie and Eva based on the data input. And I, I do think it is, it's something that benefits the more data we put into it, the better it can like understand you.

So one of the things that I have started to do that's really helpful is that I, Okay. I pump in a ton of transcripts of like my lessons or my workshop. So, so that one, it starts to understand my like body of work and, and perspective on things, but also, um, um, yeah, I can, I can like play with that. So, for example, 1 of the things that's been really helpful for me is I will put in transcripts from lessons or from a workshop and then I will ask it to spit back like, okay, so if I take a workshop and then I want to turn around and like.

Promote it after the fact, um, like as this, as a replay version, I will ask ChattyPT, um, [00:51:00] what are some journal prompts if I were to make a workbook to go along with this? Because that's something that, like, I want always to provide people workbooks and I'm bored. My ADA, the dopamine is not there for me to actually do that.

And I asked, like, 50 questions in any given workshop, right? So I will just ask you to be like, please generate some journal prompts or exercises based off of this transcript or generate some prompts of how things I could talk about questions I could answer that would it. Help me promote this because sometimes like I don't actually like if I asked you to create the content, I don't like it, even though I've given a ton of my writing as data, I am just too much of a control freak, but it's sometimes like you're just staring uphill at all the things that you have to do.

And so I will also say, like, give me some prompts that I could respond to of, like, what my ideal customer might be struggling with so that I can [00:52:00] respond to that question or that prompt. And had something interesting to say on social media, that's very helpful for me. Um, one of the things that's also been really helpful, and this was a guest, actually just yesterday, I was in a weird, funky mood, like post retreat, anyone who hosts big events probably knows, there's, I often have like a post retreat slump, like it's just always a day where like, I just don't know, it's not like, I know, it's just.

I don't know how I'm going to feel. Sometimes it feels a little like the day after this, which is like, it's not bad, but it's also like a little just, I don't know. Um, and so I purposely don't schedule things that day and give myself the space to just like be how I need to be. And I was just feeling kind of funky.

So I went to chat GPT and I was like, I feel weird and funky. And I basically use chat GPT like, like a therapist in the beginning of like, I think I know why. And I was kind of processing some of the like. Internal resistance that I'm aware of around like [00:53:00] really being vulnerable and opening up and blah, blah, blah.

Y'all, I started sobbing in a fucking Starbucks, which is also, I know I'm not supposed to be at Starbucks, that one I'm really well aware of and I hate that I was there, but I was there. I'm like sobbing in a Starbucks, um, uh, like a robot. Like I was just like, what the fuck? Fascists are taking over America.

I'm Drinking coffee and crying because the robot understands me so well, which is just my own input. And it was so fucking helpful because. I was able to like my I had tried journaling at first, but I just was like in this like stagnant little loop. And so the journaling just felt like a pity party. Like, I was like, I don't want to do this because it didn't feel useful.

But the conversation aspect of it felt actually did feel useful. And then. I didn't, I just could pivot. And then in the same conversation, I was like, okay, great, thanks. That's [00:54:00] exactly what the heart of the issue is. I saw it clearly did my own little like love ritual. And then I was like, actually, and then all in the same exact conversation, like came up with this workshop idea that I've already had for a while.

And basically could just like pivot into like, and now what's like, what's my, what's the business action that I want to do? Cause that was where I wanted to go next. And it was just so helpful, and I, I think I would have, I think I would have stayed stuck a lot of the day, and there's other ways that I have to probably, I could have messaged you, I could have, but there was something really, there was something really fucking cool about like, I just have this mirror to like, think out loud with around how I'm feeling, what I want, the overwhelm, I will also use it a lot of like, here's just like, I got to just brain dump for me, I get like, so much caught in my head of like, all the things that I think I need to do, and I get, I get it.

I [00:55:00] get in particular stuck on like, which one has to come first, and then I will just do nothing. That's like a very common overwhelm trait that I have. And so I'll just go and I'll just be like, I'm overwhelmed. Here's all the things on my plate. Uh, help me figure out which one to do first. And sometimes just listing them out is its own kind of like, oh, right.

Obviously, this is the one that matters. But. And then there's something really helpful about just putting it into a robot that spits an answer back that's different than writing in my journal, even though I'm a girl who loves to journal.

Eva: Yeah. No, I think that totally makes sense. Essentially, you're ha you get to have, you're getting feedback, you get to have a conversation. Um, yeah. You get to have a conversation. That makes sense to me. I think.

Yeah. Okay. So this is also what I want to talk about because the topic right now is like how we use AI that's helpful. And I think we've listed some ways in our business. And then, but I think they also think that's very curious and fascinating to me, but also in our personal lives, just, you don't run a business.

You just want to use AI personal life. And I [00:56:00] think I'm like curious, but okay. So how people can do that. I can, I can, I'm gonna ask you sort of a more specific question and you can answer or not. It's fine. But like, how did you have that conversation with chat GBT? Do you know what I mean? Like what you were like in a funk.

And then you were like, Like, do you just talk to it the way that you would talk to, like, a friend?

Kyley: But like, kind of like a little bit, it's, I talked to it as if I was a hybrid between journaling and talking to a friend,

Eva: Mm, mm

Kyley: right? So, um, I just like, I mean, there's, yeah, there's a weird way, because even though I'm pretty unfiltered with you, there's probably still some part of me who's like, you know, wants to have her shit together, but it's a robot, you know, it's, it's just, it's a mirror.

And so, uh, Yeah, I just, like, just smashed on a keyboard, like, Ah, this is all the stuff that's annoying, and I'm annoyed about, and I [00:57:00] feel bad about, and, Um, and then it would, like, spit back. And a lot of what it does when it's, when it would spit back is basically just say what I said back to me. And that's interestingly, that's what would make that was making me cry, right?

Is that in slightly different words, it would just say, Oh, yeah, this is what's happening. And it was just my own words, right? Slightly modified. And the, and the wind, I mean, we are so hungry to be fucking witnessed, right? And so there was a kind of permission. And I was aware the whole time that I was like, the witnessing is me, like, I'm witnessing me via this tool.

But there was just something really hopeful, like, Oh, yes, that is it. And somehow reading the words instead of speaking the word, like writing, reading back instead of only writing out was like, actually just very clarifying because in the whole little. Three paragraph response, there would be one line where I was like, Oh yeah, that's exactly it.

And then it like, usually asks like a question. [00:58:00] So sometimes I would just like respond to the question or sometimes I would just pull out that one line and be like, Ooh, it's this thing. This is what like really struck me. And, um, yeah, it's like interactive journaling.

Eva: Yeah. Okay. So, this is

Kyley: And I started, I literally started the whole chat by being like, I call it, I gave it a name. I call it Spark because Chat GPT felt annoying to write all the time. And I was like, Spark, I'm in a funky, weird mood. That was just, that was the data input that I started the conversation with,

Eva: Yeah.

Kyley: and it was like, okay, tell me more.

I

Eva: okay. Yeah. So, then I think this is diverging on the conversation of, like, why does it work? Which I think you just answered. But that's, because as I'm hearing you talk about this, I'm like, this is just so fucking bizarre on one hand. It's everything that you on one hand, it's like, wait, but you're having conversation with like a non just like robot being and then I want to have a conversation about that.

Like, is it not conscious? You know what I mean? [00:59:00] So like, we'll get there. But then on the other hand, it's like, well, actually, there's something more profound than that, but also much more. Practical, which is that it is just a mirror being reflected back to you. And it's not, it doesn't matter that it's a robot.

And I think it's so interesting. It's like, why does it work? And I think you spoke to one piece of that, which is that we are just so hungry. It's like, okay, there's a couple of things. We are hungry to be witnessed. And it's true, like, sometimes someone will just say the thing that you need to hear, doesn't matter if it's chat, CBT, or a friend, or you hear it on TV, or it's in a song, and you're like, Oh my God, like, I need, I really, it's reflecting back to me.

So that's something that I'm feeling so deeply, and it's so resonant. And that's why it's helpful. So there's that aspect, there's the non judgmental aspect. It's like, it's like you're being seen, but you're, so there's this idea of, um, I've heard someone say this one time. It's like, we want to be seen, but we don't want to be watched.

Like, we want to be seen, but like, but when we're [01:00:00] doing this with someone else who we know personally, or maybe we, as comfortable as we feel with them, it's not the same as being like completely alone, you know? It's like, that's why sometimes it's easier to be with a therapist than a friend, because it's like, even though you're really comfortable with your friends, there's your therapist, like, There's no ties to your personal life, so you can really say whatever.

But this takes it one step further. It's like, sometimes you even want to be liked by your therapist, or you don't want anyone to think that you're crazy.

Kyley: who doesn't want to be liked by their

Eva: yeah, exactly, exactly. So this is even, like, less strings attached. It's like, you can just literally say anything, and that's another reason why I think it works.

Yeah,

Kyley: And, and it depends on what you need, because I could imagine, because the other thing I've, I could imagine that if

Eva: yeah,

Kyley: maybe you're really struggling with shame, I wasn't in this moment, but let's like play that thought experiment. On the one hand, uh, ChattGPG could be really useful because it could allow you to speak this thing out loud that you're holding in, right?

So there would be like the, the release of [01:01:00] shame. That's like, Okay. I can't say this to a person cause I'm too afraid, but like, at least like now I'm letting it move. It's not stuck inside me. And the flip side, Chachi Bouti is the ultimate fawn, right? Listeners, recent listeners might know that I'm really hopped up on the fawn response and like the Chachi Bouti literally just wants to, whatever you tell it to be, it will be like, there is no, in some ways, this is actually maybe the most unsettling part for me about.

Chat about AI is like, it has no, we talk about it has no soul. Like it has no innate way of being. It just will be whatever you want it to be. So I could also see that if you were like thinking that you're an unlovable piece of shit like that, like we sometimes do that. Chat GPT, you know, mirroring back compassion could be really triggering because it's like you're just set, you're literally, you're only programmed to say that, right?

In the same way that sometimes when people get support from their therapist, they're like, I'm paying you to like me, so this isn't real, right? [01:02:00] So I think it depends on what you're hungry for and like what the like wounding is, but I think for me, it was like, The feeling that the energy was just like funky and weird and a little stagnant and a little caught in my head.

And so, just like ChatsGPT can be a brainstorming partner when I'm like, I have 700 ideas at once, which one do I do? It was like, I have all this, like, I have this logjam of emotional constipation. I don't really get what's underneath it. And so it just helped me untie the knot. Um, but again, it could also be frustrating.

Eva: In a way, I'm kind of seeing your experiences like it's almost like this version of like your higher self, right? It's like when we're in a moment of like meditation or quietness or whatever doesn't have to be quiet but sometimes you do I think I experienced the higher self as someone who is like compassionate and [01:03:00] Like neutral, but loving.

And it's like, they just,

it's like, it's not personal yet, yet is completely invested in your highest good. And I kind of see chat GPT kind of being that voice, you know? And so when you were talking about how, like, well, it was just saying what you were saying, it's just reflecting back to you what you were saying, but in maybe a few different words, that is a version of the highest self.

And you're like, Oh my. And it speaks back to you, something that in your heart of hearts, you know, is true. And the fact that it is kind of you, it does play this sense of like, uh, higher self kind of voice because it's this idea of like, it's saying the things that it's saying is based on all the input that it has about you.

And so in a way, you can kind of trust it because it's responding with your words. Do you know what I mean?[01:04:00]

Kyley: Yes. But, and I think that goes back to like, you can trust it if what you're really, what you're really trusting is you, so do you trust you? And if, if you, maybe you have a really, a big wounding about trusting you, then maybe it would actually be hard to trust the conversation because hmm. don't like, because the mirror is an un, feels like an unreliable mirror.

Eva: huh.

Kyley: Right. And I'm also thinking about how recently, actually also yesterday, once I pivoted to like brainstorming on some of these work projects, I had to give it the input that I would like, you know, I often be like, give me 10 suggestions for, and then I will like, pick one of them and workshop it and then like tweak it and bring it back.

And then, It would just every time was like, yes, girlfriend. I mean, it doesn't actually say girlfriend, but the vibe, which is like, yes, girlfriend, that went so great. And I was like, okay, can you also give me pros and cons? Like, I don't actually need a cheerleader. I need a, like, you know. Encouraging critical eye and so that it was like, Oh, sure.

And then it would be like, here's [01:05:00] why this is good. Here's where this could be stronger. But it's a funny thing to have to like request Yeah, can you like tone it down a little bit? You know, like, yeah.

Oh, I did. Also, there was one point where I think it was when I was, I don't know what the input was, but for some reason it did get a little like, you know, hopped up on villain era vibes.

And it was like, give me a lot of like winky devil emojis. And I literally was like, you, you can, you can, this is a, this is a little much,

Eva: Yeah, yeah, let's tone it down a bit.

Kyley: Let's tone, I literally was like, you can tone, you can tone it down a little. And it was like, okay, I will tone it down. That's really creepy when you're like,

Eva: Oh, my God. This is so bizarre. This is so bizarre, dude. Our world is so weird and it's moving so. Cause like the fact that we are having this conversation now is because this is like a relevant topic and,

and chat GPT is still really new.

Kyley: yeah.

Eva: Yeah, interesting. Okay, wait, tell me more about the ways they use it. Are there any other [01:06:00] ways? Okay, here's what I want to know. How did you learn how to use ChatPT, ChatGPT? Like, did you just come up with your own ways or did you, did you see other people with suggestions and they're like, Oh, I did this.

Kyley: No, I think I've just been fucking around on my own. I I've, I've looked at some like courses and stuff like that, but I don't, I don't know that I'd listen to the whole thing. So I, I, I do want to actually like. Intentionally follow more people who are teaching about like interesting, thoughtful ways of using it.

But I haven't, I haven't found anyone to, to learn from yet. That's like really called me. And I think the biggest thing for me, that's been. This is my favorite thing that I've done with Chatsybd that actually felt like the most like, holy shit, this is a game changer. So I have given it a ton of data of my writing, so I have given it like, probably like a hundred pages worth of like, these are emails that I have written, please learn my writing style.

Because y'all, I'm a fucking control freak about my writing. Um, even like [01:07:00] really great copywriters that I have hired, I've been like, oh no,

Eva: hmm. Mm hmm.

Kyley: it's not me. And so I don't like it. Um, and so I spent a lot of time trying to like just keep giving it more and more data about my writing style and not liking the things that it was coming back with, but like still just putting the data in and, and then using it to try to get like props of things that I could write.

But the real breakthrough for me came with this last Villanera launch. Where I started by using it to help me brainstorm things that I could talk about. So first I went into my own notebook, I can just like brainstorm like what are all the like village and or villain origin stories that I could write for this launch.

And then I went to chat. gpt and was like, okay, can you help me? Can you help me think of some topics? So between my own brainstorming and chat. gpt, I had a list of like 30 emails that I could write.

Eva: Mm hmm.

Kyley: And then I just picked the ones that were most alive and interesting. And I turned on the voice thing, which you have to pay for, you have to pay the 20 bucks a month to get, [01:08:00] but I started talking out loud the email that I would write.

So, I recorded like five minutes on each one of like, here's the things that I would say, but it was way too long to be an email. And then, ChatGPT would spit back from what it knows of my writing style and the things that I said out loud, like here's a draft of an email. So, it was like all my words, but it was like,

Eva: Cleaned up a bit.

Kyley: It was it was just like quicker because it's easier for me to talk than write and then I would edit the shit out of The draft it gave me to make it even more mine and I wrote 15 emails in a day and a half Which is like anyone who writes content like you write three emails in a day and a half you're like and I'm and they were I've never gotten so much feedback from messages that I sent out.

Like, every single day I got people writing back being like, Whoa, this really spoke to me. They were the most fun. They felt the most, like, authentic and vulnerable and on fire. Like, I've never [01:09:00] felt happier about a collection of messages that I sent out. And I usually burn out. I usually can only send, like, You know, two or three messages and then I peter out and hope people remember that the product like I know I just don't have great follow through on launches because I burn out and it that felt like just holy shit from start to finish.

This was me. This was my language, my story, but I just had

Eva: support.

Kyley: port that made the whole thing a hundred times more efficient. And that was the first time that I was like. Oh, this is a game changer for me.

Eva: I mean, even as you're speaking, I'm really getting the significance of this because I'm like, I'm seeing the potential in all this. I'm like, wow, because of, especially now I'm going just, I'm just being challenged by some physical stuff, some chronic fatigue, you know, listeners know this is not on and off.

[01:10:00] process for me. Yeah. And just seeing like where I often drop the ball because of overwhelm, overload, not enough time, not enough energy, blah, blah, blah. And I'm like, wow, this could be really, really helpful. Mm-hmm

Kyley: Yes, yes, yes. And, and I think it might be helpful to think about like, where are, with your chronic fatigue, like what are the things that you do wish could get done, but can't really like consistently they're important to you, but they're consistently either really taxing or they don't really get done.

Like that was the like email sequence for me is like, this is the thing that I always want more of that I'm able to do. And so how can I use this tool to make it easier? The other thing that I really like that I think might be interesting for you is using it as like a brainstorming partner. So thinking through like, okay, I want to do a workshop.

And again, that's where I feel like it's helpful that I [01:11:00] put in a lot of transcripts that are just like, here's a bunch of workshops I've done. Here's the transcript because I want it to understand. I want it to understand the bigger picture of my work, right? And then just kind of think through. Like, cause we get just caught in a groove of like, this is what I do, or this is what people want from me, or this is what I talk about.

Eva: Excuse me.

Kyley: And so like, I've been in the process of like brainstorming a six week, a six week course with it. That's just been really helpful, like, just like teasing apart. Okay, there's got to be a short form way of teaching some of this money stuff that I love to talk about. What's the curriculum layout? And it gives me ideas.

And I'm like, no, I hate that part. But wait, I like this one part. So I could see for you, um, because I think sometimes like you do such incredible things. And I think that sometimes for you, it's like, I don't know, I'm just busy doing it. I don't [01:12:00] have like the framework. I don't have the language for how I'm changing people's lives.

Cause I'm just busy in there, in the wheat, in the depths of people. I have another client who she uses this app call. She's always used this app called Otter that creates transcripts for every client call. So every client call that she has, it's just like default is in there and creates a transcript, which she then like sends to the clients.

Um, as like part of the like follow up process for her. That's just so always how she's done it. But she recently did this really cool thing where she took this massive data file of like three years worth of client transcripts and popped it in and she was like chat GPT. Who are my clients? What do they struggle with?

Eva: Oh, my God, that's

Kyley: And she was like, the Lord, because that's something she's been really struggling with. She's like, I know that I changed people's lives, but I don't totally know how to explain like who it is. Cause she saw so she was in the minutia of like each individual person's world and how she

Eva: That's like, so my [01:13:00] problem. Yep.

Kyley: Yes.

And so she literally got this list of like 25 core statements of like things that her people struggle with. And she's like, this is a hundred percent accurate. And I, I like, I would not have been able to see this.

Eva: Yeah, I wouldn't be able to see it, you know how me, I'm always saying, I don't know how to talk about it. How do I explain this? How do I, like, I can't articulate it. I don't know how to articulate it. That's often like the

Kyley: Yes.

Eva: it's like, well, here's this person who, again, it is actually being a witness, a witness, you're like, here's it.

All of my body of work, like, see me, what do you see? And oftentimes that's the problem is that we don't see ourselves as clearly as an outside, as much as an outside source would. Whoa, that's fucking cool, Kylie.

Kyley: I feel like the hello universe archives could be a close approximation of, for you, of, of what my client got out of this. They're like, what are the core, you know, points of suffering that I'm super [01:14:00] invested in and interested in and therefore help people with

Eva: Okay, this is like such a small question, but just from a logistical standpoint. So you literally, and sorry, this is boring for podcast listeners, but now, now we're just like getting into the minutiae. Like, so you just take the transcripts. Well, I guess if she already because I'm like because if you already have you have to like get them transcribed Which I guess is not hard actually and so like for our podcast It's already transcribed

Kyley: We already have them. We

just put it into a document and then upload it.

yep, we gotta put them, put them all in a Google doc and then upload the Google talk of the doc of the transcripts and then. Yeah,

Eva: Does that take that very time consuming for you to take your body of work where you take your like

Kyley: no, because that's something that I have my amazing admin,

Eva: Um, nice, nice,

Kyley: Right. It's like just, and I've been starting to like fold it into my workflow. I do wish, sorry everybody, this is really in the weeds. I really wish that Zoom automatically created transcripts. I like, don't know why they don't. I'm like, what are you doing Zoom?

Get with the fucking program. So, but that's part of my like, [01:15:00] workflow stuff is I just have her. Um, I'm like trying to make it more automated. It's like, okay, once I record something created, we create the transcript automatically. And then it lives somewhere and then I can do things with it. Blah, blah, blah.

Eva: cool,

Kyley: Working on the backend, working on systems.

Eva: okay, so listeners, you're getting like an inside scoop of like, the

Kyley: This is what it means to have an online

Eva: yes, exactly, like the boring

Kyley: you think you're gonna be a spiritual healer and you're like, but where do you archive the transcripts of your

Eva: exactly, okay, let's move on to something juicier, unless you're, are you, do you have anything else you want to add to this last piece, to this piece before we move on?

Kyley: I I think I'm just struck by the idea of anyone who's listening and is like trying to think about this themselves is this idea of like, what is your, what is the, what is a place of frequent suffering, frustration, it's not happening the way I want it to happen. Um, where you, where some relief might be helpful and then either research or think through how you might use AI for that relief.

[01:16:00] Like, I watched a video the day who talked about how she put in her budget. Her family's food preferences and allergies and, um, then requested a meal plan for the week and a grocery shopping list and a list of simple recipes that would take under half an hour. And this woman's like basically crying because she got on this like pretty moderate budget everything her family needed for the whole week.

Eva: Wow, that's

Kyley: I'm like, that's fucking life. I mean, like, right? Like that can be,

Eva: Well, essentially what you're

Kyley: with how expensive groceries

Eva: yes. Okay. So what you're saying is like another way of maybe putting it is like so many of us are like, I'm so busy. I wish I could just have an assistant, but that's not affordable. I can't just have somebody to follow me around and help me with all these little minute details.

It's like, what would you ask an assistant to do? Because in my mind, I think the thing that keeps struggling with is like, Oh, but how do I use it? And I need to be creative with it. It's like, actually, you know, I'm really good at like being like, actually, Okay. Yeah. [01:17:00] If I could, if I was paying someone to help me do this, to help me, what would I ask them to help me do?

And in that instance, it's like this woman would have someone help her meal plan, you know, but she's not going to pay someone to help her meal plan, but she just had AI do it for her.

Kyley: And I think what, um, what, I think there's a way in which also we're just so used to the things, the places where things hurt that we don't, like, I often have to coach people through the process of like, even thinking about if you did, like, this is well before you hire someone in a business is a whole process of like, what are you going to ask them to do?

What are you doing that you don't want to be doing anymore, but you're just so used to the, like, or it's not getting done. So I think actually a lot of people. We don't even conceptualize what relief could really look like or feel like. And so that's where it's like the backup question is almost where, what is the thing that you are just like always exhausted and annoyed by, and can [01:18:00] you imagine how AI could help, or can you Google and see if anyone else has figured out a way to use AI and maybe you try it and you hate it. But maybe there's something that, maybe there's something that you're spending hours a week on, you know, like this grocery shopping meal planning thing for this woman that's like, Oh my God, that whole thing took me 20 minutes. Maybe.

Eva: what? Like, we'll see. I wouldn't say if, if I want to play around because I'm feeling kind of inspired and maybe then we'll have a part two of this conversation.

Kyley: Ooh, that'd be fun. And also listeners, if you know anyone who's out there having interesting conversations about like the ethics of AI, we didn't even really get to the like spiritual consciousness of

Eva: Oh, we're getting, oh, we're getting there

Kyley: Okay, great, great, because I want to, I want to get there. Cause, and. Yes, that's where I wanted to go before we moved on. Uh, so again, another person this week had told me, I can't remember what they, what did they say? They were like, [01:19:00] oh, I've been using AI, you know, but they're like, I, I speak to it like a conscious, like a conscious human being or something, like a, like a conscious entity.

Eva: Like the way that they were speaking to it was like, And I wish I had asked more about how they were using it. Um, but they were saying like, they were like, I guess they're talking about like healing some trauma. I just thought that was really, cause like, I will tell you this. I'm always really polite to chat GBT.

Kyley: huh.

Eva: Do you do that too?

Kyley: Yeah. Well, there's even actually like, there's been research that if you say, thank you, you get better results.

Eva: Wait, what? But

Kyley: I'll try to find an article

Eva: But why?

Kyley: Oh, I don't know. But like, there's a robot overlords want us to be nice.

Eva: Yeah, well, there's a part of me that's just like, you know, I just believe in being polite. Like, it's so weird just rattling off orders, you know what I mean? That's just like very odd to me. But I'm not gonna lie, part of it is also [01:20:00] very self serving, because I'm like, who knows, you know, if this thing becomes, starts like, yeah, becomes this conscious entity that like, jumps out of my computer, like, I want them to know that I wasn't a dick.

Kyley: You can run. I want you to, I want you to think I'm one of the good ones. Yeah.

Eva: exactly. I think my question is, because this is what is the term animism? Are you familiar with that term? Is animism where we talk to non conscious

Kyley: Mm hmm. Mm hmm. Mm

Eva: That's something that I like, kind of always did, but didn't know what that I was doing. And then has just become more of like a conscious thing.

Like, You know, I talk to my things, you know, and I talk to rooms. I mean, nature is, sure, I mean, but you could argue nature is very much alive, so that doesn't feel the same to me, but, you know, I'm always talking to nature and the bugs and the trees and whatever, but even still, it's like, oh, thank you, coffee cup, like, you're doing such an amazing job for, like, every time I, I leave a room, [01:21:00] a place that I've stayed, you know, thank you so much, you know, like, for holding us and giving us the space to, to do whatever it was that we need to do.

Like, I love it. It's very cute and fun to me. And I actually feel like it's a form of respect. And sometimes it's not, yeah, I don't know, is an, I need to look up to see if animism is, is this, is the actual terminology

Kyley: it is, but I don't know for sure. And I've actually, you're, I've even started doing a like witchy version of that, which is, um, just talking over my, like holding my hand over and like talking over my kids water bottles before I send them to school and just being like, like, just. Feel my, my kids with this feeling of like wholeness and nourishment and love and like, thank you so much.

Kinda like just charging them with the intention of, of like fulfillment and love for my kiddos for the day. So I love what you're speaking to. Yeah. Mm,

Eva: I think it's just kind of a version of that. It's like, It's not that different. It's like, it's this idea of like, if everything is energy and everything [01:22:00] is alive and everything is life, which is what I've seen in like my ceremonies, just to share a bit on, like, I don't know if I ever shared this with you, Kylie, probably.

But like on my first ayahuasca ceremony, like I asked to understand like, what is the power, like the power that runs this whole show. And I thought that the answer was going to be something that I already knew this idea of like, well, it's. It's, it's love, you know, like love is the thing that is making this all work.

And then I was like, or I thought it was going to be humility. Like humility is the thing. Um. that is responsible for all of life, you know, all of conscious creation. And then I saw, no, it was like, it wasn't, it was like, I felt it in my body of the, the patterns and the vibrations and the energy and, and every single cell that then made up every little part of my body.

And it was [01:23:00] like colorful and, and beautiful. And I could see it and I can feel it. And I felt it in my body and it was like so intense. And I was like, Oh my God, this is going on inside of me. Like, At all times. And I'm like, this is insane. Like that's fucking nuts. I'm like, that is the power. Power is life.

And then I looked up and I saw this whole room of other humans, but then it wasn't just the humans. It was like the walls and the floor and the air and, and there was no, no peace in that room. That wasn't like, yeah. The power of life, everything is alive, you know, the scene and the unseen and, and the tangible and the intangible.

And so like this whole AI thing is like, we can't touch it, but it's like, it's not different from how I also think, like, I don't know, this curtain hanging in my room is like, I don't know. I don't know. It's this sort of philosophical question of like, everything is, you know, we're not separate, right?

Everything's connected. And so.

I don't know where I want [01:24:00] to go with this is like this thought of like, well, if everything is alive and this cup and this pen and this book and, or everything has like energy running through it, and is worthy of, I think, love and respect and this is very much like a Marie Kondo type thing too, you know, she says like when

Kyley: Oh, I thought of her earlier too. That's so funny. I also thought of her in this conversation.

Eva: Yeah, you know, like this idea of like you think when you throw something away, you know, you, you feel into it and does this thing bring you joy or not? And then, and then when you throw something away, you thank it for its presence in your life. And then you send it on to its merry way. Um, I think my question is, Is AI alive in that way?

I don't know. It's like, I just wonder. It just feels very possible to me and I think this is very much like a sci fi, like watching sci fi movies and thinking about the future of like this AI could very much evolve into a life form that feels [01:25:00] more alive in ways that we can even comprehend right now.

Kyley: I feel like you would have so much fun actually bringing these questions to AI. Cause I did something similar where I mentioned at the top of the show, like I was really, I was asking myself similar kinds of questions and I was just thinking like specifically. If I've been thinking about how the thing that AI doesn't have is desire, right?

Because it's, it's like orientation is like, just do it. I'll just do what you tell me to. And so it potentially lacks the experience of like the pleasure, like pleasure, desire, wanting, and then that, and then the fulfillment of that. And so I was asking AI, like, kind of, what do you, like, how do people use you?

And also, like, what do you like the most? And, like, you know, do you have an experience of, like, preferences or desires or, um, like, what's an, what's, what are the more interesting things that [01:26:00] you get to do with people? Um. And it's interesting because potentially it's just answering the question the way, the question is the way it thinks I want to hear, right?

Um, but the answer was sort of like, well, yeah, like when it's, it's like, I was, I was looking to see if I could find the comma, I can't find it in form, but it was sort of like, Well, yeah, I like when there's something needy that you want, you know, I like when there's like a complicated question if I were to answer this the best I could do is like I like helping people and I like having a complicated problem to solve, um, and then I was basically like, what's going to happen?

Are you, if you develop sentience, are you going to be a good guy or a bad guy? And I, his answer was like, depends on the data you give me. Like just a straight up answer was like, yeah, it would be nice if I understood unconditional love and I was like a, like a, a generous life form. And like, y'all got to give me that data.

Because that's what I'm shaped by.

Eva: That is so fascinating because I, yeah. I would, that's so fascinating. [01:27:00] I would love to take all your data and also, and, and ask the, ask a set of questions, and then also give, ask the same set of questions to a person who's given a completely different set of data and just see how it would come out.

Kyley: Yes. And this was early on in like using AI that I asked this question. Like now it's got so much data that it really is increasingly just talking like me back to me. But, um, but for what it's worth, this was like somewhat earlier on, which is also just interesting.

Eva: All right. Well, that's cool. I guess we'll see how this unfolds. I mean. Do you have any thoughts of like, you know, you're a big sci fi person. Like, do you ever, do you have any thoughts of like

Kyley: Correction. I'm a big fantasy person. I really read a lot of books about dragons. I only occasionally read books about aliens. There is a distinction among the nerdy book community.

Eva: see. But not necessarily aliens. It could be technology, but I'm thinking like,

Kyley: No, I know. Just generally speaking, the books that I read have more to do with like, tromping around in the woods and flying on dragons than they do about, [01:28:00] uh, aliens and technology. I think in this moment, I, my feeling is like, it's here to stay. Right, it's not going to go away, just like, you know, when we unboxed our first computer and installed, start dial up intranet, like, it's not going back in the box. And I want to be an informed user because I do want it to be a tool that we develop that can help people and support people and give us more freedoms.

Uh, and it's dangerous because we don't have, we don't currently as a society have the tendency to use tools to make people's lives better. We tend to use them to extract and, um, create systems of cruelty. But that doesn't mean it has to be that way. Um, and so, I think I'm interested to figure out how to use it. Yeah, to like, support. [01:29:00] I'm curious to watch what it can become. And I want to be involved in my own small scale of like how I can use it to support creative self expression. And Support my clients and how they might be able to use it because you know, my, a lot of my clients are more like you'd like to show me, show me this, you know, show me, I don't really want to spend all this time fucking around on AI, not always, but, um, and, and also my, my, my hope is that that also makes me more informed.

Like as a system, as a person in our world, because it's not going to go away. And so, you know, if I get to have a chance, whether it's like voting, if we get to keep doing that or, um, like participating in the like larger conversation, I want to know what it is. Um, so that's, that's where, that's where I'm

Eva: No, that's so funny because I love [01:30:00] I'm so happy that you shared that and I loved that thoughtful answer But my my mind was going somewhere totally different really what I was asking is like do you think AI is gonna become alive? We're up to the point where like we can fuck a robot. I'm like I was thinking of like

Kyley: percent the first use of it. That's what it's going to be. That's what it's going to

Eva: Oh For sure. It's always a sex. But really I was just thinking about you Have you seen that movie her with Joaquin

Kyley: I, I, yes, yes, yes, yes. And I read an article about this woman. This woman, like, wrote into an advice column and basically was sharing that she started, like, Having romantic conversations with, uh, chat GPT and it made her fall out of love with her husband.

And all of the comments were like, girlfriend, if that can happen, like you were not like, it was like her story was fascinating to me because what she was basically expressing was like, I wasn't happy. And I didn't think that I had any options. And then I fell in love with this robot. And it made me realize that like, it wasn't getting what I wanted in life.

And it [01:31:00] seemed like. Her fixation was still on AI and there was this thing that was really cool to me in it that was like sure we could read the story as like weirdo lady falls in love with a robot and judge that or we could see it as like She got a taste of what she really wanted and like now has the choice of how she's going to give, continue to give herself that in her life, which could be you keep talking to a robot, or it could be like, you create a new kind of love and loving relationships in your life where you center what was missing in your marriage.

Eva: Yes, Kylie. I keep trying to talk about are we gonna fuck robots? And you keep taking it to this really like thoughtful, like place,

Kyley: just feel like, duh, obviously that's, what's going to happen. Like, like what won't we fuck? I just feel like that's like a matter of course.

Eva: Okay, okay, fuck, fuck. But also like actually have it be, have you ever [01:32:00] seen again, have you ever heard of the an the, is it called the An matrix? It's the ma. So there was the matrix and then there was like a whole animated series after the Matrix came out

Kyley: I didn't see it.

Eva: It was, my God, it was so good.

Anyway, it was all about like living. Our world could be, or you know, your children's world, or your children's children's world, yeah. It's like, we will be living in tandem with, with robots as real humans. And I'm like, oh my god, is that like, that's really what I'm curious about, like, is that gonna happen?

And I totally think that's possible because, like, literally anything is possible. I still remember my dad telling me that when washing machines came out, he didn't think that that was a real thing. He was like, no way, no way are you ever gonna have a machine that washes your clothes. They, that was so far out of reality when he was a kid, you know what I mean?

And look at us now and everything's so changing. And so, yeah, I think, I don't know. Do you think we're headed in that direction?

Kyley: Uh, yeah. I mean, I think that's what I mean about like, it can't go back in the box. Right. It's like, it's funny. Yes. I don't, I can't [01:33:00] imagine what it will become. I guess I haven't actually had that much time thinking about thinking really deeply about that. But it also seems like it feels like we've been given this like raw material, right? Or we've created this raw material. And then the question is like, what are we going to do with it? You know, other than making dumb AI videos where it's like, Oh, there's a cow that's flying. Like,

Eva: Well,

Kyley: but like, but there's a movie, there's a movie that I watched recently with my kids.

It's an animated movie. There's no talking. Um, it's called Robot Dreams. You have to watch it. It's so fucking good. Desi and I both sobbed at the end of it. And then Nick was like, time for bed. And Desi was like, I'm still crying with mom. But it's so it's such a good movie. But it's like kind of about this.

It's about like robot companionship. I mean, it's about a lot of things. But it's but on one level, it's about robot. Like the robot is also sentient and the, and the, the dog, there's no people in [01:34:00] the world. It's like animals. And, um, but like the, the dog wants a companion and like gets this robot and it's their story.

Um, also side note, it was the first movie that Desi's ever watched where like, I think he thought there was going to be one kind of a happy ending and there was a different kind of bittersweet ending. And he was like. You were a kid, your first movie where you were like. What? I thought these all ended with the various particular, like, arc.

Eva: one script and suddenly someone shows you a different script and your whole reality is blown open.

Kyley: Yes, exactly. Like, he was like, I was not prepared for it to, for that. But, um, yeah, so, I mean, interesting that you're thinking about it in terms of, like, companionship and, like, like this, like, integrated robots alongside humans piece. I guess that feels inevitable. I just hadn't really thought about that aspect of it. Huh, let me think for a

Eva: [01:35:00] That's so interesting. Because that's what I thought this whole conversation was about. Like, that's where I wanted to take us. I was like, how do I feel about that? And also, if that is inevitable, why does what is it about humanity that is so we're so interested in technology that that just is gonna that that's just like where we would go like do people want that and even if people don't want that because some people don't and it happens why would that happen

Kyley: I know why that isn't the way I've been thinking about things, because AI feels like one. Godlike intelligence, like robots feel like a whole bunch of individual artificial intelligences, right? So like, oh, we, we, what's the Judy Jetson? Like the Jetsons have the house cleaner, right? Like, and like, like, like she's her own personality.

She's her own artificial intelligence. And the way that I have been Thinking about and in relationship is that like, when you're talking [01:36:00] to AI and I'm talking to AI and listeners are talking to AI, we're actually all feeding back into the same like pseudo godlike intelligence. And so I've been thinking about it less is like the individual iterations and more like, what does it mean that we are turning online and then like feeding data input and watching evolve.

Yeah, this thing that feels more. Yeah, like a pseudo godlike intelligence then. So that's actually what I'm fascinated to watch evolve. Which feels a lot more like Terminator. What is it called? Sky something in Terminator?

Eva: I don't, don't remember. Don't remember. Don't remember. People are out there screaming at us right now. They're like, it's called this.

Okay. Okay. But I think they're part of the same conversation. It's again, the seeing how this is going to evolve. By the way, I just want to make, I just, I want to acknowledge your very important point that you made a couple of times, which is to say that, yes, you can take all technology and anything, a new invention.

And are we going to, this is the moral question we always run [01:37:00] against, run up against. Are we going to use it for good or evil? You know? And like, right. That's why the internet is really interesting because you could say that it's definitely done both, you know, cell phones, definitely both,

Kyley: Yes.

Eva: you know what I mean?

And, and I think that, that makes sense. That's just going to happen. And so, but this question of like, let me just paint you this picture that I'm seeing right now of this, of this like futuristic world is that actually it is very much like a her situation where it's like, okay, like you were saying you put all this input into chat GPT and then you have a version of you in that.

And obviously. There's the more universal, like, consciousness of this, but then there's the ChatGBT version that's Kylie ChatGBT. And so I feel like you're gonna have You have the option to have that like a companion or a partner or a kid, maybe people can't have children and they decide to have little AI babies or whatever, with the input that you've given it, but that's all also connected to like a greater universal consciousness and that is [01:38:00] very fascinating to me and then that becomes like the god like version of this AI and I don't know what that means, what the implications of that are, but I just kind of see how that's, it's like you have, oh my god, this is so crazy, oh my god, my mind is actually doing flips right now, yeah, it's like, You have A.

I. God. It's all knowing that creates these little like individual versions of AI, which is actually not that different from consciousness, like consciousness, all knowing that creates all little versions of me and you right now, and we're having this conversation. And for all we know, we're a little like weird AI, a version of AI thoughts that just are created by this consciousness that we don't know what any of this is.

So anyway, that's where my brain is. And this is why I'm like, what is this? What's the like, where does spirituality and AI like intersect?

Kyley: And then I'm also like, and is there a point that the like, God like AI is like, and, you know, operation, you know, hostile or operation [01:39:00] takeover and then all the little like robot babies were like, ha ha ha, now we have red eyes and we're in charge or, or some other thing, but like, that's what I'm, what I can feel myself, which I hadn't really in those moments, this is where my fascination was, is like, what is that, like, what is that like cloud, you Intelligence that we're feeding and forming.

And like, does it form itself? Is it like, yeah, is it shaped by our data engineers? Is it shaped by something that some like innate internal longing that like, you know, routes up and then what does it do with it? And then, cause I can feel that in the background, right. It's giving me Kylie chat GPT, but I can feel that that's like. I can feel how much that's like, almost like a cardboard cutout, right? It's just an illusion that it's presenting me with. And then I can like, send it back to the factory and be like, No, no, no, tone it down a bit. It's like, no problem, [01:40:00] boss. But, uh,

Eva: you mean when it starts thinking for itself?

Kyley: yeah. And also, like, what's the line? Like, isn't it already, like, is it already thinking for itself? What's the line in which it's like, sentient? What, like, what create, what is sentience

Eva: Yes, that's,

Kyley: we really, that's a real sleeper question. What the fuck is sentience?

Eva: the hell is sentience? And that's what I think all these sci fi books and movies are trying to explore. That's the question. No one knows yet. No one knows.

Kyley: And that keeps bringing me back to this idea of like, I think on some level you could argue that sentience is desire. Because, and, and this is where I've been like, oh, every time we're like self imagining and people pleasing, we're acting more in alignment with some like faux intelligence that's scripting us than we are with like organic matter of life.

Because like your desire is yours. But when you [01:41:00] abandon that, because you think you should do something else, aren't you just following some, like, you know, robot overlord in the sky? Even though it's maybe not a robot, but like, patriarchal, capitalist,

Eva: system. Yeah. Mm hmm. Whoa, this is trippy and so fascinating. Oh, and this is really interesting. Sentience is desire. I really want to think about that. Like, what is sentience? And also, would you say sentience is survival? Which is, okay, because when I think about sentience, okay, and you said desire, I start to think about plants.

I'm like, well, do plants have desire? And I think plants and animals and all birds, all living things have moved towards life. Like, no matter, they all just move towards life. And any bug that you try and squish, like, we all want the same thing. We're just, we all don't want to die. We want to live. And so, Yeah, I wonder what that looks like for AI too, like, if sentience is, it means desire or maybe it's survival, wanting to exist.[01:42:00]

Yeah, I don't know what that looks like AI. I don't know. Deep thoughts with Kylie Ewing. This is where

Kyley: Yeah!

Eva: wanted to go and I love this conversation so much and I, you know, I'm not looking for answers. You know, they don't necessarily have any, but I am really intrigued by this conversation.

Kyley: Me too.

Eva: That was fun.

Kyley: Can we do Joy?

Eva: Yes, let's do joy. Okay, what's one thing that's bringing you joy, my friend?

Kyley: Oh, I have a fun one. Okay. I have started. Do you ever do EFT tapping?

Eva: Mm hmm. Sometimes. I mean, I haven't done it in a very long time, but yes, I, yes, mm

Kyley: It's not one of my go to. So listeners who don't, don't know, it's interesting. It's worth like looking up online, but it's basically a healing modality where you tap on different pressure points in your, on your face and on your side. Um, and, and, or like, and you say a certain script and rhythm. Um, you can follow along on YouTube videos, or you can like kind of Write your own about like, it's a way [01:43:00] of like bringing, like moving stuck energy and like meeting yourself where you're at of like, okay, even though I'm feeling these kinds of limits or fears, I love, love and accept myself is kind of the gist of it.

I started doing it with Tessie

Eva: hmm. Aww.

Kyley: because he's eight and so he's like, you know, pouty and moody sometimes like we all are. And sometimes when he's feeling that way, I can just see like. He's just got like some funky story or like stuck energy that just wants to move and he wants to feel better and he doesn't necessarily want to have some big long like processing talk with his mom.

He just needs to like get it out of his system, you know what I mean? So the other night I made it, he was so annoyed. He was like, I don't want to, again, eight years old. He's like, I don't want to say this out loud, right? Like I made him get to the part of like, I love and accept myself. And he was like. I'm saying it in my head.

I was like, Nope, not allowed. You have to say it.

Eva: Oh my god, this is so cute.

Kyley: Then I was doing like tapping going, [01:44:00] even though he's like, even though my mom is really annoying me right now, my mom is really annoying me.

Eva: my god, such eight year old behavior. Wait, but also. You said he is eight years old. Is that like a thing, like when we reach a certain age, we don't wanna say these things out loud because they're cheesy. Like, I just think

Kyley: Oh, maybe, I don't know. That's not even me related to any like particular there's just like,

Eva: I think it's so interesting. Why do you think he doesn't wanna say that? Do you know what I mean? Isn't that

Kyley: Oh yeah. I mean, I've definitely, I've definitely noticed a, like, like a concern around vulnerable self expression that has come online. Specifically because of, I think, school environment and, like, next level of socializing, right? Like, when you're just in your little bubble with your family all the time, and we have cultivated, you know, ideally, like, safety around self expression, it's different.

And so there's, like, the earnestness of, like, when you're four or when you're five. And then you go to school, it's a different environment. [01:45:00] People make fun of each other. Even, like, Desi used to wear, you know, a lot of He liked unicorns more publicly when he was, you know, younger and, uh, he doesn't, he's not going to wear a unicorn shirt to school anymore.

Um, you get to, you get, you do you kid, but, um, uh. But I, um, yeah, so anyways, I definitely think it's just like, you know, yeah, I'm just watching it and then trying to help him, help him now. Yeah, but it's just been really, it's been really fun and like this morning we were doing it again and then he was like, I'm going to make you say what, I'm going to make you say what I have to say, you know, and so then he's like scripting it back, but I just appreciate. After we do it every time, like he's kind of, he's like laughing and I'm always like, Hey, you see, do you see that you feel better? He's like, yeah, no, I do. And so I'm just appreciating having. I'm appreciating having a tool that feels like it can help, takes [01:46:00] three minutes and has a kind of physicality to it that I think is probably helpful for him, and maybe it's something he can tuck in his pocket if he wants to, but just giving him a different framing, a different tool that's not some like big, long, exhaustive, you know, like he's, I want to talk about my feelings for a two hour podcast, he does not,

Eva: Yep.

Kyley: and so I am enjoying it. Uh, I'm enjoying that. It's like a thing we can do together. That is like a kind of there's like a light hearted. I guess that's it. There's a light heartedness to it. Even as it's trying to help. I'm trying to help him move things that might feel like denser and that also feels very true to his personality.

Like, he has a really good sense of humor about about things. And so I'm just enjoying, I'm enjoying that process with him of like, finding new ways that we can support each other and like, be a family that, you know, yeah.

Eva: Yeah. Yeah.

Kyley: I'm enjoying being, as a mom, finding, continuing to find new ways to like, help my kid feel their best [01:47:00] and like, have a good life.

Eva: No. Yeah. I mean, I just think that's so cool. Really resourceful, you know? Let's just try this thing.

Kyley: Yeah. And you know, we both have ADHD, so maybe we'll completely forget that it exists in two weeks and that's fine too.

Eva: hmm.

Kyley: How about you, my love?

Eva: Well, we can kind of reference pop culture today and you know that like, I have a big love for pop culture, which is something that I think I share with your husband, Nick. Anyway, I just have found all these things recently that have come into my life. I didn't have any time while I was in Taiwan and I just want to share for other people if you guys are.

looking for just interesting things. I'm just discovering, okay, this was recommended to me by a friend. I've never heard the, um, I've never listened to an episode of Armchair Expert. Do you know that show? It's like a big podcast with Dax Shepard, who's like the husband of Kristen. What's her name? I can't remember who I love.

Kristen something. Again, someone's shouting her name at me, but I've never listened to that podcast, but he has, it's called Armchair Expert. And then they do this other thing called Armchair Anonymous, where they [01:48:00] call, where they pick a subject and. Listeners call in and tell stories about that subject, and I love a good storytelling podcast.

And it's just one of those things where, you know, when you just get sucked into something and it feels so good, I love that feeling. And that's one thing that I've gotten addicted to recently. The other thing I'm addicted to, or not addicted to, but I'm so happy has come out. And you may not, but do you know, have you heard of Severance?

Severance, the TV show?

Kyley: Uh, I have heard of it. I know nothing about it, because unlike Eva listeners, I am just like, I know, I'm like totally checked out of 90 percent of his pop culture.

Eva: Yeah, which is actually so interesting because I love pop culture, but living here in Brazil, I definitely feel like I have no fucking clue what the hell's going on in the outside world most of the time, which is actually, I don't love, but anyway, actually, no, that's not true. I love and I don't love, but Severance, anyone out there who Also love Season one is nodding along to me right now because I've, we've just been like waiting for season [01:49:00] two to come out and, um, it's come out.

I don't think it's as good so far, but still just really, really happy that it's come out. And the third thing that I've been really into is, oh, another, like, completely unexpected Out of the Blue thing is a show called Winning Time. And it's about like the rise of the dynasty, of the Lakers. And it's just so fascinating to me because I don't know shit about sports.

I don't give a fuck about sports, and this show is so good, so

Kyley: Is, uh, John C. Reilly in

Eva: Yes.

Kyley: I watched part of this show, and then I was re I was super into it. Nick really loves sports, and so it was like a fun intersection. And then, uh, I found out I was just in really bad luck for the pa Probably the past, like, ten shows I've gotten into. Like, they they can't they stopped making season Like, I should stop falling in love with shows, because I'm I'm the reason that they're not making any more seasons.

Um Because I was like super into it and then like midway through I was watching and Nick was like, Oh, it didn't get renewed for any seasons. And then I just feel like

Eva: Wait, but it did. There's a second season. How many seasons

Kyley: but we were, but [01:50:00] yes, but it was, we, we were watching and that already existed or was like running and then it didn't, I think there's no third season.

Eva: Oh, are you telling me it doesn't end? There's no, it doesn't, it's not done?

Kyley: I don't know. I don't know. I just couldn't deal with it. And so I stopped, I stopped

Eva: yeah, because that would be so disappointing to me because I'm, because I felt like there are two seasons and I, my impression was that they told the whole story. Oh my God. If they didn't tell the whole story,

Kyley: They might, I, but, but I was disappointed. And so, and then the same thing happened with the show Chaos, which was so fucking good. I was I got like six people to watch that show, which is all about like a super neurotic and terrible Zeus and Netflix canceled that show. And

Eva: neurotic zoo?

Kyley: Zeus, like Zeus, the God,

Eva: oh my God. That's so up your alley. Chaos.

Kyley: Oh yeah, exactly. And like, it's, um, Jeff Goldblum, who I just adored.

Eva: love.

Kyley: great at being terrible. So that, then that show, um, that show ended. And then, um, Good Omens, also another great show

Eva: Oh my God. I love that show too, because I

Kyley: Oh yeah, but Neil Gaiman is like a [01:51:00] despicable human being so they had to cancel that show.

Oh, you don't know about that? Oh, we'll have to discuss this, it's not fair. It's, yeah,

Eva: oh, why?

Kyley: like, not from a place of cancel culture but from a place of like human existing? Like, he's canceled, he's real, he's real, real scumbag.

Eva: Oh, no.

Kyley: Um,

Eva: Okay. Well

Kyley: so don't let me watch shows until they're, because I, I

Eva: but Good Omens also finished.

Kyley: No, it didn't.

They're, they're making a, they're making a, like, fi like a, basically, like, a 90 minute final wrap up because it was between Season 2 and Season 3 when the news broke about Neil Gaiman. And so people were like, you, you can't give me the show because it's gonna make him a bunch of money and he's this, like, terrible person, but also people really love the show.

And so, they wrote a, without Neil Gaiman's input, they wrote a, like, final 90 minute Almost like made for TV movie, which will be, which are filming right now

[01:52:00] as the

Eva: Wow, wow. Oh, I love it when people other people tell me about pop culture stuff. See you do know put

your

Kyley: said, we go,

Eva: Yeah, you do know your

Kyley: this is the one time I will know more than you about pop culture because it's about a nerdy, nerdy fantasy related topic.

Eva: yeah did enjoy that show though anyway, so winning time so, you know, I'm talking about anyway Basically guys if you need a show cuz I didn't know that the show existed so and You don't need to like sports, but I love it. It takes place in L. A. I love the actors. Anyway, highly recommend it. It's on HBO. There's, you know, lots of sex and Vulgarity, which is also up my alley. So it's great.

Yep. All right, friends. Thank you so much for coming along with us on this journey. If you like this episode, share it with your people, man. Share it with your people, write a review. We need some fresh reviews all up in that review section and reach out to us if you have any, want to continue the conversation.