A mess of A.I mania, funding feminist tech and silicon valley scams with Jac sm Kee

Show notes

In this messy episode Tiffany Kagure Mugo sits down with Jac sm Kee to chat about feminist tech and Silicon Valley Syndrome, the scam that is an 'equal internet' and why conversations about A.I need to be about more than shouting 'my toaster is getting smart with me.' It's all about building the internet that we all want to be on, but its going to take a little while to get there, we just need to give the feminist girlies their coins.

Produced by yours truly, Tiff Mugo, and created in collaboration with the Global Unit for Feminism and Gender Democracy of the Heinrich Böll Foundation. Mixing and mastering by Rachel Wamoto and Sheldon Mutei.

Follow Tiffany on Instagram: @kagsmugo and HOLAAfrica: @holaafrica_org

Show transcript

00:00:00: You know that the feminist the African feminist mafia has activated.

00:00:04: I now need to respond now.

00:00:08: I Thought that I'd be eating a lot of good food and Of like, you know, just be satisfying immediate desires, right?

00:00:17: Like eating good food hanging out with like a couple of good friends That listening playing music like.

00:00:22: that's what I thought I'll be doing.

00:00:24: But I guess instead What I'm doing is anticipating the end times and really taking this as a possibility for crafting with others on what can come out of the rubble.

00:00:41: What the actual... Hey, hey, hey, I am Tiffany Kogoremuga, your favourite curator of chaos and local polymath, if I'm feeling confident.

00:00:57: and welcome to another edition of What Is This Hot Mess, a podcast that chills with fire feminists from across the globe to have conversations about the fact that the world is hot trash right now.

00:01:11: We chat about the ways we are clearing up the clutter by telling the world to pick up its fascism off the floor and put away its hate, violence, and general nonsense because nobody wants to see all that.

00:01:23: In this episode, we have Jack, who is a fave.

00:01:26: Let's start there.

00:01:28: When you track down Jack and see what they're all about, you see a whole bunch of things, but one of my favorite descriptions is a feminist activist and pioneering researcher, straight out of Malaysia.

00:01:40: Jack works in the wild vortex that is internet technologies, social justice and collective power, whilst moving swiftly into the realms of sexuality and gender justice.

00:01:52: Jack and I go way back and they are one of the folks who are making the sort of internet I want to exist in because they are the badass co-founder of the Noomoon Fund and take

00:02:02: back the tech.

00:02:12: We all have those

00:02:13: people who, if there was a Grammy for feminism or an Oscars for activism, would make the, I would like to thank list during speeches.

00:02:23: When I was a baby feminist just trying to figure things out, I had some incredible people who put me on the right path.

00:02:30: And y'all know who you are.

00:02:32: Shout out to my loves.

00:02:35: So this is actually technically a PSA to thank your everyday feminist guardians.

00:02:41: Don't be nonsense.

00:02:43: No one is an island and you did not make it alone.

00:02:46: Call your feminist elders unless they're millennials who hate phone calls.

00:02:51: And in that case, text them.

00:02:53: Just shoot them a WhatsApp because no one needs that sort of harassment.

00:02:57: Okay.

00:02:59: When I asked Jack about how it all began, she told me all about the official and unofficial feminist guides in her life who formed part of her feminist origin story.

00:03:12: Ooh, that is a big question.

00:03:16: How did it all begin?

00:03:21: It's a really funny, I think the feminist origin story always comes from meeting a badass feminist who then helps you to kind of for me anyway.

00:03:34: meeting a badass feminist that helps me to unpack the world and my own lived experience from the lens of power and understanding how like you know systems and structures of power has really contributed towards how I have inhabited and experienced the world and my life including discrimination, inequality and all of the things that has made me an activist my whole life.

00:04:01: So I think that's how it began and I guess it has many different beginning points for different chapters of this and that includes meeting very, you know, meeting people who don't call themselves feminists but have an eye to this like my teacher when I was in primary school who was really paying attention to who was, you know, who had spaces to speak and who didn't in class.

00:04:25: to my first job at women's aid organization and my very intimidating and badass boss, Ivy Josia, and my very very cool colleagues like Rosanna Issa, Shubham Wachila.

00:04:37: So different points and then feminist tech activism, I remember very clearly being in this event that I thought I was going to learn to how to build a website but had my entire kind of political foundation shook to understand how technology was political space and of course it's also extremely gendered.

00:04:57: and meeting these cool feminist tech activists from Philippines like Pai Villanueva and Chikai whom I'm still really good friends with until today and yeah I can give you a host of those along the way up until today always a journey.

00:05:12: So

00:05:12: you basically had a feminist village who raised you.

00:05:19: I don't know about a village, but a lucky path.

00:05:23: Yeah, a path of many lucky encounters with brilliant people.

00:05:29: Part of why Jack took the path they took is she realized tech is political, having stumbled onto feminist tech activism.

00:05:37: The idea that tech is neutral is a scam.

00:05:40: We all joined the digital revolution thinking we shall be the star of our own show, throwing pictures of smoothies and sandwiches and sunsets at family members and friends alike.

00:05:52: A simpler time when Facebook was mainly deep lyrics from your favorite song, and Instagram was one of many pictures of a cool puddle you saw.

00:06:01: Granted, it was one of a thousand puddles you'd seen that week, but there is beauty in the mundane.

00:06:08: Then something got wicked and now we are here.

00:06:11: Or maybe something was always wicked?

00:06:14: But here we are in the global playground that is the worldwide web and we see that it is a lot more complicated than we thought.

00:06:22: There are clicks and bullies and kids with better lunches than you and I don't know, this metaphor is going somewhere strange.

00:06:30: but you get the general point.

00:06:32: We have all been on the internet and we have seen what thrives, what gets the most likes.

00:06:37: who is allowed to be topless, hashtag free the nipple, who gets to run rampant while some are shadow banned, demonetized and relegated to the side paths while some people are running the streets.

00:06:50: We see where all the money goes and how people are acting.

00:06:53: a hot mess on the internet.

00:06:56: Like why did you comment that?

00:06:57: Like when you when you logged on using your data, why did you feel that that was the way to like put your imprint?

00:07:05: on this like general collective conversation.

00:07:07: Seriously.

00:07:08: And anyway, it all leads back to the idea of who creates the space.

00:07:13: I need to take a moment to talk about how the text space purports, if that is the big English word, to monitor certain content for the sake of the community.

00:07:23: When the content is sex positive or talking about bodily autonomy or safe sex, it is banned.

00:07:29: It is going against community guidelines.

00:07:31: I was watching this German woman who was talking about Schmegs online.

00:07:35: Schmegs, a grown woman talking about her PhD research around sex and sexuality.

00:07:41: And she had to say Schmegs and Schmegsuality, which I granted is actually a little bit adorable.

00:07:47: So I might be taking that one.

00:07:49: But the fact is there is a double standard.

00:07:52: As a sex positive practitioner and curator of sexy materials, I have had my fair share of being told by Facebook, do that one more time, Tiff.

00:08:01: Do that one more time and you are out.

00:08:04: And you know, sometimes I did push the boundaries and I was out.

00:08:08: I was.

00:08:08: I was chucked off.

00:08:10: Our Facebook page as Hala Africa has been flagged numerous times.

00:08:14: I have been banned from using Facebook numerous times.

00:08:18: I have been put in like Facebook jail numerous times because of my content and Instagram.

00:08:25: full blown taken down.

00:08:26: full blown taken down.

00:08:27: We are actually on our second Instagram account as Holla Africa because we were told that we are going against community guidelines.

00:08:36: But if you know us, you know, all we're trying to do is make sure that everyone's getting laid properly.

00:08:41: But not me finding out that there was a Facebook group of thirty two thousand men that ran for over six years where the whole focus of the group was to post non consensual intimate images of their partners.

00:08:58: It took six years for Facebook to take this group down despite reports and women coming forward to say that this group existed.

00:09:07: The group had men posting pictures they took of wives and partners and the comments were heinous.

00:09:14: This was not some red pill, dark, four-chan nonsense, like men who are like skulking in the background, you know, dudes who, oh, we can't get a girl.

00:09:24: It's some incel nonsense.

00:09:26: No, this was lawyers.

00:09:29: It was doctors.

00:09:30: It was entrepreneurs.

00:09:32: There was even a mayoral candidate in the mix to boot.

00:09:38: It took six years.

00:09:39: Again, I need to keep coming back to this.

00:09:41: It took six years to get this taken down.

00:09:45: And now apparently they've all just scurried off to like undercover telegram channels or something.

00:09:51: Whereas on Hala Africa, we post one thing, one image of two consensual adults having an intimate moment, not even that deep, because we learned our lesson.

00:10:01: It takes hours to take that down.

00:10:03: Sometimes minutes, sometimes I post it.

00:10:05: And by the time it's finished posting, it's already been taken down.

00:10:10: So the question is, who really?

00:10:13: is the community that is being protected by these tech giants and who is the space really made for.

00:10:22: you know the tech space.

00:10:24: is so political because right now I think people are getting more of a dose of that right like people are seeing more and more how these tech bros are doing too much turning up at inaugurations.

00:10:36: they have no business being at like removing these sorts of platforms doing that.

00:10:41: so I think can you speak a little bit more to the idea of tech being political?

00:10:47: Well, I think technology has always been not only political, but very gendered.

00:10:55: And I think it sort of, it has managed to make itself invisible so far because it hides behind this discourse or this like, you know, kind of, how do you want to call it?

00:11:05: Like, I don't know if you read like fantasy stories where they talk about fairies having glamour.

00:11:13: So you have this aura of opera.

00:11:15: right?

00:11:15: Like this thing that you kind of project around you.

00:11:18: that makes you seem to be something but the reality is something else.

00:11:21: So science and technology has always cloaked itself in this glamour of neutrality.

00:11:28: and of like, you know, in like objectivity.

00:11:31: But in fact, from the very beginning, it's been entirely embroiled in and complicit in the project of colonialism, in the project of creating gendered societies and disparities and extraction and, you know, creating rationale for extractivism and projects of empire, like all along science and technology has been really not only a driver but a real, cohesive collaborator of these projects.

00:11:59: Then we come to the current times where digital technology has become such a big role and has such a huge influence in how we participate and engage in all facets of life, whether it's the most intimate between you and yourself and you and your family and you and your body and you and your relationships to the most public between you and your government, you and your land, you and the economy and territories.

00:12:32: Technology is like a big systemic and structural force within all of that.

00:12:38: It sort of constructs how we make sense and make meaning out of these particular relationships.

00:12:43: and who has power, what is value, what is being said.

00:12:48: and you said it as well like all these tech bros having no business in inaugurations and unfortunately at this point it is very very much monopolized and directed by yeah I sometimes say it as a joke in a very off-handed way but for white guys and maybe one Indian dude.

00:13:08: And that's like, that's kind of like, you know, that's the imagination and the reality that we're living in right now.

00:13:14: It's not entirely true, but true enough.

00:13:16: Yeah, funny, not funny.

00:13:20: When we talk about the Silicon Valley model, Jack explained it as a few dudes, jokingly for white dudes and an Asian who are seen as the saviors of the realm, the one true brain to save us all.

00:13:32: And in that belief, all of the capital flows towards them.

00:13:36: concentrating the resources in a couple of disruptive hands, rather than giving everyone a fair shot at this saving the world shtick.

00:13:47: Yeah, I think we're building similar models.

00:13:49: I think that's the sad thing about where we are at right now, right?

00:13:54: Like in terms of how science and technology is being developed and built, it's the same silicon.

00:13:59: valley model where every single country that is it's either trying to create itself as the Silicon Valley of North Kenya as a Silicon Valley of Africa, India as the Silicon Valley of Asia.

00:14:13: Well, Asia is also like a big belief place, but there's several different Silicon Valleys in Asia.

00:14:20: And then when you have a Silicon Valley of that means you also want to be able to monopolize the field in the same way.

00:14:27: You also want to create kind of really unreal, like, you know, not only unrealistic but problematic idols within the field that you want to.

00:14:37: that you want to put all of your faith in to say okay no you have the vision and you are the genius and you should lead us and we will put all of our capital investment in you and it's okay like feel free to experiment and all of us because you will get that.

00:14:49: it's important for us to become like um competitive and it's the same model of um in yeah how you invest in it but you kind of prioritize and put a lot of energy behind.

00:15:03: I think that's also the yeah the sad problematic narrowing that we are in right now.

00:15:13: The internet can be

00:15:15: a

00:15:15: lot.

00:15:15: It doesn't matter if you're witnessing an influencer being the modern day version of a snake oil salesman.

00:15:21: No, that herb will not cure your chronic illness.

00:15:24: Or we're sitting on Facebook watching your older relatives post the most inappropriate and wild status updates.

00:15:33: The space is doing too much.

00:15:35: I asked Jack what they're feeling about the online space right now.

00:15:40: What is my gut feeling about online spaces right now?

00:15:45: Oh It's a it's a. It's a tough question.

00:15:49: I think on the one hand an immediate visceral reaction is to kind of coil away from Yeah, the toxicity and pollution and blatant capitalism that has been infusing all of its development and its perceived value and use.

00:16:15: Not all, but a majority of it.

00:16:17: So, you know, that's like the initial gut reaction.

00:16:21: And just one thing to really turn away from it.

00:16:27: And I think it's particularly, yeah, I mean, the last few years, especially has been particularly challenging because it's so blatant in terms of who has control and power over shaping those spaces and how little accountability there is and how little participation there is.

00:16:49: So that's like the first gut reaction.

00:16:52: But then because I've been a feminist activist in this space for a better part of my life, a very quick follow-up response to this is to you know, wanting to elbow out and create this like, yeah, direct attention and visibility also towards a lot of really incredible work and imagination and strategizing and organizing.

00:17:28: that's being done by a lot of groups and activists.

00:17:31: in many parts of the world, especially in the larger majority world, like where you and I are in.

00:17:38: And that is a different feeling.

00:17:42: That one sort of makes me want to step into the messiness of that more and give it more everything for it to thrive.

00:17:56: Tech is sneaking into everything.

00:18:00: Your bed, your garden, your groceries, your menstrual cycle.

00:18:04: Everything.

00:18:05: You can get an app with optional AI for everything.

00:18:11: My personal fave is sex toys that can be used globally through like these sort of online platforms.

00:18:18: And because tech is so prolific, we need to give the girlies the coins.

00:18:23: Give the feminist tech girlies the monies so that they can increasingly create a digital world in more than one image.

00:18:33: So how do you say the funds name?

00:18:35: Cause I feel like it's New Moon.

00:18:38: Yeah.

00:18:39: It is New Moon.

00:18:40: Okay, cool.

00:18:41: You know, sometimes people say things and you're like, absolutely not.

00:18:43: You have butchered that.

00:18:45: So cause, you know, we all know that the money in tech and the money in the, for the digital space is going to just, yeah, like basically three dude bros and then they're protégés now, right?

00:18:57: So my question is what, what does feminist tech look like?

00:19:02: And what does funding feminist tech look like?

00:19:05: Because I gotta ask you about your actual work.

00:19:08: What is in the email address?

00:19:11: What does feminist tech look like?

00:19:14: I think feminist tech looks like a deliberate unpeeling of all of the different layers and complexities and interrelationality.

00:19:29: the different parts that mix up what we think about technology or technology infrastructures in our lives.

00:19:37: That means feminist tech is also about communities.

00:19:42: Feminist tech is about thinking about power and decision making and governance and accountability and all like you know and and thinking about that not just in terms of articulating and understanding what's actually happening right now and who it is affecting, but thinking about it in a way that's also creating propositions and potential pathways forward.

00:20:10: Feminist Tech is also about understanding technology's relationship to... the environment and the ecosystems we're part of and communities that inhabit them and also all kinds of you know kinship with other kinds of living beings.

00:20:29: And feminist tech is about centering the communities who will be where these technologies are important for and then them sort of defining what it is and should be or needs to be and also centering those who are most affected and making that very visible.

00:20:48: And feminist tech is about imagination, about different frameworks.

00:20:53: And it's also about building and developing and adapting and thinking through different kinds of technologies and the sorts of power that needs to fuel and drive them.

00:21:11: in really interesting and exploratory and sometimes very, you know, and often actually not sometimes often very connected to ancestral knowledge and technologies as well, building from our histories in order to be able to chart a pathway to the future.

00:21:32: And sometimes it's very pragmatic.

00:21:34: I'm building like, for example, one of our partners in Rwanda, is building a platform online and I think now they've developed it into also an application that you can use to support people to be able to access sectional reproductive health services in the country.

00:21:52: So it is also as pragmatic as that.

00:21:55: So Numun Fund is a feminist tech fund.

00:21:59: We are an activist fund.

00:22:01: So what does that mean?

00:22:02: That means we're kind of, you know, we're grounded in movement and led by movement for movement.

00:22:11: Like I said, give the kittens their coins.

00:22:13: Noomoon Fund was birthed like most game changers such as making banana bread and other breads actually from scratch during COVID.

00:22:22: That time when people were sitting at home with their own thoughts.

00:22:25: A moment that was truly terrifying but magical.

00:22:28: So Jack wants to make it known that she does not and I quote have deep pockets.

00:22:34: So do not slide into the DMs trying to secure the bag.

00:22:37: But despite not having these deep pockets, she got into the business of thinking about and reimagining what funding movement building could look like.

00:22:47: In her view, and a bunch of other smart folks out there, funding models were and still are missing a big chunk of the larger picture, namely how organizing is happening, where it is happening, and who is making this magic happen.

00:23:03: Who is leading the things?

00:23:04: These are the questions that funding models are not asking.

00:23:08: COVID truly showed us two things, our asks and how deeply we engage with the digital world and how much the social justice realm still had to do its job within this space.

00:23:19: So the question was, how could this deep relationship be leveraged?

00:23:24: And this was the thinking that Jack was doing along with all her co-conspirators.

00:23:33: So then it's like, oh, actually maybe now it's a really good time to try and try and think about setting up a dedicated fund to support feminist tech work.

00:23:42: Because, you know, the intersection between feminism and technology is not evident.

00:23:46: So much of our time, like, you know, so much of the time that I've been spending in this field is field building and kind of getting more people aware and like, you know, engage with this connection and why it's important to kind of have feminist analysis and lens to this question of technology.

00:24:03: So then a bunch of us, it was Ezra who built CrowdMap.

00:24:07: I don't know if you know that and who was the head of Mujal.

00:24:11: Really, I admired her work for a long time.

00:24:14: She's a developer from Bahrain and also Ezra and also Anasuya who co-founded Who's Knowledge.

00:24:25: we were sitting there and thinking and thought and then like, okay, how about a feminist tech fund?

00:24:29: Let's just, let's do it.

00:24:31: Like, I don't know, man, this is not my jam, but I'll help think, you know, I'll help to like think about how to make this happen with this question that I had, right?

00:24:40: About how to resource, you know, feminist organizing in this current time.

00:24:45: And I remembered in January, twenty, twenty one, it was like, there was a clarity to me.

00:24:51: I'm like, Oh, you know what?

00:24:53: What this?

00:24:54: could do is to really explore thinking about what would it take to also create alternative resourcing infrastructures like.

00:25:05: we need to also think about different sorts of economic models that will be able to fund and fuel our organizing and our movements and our activism.

00:25:15: and I think it's never been more evident than this year especially to show that the current funding the sourcing system is broken.

00:25:25: Not only is it broken, it feels like, you know, so much of our activism is being funded through composting the leftover of capitalism and colonialism.

00:25:34: It's a problem.

00:25:35: So that's how it came about.

00:25:37: And so we like, yeah, we just set up this fund.

00:25:42: It is currently still the only fund that's dedicated to feminist tech organizing in the larger majority world.

00:25:50: We are an activist fund.

00:25:53: It also means we raise money to move money.

00:25:55: And we do a lot of advocacy and activism within philanthropy to highlight this intersection as well.

00:26:03: And to demonstrate the vision and leadership and amazingness of all these different groups and activists.

00:26:13: in the larger majority world who's doing this work.

00:26:15: And I actually didn't realise how many people are doing this work until we had our first open call in August, so when we had our first open call, let's just see who will apply doing this work on feminism and tech.

00:26:30: My God, we got so many applications from people doing such cool work from all over, like so many different parts of the world.

00:26:38: It was just... So hopeful.

00:26:42: Yeah, it was really like such an injection of vision and possibility.

00:26:47: I can't tell you.

00:26:48: Yeah.

00:26:53: But we aren't perfect.

00:26:55: Even when we are doing big things like Jack is, we can still trip up.

00:26:59: Hence the Oops My Bad segment, where we look at the failings of feminists.

00:27:04: Yes, the failings because we're allowed to fail.

00:27:07: And what we learned from them.

00:27:10: Because if you're going to fail, learn something, right?

00:27:12: Don't fail and then be an asshole about it.

00:27:14: fail, accept the L, and learn something from it.

00:27:18: And that is what this segment is all about.

00:27:21: So in the case of Jack, it was forgetting to invite the hosts to the party in their own country.

00:27:28: It's a terrible metaphor, but look, work with me here, and it will make a lot more sense when Jack tells the story.

00:27:35: Because we don't hear those stories.

00:27:37: We just hear people are shining lights, and then all of a sudden they're canceled for something that happened in like, Well, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like,

00:27:52: like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, like, And it's a small event, you know, we were bringing partners from different parts of the world and we had it in, yeah, it wasn't in Nairobi, it was somewhere a little outside of Nairobi.

00:28:22: And we... brought people from Latin America, from, you know, different, from North Africa, from Asia, and from different parts of the world, but we didn't have a partner in that particular small program that was in Kenya.

00:28:40: So we completely didn't think about, but we had other partners who were in Kenya, who was not part of these grand streams, part of other grand streams, but somehow we didn't think about inviting our partners who were in Kenya to come to this event that we were having in Kenya.

00:28:58: So it just was so weirdly decontextualized.

00:29:05: And the people that we had who were from Kenya will hear partnering with us as documenters or as context accompaniment, but not as participants in the event, even though we had partners in Kenya.

00:29:19: And it was such a big It's not even a, oops, my bad.

00:29:23: It was like a terrible, terrible, like, yeah, on record.

00:29:29: We really was, you know, called in into this very clearly, was very like, you know, and it was something we sat with and reflected on and are still having conversations around that we need to thread through how we do things very differently.

00:29:45: But this, yeah, this kind of, disconnection and dislocation and not centering of the activism and the people and our partners in the context where people actually bringing people together in was, yeah, huge, huge learning moment.

00:30:05: I admit it.

00:30:05: I spend too much time scrolling.

00:30:07: Granted, it's mainly memes.

00:30:09: Anyone who knows me, I'm always sharing a meme or eight or six.

00:30:13: I will bombard.

00:30:14: your inbox with reels.

00:30:16: Like that is my entire shtick.

00:30:18: But sometimes I'm out here on the internet at five AM being radicalized because I wake up at around five AM and I pop onto the internet and I'm like, oh my gosh, what's going on in the world?

00:30:28: And then I get radicalized.

00:30:29: I just, people are online saying the quiet part out loud and I am shook.

00:30:36: People are acting bold and wild on the worldwide web.

00:30:39: Look, we all have some sideways thoughts, but my thing is you turned on your camera.

00:30:44: pressed record, recorded that video and posted it.

00:30:49: wicked behavior.

00:30:50: In light of that, I'm going to go on record and say, stop giving Dude Bros podcast equipment.

00:30:55: Just,

00:30:56: just stop it.

00:30:56: There's toxic influences.

00:31:01: It's the toxic influences for me that are really stressful.

00:31:04: Like I understand that the internet space was supposed to be like the great equalizer, but like it feels like a mess.

00:31:11: So I think you started touching on it, but like how do we counteract and push back against that.

00:31:18: And also, how do we protect ourselves?

00:31:20: You know, for those of us who aren't like, you know, people in tech, how do we protect ourselves and how do we push back against all of that?

00:31:30: I'll start with the simplest, which is how do we protect ourselves and push back?

00:31:36: And I think we know this.

00:31:38: We have always protected ourselves and organized and pushed back by being in community, by really looking around us and like, you know, building and growing community and strengthening our collective response and, you know, showing up and really thinking through and analyzing what's happening and paying attention to the situation and peeling what the problematics are and thinking through solutions together.

00:32:03: I think that community piece, it doesn't matter where we are, whether we're online, whether we're offline, and we know that everything is interconnected.

00:32:12: It's not that separation anyway.

00:32:15: that piece is the always.

00:32:18: that piece for me remains the most significant and important piece.

00:32:22: and how do we then create bigger and stronger and more interconnected communities?

00:32:29: that's outside of this like you know outside of our own more safer tribes.

00:32:34: right like to also then be quite intentional about crossing over this.

00:32:38: but you were talking also a little bit about the internet supposed like you know it's supposed to be.

00:32:44: It's meant to be a great equalizer.

00:32:48: But how can it be a great equalizer when there is still no equal access to, first of all, at the most basic and simple level access to technology itself?

00:32:59: when at this point in twenty twenty five we operate as though wow we're in the future.

00:33:04: now everybody has access to the to the internet we will just use we will get rid of money and everybody will just use digital currency la la la la.

00:33:12: when the reality is i think we're only like sixty five or sixty nine percent connected online.

00:33:19: that's that's still.

00:33:22: Yes, it's turning, turning five.

00:33:24: And we only got to sixty five or sixty nine percent because of COVID, because there was a lot more investment in accelerate, like, you know, really pushing through connectivity and so forth.

00:33:34: But this is a this is a reality.

00:33:36: And the more when you don't have that equality, like, you know, when you have already from the get go differences in.

00:33:43: access to basic connectivity.

00:33:45: we're not even talking about the kinds of connectivity you have like whether you have good quality like you know what's the quality and cost of your connection whether it can be very easily shut down by the telco or your government or surveilled for different reasons which you know you're familiar with.

00:34:02: also what's been happening in movements all around the world including in Kenya and like That doesn't even include that.

00:34:09: And then the next layer is then actually where is your like your capacity and ability and access to resources to be able to critically engage with these technologies and to create and shape it.

00:34:26: So I have a confession.

00:34:28: I sometimes forget that not the whole world is on the internet.

00:34:33: It's hard to remember this when it's such a central part of like these big life global moments and also when some people have more followers online than countries have citizens and not everyone has the same access.

00:34:51: Do I think my memes are funny?

00:34:52: Yes.

00:34:53: Do I personally feel I should have more followers to spread my hilarity on Instagram?

00:34:58: Also, yes.

00:34:59: But I do also get that my gender location and a whole bunch of other things don't really work in my favor.

00:35:07: The access to the internet is a wickedly unequal thing.

00:35:10: I said this further up in the podcast.

00:35:13: Like, you know, listen back if you think I'm lying.

00:35:16: When one looks at the population size versus the number of people on the internet in certain countries, if we went by internet visibility, one would think that the world is made up mainly of white American twenty-something year olds with half-hearted aspirations to recreate their favorite dance movies.

00:35:36: So you have all these different levels of inequality stacked up.

00:35:39: And then a statement like the internet was supposed to be a great equalizer just sounds, you know, it rings very hollow, right?

00:35:48: At this point, like actually we know that isn't it?

00:35:51: And there's so many different issues surrounding that.

00:35:54: And the reason why that we also experienced so much misinformation and gender, this information specifically also online and also, you know, gender harassment and technology facilitated gender-based violence online is also because to some extent, well, on the one hand, technology sits on top of existing disparities, right?

00:36:22: So then it sort of, you know, like it augments the ability to also express these disparities more.

00:36:30: Jack breaks it down.

00:36:31: the con is real.

00:36:33: Get you all worked up and then they can extract data points from you.

00:36:37: This is the business model.

00:36:40: When you're online fighting for your life in the comments, it's all about money in the bank.

00:36:45: When you're angrily commenting on a post about how women shouldn't ride bicycles because they will destabilize the economy, or you're replying to a post on X

00:36:55: about

00:36:55: how the queers are working with extraterrestrials to overthrow the government, they get you.

00:37:01: That is how they get you.

00:37:03: That's the scam.

00:37:04: I was telling my partner recently about how I am the target market.

00:37:09: Ragebait will have me watching a video four times in disbelief.

00:37:14: Checking the comments to see if anyone, somebody agrees with me that this is nonsense, right?

00:37:21: And boom, I've spent three minutes on a twenty second post.

00:37:25: A mess.

00:37:27: So, for all of these reasons, It's like a, yeah, to put it bluntly, a pretty shitty space.

00:37:33: But then at the same time, that's not new to online spaces.

00:37:41: It's also been part of many of our other kinds of social and public and even like, you know, private spaces and community has always been a really important strategy and an important kind of, yeah, way of being for us to shift this, like shift the dynamics.

00:37:58: So, yeah.

00:38:00: Jack is not on the socials and I don't care for it because Jack is our vibe.

00:38:06: I did try and find you on Instagram and I was like, what?

00:38:12: I feel like you're so Instagrammable.

00:38:14: This makes no sense to me.

00:38:17: Yeah, so I don't, yeah, I'm not sure.

00:38:20: So I actually, I would like to ask you, what do you feel like?

00:38:23: What's your sense?

00:38:25: I'm really, yeah.

00:38:27: Interviewed on your own podcast.

00:38:29: This is wild.

00:38:30: Um, I don't know.

00:38:32: I think, I think also for me, I think it's a jewel thing.

00:38:35: Um, it's partly I've taken, like the more it became difficult, especially for Hall of Africa to maneuver online, the more we started taking like a step back from spaces, just giving, just engaging enough.

00:38:50: Um, but.

00:38:52: I think it's partly that, but also I think it's also an age thing.

00:38:55: So like a lot of the people who I used to like, you know, have community with online, aren't online as much, right?

00:39:04: And I don't know if it's like fatigue.

00:39:07: I don't know if it's burnout, but like, we're not doing the thing where we all used to jump on a hashtag on what is now called X. Like just, what is it?

00:39:15: An X con bone?

00:39:16: I don't even know.

00:39:17: So I, yeah.

00:39:19: And for me, I feel like.

00:39:21: What I am seeing a lot more in spaces is influences.

00:39:26: So like it's like a funky queer couple or you know it's a feminist who goes online and is helping people in Sierra Leone but she's always on lives by herself.

00:39:38: Like it does seems to be a very solo nature.

00:39:43: And especially as more and more people are leveraging that space for what I feel is the social capital of it.

00:39:54: It does not feel as much.

00:39:56: with the oh cool we're going to build a community.

00:39:58: it feels more like hey come on my platform with thirty five thousand people and two major influencers talk to each other and the rest of us stand on the sidelines and clap our hands.

00:40:09: So high stakes like you said.

00:40:10: no everyone has become not only we've been invited to participate not as a community.

00:40:16: member but as a spectator and a fan right like that's and at the same time as a critic.

00:40:22: so you almost like you only have this very kind.

00:40:24: your participation is just either critique or judgment or fan and spectator.

00:40:29: so it's so high stakes.

00:40:30: so maybe the community conversations are happening in other spaces where it's not so high stakes when you're not when this is not the yeah when this is not the kind of um the model you're running on.

00:40:42: so i i mean i want to give like.

00:40:44: yeah i want to sort of like gives space for hope in also the expansiveness of my ignorance of different conversations.

00:40:51: that's happening, hopefully.

00:40:53: But the influencer phenomena is also a consequence of the, you know, we were also talking about the economic model of the current technology, digital.

00:41:06: online sphere, right?

00:41:07: Like that is the economic model.

00:41:09: It's not just social capital, it's actually monetary capital.

00:41:12: Like that's the way in which you're also able to, when you build a brand, when you are the brand, then you also participate in, you know, you're able to take different kinds of gigs and do different things and often, and there is an entire economy of controlling narratives and shaping narratives.

00:41:29: that's both above ground, that's about consumerism and also more grey, which is about political power.

00:41:38: And that is a real thing that happens all the time in many different spaces where you do have influences of different degrees for hire to shape narrative and public discourse in different ways.

00:41:52: So that's also a very real phenomenon in that sense.

00:42:00: Besties, AI is a lot for me.

00:42:02: Look, am I the naysayer you want to see in the world?

00:42:05: No.

00:42:06: Am I completely convinced AI will save it?

00:42:09: Absolutely not.

00:42:10: I'm a millennial.

00:42:11: I grew up on movies like iRobot, Terminator, and The Matrix.

00:42:15: My vision is colored.

00:42:16: But what I have picked up is AI is a little bit more complicated than thinking about it as people using it to cheat on their partners or their exams.

00:42:26: The way we speak about AI and think about AI is Jack's fight me in the comments moment.

00:42:32: So again, just to tell you what fight me in the comments is about, it is about that idea and that notion that you are willing to break up a dinner party over, that you are willing to go boxing in the street over.

00:42:44: Like, you are just like, you know what?

00:42:47: This is my heart.

00:42:47: take and you are going to have to hear it.

00:42:51: In this edition of the segment, Jack had some big feelings about how we're engaging with the conversation around AI.

00:42:58: So

00:42:58: deep were these feelings that they actually apologized to me off air about going on a rant.

00:43:03: And I'm like, no, do it.

00:43:05: That's exactly what the segment is about.

00:43:07: You're just like, nah, you must fight me in the comments because I got some big feelings.

00:43:11: And this is also a safe space.

00:43:13: And we love and we appreciate it.

00:43:16: Just a quick catch up though, before we get into the segment.

00:43:19: What is a large language model?

00:43:21: Large language models are advanced deep learning systems trained on vast datasets of text to understand and generate human-like content.

00:43:33: Basically, they aren't thinking.

00:43:34: It's plagiarism of text on another

00:43:38: level.

00:43:42: I have to ask the AI question.

00:43:43: You know I have to ask the AI question or else are we even talking about tech, right?

00:43:48: So AI, I know it's not a simple yay or nay conversation, but where you at with sort of AI right now?

00:43:57: Yes, because I know I've heard some good things, but then also in terms of like, again, the power dynamics and how this sort of stuff is built and what it focuses on and et cetera, et cetera, et cetera.

00:44:10: I've heard some really negative things.

00:44:12: So yeah, so where you at with AI generally?

00:44:14: I guess it depends on what you mean by AI because I feel like so much of what the conversation around AI or what is being talked about as AI.

00:44:25: Number one, you kind of add AI and stuff to everything, right?

00:44:27: And suddenly it's like, no, no, we cannot talk about technology now without just adding AI.

00:44:31: Like my toaster has AI.

00:44:33: you know my shoe has AI like my pencil has AI.

00:44:36: everything is like AI.

00:44:37: I'm like dude.

00:44:38: what are you like?

00:44:39: what are we even talking about?

00:44:40: and I think the reason for that is because AI is sincerely a huge marketing term that is supposed to encapsulate everything and nothing at all about developments in technology.

00:44:53: so I think it depends what we mean right like when we talk about this.

00:44:58: The current trajectory and mania around AI right now is very, very much driven by the power forces that we were just speaking about from the beginning.

00:45:12: It is driven by Silicon Valley in the US, by very few companies who basically have been gobbling up all of the capital investment, all of the data and, yeah, all of the data and content that we have been creating, not for them, but they've just laid claim to it as though, oh no, it's on the internet.

00:45:39: therefore it's public and therefore we can train all of our systems on it.

00:45:42: And like, you know, it's for the greater good.

00:45:45: And it's also kind of increasingly gobbling up more and more energy because AI, like, you know, large language models, like how things like chat GPT is being run on, a lot of the so-called AI that we are familiar with that we are being asked to use in our daily lives is the kind of model that actually needs to be trained on huge amounts of datasets that then takes up so much energy in terms of electricity as well as clean water and land.

00:46:14: And this isn't even talking about the sorts of resources that's needed to build this machine.

00:46:26: and it's doing all of these things and taking you know really being such a if you imagine it as like an anime monster.

00:46:33: it's so greedy.

00:46:35: there's like two or three humongous greedy ass monsters that's like you know getting nebulous shit that's eating up more and more and more.

00:46:42: And then selling us this manic dream that AI is going to be the best thing ever.

00:46:47: Like it will solve all of our problems.

00:46:50: It's so fun.

00:46:51: It's so great.

00:46:52: Just use it.

00:46:53: Please use it.

00:46:54: We'll put it in everything.

00:46:55: Oh, unless you are really intentional about using non-AI-enabled Google, it's automatic.

00:47:05: You don't even have a choice.

00:47:07: And the reason for that is because they are still continually training.

00:47:11: their systems, they still don't have a very good use case for you.

00:47:17: In what way has it significantly actually helped us, these large language models, in very dubious ways.

00:47:25: But this is where we are right now.

00:47:28: But when we talk about very specific forms of AI that are trained on much smaller and specific data sets to solve very particular problems, they've actually produced some really interesting and valuable results for us.

00:47:45: I'm sure you've been asked, Tiffany, how are you using AI in your thing?

00:47:49: What about you and AI in content?

00:47:52: I'm pretty sure.

00:47:54: And I almost want to go like, oh gosh, we have real problems.

00:47:57: You want to talk about AI?

00:47:59: Then, okay, let's talk about the kind of problematic, non-consensual use of data.

00:48:05: Let's talk about the people who are being paid very low income to do categorizing and data tagging in our parts of the world, working in really shitty conditions.

00:48:15: Let's talk about that.

00:48:16: That is a conversation I'm interested in having.

00:48:21: Okay, so my confusion is allowed.

00:48:23: They are good parts of AI and bad parts.

00:48:26: Good parts include helping with research, brain mapping, and collating large amounts of data, like your Excel spreadsheet could never.

00:48:35: My dad told me something vague about engineering things, which sounded promising, but then they're also the wild parts of AI, where it's just straight up making stuff up.

00:48:46: Like there have been instances where it's made up case law.

00:48:49: and gotten lawyers into trouble, right?

00:48:52: Like, you don't need this.

00:48:53: There's also that controversy where they make up books while they plagiarize books.

00:48:58: But then they just also make up book titles.

00:49:01: Like, look it up.

00:49:01: There's been just a lot of stuff.

00:49:03: And then also the unethical use of AI within research papers.

00:49:09: AI is like a needy lover.

00:49:11: It just wants to make you happy, even if it's selling you dreams and chatting hot trash.

00:49:17: and do not even get me started on how it's being used for the creation of revenge porn or what we should call it the creation of non-consensual intimate images.

00:49:26: But the thing is AI has slipped into all corners of our digital life.

00:49:31: On most platforms you have to actively switch it off.

00:49:34: It even has the older generation in a chokehold.

00:49:37: My mother was legitimately jazzed by AI writing a letter for her and an uncle of mine told me to ask chat GPT about something he is an expert on.

00:49:49: Sir, what do you mean?

00:49:53: Just before you move on, I think, you know, kind of the encounter with your uncle, right?

00:49:59: Like the fact that he's going, ah, just ask chat GPT.

00:50:02: And the fact that chat GPT hasn't really been around for that long, but it's been so pervasive and invasive in our world and in our like you know experience of technology and Expectation.

00:50:15: right like you like you've got to be on chat GPT if you want to be on it.

00:50:20: But why every single query takes up so much energy?

00:50:25: why not to mention the fact that it's often wrong and hallucinating?

00:50:31: But we are kind of calling it.

00:50:32: I have a friend also was telling me.

00:50:33: Oh, yeah, you know, what did she say?

00:50:38: My my friend chat like.

00:50:40: yeah, but it's basically sort of like you know becoming this companion to you and this person.

00:50:45: now It's like your friend chat and she meant it as a joke, but I think it's also this real like reliance on a function of technology that is untested like you know that actually is an auditor that has huge problems in terms of how it is How it was created and what?

00:51:07: what underpins its use and also questionable in terms of its value that it's become so much a part of our reality.

00:51:24: and the consequence of it as well is we are subcontracting our ability to synthesize, analyze, think through different content points or create you know, write sentences to this.

00:51:41: And language and the ability to synthesize, analyze, write, express ourselves in thought through our own writing in the, you know, from a sentence from A to Z, from the beginning to a full stop.

00:51:53: It's also such a critical part of critical thinking.

00:51:56: And that's also a particular consequence that we don't know what it is yet, because we're now in about year two and a half, right, of chat GPT.

00:52:05: So what will happen in five years?

00:52:08: What will happen in five years?

00:52:10: As a psych and sex positive girl, one of the major things I've been seeing is the rise of people dating chatbots and also people outsourcing therapy to their chat GPT bots and other platforms.

00:52:23: I will not lie with all the hats I wear.

00:52:25: I did not think that mental health and getting people laid on a Friday night would be the parts of my offering that are moving towards becoming obsolete.

00:52:35: I don't think they're really going to become obsolete, but I just, I do not care for it.

00:52:40: I do not care for it.

00:52:42: It's wild out there.

00:52:43: Look, I can neither confirm or deny, but I am so sure someone I was having a thing with was using chat GPT to send me beautiful messages.

00:52:53: Like the person one minute was like, oh, one word answers and chilling.

00:52:57: And the next it's giving.

00:52:58: Shakespeare meets Shonda Rhimes, circa season three of Greys.

00:53:02: Okay.

00:53:03: I was just like, wait.

00:53:04: What?

00:53:05: How?

00:53:06: Look, I am both flattered and confused and have to channel my inner carry from sex in the city and ask, in this digital age, did I fall for the person or the bot?

00:53:18: Right?

00:53:19: Yeah.

00:53:20: Yay, he's correct.

00:53:22: It's like, yeah.

00:53:23: And it's, I mean, this person must be trying very hard to impress you, but maybe put a bit more impressive in the, and then awkward, but, you know, genuinely well felt.

00:53:33: sentence, probably.

00:53:34: Right.

00:53:35: Like, you know, just maybe sit with it.

00:53:37: And if you haven't got that many words, it's fine.

00:53:40: We'll figure it out.

00:53:41: But like, like, look, anyway, that ended.

00:53:43: So I guess that's that.

00:53:45: Oh,

00:53:47: shame.

00:53:47: No shame.

00:53:48: No shame for them.

00:53:49: It's fine.

00:53:50: Okay.

00:53:50: So because

00:53:51: you're going to put in your in your dating apps, right?

00:53:53: Like, please no chat GPT.

00:53:55: I prefer your awkwardness.

00:53:56: Thank you very much.

00:54:00: So to wrap this all up my people write your own sex sex.

00:54:05: You know sexy texts, right write your own ones Understand that the internet is a lot and make sure you like and subscribe to the kind part of the internet because it is Wild out here.

00:54:18: like you genuinely have to ask what is this hot mess?

00:54:23: Because the internet isn't quite the utopia we thought it would be.

00:54:27: But all hope is not lost.

00:54:29: There are people doing things to make the internet a better place.

00:54:32: As usual, we're going to put all sorts of dope links in the show notes, so make sure you check it out.

00:54:38: Lastly,

00:54:39: no AI

00:54:40: was harmed in the making of this podcast.

00:54:44: Shout out to the Global Unit for Feminism and Gender Democracy of the Heinrich Bohr Foundation that is hosting this podcast and my magical team Ray and Sheldon for that apocalyptic post-production.

00:54:57: This episode was executive produced by Yours Truly.

00:55:00: Till the next episode, this is Tiffany Kogoremogo telling you to go to the mall in your fancy dress.

00:55:06: Use the special teacups your African parents said are for nice occasions or simply shower after midday.

00:55:12: Because it's all chaos anyway.

00:55:15: And this is not the apocalypse we signed up for.

New comment

Your name or nickname, will be shown publicly
At least 10 characters long
By submitting your comment you agree that the content of the field "Name or nickname" will be stored and shown publicly next to your comment. Using your real name is optional.