Moral Repair:
A Black Exploration of Tech
How can we all thrive as we navigate technology, automation, and AI in the Information Age? What have technologists, philosophers, care practitioners, and theologians learned about the innovations and worldviews shaping a new century of unprecedented tech breakthroughs and social change?
On Moral Repair: A Black Exploration of Tech, hosts Annanda Barclay and Keisha McKenzie talk with tech and spiritual leaders. Their conversations inspire curiosity about tech while showcasing practical wisdom from the African continent and diaspora to nurture wellbeing for all.
Moral Repair expands mainstream tech narratives, celebrates profound insight from Black philosophy and culture, and promotes technology when it serves the common good. Listeners leave each episode with new ways to think about tech’s impacts and apply practical wisdom in their own lives.
In Season 1, episode themes range from recommendation algorithms and a Black ethical standard for evaluating tech to interactive holograms and hip hop as cultural memory tools. Other episodes explore moral repair, ideologies and philosophies shaping Silicon Valley, AI ethics, inclusive design, and tech well-being.
Guests include Aral Balkan (Small Tech Foundation), Dr. Scott Hendrickson (data scientist), the Rev. Dr. Otis Moss III (pastor, filmmaker, storyteller), Stewart Noyce (technologist and marketer), Zuogwi Reeves (minister and scholar), the Rev. Dr. Sakena Young-Scaggs (Stanford University’s Office of Religious & Spiritual Life, Judith Shulevitz (culture critic), and Dr. Damien Williams (professor and researcher on science, technology, and society).
Season 2
Trailer
Season 2
On Moral Repair: A Black Exploration of Tech, hosts Annanda Barclay and Keisha McKenzie talk with tech and wisdom leaders. Their conversations inspire curiosity about tech while showcasing practical wisdom from the African continent and diaspora to nurture wellbeing for all.
Moral Repair expands mainstream tech narratives, celebrates profound insight from Black philosophy and culture, and promotes technology when it serves the common good. Listeners leave each episode with new ways to think about tech’s impacts and apply practical wisdom in their own lives.
-
Annanda: An ancient form of trauma has a new name. It’s called moral injury. And it defines a deep spiritual and existential pain that arises when something violates our core beliefs. In our increasingly connected world, we’re seeing lots of moral injury.
Keisha: Just take our AI age, which often feels like it's doing more harm than good.
OM3: What scares me about technology is the profit motive. profit plus human frailty, historically, has meant tragedy for someone.
Damien: the very people who are supposed to be regulated by these policy changes, are the people who are writing the laws, or the very least “advising” on them.
Adorable: technology it's solving the problems of other technologists and industry, it's not really solving the problems of everyday people.
[SFX: A crescendo in the music]
Keisha: AI is overwhelming.
Annanda: So here's the question. Can we ever truly mend the damage that AI is causing us? That is, what could moral repair in our modern technological era look like?
Keisha: This isn't just about patching systems. It's about caring for people. Technology should serve humanity… not the other way around.
Annanda: We’ve got an antidote to moral injury.
[SFX: A shift in tone]
Jane: Africana philosophy, it's so rich with a broadened conception of technology, it's all about cultivating human well being, cultivating sustainability…
Rev. Dr. SYS: Is this life-giving or death-dealing? Because we need more life-giving things.
[SFX: Our theme song or a play off it leading into it]
Keisha: Welcome to a new season of the two-time AMBIE-nominated podcast Moral Repair: A Black Exploration of Tech. A series about the innovations that make our world… disrupt our societies… and how we can repair the damage.
Annanda: I'm Annanda Barclay, an end of life planner, chaplain, and moral injury researcher. I think a lot about the impact moral injury can have on living a meaningful and dignified life, all the way to the end.
Keisha: And I'm Keisha McKenzie, a technical communicator and narrative strategist with a knack for asking tough questions and making experts accessible.
Annanda: This season, we'll be your guides through the maze of AI. And its use in political strategies, conflicts…
Keisha: Government regulation and play. All while we try to answer the big question — who's actually responsible for the moral repair we so badly need in AI?
Annanda: And how can Africana wisdom guide us towards healing and accountability?
Keisha: Let's take a hard look at the AI that powers and distorts our world.
Annanda: And let's start the process of repair together. Join us for Moral Repair: A Black Exploration of Tech launching April 24th.
Government Regulation: Afrofuturism and Equity in Tech
Season 2 | Episode 1
What do we need to know about recent regulatory guidelines on AI trust and safety? What does one recent federal regulator think still needs attention? How could critical Black digital perspectives reshape the conversation? Annanda and Keisha talk Afrofuturism and equity with Dr. Alondra Nelson, deputy director for science and society at the White House Office of Science and Technology Policy from 2021-2023.
Talk to us online: at Instagram (@moralrepairpodcast), on X (@moralrepair), and on LinkedIn.
The Social Text Afrofuturism issue. About the Black Panther’s clinics.
Nelson + Lander explain the AI Bill of Rights (WIRED)
-
Lead Guest: Alondra Nelson, PhD
EPISODE BIG QUESTION: What does tech policy look like behind the curtain? And how can Afrofuturist and Black cultural principles make that ecosystem work for those who’ve been left behind?
DESCRIPTION: What do we need to know about recent regulatory guidelines on AI trust and safety? What does one recent federal regulator think still needs attention? How could critical Black digital perspectives reshape the conversation? Annanda and Keisha talk Afrofuturism and equity with Dr. Alondra Nelson, deputy director for science and society at the White House Office of Science and Technology Policy from 2021-2023.
[00:00] INTRO
SOUND: curious background music
Annanda: Welcome everybody to our first episode of our second season of Moral Repair: A Black Exploration of Tech!
Ya girls are back at it again! And we won’t stop.
So this season, we’re focusing specifically on AI and technology. We’ll be talking about government, cobalt mining, policing, and believe it or not, the impact big tech has on American farmlands.
Keisha: I’m excited to get into it. We’re talking about how AI shows up in different parts of our lives, how we can use it in positive directions and, where it’s harmful, mitigate that damage.
Annanda: There's a lot that Africana wisdom, Black wisdom, can say about technology: how do we take what AI gives, consider it, and sprinkle some Africana wisdom seasonings on it? That’s the business we’re up to this second season.
Keisha: In these conversations with tech experts, we’ve talked around the role government plays with tech. On today’s show, we’re focusing on the policy environment around new technologies. We ask the big question:
What does tech policy look like behind the curtain? And how can Afrofuturist and Black cultural principles make that ecosystem work for those who’ve been left behind?
Annanda: ‘Cause so many people are left behind.
Keisha: And we can do things differently.
SOUND: title music starts
Annanda: I’m Annanda Barclay…
Keisha: I’m Keisha McKenzie…
Annanda: This is Moral Repair: A Black Exploration of Tech. A show where we explore the social and moral impacts of tech… and share wisdom from Africana culture for caring for what tech has broken.
SEGMENT A: Regulating Emerging Tech (Problems in the policy ecosystem)
Keisha: Annanda, did I ever tell you I worked for Congress once?
Annanda: Nope, I’d remember that. *laughs*
Keisha: Yeah, I interned with committee lawyers in the House of Representatives halfway through grad school. I wanted to learn about government as a technology for managing power and decision making.
SOUND: title music fades out
While I was there, learned a *lot* about the gap between the myths and the realities of government.
That was the first administration of the Obama era.
It was a chaotic time.
Annanda: But—but a historic time nonetheless.
You know, bless you, Keisha. I don't know if I could've had the stamina!
I kind of look at that time as a beginning of a—as a major shift. I mean, I can go back to Bush and Gore, but I went on YouTube and looked at past political debates from when I was a kid, and then from when I was a teenager and then now, and you can see like the disintegration of rhetoric and actually of knowledge of government and how it functions.
It was actually quite frightening to me to see how quickly bipartisanship and actually knowledge of governance has unraveled. And I think about [Gen] Zs, for whom, this has always been their political reality. They have only known a dysfunctional government.
Keisha: Yeah, it’s a maze…
SOUND: Curious sound
So we brought in someone who could guide us through it.
Keisha: Dr. Nelson, thank you for being here.
Alondra: A pleasure to be with you both.
Keisha: Dr. Alondra Nelson was the deputy director for science and society at the Biden-Harris White House Office of Science and Technology Policy (OSTP) from 2021-2023.
Alondra: I have been very forthright in writing about problems of American society and the ways that African American communities in particular have both been innovators and also, struggled very hard, in American society to sort of compel it to live up to its professed ideals—
Keisha: Dr. Nelson told us about how she ended up as the first Black woman to hold her position. It started two presidential terms earlier, with relationships she built while writing a book about the Obama White House.
Alondra: I never thought I would serve in government.
I had been—in—since 2016 working on a book about the White House Office of Science and Technology Policy under the Obama administration. I was very interested in the kind of creative, innovative work that office was doing, under a Black presidency in President Obama's personal interest in science and technology as a policy maker and also as a kind of self identified nerd.
I think when the history books are kind of written about that presidency, That science and technology policy will be a really central part of it. I think it's one of the lesser acknowledged significant things of that administration.
One of President Obama's last foreign trips before he leaves office, he goes to Hiroshima, to the Hiroshima Memorial Park…
Audio clip: Obama at Hiroshima (C-SPAN) - (The actual line Alondra referenced: (7:05-7:27”) “Technological progress without equivalent progress in human institutions can doom us. The Scientific Revolution that led to the splitting of an atom requires a moral revolution as well.”
Keisha: This idea — of parallel technological and ethical revolutions — it’s what was behind Obama forming the President’s Council on Bioethics.
Alondra: You see in the administration this sense that science and technology policy really shouldn't and can't be unmoored from thinking about the historical context and social implications of science and technology.
Annanda: One thing I appreciate about Obama, that man, he could tap into his emotions as a politician to the suffering of people. And I wish I saw more of that in general... I wish that was a standard.
Keisha: Yes, a little emotional awareness and capacity.
Annanda: Yeah, I want a lot, to be honest. *laughs* Not a little. I want—
Keisha: Fair!
Annanda: Not that one is run by their emotions, but to be able to tap into their own humanity and the humanity of others, especially those who are different from them.
Keisha: I listened to the speech and I'm struck by that too. I'm wrestling with whether it did end at the emotional performance of that moment because he went over there for the first time in 70 years. He didn't actually apologize to the people of Japan for the U. S. dropping the atomic bomb, so he was present as the president, and it was extraordinary, and he acknowledged the moral breakdown, and he hugged the survivors. But I'm wondering if there was actual moral repair there.
Is there actual moral repair if you're not apologizing or taking responsibility for it? What do you think?
Annanda: I think that's up for the survivors, those who are morally injured. I think there is moral injury for sure in United States foreign policy. And I wonder what does repair look like for politicians who are in seats of power at times of war, making decisions as big as dropping an atom bomb.
Keisha: Right.
Annanda: The phrase moral injury and moral repair actually does come from the military in the 80s. Because they're trying to figure out what these troops are going through. It's like PTSD, but it's different. They could cure PTSD, but there's this other thing that's like showing up as a comorbidity, and that is moral injury. And the impacts of it are profound.
Keisha: Mmmm. Mm hmm.
So there was a history-making moment with Obama at Hiroshima. And then back home, as Dr. Nelson joined the Biden-Harris administration, they were about to face another.
Alondra: We were at the high water mark of a once in a generation pandemic.
I mean people were just dying. In Brazil, in the African subcontinent, in Harlem, where I live, and the sort of racial demographics of that, the racial inequality of that was just raw and real.
SOUND: conversation heading toward a break
Keisha: Phew: I can't even believe it's been four years since the pandemic started.
Annanda: You can't believe it? Oh, I can. *laughs* Why? Say more.
Keisha: It feels almost like world before, world after. We've collectively gone through multiple layers of moral injury, including abandonment from the people whose roles in the political system is to protect and care for the public.
Alondra: What situation would the United States have to be in for an African American woman who studies health inequality and racial inequality to be invited to be a deputy at the White House Office of Science and Technology Policy?
Keisha: You’ve heard of the “glass ceiling,” the career limits you can see through but not break through. Being promoted into crisis is called the “glass cliff.” And it’s a set up.
Annanda: Being a chaplain in the hospital around that time, that was definitely a cliff. Life is forever changed when you're dealing with six to eight deaths a day on your on-call shift, right? Just death after death after death…
But I actually felt safer in the hospital than any other place. How do you long term manage supporting people in that much grief, bereavement, and death?
Keisha: What I love about what you said was not just to care for others, but also to care for yourself in the midst of caring.
People who are in cliff situations, they don't get that part of the resource. They might get enough to care for others or do the job function, but they usually will not get the stuff that keeps them nourished and able to navigate the change.
Alondra: The economic inequality, the health inequality, the pandemic, were what we were going into office to help mitigate the best we could. I don’t know if it’ll ever happen again, where governments leadership were seeking people who had thought about issues of inequity and innovation, inequality in science and technology. I happened to be… one of those people.
SOUND: break cue
Keisha: Glass cliff or not, Dr. Nelson went into government to help change the climate in tech. After the break, we’ll hear all about that.
BREAK
[9:55] SEGMENT B: How equity improves the process
Annanda: We’re back with Professor Alondra Nelson, former deputy at the White House Office of Science and Technology Policy. We've been thinking about how she got into government and how a moral revolution could help shape tech. But how does influencing government policies really work?
SOUND: break cue ends
Keisha: Going back to what you said about the pandemic or the early part of the pandemic being a once in a century event or experience, when it comes to emerging technologies like machine learning or AI or the mRNA vaccines, who has the greatest influence on what regulations around those things actually look like?
Alondra: The MRNA vaccine is set within the U. S. Food and Drug Administration. But certain things were accelerated. I mean, we did clinical trials very quickly.
We were running them parallel as opposed to waiting for one to finish and another to start so in that case, it was the regulatory authorities that were in place, the usual channels, accelerated. We had something called Operation Warp Speed, and the whole point of that was to do things much quicker than usual.
I think when you talk about things like automated systems and AI gets a little more complicated in part because— and I’ve also worked on human genetics and genomics as an emerging field as well— and when you get the collision or collaboration (pick your C-word!) between market-driven innovation and sometimes really exciting products and possibilities with a PR hype cycle, regulation becomes harder. You know, in the public sphere, people say “Oh government’s so slow and they can’t regulate” and all of that. But also that hype cycles just make it sound like you can’t regulate it.
Like, this new, magical, direct to consumer genomics is coming at you, and we've never seen anything like it in the world. Cut out “genomics” and put in “generative AI,” and it's like, how could you even think you could regulate this? This is like, magic, you know, and it's cool, and there'll be food for everybody, and we'll cure cancer, and everyone's gonna have a house, and the robots will be here, and it'll be awesome!
So, there is this combination of hype cycle and material interests of venture capitalists and companies that make regulation hard, and the hard part is not necessarily that we can't keep up. It's how do you regulate a fever dream? How do you regulate a hype cycle?
That said, it can be regulated!
Annanda: Y’know, when Alondra says, how do you regulate a fever dream? And then she talks about the mRNA vaccine. I remember my beloved gay uncle, Michael, being really freaked out during the pandemic because it reminded him of HIV AIDS.
Keisha: Right.
Annanda: And for those who don't have older gay men as friends, they have survived the AIDS epidemic. we wouldn't have the mRNA vaccine for COVID as fast as we did, had it not been for the years and years and years of HIV AIDS activists who demanded research and a search for a cure in spite of government and in spite of the church and other religious institutions.
Keisha: the AIDS virus and the mRNA vaccine for COVID? How was that connected?
Annanda: Research on HIV/AIDS advanced our understanding of how viruses can manipulate the immune system. The way for our immune systems to attack these unwanted viruses— it's targeting specific viral proteins. So viruses have proteins as a part of their makeup, and mRNA vaccines work by producing a strong immune response to a virus by attacking a specific protein within that virus.
Right?
In the case of COVID the spike protein is a key protein that our immune system must attack for the COVID 19 vaccination to be effective. They took research founded in HIV/AIDS, which showed that to kill this particular virus, you have to target a very specific viral protein. And… it’s from there that the COVID 19 vaccine was able to jump off.
Without gay men, trans folk, queers, lesbians, there would arguably not be a COVID vaccine at the speed the world received it.
Keisha: Right.
Annanda: mRNA vaccines were bought with gay, queer, and black death if we consider the pandemic ongoing in Africa and among African American populations.
And the truth of the matter is, it was and still is.
Keisha: Yes. What we heard from that generation of organizers was not taking the given world for granted. Not treating it as like, It came down from the mountain, like the tablets, and nobody can touch it.
And that's what breaks the fever dream. To say the world as it is, was made, can be remade. We can do something about it. And we're going to do something about it. That's a critical trait we're gonna need to navigate this latest round of “It's magic. It came down from the mountain. It's scary. We can't touch it.” But I think that's the part that Alondra's talking about, that it can be regulated, but we have to take responsibility to do it.
SOUND: shift to prefigure interview audio
When we talk about regulation, it's really important to be able to understand, as you were saying of politicians, how does the system actually work and therefore what is my influence in it? What's my moral responsibility in it?
Like us, Alondra also had to figure out her influence within the system she was embedded in.
Alondra: The President of the United States is the Commander in Chief, the highest in the land
It's a very powerful office, but with regards to legislation, it's a very weak office. The executive branch doesn't make legislation, and it sounds silly to say it, because you're working in the White House,
But you know, people will say, “Why can’t President Biden or Vice President Harris just, like, make the companies stop X, do X, do Y?” And you just can't, it's a free society, so you can't tell companies to do anything. and it is the job of Congress to do that. So the question becomes, in the domain of the powers, soft or otherwise, that the executive branch has, what can you do?
You can convey a vision, you can make a case, and, assert a theory for how things might be done. Everyone could just ignore you. It has no power or anything. And so the Blueprint for an AI Bill of Rights was an attempt to both model what might be done, and also to advance a conversation.
Keisha: Fortunately it was a conversation that had already started. Not just in the US but even in UN agencies and global trade organizations. Leaders were starting to think about “safety in artificial intelligence” and setting common standards for technologies that members of the public would use. And that led to the publication of the White House’s AI Bill of Rights in October ‘22, which Dr. Nelson and her team framed:
Alondra: Part of what we tried to do with the AI Bill of Rights was to say, if systems are supposed to be safe and effective, what processes does one use to guarantee that that's the case?
So it could be organizations that are in charge of protecting consumers can intervene and ensure that AI tools and products, automated systems that are released for consumer use, have sort of met that standard consumer bar.
But you could only imagine doing that if you didn't think that AI was like magic, right? Is ChatGPT a special, nuanced, fascinating consumer product, but a consumer product nevertheless?
Or is it in some other category that we can't even imagine regulating?
Whether or not you're talking about existing algorithmic systems or future ones, they should be safe and effective. They shouldn't violate people's rights. They shouldn't discriminate against people. people should have some sort of alternative or fallback in using the systems. And they should have some expectation of privacy.
All fairly common sense things, but you can really only get there if you anchor in the kind of fundamental rights that people have, regardless of what technology you're talking about, even if you’re talking about quantum computing, it should not be a vector for discrimination in your life.
It might affect discrimination in different ways but the outcome for people's lived experience shouldn't be that they have unsafe systems or you know, lose their privacy, or they're discriminated against by a new or emerging technology.
Keisha: I'm curious about what you were hearing from the public during the year long listening process.
Alondra: We talked a lot to the people. When my appointment was announced I was able to give a short speech, I thank my mom and my daddy and my ancestors and my family, but also said, What are we going to do, effectively, about algorithmic bias?
because that was an issue that people were talking about the film that Joy Bualamwini's involved in, Coded Gaze, had come out. Timnit Gebru, had been fired unceremoniously from Google, for raising issues around generative AI. And so there was already a public conversation happening, and there was a responsibility on the part of folks in government to respond to that.
SOUND: background music
Keisha: Trust in institutions like the government is the lowest it’s been for nearly 70 years. But here Professor Nelson’s describing government actually listening to members of the public.
They’re listening to conversations happening in media, journals, books and industry chats. And they’re also actively soliciting input in some interesting ways, thinking about equity and access as they do it. This reminds me of the call-and-response communication pattern used across the Black diaspora. It’s a way of interacting between speaker and audience that assumes both are an essential part of the experience and both have something valuable to add.
Alondra: The FDA has public listening sessions, anyone can come and speak their peace for two minutes, We did several of those at different times of day, so people could come after work, before work after school…
We had high school students very worried about algorithmic bias and discrimination as we’re putting metal detectors and surveillance cameras and these sorts of things in their schools.
and then we had office hours. So we met with pastors and rabbis, researchers, sometimes industry people. But we try to talk to as many people as possible and to reach them beyond the typical process in Washington, D.C. which is often respond to a request for comment. I imagine in a couple of years, this will be something that we use privacy preserving, generative AI to filter through this kind of data.
Yeah, people do read them and they also become part of the historical record.
Keisha: Not many congressional reps or senators have a science background. And some people say it's probably unlikely we could get lawmakers to know enough about new technology to regulate it quickly in an informed way. What’s your perspective?
Alondra: When the Office of Management and Budget was making initial decisions, about whether government should adopt personal computers, or buy the Microsoft Office Suite We didn’t think you have to have a PhD in computer science to make a policy decision around this.
I don’t want to suggest that artificial intelligence is not a powerful technology. This is a transformative technology for pretty much every facet of life.
At the same time, If we can have more legislators in government, who understand technology and have that background, that is only to the good in the same way as having legislators who have been doctors or nurses or medical professionals, working on healthcare policy or public health policy is only to the good.
But if we think about the AI Bill of Rights, algorithmic systems are supposed to be benefiting and enhancing people's lives, benefiting and enhancing the experience of workers, and they're not doing that, a policymaker at the Department of Labor doesn't need expertise in AI to make that distinction.
Keisha: So the underlying ethic, the agreed core values, become the touchstone for everyone in the system, regardless of role. Values and ethics are how equity and other emerging norms can reshape organizations, industries, and entire countries.
SOUND: background sound ends
Keisha: You've talked a little bit about who's missing from the rooms when tech like AI is developed. Why does it matter if most of the people designing new tech come from the same cluster of schools or share the same social class?
Alondra: Oh, gosh. Because we keep getting it wrong. We keep getting it wrong.
I spent a lot of my career teaching introduction to medical sociology, some history of public health, sociology of health and illness.
And even if we just go back a hundred years and think about some of the early clinical trials or early approvals for pharmaceuticals that were only tested in men, right? and it was assumed that they were just supposed to work on everyone, you know?
So, in some cases, they were only tested on white men.
Ruha Benjamin has examples of this from the design space, where you know, we've designed a chair, we've designed a seatbelt for this car, and it should fit everybody, and it just doesn't.
There's some instances, particularly when drugs were thought to be dangerous, where they might have been tested on veterans or on African Americans, other marginalized communities.
Keisha: Now, this isn’t a “might have been”. This is a “was.” Go look up — Project Operation Whitecoat!
Alondra: we have quite a lot of research that shows us how science and technologies have been drivers of inequality. The Tuskegee Syphilis Study was stood up by the U. S. Public Health Service, a body of the U. S. federal government, and it left, you know, around 300 Black men untreated for syphilis after we had penicillin and knew how to treat it. the legacy of that study has been to build and fester mistrust and distrust of science, of technology, in government, in African American communities, and frankly Black and Brown communities outside of the United States as well.
There’s so much rich research on the impact and legacy of that study, and how it shapes people's help-seeking behaviors. And often, becomes a barrier to people feeling like they have enough trust to get the health care that they need and deserve, or to participate in clinical trials and these sorts of things.
You can't just act like that didn't happen and then say, everybody go get a vaccine. If we're thinking about the pandemic, equitable science and technology policy means a government that's willing to say we have gotten this really wrong, not a boosterism that's like everyone should get a vaccine because the American government is great, go.
Being forthright about those mistakes about those traumas are really important.
And some of these mistakes are so basic, but you don't have anyone in the room who's saying, like, I don't know, that seatbelt doesn't fit me. I'm taller, shorter, bigger, wider. those are just fundamental basic questions.
And then when you have algorithmic systems that are based on, often historical data, or people making choices about data that then get baked in, almost, hardwired as a metaphor, into algorithmic systems, those choices become these path dependencies there's one way that they're going and it sort of narrows the possibility for what the outputs can be. What the, you know, uh, what the answers can be, what the quote unquote predictions can be, if you're talking about a predictive system.
And so it doesn't matter in the sense of get a rainbow coalition in every, Google seminar room and things are going to be better, but it does matter in the design phase of algorithms and the design phase of lots of science and its applications, very basic errors, um, very wrong assumptions, that could certainly benefit from having a broader perspective of people in the room.
And then of course, everyone won't share this, but there's just an equity mandate, that if a company is making tools and resources that are literally going to be used—if you're talking about the big tech companies, Apple, Amazon, Microsoft—by potentially literally everyone in the world, that there should be some obligation for some kind of democratic participation and shaping those outcomes, even if those folks are not stakeholders and don't owe the company, what is an equitable responsibility in relationship to the public and to the consumer?
Keisha: Equity is about defining key populations and getting the data that describe their situation. It’s also about telling the truth about what happened to them, who did it, what gaps resulted, and what it now takes to make up the difference.
Annanda… if we understand equity that way, does that make “moral repair” a kind of equity practice?
Annanda: I think it's a both-and. In order to have moral repair, we first have to name the injury, right? We can't heal what we don't name. there's collective moral repair, and there's individual. And so, when I think about equity, I think we're trying to repair something collectively.
Keisha: Right. Right.
Annanda: And what we know about moral repair is in order for that repair to occur, acknowledgement and hearing of the transgressions have to take place.
SOUND: investigative / curious cue
So people actually do need to be nonjudgmentally heard in their pain, seen and witnessed in their pain which is something that we as a culture and as a country struggle with.
Keisha: Mmm, yes.
Annanda: Dr. Nelson broke down equity in a recent speech at the Federation of American Scientists awards:
CLIP FROM ALONDRA: Audio from Alondra Nelson’s speech at the Federation of American Scientists Awards (2023) (2:10-2:33)
“Equity isn't just who you serve. It's who is at the table when the policies are made. It's that the policies work and proactively uplift all. It's also who has access to the research, the data, the tools, and the resources to pursue the answers we work to ensure greater access to and support within STEM fields for more people and to more immediately open federally funded research to All American taxpayers regardless of their ability to pay.” —Alondra Nelson
SOUND: interview cue
Alondra: We also saw researchers, scientific researchers, talking about the danger of trying to go to their field site during a pandemic. People would be like, “Who is this Black woman with a backpack, out here in the woods where we're, in a lockdown, what are you doing out here?”
And we started to hear stories about researchers who didn't feel that they had mobility. That would shape an office [OSTP] that does science and technology policy, but also has responsibility for the research ecosystem.
What was the responsibility of government to help people think about how else they can do field research or building a conversation in the scientific community about the fact that there's not equitable access to field research, right? So you're being asked to publish papers or get tenure but you don't actually have full access to the resources you need to work at the height of your powers as a researcher. A few years ago, I ran an organization called the Social Science Research Council, and we had a grant for people doing international field work.
And we had a very unequal fellowship experience, right, so we had people without caregiving responsibilities who could take the fellowship and use all the money for their field sites and all of that. And then we had a population of fellows who had to find child care, had to find elder care and were trying to make that fellowship money stretch farther.
And then once they got to the field, often faced vulnerability in the field by virtue of their gender or sexual identity. And so those are the things that are part of a conversation about equitable science and technology policy.
Annanda: Repair isn't just “I'm sorry”, it's what structures then need to be put in place to rebuild trust and to build a new sense of hope. Repairing is mending the harm, and creating a new moral covenant or equity agreement that this harm will not be repeated.
Otherwise, if the harm is repeated, you're continuing the pattern of distrust and injury, making repair further difficult, and there is a place of no return.
It's been argued by scholars that chattel slavery was so great that there is a place of no return with that.
the same with Native Americans, there's no way to repair the indigenous genocide—
Keisha: Because it's ongoing.
Annanda: You're not going to be able to repair what was done in the past, but how do you repair how those descendants are treated in the future, right?
Keisha: Yeah, so changing these sorts of patterns of disparate access and not being taken into account requires a systemic response, a different way of investing resources, a different way of taking in information from a population and a different way of designing for solutions to serve them.
And it was at the Social Science Research Council that Dr. Nelson helped to put together a major fellowship and reshaped that model for funding participants that took into account their research, sure, but also their life contexts, the life contexts in which they were doing that research. Were they parents or elder caregivers? Were they experiencing strain? Where? And how can the resource then serve those conditions?
Annanda: I look at the United States and your medical care is not guaranteed.
Keisha: Nope, it's tied to your job. So if you lose your job, what health insurance?
Annanda: Your income is not guaranteed to be a livable wage. And so you have so many things in this country that are basic safety, security, and I actually think dignity needs that are not guaranteed.
Keisha: Small scale experiments like, universal basic income and or a fellowship that actually treats you like a human being. Those are the kinds of experiments I want to see scaled up socially so that you don't have to be the special person who gets the fellowship and you don't have to be in the right city to have the experiment of universal basic income, but you get to have that not as an exception, but as a norm.
That's what I'm looking for.
Annanda: That would be amazing. The amount of flourishing that would take place because basic needs are met. The amount of crime that would go down, the amount of stress that would decrease, the amount of health and wellness, of innovation and creativity, the amount of kindness in the world. From your lips to God’s ears.
SOUND: interview cue
Alondra: In my time at the White House Office of Science and Technology Policy, we did a kind of vision document on STEM equity and excellence. How do you support the full person?
In postdocs and research lab experiences supporting the full person meant a range of things like how do we need to think about the design of laboratories so that anyone across a spectrum of ability can imagine being a scientist and imagine doing bench research or other kinds of research. So how do you think about universal design?
Are there ways that science and technology research funding agencies and the U.S. federal government can do more to support people who are caregivers, to make sure that they stay in the pipeline because in the bigger context we don’t have enough workers, right?
We need more people working in these fields and sometimes that conversation is an immigration conversation, right? we don't have, enough, scientists coming from abroad, and technologists and engineers, but we are also not doing all that we can to support people here in the United States, whether or not they immigrated, last year or 100 years ago or were brought here during the slave trade, to do the work that they want to do in science and technology.
The ecosystem approach is about supporting the full person and also about trying to strategically and intentionally link up all of the programs. So there's the, y’know, after school program, and there's the community college program, and the worker training program, and everybody's competing for funding and doing their own thing.
SOUND: break cue starts quietly
How do we put all of these programs in an ecosystem so they're like handing the baton to each other, not just competing with each other.
So they're passing the resources and sharing the resources in a way that's accessible. more than the sum of their parts.
Keisha: From ecosystem equity to Afrofuturism and AI: more, after the break.
BREAK
[34:04] SEGMENT C: Afrofuturist and other Black Perspectives on the Problem
SOUND: Break music
Keisha: Welcome back.
We’ve explored how equity approaches can repair the entire tech policy ecosystem from regulation standards and the research environment to opportunities for young people left out of the field. It’s an orientation that, for Dr. Nelson, is rooted in Afrofuturist perspectives.
SOUND: Break music ends
Afrofuturism is a movement that blends art, philosophy, technology, and activism to address black life and liberation.
It assumes Black people are thriving in the future—not invisible or missing like in The Jetsons, and not subjugated like in most conventional sci-fi.
Annanda: It's the ultimate form of freedom. I think it's a collective prayer, speaking into being, of Africana Black people all the time of we're here, we will be here, we will thrive here, and us being here isn't an over and against anybody else, right? It's not the supremacist notion of dominance, but it is dignity. It is respect. and care.
Every Afrofuturist thing, Black people are living good, and the societies in which they are a part are thriving. but it's not limited, but those societies, those worlds are thriving.
Keisha: Annanda, remember this?
SOUND: Fade up bars from Baaba Mal’s Wakanda theme a few minutes into the start of Black Panther. E.g. 0:01:12-1:28
Annanda: Oh my gosh, my little diaspora heart!
Keisha: I know! I know. When Black Panther came out in New York, it felt like an earthquake hit. Afrofuturism everywhere. The clothes. The memes. People doing Wakanda Forever gestures all over the place lol…
SOUND: Black Panther theme fades out
Annanda: Oh yeah. I was out here in these Bay Area streets I went to go see it with Black Stanford And, yeah, everybody dressed up in whatever Africana gear that they had it was beautiful,
Keisha: Yes.
Annanda: what got me with Black Panther I was like certain things or ways of being that were just so unapologetic.
Keisha: I saw it in Harlem, the blackest experience I've had seeing a movie. those examples of music culture, and blackness remind me of the power of Afrofuturism to cast a different vision for people, where we belong, we're in the future we get to survive—
And it's full color. it's technology, it's innovation, style…
Annanda: And all those little STEM babies and art babies… could see themselves into the future as well. It was an Africana space: folks from the continent, from the diaspora. It was beautiful.
SOUND: Drumming in the background
Keisha: In 1998, Professor Alondra Nelson — who we’ve been hearing from in this episode — started an international Afrofuturism listserv for scholars, artists, and others. She went on to publish on the movement through the early 2000s, using journal articles and books to break down the Afrofuturist vision of “making the impossible possible” and imagining “race after the internet.”
So it’s not surprising that Professor Nelson told us about the tremendous impact Afrofuturism has on her.
Keisha: How does it shape your work?
Alondra: Oh my gosh, uh, every day in every way. Absolutely it does. I started writing about Afrofuturism in the mid 90s and I think as people take it up and use it and think with it and are empowered by it now as a concept and a movement is a lot more about fashion and music and food, you know, it's this beautiful, aesthetic thing. But for me, it's always been also about science and technology.
So my early work on Afrofuturism tried to have a place for all of those things. It was a community that included Nalo Hopkinson, the Caribbean science fiction writer, and Fatima Tuggar, the Nigerian visual artist, people doing social theory, as well as people who were technologists, and the like.
Keisha: But an Afrofuturist view is quite different than how many of us… including Dr. Nelson — learned Black history and the past—a mix of tragedy and exceptional Great Men.
Alondra: The stories about technology and gender and race that were part of my upbringing were either the Great African-American First who invented the traffic light
Or you know George Washington Carver, “He’s such a genius, he did a hundred things with peanuts, like oh my god.” I’m just a regular, regular person; how do I get involved?
And then you’ve got these stories of science and technology used to expropriate, exploit, traumatize, repress. Afrofuturism, for me, has offered not an apology for any of that repression, um, and the trauma but a way to say that's not the only story, and that even over and against that history, that continues in some instances, that you've had black People just creating beauty and innovating and doing just extraordinary things. I still marvel at the fact that in 1968, black teenagers, 16 year old and 17 year olds in Oakland were like, we're going to do genetic testing and screening.
I mena, people didn't know what genetic testing was. We didn't have universal newborn screening for genetic disease like we do now. It's like very commonplace now. talk about imagining another world. we can't get access to these resources and people like us are dying because of genetic disease. So let's do some tests. Like what?
Keisha: Sidebar—
SOUND: historical sidebar note
Back in the 1960s and 70s, the Black Panther Party started free health clinics that expanded access to disease screenings including for sickle cell trait, which disproportionately impacts Black people. They started this in Oakland, and the clinics spread across the country, eventually influencing the US government to start screening programs it should have offered years before. And the early Panthers were young: many in their late teens and early 20s.
Similarly, African American senior citizens were some of the earliest adopters of direct to consumer genetics. That's not how we tell that story. The history of chattel slavery has meant that Black people very much live in a world in which they're constantly seeking more information about their ancestors, about their past, about their families.
But what that has meant has been, like, tremendous innovation and a kind of courage around new and emerging technology. Afrofuturism has really been the bridge that has allowed me to tell that story. Like, who are the, outré, you know, interesting, brave, ingenious people who are trying to think about science and technology and its implications in new ways?
Keisha: Coming back to the consumer side of gene tech, is there something that you'd want regulators to take a closer look at—
Alondra: What the national security and existential risk concerns in part around generative AI have done is made people more aware of the potential of bioweapons and using these new kinds of technologies to accelerate, the ability of people without a lot of technological or scientific expertise to, potentially, use these tools in malign ways.
And already people can do CRISPR gene editing in their garages.
Keisha: “People can do CRISPR gene editing in their garages”—what?? *laughs*
Annanda: I'm telling you, these Bay Area streets, give some nerds some tools. *laughs* That's all they need.
Alondra: There'd been these dividing lines, like forensic genetics and medical genetics; those are important and legit. And then there's recreational genetics; and no one cares. Part of the through line of The Social Life of DNA is that there might be different use cases for the genetics, but it's the same genetic information.
The genome of a person carries all of this information at once, and we need to be careful.
I've been kind of trying to track the thinking about genomics and AI together. I mean, part of the project of the Human Genome Project has always been um, a big data supercomputing project, and so AI has been involved in some way in genomics research for a long time. It’ll be interesting to see what that means for genetics and genomics research.
President Obama announces the Precision Medicine Initiative, which is this endeavor in the United States to get a million people's DNA in a database for medical research, plans to oversample for underrepresented groups, and to do a lot of work, admirably, around consent and repeat contact with people who are subjects in the study, giving them sort of feedback and the like.
But they announce it in the East Room and President Obama, says something like, “Well, you know, if it's my genome, I like to think that that's mine,” you know, him and his avuncular kind of delivery. And, I think the next day there was a correction in the newspaper: if you go to the hospital and give tissue, it's not yours, actually, it belongs to the hospital. It does not belong to you.
Keisha: Even the President of the United States with all his staff and information resources—even he didn’t know that he wouldn’t own his own DNA if he went to hospital! So how much are regular folks at a disadvantage?
And what about the technologies that are emerging around us right now?
Alondra: I think, if we can not mess it up, that generative AI is really cool.
SOUND SHIFT
I mean these tools in these systems are potentially fun, fascinating, saving time to the extent that people are anticipating that generative AI or some of these other tools might do work we don't want to do, I'm interested and excited about that.
I'm both interested, captivated, and daunted and worried about outer space innovation, about outer space policy. We're like throwing debris up there. There's no laws. Anybody can pretty much shoot up a satellite as long as they have the capacity to do it. You can explore up there. I think the story of technology and science, if we're wise, is the things that excite us always fill us with some trepidation as well, and our job, the job of policy and of imagination, and of good governance, is to try to mitigate the trepidation.
Keisha: Thank you again for your time this afternoon. It's been a delight.
Alondra: Thank you for your work. I've so just been honored to be hailed by you and, um, really appreciated it, enjoyed the conversation.
SOUND SHIFT: silence for closing reflection.
Keisha: Talking to somebody like Dr. Nelson is helpful because it gives you perspective with all the things that go wrong, are there things that maybe are going right?
Annanda: My dad has this saying. He's like, always keep your wits about you. When he's anxious, it means one thing. When he's calm, it means another. But, you know, I think what's come up for me is, is that idea of what does it mean to right size your wits… as it relates to AI?
I think it would be foolish to not have any caution and to have full abandon to the technology. I think there is a level of trust that is required because I think what could come from this can be so beautiful, y’know.
Keisha: I did get my matrilineal ancestry tested recently. The LDS church has a lot of civic records and so I've been poking around those. But for a long time, I was completely resistant for some of the same reasons you said. Um, who are these companies? What will they do with that information? What's the relationship to law enforcement?
And then on the back end, do they actually have the spread of data to interpret it for me? Like, for me and my people?
I'm really so grateful to be around at a time where, one, that thing is possible. It's possible to do that sort of genealogical study. and make it generally accessible to ordinary people who don't know all about DNA and long time and migrations out of Central Africa and things like that. And to present it in a way that helps me feel like, Oh, I now have a sense of my link to lineages 500 to 2000 years ago, which before that test, I did not.
Annanda: There are biomedical startups that are trying to make unique medicines that can be specified just to your DNA. Imagine how helpful that can be with a myriad of diseases. I want the Wakanda version of that.
Keisha: And not the James Bond version.
Annanda: Oh hell no! No.
There are enough incredibly wealthy people that have shown you’re going to be alright if you do right by others. And we don't highlight that enough.
SOUND: theme twinkles
All too common is we think in order to be wealthy, that wealth must come at the cost of dignified living and thriving of other people. And that's just not true.
Yes, that means there will also be a cap on how wealthy one could be because how could anybody have a right-sized relationship with the world if 1%-9% of people are hoarding so much? Like the math don’t math, right?
Keisha: Right.
Annanda: And our relationship with the world could be sustainable if folks who are wealthy honor that they’re part of a greater interdependent whole, and that when that whole thrives, they do too.
Keisha: Yes.
I'm struck by the fact that it is a choice, to go back to that question of magic and responsibility. It is a choice to choose the magic of life-giving ethics and to take in the perspectives of those who are different from you, to fill out your perspective on the world around you.
This whole panic about emerging technologies requires that any of us, any individual of us, remember that we're not here by ourselves and that we depend on the perspectives of those who are different from us to have a grounded reality and to help us design a future that's worth living in.
I end the episode like with a sense it's possible. If you take the Afrofuturist route, it's possible.
SOUND: Closing theme starts
CALL TO ACTION
Annanda: We’re building community with listeners this season. So reach out and talk to us on Instagram—our handle is @moralrepairpodcast. Also catch us on X… formerly Twitter.
We’d love to hear from you!
Keisha: Follow the show on all major audio platforms—Apple, Spotify, Audible, RSS—wherever you like to listen to podcasts.
And please help others find Moral Repair by sharing new episodes and leaving us a review.
CREDITS
Annanda: I’m Annanda Barclay.
Keisha: And I’m Keisha McKenzie.
Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Rye Dorsey, Courtney Fleurantin, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.
Keisha: Original music is by Jim Cooper and Infomercial USA. And original cover art is by Randy Price. Our two-time Ambie-nominated podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.
[PRX SIGNATURE]
SHOW NOTES
Talk to us online: at Instagram (@moralrepairpodcast), on X (@moralrepair), and on LinkedIn:
The Social Text Afrofuturism issue: https://www.dukeupress.edu/afrofuturism-1
About the Black Panther’s clinics: https://www.blackpast.org/african-american-history/institutions-african-american-history/black-panther-partys-free-medical-clinics-1969-1975/“
No Justice, No Health”: https://link.springer.com/article/10.1007/s12111-019-09450-w
Nelson + Lander explain the AI Bill of Rights (WIRED) https://www.wired.com/story/opinion-bill-of-rights-artificial-intelligence/
How many medical tech advances came from HIV-AIDS research: https://www.princeton.edu/~ota/disk2/1990/9026/902612.PDF
Interaction Question
What piece of emerging technology are you excited about right now?
Political Strategies & AI
Season 2 | Episode 2
It’s a major election year in American politics. This episode explores the big question, how is AI used in American political decision-making? What are the tools out there? How do they impact the political process? While this episode will not be political, it will touch on the evolution of political culture via AI and the impact it has the everyday person. Our episode features special guest, Cyber Security expert Bruce Schneier of the Harvard Kennedy School.
-
The Big Question
This episode explores the big question, how is AI used in American political decision-making? What are the tools out there, how do they impact the political process? While this episode will not be political, it will touch on the evolution of political culture via algorithms, AI and how that impacts the everyday person.
What Utility Does Bruce Serve? Informing us and our audience on the perils and possibilities of AI and Democracy
Intro
Annanda: Hi, Keisha.
Keisha: Hey, Annanda.
Annanda: Do you remember the T, the scandal that was Cambridge Analytica two U. S. presidential election cycles ago?
Keisha: I remember that name and I remember lots of drama on social media, but tell me about it.
Annanda: So long story short, Cambridge Analytica was a political consulting firm known for its use of data analytics and targeted advertising and political campaigns. The company gained notoriety for its role in the 2016 U. S. presidential election and the Brexit referendum, particularly for its controversial methods of harvesting and utilizing personal data from Facebook users without their consent. This data was then analyzed and used to create detailed psychological profiles to deliver hyper targeted political advertisements. In the U. S. 2016 presidential election and the Brexit referendum these tactics were employed to influence voter behavior and public opinion.
Annanda: Listen back to this epic interview with whistle-blower Christopher Wylie, a data analyst formerly employed by Cambridge Analytica interviewed by Carol Cadwalladr of The Guardian.
SOUND: shift to historical cue
[audio (0:28-0:58)]
Christopher Wylie: Throughout history you have examples [00:01:00] of grossly unethical experiments, um
Carol Cadwalladr: And is that what this was?
Christopher Wylie: I think that you know, yes. It was a grossly unethical experiment. Because you are playing with an entire country. The psychology of an entire country without their consent or awareness. And not only are you like playing with the psychology of an entire nation, you’re playing with the psychology of an entire nation, in the context of the democratic process.
SOUND: end historical cue
Annanda: What does that bring up for you?
Keisha: Okay. So I remember. I was still using Facebook back then, but there was a whole scandal about people feeling hyper exposed. They were doing these free quizzes, Farmville or whatever, and surveys. And they were like, who's taking the survey data?
Who are they then selling it to? And why can't I just do a free quiz on [00:02:00] my social media and be left alone? I know a lot of people who got up off of Facebook that year. And then, it was already changing, like the atmosphere of the site. It was less personal and more agitated and aggressive. So yeah, it did start to sour people on the whole social media thing.
the casual connecting internet that we were promised, I think that was the decade we lost it.
Annanda: That makes sense.
2016 was definitely the year that I got off of Facebook and social media. Yeah. And this year is the year that I'm back on because of this podcast and now it's like, oh, you really can't run a business or do anything without it. It's more of a utility for me [00:03:00] now
Keisha: yeah. Yeah, and I feel like 2016, at least in the UK, was the year of Brexit Britain leaving the EU by referendum. Everything up to that, , pulled on the same sorts of strings as the Cambridge Analytica scandal. So lots of,, behind the scenes data manipulation, changing what people saw, reading people's, or trying to read people's emotions.
So yeah. Big season.
Annanda: Big season indeed. It's never all private.
Keisha: No.
Annanda: Yeah.
Keisha: No. Mm hmm.
Annanda: Cambridge Analytica speaks to the [00:04:00] high stakes possibility of collective moral injury, regardless of who's in office and from what political party. The use of technology when it comes to political strategy is not a political question, but an ethical one.
When public ethic is violated, it's likely that moral injury follows. Moral injury occurs because political violations betray our sense of trust in government, directly impacting our lives with outcomes based on decisions that violate our moral ethical consent.
Keisha: Yeah, with the scale at which government can act on people, it doesn't really matter whether you're directly affected, like if you don't use a particular service, or if you didn't submit particular data, it's just like the overall climate around you becomes corroded in some way.
So you have a stake. And so even if you're not the person who's going to get a check because there's a class action suit, but the whole soup that you're in, like we all get to be part of that. [00:05:00]
Annanda: Yeah. So if we take the class action, Keisha. Like, all right, so I'm not gonna get a check because some company did me dirty with mesothelioma, right?
I don't have mesothelioma, but why should I care that a class action lawsuit is happening for folks with mesothelioma? Like, how does that impact me?
Keisha: Well, the hope is that the action for that particular company that did something wrong sends a warning shot to all the companies who think that they could get away with a similar sort of effect.
And so you hope that the specific remedy to the people directly impacted changes the climate around them. Kind of like what you were saying last episode, that it's the people who are directly impacted who shape what a moral repair looks like. The purpose of that moral repair is to change it so that the injury doesn't happen ever again.
And so, when it comes to political issues, people who can vote, they have a direct influence [00:06:00] on the people who are at least in theory—Let me reframe that.
Annanda: I mean, no, no, we have to say we have an impact. Otherwise, there's nothing left, Keisha. There's nothing left in this dumpster fire that is the United States.
Keisha: Okay. Rewind.
Annanda: I need to believe that my vote can light a fire under multiple people's behinds who serve the public.
Keisha: Yeah. Cause that's one of the levers we have and we have to pull all the levers.
Annanda: Yes.
Keisha: So the people who can vote have a particular influence in the system, but even people who can't vote, the non immigrants, the student workers, the people who are on short term labor contracts, are still affected by the outcomes of the system that the voters represent and influence. So I think about these sorts of processes in a wider circle than just who's directly in the target.
The person who might have their information released is one person, yes, but all of us are [00:07:00] implicated because of what can potentially happen to them can happen to us and often does.
Annanda: Got it. So you're talking about accountability and deterrence.
Keisha: Yeah, accountability, deterrence, and then being in solidarity because we're all connected. So like thinking about how we pay attention to the wellbeing of everybody is basically what politics is all about.
Annanda: Hear, hear! I mean, that is the idea of a democracy. Rumor has it.
Keisha: I've heard legends.
Annanda: And myths. Ahh
The goal of any political strategy is to win. And so the questions understandably surface in such a massive global election year. What role does AI play in the political strategy of winning? Where should we focus our concerns? How do we manage our anxieties? And — the big question of this episode is: [00:08:00] What is an ethical win for the public when it comes to technology and political strategy?
I'm Annanda Barclay.
Keisha: I'm Keisha McKenzie.
Annanda: And this is Moral Repair, A Black Exploration of Tech, a show where we explore the social and moral impacts of tech and share wisdom from Africana culture for caring and repairing what tech has broken.
Keisha: When we come back, we'll talk with Bruce Schneier, a fellow and lecturer in public policy at the Harvard Kennedy School and Berkman Klein Center for Internet and Society.
[BREAK]
SEGMENT A
Keisha: Bruce Schneier calls himself a public interest technologist, and he works at the intersection of security, technology, and people.
Annanda: In your last essay on your website, you talk about how the frontier became the slogan of uncontrolled AI. Say more about that. What is, for our listeners, what is the frontier? What are you referring to with that?
Bruce: Frontier is what we [00:09:00] Talk about, when we talk about the newest AI models, so the term you will hear by the researchers, by the VCs, by the companies, are frontier models. These are models on the edge. These are models that are doing the best. These are the ones that cost hundreds of millions of dollars to create, and a lot of Energy.
And it's a complicated metaphor, right? Frontier is the final frontier. Space, Star Trek. It didn't come from the American West and subjugating the native population, which is the legacy of the frontier in this country.
Bruce: And when you peel back the edges of AI, you see a lot of frontier thinking. That's You see a lot of colonization, a lot of, the data is there for the taking, no one has it, we're going to take it.
There's a lot of rule breaking, right? The frontier was all about the rules don't apply, [00:10:00] and we're going to make our own rules. Think of the American West and the cowboy, that metaphor. So, it is complicated. And I worry that we are making some mistakes when we think about. AI, and our rush to create AI.
Keisha: What would we need to do either with AI or something else to avert the worst impacts of degraded quality in public debate?
Bruce: Right now AI is not helping. Right now there is so much AI nonsense being created by sites that realize they can fire the humans. I write the stuff and this isn't new.
And [00:11:00] we have seen AI generated content for years in three areas, in sports, in finance, and in fashion. Those are three areas where it was kind of formulaic, it was stylistic, it didn't take a lot of creativity to write the article. And AI has been writing those articles for a while now, but now they're writing more stuff, and it's not very good.
And the AI dreck is pushing out the quality human stuff. Now this is hard. Some of it is our fault. We as media consumers don't demand quality in our reporting, in what we consume. So we accept. Poor quality things. I think AI has the potential to write very nuanced articles about issues that matter.
I think about local politics again, right? We've had the death of local journalism because of platforms like Facebook, [00:12:00] but now lots of public government meetings at the local level are effectively secret because nobody's there. There is no local reporter covering the school board or the city council.
In a town. Now, AI can fix that, right? AI is actually really good at summarizing stuff. So if the meeting is open and it's recorded. The A. I. Can summarize it and write the article. So I'd rather have a human doing this, but the problem is I don't have a human because we can't afford the human.
We're not. We're not willing to pay the human. Let's say it that way. Of course we can afford it. And in that area, an A. I. Is way better than nothing. It'll provide some accountability for local government.
Annanda: What would seem most urgent, if there is a sense of urgency?
Bruce: Urgent is this election. On a scale of 1 to 10, I'm pretty concerned. About democracy in the United States, I see some very strong authoritarian tendencies. I see a lot of anti democratic ways of thinking. I see a lot of people more concerned with results than the process. Now, the annoying thing about democracy is You got to accept it, even if the vote doesn't go your way, right?
Democracy is really about not picking the winner, but convincing the loser to be okay with having lost. And that sometimes means your side loses. And if you're not okay with that, then you're not really for democracy. See a lot of people in this country, not okay with that. That think the ends justify the means.
And I don't just mean their side. I mean, us too, a little bit, but I do see some very strong anti democratic. ways of thinking, and that gives me [00:14:00] pause. Turns out a lot of democracy is based on the way we do things rather than laws. Hey, it's sort of, just the way we do it, and it turns out if someone just breaks all the norms, there's not a lot that can stop them.
We learned that, and that was a surprise to many of us. So, I don't know. I like to think we are resilient. We were very resilient in 2020. We've been resilient since then, despite the rhetoric overall. Still a lot of places where democracy is not working.
Keisha: We recently spoke to Professor Alondra Nelson, formerly deputy at the Office of Science and Technology Policy. And she said that over a third of the planet is expected to have elections this year. And you said you thought democracy globally is kind of under siege. Can you share what impact you think AI might have on some of these elections and how we think about government by the people? Globally.
Bruce: It's interesting statistic and I've heard [00:15:00] 30, a third. A third I've heard 40% I've heard over half. Kind of depends how you count.
Keisha: Mm-Hmm. .
Bruce: But it is United States and the EU. And it's India. Australia will be early next year, UK early next year. I mean, these are large countries. With strong democratic traditions. And we are worried now about AI and its ability to create fakes. This is a very near term AI risk. This is not AI subjugating humanity. This is not AI taking our jobs. This is AI making better memes.
Keisha: Mm-Hmm,
Bruce: Now that's the thing that the fear is that AI will be used to create fake audios and fake videos.
There was a fake audio in Slovakia's election. A couple of months ago that came out a week before the actual vote that might've had an effect. We don't know. So we [00:16:00] worry that what the Russians did in 2016 with the Internet Research Agency and creating fake ads on Facebook and fake memes on Twitter can be replicated at speed, at scale, at a much lower cost.
SOUND: shift to historical cue
Annanda: What Bruce is referring to is, the Russian government's interference in the U.S. 2016 election through a coordinated campaign that included hacking Democratic party emails and disseminating misinformation across social media platforms. This interference aimed to sow discord, undermine public trust in the democratic process, and skew public opinion in favor of certain political outcomes.
SOUND: end historical cue
Keisha: I read an article that you wrote for the Ash Center back in November about the four dimensions of change.
You said speed, scale, scope, and sophistication in terms of how AI might develop. Can you give us some concrete examples of what you mean by scale versus scope so that people might understand?
Bruce: Yeah, so let's use, let's take like misinformation, like why misinformation might be worse, right? Speed is one reason, right?
AIs can make memes and fake facts and write tweets and Facebook posts so much faster. than humans can. They can operate at a speed that will rival humans. They can do it, at scale, right? They can make not just like one post, but thousands, millions. You can imagine millions of [00:17:00] fake accounts, each tweeting once instead of one account tweeting a million times.
The scale, scope is how broad it is. So like on Facebook, on Twitter and in different languages, optimized for different audiences. And then the sophistication, they might be able to figure out more sophisticated propaganda strategies. Then humans can and to me, that's what I look at when I look at these technologies and specifically when those changes in degree make changes in kind, when it is different that it's not just faster, but it's something else.
Bruce: So it's not just the Russian government with a hundred plus people in a building in St. Petersburg. It's everybody. And the worry is that noise. [00:18:00] We'll drown out the signal. Now it's not just AI. Blaming that on AI, I think is too easy by half. That's a lot of us. That value, the horse race and who's ahead on the polls.
We're having this interview the day after President Biden’s State Of The Union address and everything I'm reading is how good it was, how bad it was, did he make gaffes? Did he sound good? Not a lot about substance, and that's on us, , that is the kind of political reporting we like, and that kind of reporting plays into memes, which plays into AI strength.
So the worry is that AI will make fakes. The fakes will be disseminated. And more importantly, there's something called the Liar's Dividend, where if there's so many fakes out there, if something real happens, you can claim it's a fake. And we have seen this in the United States, right? Trump has said about some audio, that was probably a [00:19:00] deep fake when it actually wasn't, but if things are a deep fake, you now can claim whatever is real is fake.
So it, I don't know if that will have an effect. We don't need AI to create memes to denigrate the other side. There's a video of Nancy Pelosi that was slowed down, made her look drunk. That wasn't a deepfake, that was slowing down a video. Like any 12 year old can do that with tools on their phone.
But it was something that was passed around. So I don't think the deepfakes are going to be that big a thing. I do worry about them close to the election when they can't be debunked. And I worry about that ability to claim something real as fake.
Annanda: What ways do you see, the average voter, being able to not defend themselves, but keep their guard up, a sense of wisdom around how to navigate how technology is being used in the political [00:20:00] process?
Bruce: It's hard. I'm not sure I know an average voter. You might not either. The people we know are hyper aware, maybe hyper political, are not, when I read about the average voter, it's not the kind of people I meet at Harvard, which is very elite. The average voter, near as I can tell, doesn't get a lot of information.
Keisha: Mm hmm.
Bruce: You know, and not only telling them how to be on their guard doesn't help, they don't know they have to be on their guard. They probably don't even know what being on their guard means. It's sort of interesting. I think about the idea that we're going to put labels on memes, whether they're true or not.
It doesn't, to me, that doesn't make a lot of sense because people who would believe the labels don't need the labels. The people who need the labels aren't going to believe them if they're there. And I don't want to live in a world where the [00:21:00] average voter needs to be on their guard.. That seems like a really, not fun place.
I want a world where the average voter is safe, where the average voter gets information that they can use to make a intelligent decision about which candidate best represents their views accurately. and cast their vote without any undue process or intimidation or long lines or anything. That's the world I want.
That you don't have to be a political junkie in order to be, have a political voice. That you could be someone who is just living their life in a democracy. And then you get your ability to have your say. So I don't know. I don't know what to say to the average voter. I guess I'm sorry. I want it to be better.
Keisha: Bruce, you've written about how AI can shape campaign advertising, [00:22:00] communicate with voters, write legislation, distribute propaganda, submit comments.
Can you talk about how AI is showing up in the practice of democracy and campaigns, lawmaking, and regulation?
Bruce: Depends how we do it. So right now we've had examples where they've been rulemaking.
processes and hundreds of thousands, millions of comments was submitted by not an AI, but by a machine, right? By a computer. It wasn't even clever. It was just multiple comments submitted by fake people. So already it's pretty bad.AI as assistive tech, if we do it right, increases democracy.
So if an AI can make you more articulate to your Congressperson, I think that's great. If the AI denies you a voice, that's bad. I think about AIs being used to ease administrative burdens, like helping people fill out government forms. Now, we can do that, and that is possible, [00:23:00] and that will engage people in democracy.
That will help them get the assistance they are legally entitled to. You could also imagine AIs being used to increase the division between the haves and the have nots.And this is the problem with tech, right? The old saying tech is neither good nor bad, nor is it neutral.
But we can design tech to favor democracy. We could design tech to favor the powerful. And these AI models right now, they are very expensive to create, relatively free to use if you're not going to use the best model. , the tech monopolies are right now giving them away. But my guess is that's temporary and they will become cheaper and they will become more available and that people will have personal AI assistance.
Now that can be an incredible boon for equality. Right, to give you an advocate where you [00:24:00] couldn't have had one before. And here I'm thinking about people for whom going to a courthouse or a government building is a burden. They have a job, they have a family, it's not easy to get around, it's always easy to say, go do this if you're a middle class white guy.
And it's harder, the more you deviate from that kind of zero level of ease. AI can make this better. It might not, right? It really depends. But we do have an opportunity here. So I like talking about the benefits of AI for democracy. Because that gives us a chance for having it.
Democracy is an aspiration. It's always an aspiration, and our goal is to make it better.
SOUND: break cue
Annanda: After the break we'll hear more about what Bruce has to say about making AI a tool for people to have hope, ease, and access to the democratic process. This is “Moral Repair: A Black Exploration of Tech”
[BREAK]
[00:25:00] Segment B
Annanda: And we’re back talking with the cyber security expert for the people, Bruce Schiener to hear what he has to say about AI and the democratic process.
SOUND: interview cue
Keisha: What excites and or terrifies you about technology, What big questions do you still have about the future of tech in politics?
Bruce: I think a lot about power. When I think about tech, the element that to me is most important and exciting and disturbing is how it relates to power.
And whether power can co opt it, whether it can be used to dismantle power and disaggregate power and distribute power, or whether it will consolidate power. And, we've been through these cycles, when Facebook first appears, we all thought it would be democratizing, right?
That it would give people voices, didn't have voices. And it did for a while. And now it helps the powerful consolidate their power. And what comes [00:26:00] next will, first be democratizing, and then someone will figure out how to consolidate. I think AI is going to be that way. I think about all the ways AI will democratize, will distribute power.
We'll give power to the powerless, but the powerful will try to figure out how to centralize that power again. That's what power does. So, that is what I think about most. I really think about how these technologies interact with power. How power can use them. How they can be used against power. And by that I don't mean revolution.
Things like, Whatsapp breaking the monopoly on phone message charges, right? that kind of thing, sort of very low level, the way these technologies can make it so people can organize and find themselves and find their tribes, which has been, , one of the wonders of the internet is you can find people like you, no matter how weird you are.
And that's wonderful. [00:27:00] On the other hand, conspiracy theorists find each other and now suddenly it's their reality. It's always the good with the bad.
Annanda: How can AI be used to support the democratic process?
Bruce: Let's think of the things I've seen. There is a public charity that has created an AI to help people run for local office. Now, these are nonpartisan offices, right? These are sheriff or person in charge of sanitation or dog catcher, right?
I mean, very local government jobs that are not party affiliated. And this AI helps regular people fill out the paperwork, have a website, get the signatures, do all the things necessary to run. That seems fantastic. that will increase engagement. At in local government around the country. That feels like a great thing.
I've seen AIs that are helping people navigate the legal process. There's one, site called donotpay.com, which will help you, get out of parking tickets that you don't deserve to pay again, very egalitarian. [00:28:00] Can imagine something that will help people in housing court or immigration court, and all of these legal processes that have a lot of paperwork, a lot of bureaucracy, and you have to pay, multi hundred dollar an hour lawyer to help you navigate these processes.
And if you can't afford that, you don't get a lot of representation. And so that seems really important and really powerful. There are groups that are using AI to help engage voters. To get beyond the parties with their big party databases of voters that they control access to as a way to limit primary challenges.
That feels more democratic. These are all things like very powerful things. I think about AI, assisting in polling, in, dividing campaign strategies, here again, we can imagine haves and have nots. We can imagine people who are not in the normal party [00:29:00] machine being able to use them, to run for office.
These all seem like really good things and some of this is science fiction, right? This isn't all reality, but none of it is stupid science fiction, it's science fiction that in a couple of years is likely to be true. My question is, how do we build an AI that isn't just going to be beholden to corporate interests?
That can do some of these good things for society, for democracy, and , not just do what's good for the Silicon Valley tech billionaires.
break cue
Keisha: I wonder about the theme of participation.
I was at a concert recently, black gospel music mashed up with a local symphony. And I knew going in, [00:30:00] minute they said this gospel artist is coming, I knew it was going to be church folks there, a completely different demographic than what's usually going to participate in this cultural space.
Annanda: Yeah.
Keisha: The usual rules about waiting until the end to applaud or not singing along, that wasn't going to apply because the black cultural space is participatory, right? So the audience is as much a part of the performance as the person on the stage and the person on the stage welcomes that.
It's not like a imposition. It's a conversation between them and it creates something that neither of them could add on their own. And it was amazing and beautiful and enlivening. And I think that gave me a taste of what governance of the people, by the people, and for the people can be like a richly participatory experience.
Not something that is gate kept by the people who had early access to the stage. And not [00:31:00] something that is, exclusionary or just like you're spectating this performance that's happening, like at a campaign event, and your only, role is to co sign the thing that is happening in the stadium or to cast one vote one time every two to four years.
But like, what's the quality of democracy in between those stage moments and how does even the stage moment welcome all of the richness of the so-called audience to create something new?
Annanda: The wisdom of call and response.
Keisha: Yes, exactly. Call and response, and a deeply robust inclusion.
Annanda: Mm hmm. That creates a co facilitation of performance or why we're gathered in the first place. Yes. Well, that would be a democracy.
Keisha: It would be. And I think that's part of what makes [00:32:00] what could be seen as a threat from AI. of crowding out people. I think that's what would mitigate the risk. to make sure that the system itself not only brings people in at select points, but like really centers the contribution of people, then it doesn't really matter what the machines do or the instrumental ends to which we put the machines.
Because you're crafting the system and the structure and the policy and process around what humans can do.
Annanda: I hear that. What comes up for me as it relates to what does Africana wisdom have is, the importance of rhetoric.
Keisha: Mmm
Annanda: In several of Toni Morrison's essays. She speaks about the importance of language and the use of it, as a technological tool. As we're talking about AI and the democratic [00:33:00] process, how has our language shifted when we talk about democracy?
How do we use it to make or break worlds? We shape world through language.
Do we even value that rhetoric? , and what that rhetoric then creates or instigates or obligates the listener to.
Keisha: Is there an example that you have that's coming to mind?
Annanda: I think a great example is like book banning, right?
We're banning different thought, cultural perspectives. So for her, the use of rhetoric, and in particular, the word, not the imposition, but the understanding that the arts and the humanities really are alongside technology. And she does this in a speech in 1991, where she's already seeing the trajectory, like everything is happening today.
If you look back, you'd be like, Oh, Toni Morrison is a prophet. And it's like, no, she just saw the writing on the wall and will [00:34:00] it continue? And even in her speech, she says, I've dramatized and exaggerated these things to prove a point. And what I find eerie is that we are actually at the point of her dramatization, what she names as dramatized is actually here.
And so, the work, the technology of the humanities, of engaging with cultures and people of differences, is vitally important because you get a diversified understanding of the human experience, that somebody is not you. Yep. And that's okay and actually vital that somebody sees the world differently, thinks differently, that it is actually an insult to knowledge.
It's an insult to technological innovation. to consider one understanding supreme. But what does it look like to actually delve into the complexity that we are dealing always already with multiple, multiple identities, multiple ways of seeing and multiple ways of knowing? And what is the [00:35:00] art of then negotiating those multiple ways so that we can be? be a better society, a better democracy together. And that is the gift that the arts do.
So, yeah, the technology of rhetoric and the humanities from the wisdom of Toni Morrison, is what comes up for me.
Toni Morrison: No one should convince you that listening to authors and concerts and going to exhibits is secondary. Whatever your profession is, this may be the real work you are doing. Creating the juncture where artists, scholars, and the public intervene to create and facilitate genuine intellectual development.
To facilitate communities successfully to alter socially unhealthy situations in traditional and non traditional ways. To nurture the splendid consequences of fed up representatives and citizenry who no longer are waiting for the peace dividend or the campaign promises.
I wanted to call your attention to how much is needed now from you and how vital it is now, with no more time to lose, that we become as innovative as possible if we don't want to continue to relegate dying and loving and giving and creating to pop bestsellers and greeting cards. I believe, as you do, that there are distinctions to be made and kept among data, knowledge, and wisdom. That there should be some progression among them. That data helps feed and nourish knowledge. That knowledge is the staging ground for wisdom and knowledge. And that wisdom is not merely what works or what succeeds, nor is it a final authority.
Whatever it is, it will always be a search.
SOUND: Closing theme starts
CALL TO ACTION
Annanda: We’re building community with listeners this season. So reach out and talk to us on Instagram—our handle is @moralrepairpodcast. Also catch us on X… formerly Twitter.
We’d love to hear from you!
Keisha: Follow the show on all major audio platforms—Apple, Spotify, Audible, RSS—wherever you like to listen to podcasts. And please help others find Moral Repair by sharing new episodes and leaving us a review.
S2 CREDITS SEGMENT
Annanda: I’m Annanda Barclay.
Keisha: And I’m Keisha McKenzie.
Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Rye Dorsey, Courtney Fleurantin, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.
Keisha: Original music is by Jim Cooper and Infomercial USA. And original cover art is by Randy Price. Our two-time Ambie-nominated podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.
[PRX SIGNATURE]
SHOW NOTES
The whole cup of tea: The Guardian Interview with Christopher Wylie
They are still going y’all! Chicago Festival for the Humanities
Check Out Bruce’s Corner of the Internet! Bruce Schneier
Sabbaths and Light Phones: Technologies for Rest
Season 2 | Episode 3
Always-on technology has amplified distraction and overwhelm, disconnection, and cultural polarization. What ancient and emerging tools can help us put tech in its rightful place? Keisha and Annanda talk to Judith Shulevitz and Joe Hollier about what can we learn from Sabbath traditions about community, connection, and thriving.
-
S2 E3: Sabbaths and Light Phones: Technologies for Rest
(Taking a Break from Tech)
Lead Guests: Judith Shulevitz and Joe Hollier
EPISODE BIG QUESTION: What ancient and emerging tools can help us put tech in its rightful place? What can we learn from Sabbath traditions about community, connection, and thriving?
DESCRIPTION: Always-on technology has amplified distraction and overwhelm, disconnection, and cultural polarization. What ancient and emerging tools can help us put tech in its rightful place? Keisha and Annanda talk to Judith Shulevitz and Joe Hollier about what can we learn from Sabbath traditions about community, connection, and thriving.
Keisha: Hi Annanda!
Annanda: Hey Keisha!
Keisha: I was on YouTube the other day and saw a TED Talk from Manoush Zomorodi. She’s a public radio podcaster at NPR.
Seven years ago, she started noticing how much of her free time and brain space was taken up by her phone. Check this out:
AUDIO: [2:17-2:42] “Now all the cracks in my day were filled with phone time. I checked the headlines while I waited for my latte. I updated my calendar while I was sitting on the couch. Texting turned every spare moment into a chance to show to my coworkers and my dear husband what a responsive person I was, or at least it was a chance to find another perfect couch for my page on Pinterest. I realized that I was never bored.”
The title of her talk? “How boredom can lead to your most brilliant ideas.” Zomorodi invited her listeners to experiment with her for 1 week, to keep their phones in their pockets when they went on walks, not to take pictures of their food at dinner, delete the games and other apps that sucked up hours of their attention.
How do you think you’d do with a challenge like that?
Annanda: You know, I think at this point in time, I would struggle not because I'm addicted to my phone, but because with my end of life planning business, I do most of my outreach via social media now because that's where the folks are. So yeah, I would be like, no, no, I need clients.
Keisha: You need to be connected?
Annanda: I tend to be with rebellious people in terms of tech. We're all kinda outdoorsy crunchy and so we're off in the woods anyway where you're not gonna get signal *laughs*. Like, you know? Or just out and about, but yeah, I have recently begun taking more photos… for the sake of memory.
Keisha: Mmm… mmm.
Annanda: Nine times out of ten I'm disappointed because I'm like, “it's not capturing” and then I remind myself like no. It's actually not supposed to hold all of this moment for me but it is to be a reminder of the time that was had.
Keisha: Right. I also love those dinner parties where people might pass the plate and everyone puts their cell phone in there, and nothing moves until all the cell phones are off the table and then we can actually have a chat.
Annanda: Interesting. Oh my gosh, I've never been to a cell phone control dinner party.
Keisha: *laughs* Because it is kind of out of control.
SOUND: title music starts
Annanda: This is Moral Repair: A Black Exploration of Tech. A show where we explore the social and moral impacts of tech… and share wisdom from Africana culture for repairing and caring for what tech has broken.
Keisha: Today we ask the big question: How can we put tech in its rightful place? What could we learn from an ancient tradition—Sabbath—and an emerging tool—the Light Phone—about community, connection, and thriving?
SOUND: End title music
Keisha: Good morning, Judith. It's a pleasure to have you.
Judith: Thank you so much, Keisha. It's really a pleasure to be here.
Keisha: Judith Shulevitz is a culture critic and the author of The Sabbath World: Glimpses of a Different Order of Time. The book is about tech, overwork, and ways we can pull ourselves and our culture back. I wanted to know what place mobile technology plays in her life.
Keisha: Technology has accelerated how we produce objects and knowledge since the wheel, the yoke, the printing press, the factory. How are you seeing web and mobile technologies influencing how you and the people around you work and rest?
Judith: It’s changed so much it’s hard to know where to begin. so much of culture has migrated online, so that my job as a cultural critic is far more sedentary than it used to be, which I don't particularly like or approve of.
Deadlines have speeded up, of course, with web magazines. I worked on one of the earliest webzines, Slate, when it started up, and we posted two or three stories a day, which just seemed like an enormous amount at the time. That was more than 20 years ago.
I now work for, the Atlantic Post, I don't know how many a day, you know, probably 25. So, I'm writing on a faster schedule, and I'm consuming more online, watching movies from my computer ra
I'm mostly a book critic nowadays, all the books I get sent are on PDF. That is a radically different experience, reading a book on a PDF than reading it in print. My daughter's boyfriend recently, he said to me, why do you read the New York times on paper? That's such a good question. I read it much more thoroughly. I go from the beginning, to the end, do I read every article? No, but I have a map in my head of where all the articles are. And that often signals to me what they think is important.
Now this is all just, you know, legacy technology talking and my nostalgia for it, but that's—that's one way it's changed. So that's on both the production end and the consumption end. The other thing I really miss, as I said, I like going to the screenings, and being with other people. I just miss the part where you go out and you pound the pavement and you ingest culture and you go to a lot of movies and you go to a lot of theater and you go to a lot of art shows and poetry happenings or whatever it is.
It feels like there's a lot less of that.
Keisha: On the one hand you're talking about the immersion opportunities of being out in the field, but you're also saying that there's an over-immersion from the internet and all the stuff you have to absorb.
Judith: You're completely right about that. the amount of things I have to consume from one black oblong all day long, it creates a funny sort of dependency. You know, I feel very odd when I'm not on my computer.
I feel disconnected. I feel that the real world isn't as real as the computer is to me, cause it's so much more stimulating, And moving between the two kinds of spaces is surprisingly hard.
That's the big problem. If you are nostalgic when you get off your computer for life on the computer, you are going to be absent rather than present in the space you occupy in the relationships that are around you.
Keisha: Judith’s not alone in noticing the difference tech can make in our work, lives, and relationships. A Brigham Young research study from 2015 used the term “technoference” to describe the rejection some couples feel when one partner prioritizes using their TV, computer, or phone over interacting with their loved one, even when they’re in the same room as each other. Therapist Esther Perel calls it “artificial intimacy.”
Annanda: So we reached out to Joe Hollier, artist, entrepreneur, and inventor…
A man who loves the color yellow because the richness can't be picked up by our screens…
And a man who also believes that beauty in the world matters and our relationship to one another in that beauty is vital.
Keisha: And after the break, we’ll learn all about how Joe answered the overwhelming demands of tech—with a device he calls the Light Phone.
Audio: break music
SEGMENT B: Background
Audio: break music
Keisha: This is Moral Repair: A Black Exploration of Tech.
I was in church the other day and thought I felt a phone buzz.
Audio - background: buzz… buzz.. Soft ring. Maybe 2-3 rings. Over this next paragraph
It was wild how many people around me also started patting their pockets and checking their bags… Like all of us were on call!
I’ve had my ringer off since the early 2000s, but I’ve felt phantom buzzes before. It’s not a real notification, it’s just your brain inventing an interruption and filling in “dings” and “pings” because if you use the phone like it’s designed, a phone will always ding and ping.
Annanda: You know, Keisha, I’m not gon’ lie. I love a good ding, and I love a good ping, and if I'm not trying to hear it, *laughs* If I'm not trying to hear it, it will be on silent. I've used those dings and pings as a tool for my ADHD.
Loud and proud, I am a ADHD human being and those dings and pings help me to help you and everybody else I'm working with to be like, “Homie, you need to get on this next task on your to do list.” And it's also like, “Oh, oh! A ding and a ping!” Thank you, phone, for seeing me so that way I can do what I need to do.
Keisha: So it satisfies your brain. Do you notice it changing anything about your interactions with others?
Annanda: Honestly, no. But I think it's because I have been so… social media adverse, really until this podcast, that I have now actually developed a discipline, It's a habit at this point, the ding and the ping is here for a reason because I want to interact or because I want to be notified, right?
So if I know I'm going to get a call or a text from somebody, I make sure that I can get the ding and the ping.
It's made me more efficient in time, but that's because I've consciously made it a point to use it that way.
Keisha: Our friend Joe Hollier saw how these phones were changing the way he lived his life… so he and his cofounder Kaiwei Tang made a different kind of phone -- low tech, not high tech. The Light Phone just calls and texts. You can use it as a hotspot, but no apps, no news, no social media, no web browser. The tagline for the phone is “designed to be used as little as possible.”
We asked Joe about it.
Annanda: How did you as an artist get into designing what the internet commonly calls a dumb phone, but I actually think it's a very smart, intentional phone.
Joe: just looking in 2014 at the kind of problems I was seeing with smartphones, our overwhelming attachment to them, the way that it kind of diluted and distracted so many of these kind of amazing moments that I cherished as an artist, whether being in the studio or, something as simple as taking a walk, a road trip, going skateboarding, being lost in a concert, you know, present.
I really just saw it as an opportunity as an artist to make a statement that hopefully would resonate with everyone around reminding ourselves that we weren't always so connected to the internet. We didn't always bring it with us to the bathroom or have it next to our bed when we wake up or in line at the grocery store.
And that, you know, sometimes Being bored or, you know, present in these moments can actually lead to creative thoughts. maybe not ditching the internet for good, but at least carving out very intentional internet free spaces, was super important to me.
Annanda: Yeah. Yeah. I see that as a clear value. And in what ways do you think the technology that you have been a part with your co-founder really addresses our attention economy?
Joe: The value is probably a little bit subjective to each person that might try what we call going light. But the really beautiful thing is that, we don't really tell you what to do. we don't feel like that's our role. Going light for me might mean quality time with my cat or practicing piano, but it could be drastically different for so many different users.
And I think it's inspiring me as an artist, just how wide of a demographic the phone has been able to reach. Like we have Bible Belt families that see this as family values. Like, I don't want [that] my kids’ memories of me are looking up at me on my smartphone.
Or from the other angle, I don't want my kid to feel vulnerable to the temptations of social media, the internet at large. One value that Kai and I try to embrace in all of the designs that we do is intentionality.
Whether you want a smartphone and to be fully connected at times, or to be in the kind of light space that we champion, I think it's just about being intentional and not finding yourself scrolling away when you didn't really want to do that, that afternoon.
We have the agency and the power to take a little bit more control of our digital lives.
Annanda: You wrote an open letter in the New York Times, inviting Tim Cook, CEO of Apple, to a conversation on the phone's role in society and the importance of finding a balance with smartphone interaction.
If Tim was in front of you, what is the conversation that you wanna say?
Joe: We wanted to point out the irony of these statements that he had been making, in relation to just how much we were abusing the phone.
He said something like, “If you're getting 10,000 notifications a day, you're doing the wrong thing.” Yet, most smartphone users were getting 10,000 notifications a day. And you know, “If you're looking at your phone instead of someone's eyes, you're doing the wrong thing.” And so these things we completely agreed with, yet what we saw was the smartphone getting bigger and more encompassing, more data on users, more apps fighting more and more aggressively for a time.
So there was kind of this discrepancy in like, what you're saying, what you're doing. And we just thought it'd be funny to point that out in a loving way. We were as curious as anyone, what his reaction to a phone like the Light Phone would be.
Annanda: Yeah. Any word?
Joe: No, we never got any word from Tim Cook, but I think that was, uh, expected.
Annanda: Yeah, that makes sense.
Keisha: How about others? Did you get feedback from other readers or others in the industry?
Joe: Yeah, definitely. It helped connect some dots for people. We're not anti-technology, we're just pointing out something so obvious that the CEO of Apple, the biggest smartphone maker, is saying it.
Annanda: What's the legacy that you want this Light Phone to have? When Joe Hollier has spent his time on this planet, what's the legacy that you want this particular project— what's the gift that you want people to have experienced?
Joe: I'm not one to think too much of legacy or even needing to have my name attached to it in any way. I like, unlike art, that this project has allowed me to be more visible but I think what moves me the most about it is when we get emails from users who have had serious life changes since using the Light Phone.
And obviously I try to remind them, like they changed their own life. The Light Phone was just this catalyst that maybe set them on a path, but their persistence and hard work and not giving up when they were bored or dealing with, you know, emotions and not distracting themselves, that's what led to this positive change. It's having a tangible effect on people. Thousands of people have tried our phone and had some sort of positive experience. Like that just really moves me.
I'd love to imagine what those users could do and invent and create that's so much bigger than the Light Phone addressing other problems now that they aren't so glued to their phone. I just recently did a talk at Stanford and it was less about trying to sell the students on the idea of using a Light Phone and more trying to say like, look, if me with no technological experience was able to do this, you guys who are really, really sharp, what can you do? They were talking about the existential threat of like, not wanting to work for Big Tech or, potentially, military industrial complex. You know, those are sort of the options for a lot of these really sharp students.
And I was like, you can also invent a different path and it's less guaranteed, but it's possible.
Annanda: Joe, thank you so much for your time and thank you for your role in the Light Phone and disrupting this industry in a way that helps us return to ourselves and return to that, which provides us meaning and substance and taking notice of the world around us, not just the digital realm.
Joe: Thanks so much for hearing us out. We really do appreciate it and I thoroughly enjoyed myself. So my pleasure.
Keisha: When we come back, we’ll talk about another innovation—much more ancient than the smartphone—that could give us a different lens on tech, community, and how we can thrive together.
SEGMENT C: Solutions/Practices
Annanda: Welcome back to Moral Repair: We’ve been talking with Joe Hollier, cofounder of the Light Phone.
But… what if you don’t have a Light Phone? Cause, let’s be real, most of us don’t.
So what else could help us heal from the overwhelming culture that tech has made?
Keisha: We talked about tech culture and healing cultural resources with Judith Shulevitz, who you heard from earlier in this episode. Judith’s the author of The Sabbath World, a reflection on the ancient tradition of Shabbat or Sabbath and ways that we can rest.
Judith: Rest is not a thing you do by yourself. It is a thing you do within the context of a community that becomes more real to you when you're not working.
Keisha: Mm hmm.
Judith: And if you're not present, In the moment when you stop, the moments when you stop working, you, uh, you're not resting. You're just nostalgic for working.
Keisha: First, can you share your definition of Sabbath? What is Sabbath?
Judith: There's so many ways to answer that. It's what I call one of the great ideas of civilization, which is this idea that rest should be regular and for everyone. It was a democratic revolution in the sense that it extended the idea of rest to all the people, including animals who are not the people—everyone, everyone who worked. And even those who didn't work. It created this rhythm that had never existed before, which is this idea of one day in seven, the Shabbat. You know, there's different theories of the origin of the word, but one cognate is seven. This is one day in seven.
The Hebrew week is just day one, day two, day three, day four, day five, day six, Shabbat. You know, it's the only day of the week that has a name. Um, so it is this great idea. That's my first definition. My second definition would be obviously, um, it's a religious institution, which, you know, expanded from Judaism to Christianity, is practiced to a certain degree in the Muslim world on Fridays rather than on Saturday and Sunday.
So it has a lot of meanings bound up with whatever religion it sort of comes to you from that are specific to that religion. It's a religious institution and I also say it's a sort of important social institution in the sense that it is, I believe, the basis of a lot of social solidarity and civic solidarity.
Keisha: We've defined an algorithm as a recipe, like an instruction for using specific inputs and processes to get a specific outcome. How is Sabbath like an algorithm?
Judith: Wow, that's such a good analogy. one I wish I had written about in my book. Oh my God. I mean it literally is an algorithm, right? It's six plus one, six plus one, six plus one, six plus one. It's a very steady and boring algorithm. The very regularity of it is its great strength.
And, from that very basic backbeat, you generate all kinds of effects. I argue that the idea of having the leisure to read and to study. Setting aside work time for that, it actually kind of comes from the Sabbath.
There's a reason that the Jews are called the people of the book. They read the book on the seventh day. And I think that spirals out into a lot of culture. Other effects are this idea of deep civic engagement, deep immersion in your community, because when you stop on the seventh day to rest, then you become much more integrated into your community.
Um, there's just, there's so many outputs that come from this algorithm.
Keisha: You were answering in the context of your book. What's the main thrust of that argument in The Sabbath World?
Judith: You cannot rest alone. You must rest collectively. When I go out and talk about it and, uh, find that people are trying out different days for their Sabbath, that they think they can have a Sabbath by themselves, I feel that that is the fundamental misunderstanding that needs correction in this world: this idea that you can just take Monday afternoon off and think you're getting a real restorative rest.
Annanda: One of the books I read in seminary which moved me to pieces was To Mend the World by Emil Fackenheim. I think there's something really beautiful and special about the history of what the Jewish people have been through and yet the consistency of Sabbath as a spiritual technology.
Judith: I would call it a sort of counter technology, right? It's a technology to push against the sort of commercial uses to which most technology in our lives are being put and of course, it's a non-tangible technology. It's a technology in time rather than in space.
There's just much more hunger for it than there was a decade ago. As fast as everything seemed to be in 2010, they just seem a lot faster now, as online as everything seemed to be, they seem they're vastly more online since COVID. Another thing that has changed is the desynchronization of social time, by which I mean, just in time scheduling cascades throughout society in terms of work schedules where you are at the mercy of your employer as to what days and what time of day you're going to work; schedules are so desynchronized, so uncalibrated that you have to use a Doodle just to arrange dinner with a bunch of friends.
Just the idea of the machines never resting and you're always having to feed them. Things have speeded up and come apart and people are responding to that more.
I think people are tired of being on constantly on both ends of the economy in the salary and on the wage earning end you have either extremely long hours or extremely unpredictable hours.
Keisha: When the pandemic hit, I thought, Oh, what a boon to be able to break up my workday by taking the dog for a walk. And then the workday became its own monster.
Judith: And that has only gotten worse since I first started writing about this problem over the course of the past decade. I just think people are exhausted either from working too hard or working too erratically.
Annanda: And yet, what you're talking about, Judith, is a quiet place for people to go, no matter what is going on in their lives. And to me, that is a very moral reparative practice, because it's not denying what is going on in the world, but it is I think in a beautiful sense defiance and care in community, right?
Judith: I think the lesson of the Jewish Sabbath is this idea of boundaries.
The analogy I make is, I don't know if either of you or your listeners are in psychoanalysis or in therapy, but you know, you have that 45 or 50 minute block and it's bounded and it's kind of sacred in the sense that your therapist is not going to let you go beyond.
Audio: Could use a few lines of the Lehka Dodi or something similar in the background?
And it's a little weird in today's ideas of time to have this hard and fast boundary, but the Jewish Sabbath really makes it clear that that's one of the most important aspects of it… that 18 minutes before the sun sets, you light your candles, it's Shabbat, you wait till there are three stars in the sky at the end, you light another candle, you say a prayer that thanks God, for creating divisions between light and darkness, sacred and mundane.
And then off you go into your work week. and in between those times, there's all kinds of things you can't do, which a lot of people don't want to do and I totally get that. The effect that they have is to make the world a different place in which you can rest, but it can be that way because it's bounded.
Annanda: As Judith was talking about Sabbath and the cultural importance for her as a Jewish woman, and the benefits of that wisdom and practice within Jewish culture for us all, what comes to mind for me is the ways Black folk, Africana folks…
SOUND: Transition to something like Maze’s Golden Time of Day here, as background for this section.
… have places of respites, of cookouts, you know? My birthday's on the 4th of July, so when I celebrate, I like to have a cookout. We're just all sitting back, chill, laid.
Not everybody is invited to the cookout, right? Like, you only invite people who you know will curate the space that you can have a sense of rest.
I can have a Sabbath from some of the harmful ways of the world, some of the stereotypes of what it means to be Black. It is a space where I can be unapologetic and have a rest from the shenanigans. I love the idea of being able to rest and to be, and to connect with one another.
Cookout is a thing. It's a thing in so many, not just Black, traditions. Asian sibs, I know y'all be eating—
Keisha: I'mma pull in our Irish cousins too.
Annanda: Latino culture too. Latiné culture. I'm loving it. Good food, talking smack. Intentionally intergenerational, yes. It's a rest from the capitalist ways of the world. Time is suspended in a really beautiful way.
Keisha: Mmm. I learned about Juneteenth from the time that I was in Texas, and that was one of those spaces where I saw the entire community come together: The church folks and the mechanic yard folks and the stoop folks and all the different kinds of Black folks would come assemble in the park downtown.
And it was just like a space of *breathes out* exhale, you know, like You hold your breath going through life sometimes when the environment around you is oppressive and hostile, and to have those spaces in place and time where you can just *sighs* breathe.
I get some of that experience when I'm hanging out with my cousins, with their many, many children, celebrating graduations or birthdays or baptisms or whatever.
I know I'm going to eat really well. I'm gonna take some food home for the next day. *laughs*
Annanda: Everybody knows that one auntie… Auntie, if you're listening, you know who you are. You are a legend. Bringing the tupperware!
Keisha: Your tummy's getting fed and your soul's getting fed and, and that's the beauty of this experience for me. So let's hear a little bit more from Judith.
Keisha: How have you noticed marginalized communities or social groups today using rest as an innovative technology?
Judith: I'm very impressed with the American black church, with its long history of Sunday dinners. the black church held on to Sunday dinner a lot longer than white folk did,
Keisha: Mm hmm…
Judith: When I would go and, um, talk to synagogues who wanted to bring Shabbat back into their lives, I would say, start with Shabbos dinner on Friday night, because the most important thing is to “break bread.”
Keisha: Mm hmm.
Judith: A Christian phrase, but nonetheless, the most important thing to build a community is to break bread together. That's something that I think the modern world has forgotten. Cooking and food are very related to rest.
Keisha: You're actually making me feel a little nostalgic. When I was a kid, one of my treasured parts of Sabbath practice was mac and cheese that my mother made the day before and if we invited people over then that would be like the centerpiece of the sabbath lunch.
Judith: Yeah we have the same thing in Jewish culture. We have this stuff called cholent. It’s a —it’s a stew. and you throw in a lot of different things. beef bones, beef, lots of vegetables, beans. Very important are beans. Some people cook eggs in it. Now this is traditionally made on Friday afternoon and then put in a slow oven and cooked basically till lunch because you can't light a fire on Shabbat. It has to be lit before.
However, if you keep a fire going on a very low heat you can cook food. So this is cooked until lunchtime. It's really yummy.
Um, to the degree that you have cultures built around making food, Eating food together, whether it's in a family or in a group of friends, people are creating their own spaces for something that is more than not work, right?
It's not work, it's something pleasurable and rewarding also.
Annanda: Because one does not Sabbath alone… In community, you're saying there's a part of me that will rest from this, that will make a choice to engage differently.
Judith: I wrote my book on the Sabbath specifically so that people would appropriate it. I am so in favor of appropriation. It's a good idea. It should be adapted to the best of one's ability and in a culturally appropriate context. Meaning is made within the context of the community.
People who want to “Sabbath,” as it were, start with food. Start with gathering people together who are important to you. If you're single, if you feel alone, find a place. Religious institutions tend to be the ones that push back against the constant pulsing drive of commerce, and work.
But it might not be, if that's so alien and foreign to you, that is not going to be what it is… I think it's important that it not be part of the sort of endless technology of self improvement. So it's important that it not just be like, you know, a spin class,
That's great. But this is not about improvement or productivity or being a better, more competitive person. This is about being among others for the sake of being among others, because that is important to you.
Doing it regularly so that it becomes a habit, so that it becomes instinctive so that you can overcome that hurdle of, “Yeah, I know I really should go, but you know, I have this deadline,” there's like a group of people relying on you and you just can't get out of it.
And that's a technology too, of making sure that you do it, right?
Keisha: I want to drill into that a little bit because I think sometimes when we pull the practice out of various religious contexts, it then just becomes a weekend. On a weekend you might go to a restaurant and then somebody else is working to serve you at the restaurant.
Judith: That's, Oh boy. Yeah. That's a good point. Yes.
Keisha: You're talking about a cultural practice that allows everybody to rest, including those who might otherwise be serving us.
Judith: Yes, yes. And this is why I think the Jewish Sabbath eschews money, you can't go to the restaurant unless it's somehow been paid for in advance. but even in the most strictly Sabbatarian societies there was a recognition that doctors need to be on call and nurses need to keep working.
And, when things break down, there's going to be people who need to fix them. Judaism has a concept called Pikuach nefesh. Um, and another one called Tzorchei Tzibbur, one is to save a life, and the other is for the needs of the community or the interests of the community, and these are exceptions to the Sabbath rules.
So, there will be some working, you might call it essential workers, there is this idea, Keisha, definitely of not consuming in a commercial context and not, you know, and not working in a commercial context.
And that's going to depend a lot on whether it becomes possible not to work, which is harder and harder given the rising unpredictability of our work schedule.
Keisha: When you said “essential workers,” I thought of Annanda's work as a chaplain because chaplains were one of those communities that were framed as essential, particularly during the pandemic.
Judith: Oh, wow.
Annanda: Yeah, that was a fun time.
Keisha: Fun in quotes! What, what was rest like for you then, Annanda?
Annanda: Oh my gosh. A rest was put on hold for me. I lived in community at the time. So there was maybe seven of us in a house in Oakland I served at Stanford Hospital. It was pretty intensive. emotionally and with the spiritual care I was providing, I don't think I had a Sabbath until the summer of 2021. I was able to take off and I pastored a small church part time; it was a lot. March 2020 to September 2020, I alone in my shifts had 127 deaths. There were times where I was so afraid, like, please do not let one more person die because the cold room doesn't have a space.
What do you do with that?
There was never a full off during that time. I was in a different mode. Like military chaplains that are… in war? That's what I compare it to. You can't fully turn that off because to fully turn that off, then all the stuff comes out.
Keisha: Mm-Hmm. Mm-Hmm.
Annanda: And so I give thanks to my therapist for that time. And also great chaplains to work with. You're constantly processing what's going on, which is one thing I really appreciate about clinically trained chaplaincy. I had a great team to work with.
Keisha: I'm hearing the social network and social practice made that period easier to bear.
Annanda: There was this mix of a need to socially hold a container, while also a need to completely just break away and decompress and that's what I needed.
Keisha: And it can be hard to balance those competing needs as an individual. Especially when almost every structure in the surrounding culture is swimming in the direction of overwork, overwhelm, and isolation.
Keisha: Is this all then up to each of us, individually opting into some sort of collective cultural practice? Protestant theologian Walter Brueggeman talks about Sabbath as a way to resist empire, to resist everything that transforms people into things.
Judith: Judaism has in its Sabbath laws, the idea of not using the world, not just other people, but the world instrumentally, which is why you don't work. You don't use things instrumentally.
And that reminds you that you yourself are not a thing that the world does not exist to be worked on by you. The world exists because, well, in the Jewish context, God made it, or the world exists because the world exists without you if you don't want to have it in a religious context. You one day a week have to be reminded that you are not master of the world, so you stop mastering the world. That's a very radical concept. And I do think that's what Brueggemann is talking about.
Keisha: Thank you, Judith.
Judith: Okay. Well, I just want to say thank you.
Reflection, Discussion
AUDIO: Reflection music - I just noticed there are birds in my raw audio, and I’d like to leave them in there. Maybe add some more, to rep nature and touching grass?
Keisha: I loved that theme of the world not existing to be worked on by us. But it’s hard to remember that with the kind of mobile technology that’s on all the time.
Remember that phone disconnection experiment from public radio podcaster Manoush Zomorodi? Here’s how she said that turned out.
AUDIO (14:22-14:41): “In the end, 20,000 people did ‘Bored and Brilliant’ that week. Ninety percent cut down on their minutes. Seventy percent got more time to think. People told me that they slept better. They felt happier. My favorite note was from a guy who said he felt like he was waking up from a mental hibernation…
(15:14-15:23): “The next time you go to check your phone, remember that if you don't decide how you're going to use the technology, the platforms will decide for you.”
Annanda: This planet really is a paradise. In the Abrahamic tradition, we call it Eden. And what I love about earth systems: it really is designed to do the work for you if you know how to work with it. We don't have to exert this additional labor that drains us. We already live on a planet that is designed to take care of itself if we can pick up on the patterns of its care, and in picking up on the patterns of the planet's care, we are caring for ourselves because we're not separate.
Keisha: Beyond what we do or don't do on any given day, Sabbath means that sense of reconnection to all of life. And in human culture, it also means acute attention to everything that runs contrary. Everything that uses people, that makes us tools or resources to optimize and profit from.
That feels like one of the major questions of this century, because of how much we have used technology to pull us out of the ecosystem, out of the slipstream of nature.
In this episode, we talked a lot about this being a time where it's harder and harder not to work, and we're also in the era of the Internet of Things, where even when you're not working, the machines around you in your house… are working because they're connected to the internet, they're accessible remotely, they're awake all the time, just primed to hear you say their name.
*whispers Alexa* *laugh*
Annanda: Don't be whispering—That is—I got the creepy shivers with that. Don't be whispering Alexa in that microphone like that!
Keisha: Oh, she’s like the Elf on the Shelf, watching us! I kid, but the impacts are real.
With all of this connectivity, my smartphone is always on.
And at some point, because of my connection to these things, I'm always on. So for me, Sabbath is an intentional time to switch off.
Annanda: I actually do want to be unplugged.
Keisha: Yes. Yes. I’ve been pulling away from spending so much free time in the digital world. Like, I picked up fountain pens a couple of years ago—
Annanda: Of course you did, yes.
Keisha: it's a way to really just slow down… and form the words with my hands… in a way that feels different and flows differently than when I'm typing it out. Seeking more of those analog experiences, it puts me in a different place. Maybe it's a more human place. So I get to spend Sabbath time writing a letter with a pen instead of mass texting a bunch of people, or endlessly scrolling social media, or being streamed at through conventional broadcast TV. I get to make those choices.
The Light Phone—app free, just the basics—it seems like such a solid stop on the way to being more free, more whole, more easeful.
Audio: closing music
And I hope that what we heard today, from Joe and Judith and us, inspires more people to check it out because it's more than switching on a meditation app and listening to the sounds of the forest and the waterfall… and I don't know if it's a real forest or a real waterfall.
In the end, there's really nothing quite like going outside and touching actual grass… without my phone!
Annanda: Yes, yes, yes, yes.
CREDITS - updated
Annanda: I’m Annanda Barclay.
Keisha: And I’m Keisha McKenzie.
Annanda: We’re building community with listeners this season. So reach out and talk to us on Instagram—our handle is @moralrepairpodcast. Also catch us on X… formerly Twitter… And LinkedIn. We’d love to hear from you!
Keisha: Follow the show on all major audio platforms—Apple, Spotify, Audible, RSS—wherever you like to listen to podcasts. And please help others find Moral Repair by sharing new episodes and leaving us a review.
Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Rye Dorsey, Courtney Fleurantin, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.
Keisha: Original music is by Jim Cooper and Infomercial USA. And original cover art is by Randy Price. Our two-time Ambie-nominated podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.
[PRX SIGNATURE]
SHOW NOTES
Talk to us online: at Instagram (@moralrepairpodcast), on X (@moralrepair), and on LinkedIn: https://www.linkedin.com/company/moral-repair-podcast/
We’ll be in Oakland, CA, for a live event on June 11. Stay tuned for details!
Zomorodi’s TED Talk: https://www.ted.com/talks/manoush_zomorodi_how_boredom_can_lead_to_your_most_brilliant_ideas
Research on “technoference,” relationships, and quality of life: https://psycnet.apa.org/record/2014-52280-001; tips from Psychology Today: https://www.psychologytoday.com/us/blog/the-squeaky-wheel/201501/how-cellphone-use-can-disconnect-your-relationship
Learn about the Light Phone: https://www.thelightphone.com/
Try a Tech Shabbat! Tiffany Shlain explains the principles in WIRED: https://www.wired.com/story/everything-you-need-to-enjoy-one-tech-free-day-a-week/
Tell us: What are the community spaces where you find rest with others?
How Tech Impacts American Farmland
Season 2 | Episode 4
How do the profits from Big Tech impact family farmers, food-ways and the environment in the United States? Annanda & Keisha talk to Jamie Fanous, Director of Policy for Community Alliance with Family Partners and Dr. Amina Darwish, the Associate Dean and Advisor for Muslim Life at The Office of Religious and Spiritual Life at Stanford University.
-
The Big Question
This episode explores the big question, how do the profits from Big Tech, impact the access to property, foodways and the environment in the U.S.?
Intro
Annanda: You know, this Palo Alto Trader Joe's.
Keisha: Yes.
Annanda: 1. 29 for a green bell pepper.
Keisha: I can't believe that. When I was in grad school in Texas, I remember going through Walmart because Walmart was where I could afford to shop. . And I remember like looking at the price tags, what color of pepper can I afford to buy? Because, the green bell peppers were like, you could get two for 50 bucks, 50, not 50 bucks, two for 50 cents.
Annanda: Luxury.
Keisha: And the tricolor peppers were in a package and they were 1. 50 or something. I'm just like, the numbers don't work out for me to have beautiful peppers in my salad or in my cart. Green peppers for like four or five years. That's what it was.
Annanda: Wow. That seems like extra chlorophyll.
Keisha:I'm sure my skin thanked me.
Annanda: It feels like such luxury right now. Are you kidding me? Green peppers, two for 50 cents?
Keisha: Yeah, it's a different world. And that was only 15 years ago.
Annanda: Wow.
Keisha: Yeah.
Annanda: Wow. Yeah, the nuts are nutty. The prices are wild.
Keisha: But that's crazy, because you're right there in California, and isn't California Nut Central?
Annanda: Yeah. Number one nut producer in the world, baby!
Keisha: That's gotta be great at something.
Annanda: It's like 16 ounces of almonds at that Trader Joe's was 7. 99?
Keisha: But they didn't grow them for you, Annanda.
Annanda: Well you ain't lyin’ on that note, y'all are probably wondering, why are we fussing about the price of almonds?
Keisha: Because that's a real thing one needs to fuss about.
Annanda:Here's the question for our episode. How do profits from Big Tech impact access to farmland, food supply chains, and waterways in the United States?
SOUND: title music starts
Annanda: I’m Annanda Barclay…
Keisha: I’m Keisha McKenzie…
Annanda: This is Moral Repair: A Black Exploration of Tech. A show where we explore the social and moral impacts of tech… and share wisdom from Africana culture for repairing what tech has broken.
Keisha: When we return we’ll learn about how tech investors are reshaping the world of food… and what community organizers are doing about it.
SOUND: End title music
Silence
Segment A
Annanda: Welcome, Jamie. Glad you're here.
Jamie: I'm happy to be here. Thanks for having me
Keisha: Jamie Fanous works with farmers to create and advance policies rooted in farmer's needs. We wanted to talk with Jamie. We talked with Jamie on the impact of big tech on American farmland.
Annanda: Community Alliance with Family Farmers, which is the organization that you work for, is seeing big tech have a major impact on family farms. What is that impact on family farmers, and does it just impact family farmers?
SOUND: interview cue
Annanda: Community Alliance with Family Farmers, which is the organization that you work for, is seeing big tech have a major impact on family farms. What is that impact on family farmers, and does it just impact family farmers?
Jamie: It's a big question. The short answer is yes. It's devastating and it's terrifying. the reality is they do that because ag land is a really good investment. It's really stable and it will always be subsidized and supported by the government. So it's always going to keep your investment portfolio stable. I'll just name like the big names that I can name Bill Gates.
Bill Gates owns 270, 000 acres of land nationally. That's huge. There's around, I think it's like 900 million acres of cropland in the country, in Solano County, 50, 000 acres of land has been purchased over the last five years through an anonymous, LLC company now in the last year they've revealed themselves as this entity called California Forever, and that is made up of a handful of the biggest names in tech, Mark Anderson, Lauren Powell Jobs, they're the primary bank rollers of this, and it's just devastating, like what they've done to that county and what they're planning on doing, to that community.
{insert clip starting at 0:03-0:30}
Annanda: I want to explicitly say, as I was doing research for this episode, it proved difficult to hear voices and interviews from local farmers. The available media paled in comparison to what I could find on California Forever and its representatives. More on Solano County and California forever later…
Jamie: So there's a lot in here, but ultimately we need ag land to be owned by our community members, owned by our farmers. It's a slippery slope.
Keisha: I was wondering if you would share, is there a definition of what a family farm is? And I asked that because when I lived in Texas, there were a number of families that owned farms, um, and some of them owned a lot of acres. I always had questions about how they got that land in the first place.
But if I think of family farms, should I be thinking about. John next door that passes it down to his son and their son and their daughter, or am I thinking of something at a larger scale?
Jamie: Totally. Yeah, that's a great question. There are a lot of big corporations that get away with calling themselves family farms, because it is owned by family. At CAF, the terminology that we really use is family scale agriculture. That means people that grow on an eighth of an acre, to 500 acres.
If you are growing food for your community to feed your community healthy produce, you are a family farmer in my opinion. You are a person that cares. about beyond just trying to make money. We try to be really inclusive of community gardeners of small, non profit farms because they're committed to growing food for more than just themselves.
This concept of family farmer, we don't really want to perpetuate this like white couple and a pitchfork ideology or imagery that we all have. Family means a lot more than small, nuclear family that we think of, but also we're realistic. It needs to be both the community gardeners and the small farms and urban areas. You also need to make a living.
Keisha: What do you mean by that?
Jamie: Most of our farmers, a lot of them, need a second job because they're probably making under the poverty line. It is not really possible to be thriving as a family farmer these days, there's a lot of pressure on them to really fail. Which is why we exist as a nonprofit. We are fighting every way we can to make sure that family farmers exist in the future.
We lose four family farms a day or small farms a day, and that's just in California. It's an impossible battle that they're facing, from market pressures to climate change even if they did have a little bit of money to go buy land, that hedge fund or that tech company is coming in very quick because natural resources are everything. Whoever owns the land and owns the water has power in this country, or in the world.
Annanda: So what impact does big tech buying up all of this farmland have on our food supply?
Jamie: Totally. I try to help people understand ag and ag land and our natural resources. I think about it in the context of renting or owning a home, right? If you own your home or you feel secure in your home, you're going to invest in it, you're going to plant flowers. You're going to paint the walls. You're going to make sure your home stays clean and is stable and secure. You're going to apply a very similar principle to ag land.
If a farmer owns that land and feels secure in that land, they're going to invest in the natural resources. They're going to make sure they're implementing healthy soil practices. They're going to plant things that are, what are called hedgerows, which is like habitat. They're planting habitat. They are maintaining the ecosystem because they want to be there for a long time. They want their seven generations from now. To be able to grow food and thrive in that space. That is not the core principles of how tech and how hedge funds show up. When they purchase land, their goal is to make as much money as they possibly can in the short amount of time and get out.
So that means they're probably going to plant things like almonds or things that are really high value and have a really strong supply chain network. They're going to use all of the water they possibly can.
Keisha: To put it in perspective, California has 1.8 million acres of almonds. That’s half the size of the state of Connecticut.
Jamie: In the Central Valley they take a lot of water. Don't really care about pesticides and herbicides impacting the communities. Don't care about air quality, care about making money. So they're gonna utilize all the pesticides and herbicides to the degree in which they can.
Keisha: A central valley local news station reported on the impact of pesticides last year
{play clip starting at 1:09-1:29}
Jamie: Fortunately, we have standards in the states, but that's not true in other countries. And so you're going to do everything that you possibly can to destroy what you can, and then leave, cause that's what capitalism does. You can make a million dollars in one year off of a hundred acres of almonds. So it is a lucrative business you've got if you're buying this land.
Annanda: To put that into perspective using Jamie’s math. The almond crop in California alone is worth 18 billion dollars, profit.
Jamie: I also want to talk about power. And we know, this comes up even in the state legislature here in California, food security, in the context of climate change and thinking about our global supply chain is coming up a lot. People are like, we don't want these countries buying up certain land in the U.S. because there's going to be a future where we're fighting over those things. The reality is that what's being grown here is not really to feed us, but it does have a really big implication on our food security in the context of the majority of cropland in California is not owned by us.
Annanda: You mentioned water, which is life. It is the blood supply of the earth, right? So what impact does this have on water?
Jamie: Oh, that's a spicy one. When it comes to water, there is this trend now that we need to start regulating things and controlling and monitoring groundwater. We're going to have water markets, like people are going to be fighting over, people are gonna be selling water and trading water, which is just sick. I mean, we already do that to a degree, but it's pretty terrifying. So it's whoever has the most money is going to control our water. That's already starting to happen in some other places.
Keisha: And you can tell governments are concerned about this because they’re gaming out security plans. This is a clip of Vice President Kamala Harris announcing a water security action plan in 2022…
Start clip from{9:44-10:20}
Jamie: The other piece that's really specific that I could share when it comes to water is this California forever thing that's happening in Solano County. So a couple of tech billionaires came together. Over a five year period through, LLC Flannery Associates started buying up land, it was a mystery who they were and it was really scary for the community.
[start clip from 0:01-0:15]
Jamie: They pit family members against each other to get them to sell their land. They did really malicious practices to get people to sell their land. The number one job. in Solano County is ag., so they're gutting a very core piece of the community and now there's this whole thing where they're literally suing those people that sold their land to them saying that they colluded, and that's literally going through the court system.
{starting clip at 0:52-1:18}
Jamie: But the reason I bring it up the full context, their plan is to build a city in Solano County, in California's grasslands. These are rangelands. These are dry farmed. There's nobody irrigating this land for the most part. It's all rainwater reliant, but there's a very big component, where they're buying up land, which is along a waterway. The thinking behind it is, they failed so badly at developing Silicon Valley that they were just like, let's just try again. Let's build a walkable city because we messed up so bad so let's just try it again. From scratch.
And now it's on the ballot. In November, they're going to decide whether or not the city is happening.
Given all of that background, when it comes to water, it's definitely going to take up a whole lot more water and need way more water than 50,000 acres of grasslands of dry farmed grasslands.
That's going to have a huge impact on water availability in the county. That's probably going to cause more droughts in the county. Right now that county doesn't have a ton of water issues, but it definitely will. They also are starting to purchase land that have water rights to the Delta that's there, which is a major source of water.
Our farmers don't have the time to be at the table and these discussions that are happening. They might just completely lose their water rights or lose access to water and have no way to get it back. And they'll just go out of business. A reality that some of them are facing and we're terrified of.
Water is a big deal.
Keisha: You mentioned water and these investment practices that buy up the resources they know people will need.
People need them now, but they know they will need them in a less secure future. And on one hand, they're building technology to innovate. And on the other, they're investing in weapons that increase the likelihood of war, which makes the resources more scarce. Can you speak to the relationship between Big tech's investment and the overall environment. Is there a larger story there that we should underline?
Jamie: In any scenario. What's out there that's happening that big tech is influencing that we don't know about. Are there going to be scenarios in the future where tech has control over water and we find out about it too late? The scary part is their ability to be anonymous. There's a lot happening around tech, getting data, like ag land data. What scares us the most is like the majority of the farmers we work with are farmers of color. A lot of them are immigrants. A lot of them do not want to be known because they have so much distrust and fear, just with the state government and the federal government, tech has ways beyond my understanding of getting people's information.
What is terrifying is they start posting the location of these Hmong farmers. in Fresno and then they start getting harassed because that's happens like the amount of times there's a lot of black farmers that I know in Fresno and the amount of times that they've had their equipment stolen there's communities in the valley that are very clearly being discriminated against still and harassed. These folks want to stay anonymous for a reason.
It's not safe for them. That's something that definitely concerns us like what happens when they get that data and they start using it for their own means? Maybe they're trying to help all of these people, quote unquote, and in reality they're causing serious damage. That's something that scares me the most, the security and the safety of the farmers we work with.
They should have a say in food systems and agriculture and understand what is a good farm versus, what is a farm that is actually doing damage for their communities. As an organization that is very much committed to racial justice, we know that we really need to show up when it comes to farm worker issues or access to water issues. The ownership and the buying up of ag land is just not in the way that it is housing. Don't get me wrong, housing should be a human right. These are very important things. But we should be applying the same principle to our ag land because at the end of the day, that's what's producing our food.
We need food to survive, should be a right to everyone to have access to healthy food, in the same way that it should be a right to housing. We really need to be paying attention a bit more to what's going on and starting to connect the dots.
Annanda: What role of advocacy do you think the everyday citizen, voter person, has as it relates to tech companies buying up American farmland?
Jamie: I would love the Gen Z ers to be like tracking ag land buying up the way that they do like private jets. I would love to see us be paying attention to these silent Investing strategies that are happening in the way that we pay attention, like famous people I think we should take note of what's happening I worry that because so many people, don't live in these places, but they're really important to pay attention to. Make calls. Make calls when organizations like CAF or other smaller ones Are asking you to make calls. We don't have the kind of power that unions do or that tech does.
We really need to be in an effort to surface our food system in ways that we haven't in the past. We're losing farmers, we're losing farm workers, our communities are really not doing okay, and it's pretty scary to see this trend of losing small farms. The health and safety of farm workers is constantly, just the stories that come out are terrifying.
My other plug to well meaning tech folks is , don't make an app without asking a farmer first. Don't do this saviorism anymore.
Annanda: What kind of apps? What are you seeing?
Jamie: Like the amount of times myself and my colleagues have been approached being like, Oh, you made this app to make lives easier. And you've never asked them what they need. If you're a well meaning tech person or VC person, ask first.
Ask us what we need before going and asking your buddies for a hundred million dollars to make app. That would be great. It's sad, because these are really well meaning people. They really are. It's just like, your approach was a little flawed. You saw a problem from your vantage point. And thought that you could fix it, but maybe they need something else. Maybe they need that hundred million dollars that you got from investors instead.
Keisha: Annanda, can you imagine a world where the motive for agriculture as a sector was more about caring for others and feeding others and not just turning a profit?
Annanda: Yes, I can. I imagine it all the damn time because it needs to happen and it's not asking for much. Like that bar to me, Keisha, is so low. It's so low. And I feel like, go ahead.
Keisha: No, and I, I know Amina has wisdom from Islam to kind of guide our dream and hope about this and I'm, so appreciative that we're hearing from her.
Annanda: Same, same, same, same. And so to me, like, that's how it's easy to imagine it. It's not because I'm imagining something utopic that isn't. I'm imagining something that is proven over and over and over again that if you have good waterways and intentionally have good hydrology and you plant permaculture you will have an abundant crop and to me it's like well you feed people with abundant and sell the extra.
Keisha: Yes yes
Annanda: It really bothers me that we have these simple ancient systems, and we don't put them in place at scale.
Keisha: Repair is in reach. Repair is in reach. Yes! Yeah.
Annanda: Yeah, because then, now we're not having to deal with some national security mess. Over some food and water. Can you please put my tax dollars towards that instead?
Keisha: You know what I mean!
After the break we’ll hear from another Black wisdom tradition—Islam. What can Islam teach us about ways to use wealth to benefit all our people?
SOUND: break cue
BREAK
Segment B
Annanda: Welcome back to Moral Repair: A Black Exploration of tech. What we just heard from Jamie feels important, and overwhelming. But one of our core values for the show is to not be left in doomsday despair.
Keisha: So the question becomes, what options do we have for stewarding incredible wealth in a way that isn't? Problematic, imperialistic, that doesn't turn us into colonizers.
SOUND: interview cue
Amina: (reading from the Quran)
وَٱلَّذِينَ ٱتَّخَذُوا۟ مَسْجِدًۭا ضِرَارًۭا وَكُفْرًۭا وَتَفْرِيقًۢا بَيْنَ ٱلْمُؤْمِنِينَ وَإِرْصَادًۭا لِّمَنْ حَارَبَ ٱللَّهَ وَرَسُولَهُۥ مِن قَبْلُ ۚ وَلَيَحْلِفُنَّ إِنْ أَرَدْنَآ إِلَّا ٱلْحُسْنَىٰ ۖ وَٱللَّهُ يَشْهَدُ إِنَّهُمْ لَكَـٰذِبُونَ ١٠٧
So that was actually chapter 107 of the Koran, and it says, "in the name of God, the most merciful, the most compass. Have you seen the one who denies what's owed? The one who repulses the orphan, does not encourage feeding the poor. So woe to those who pray, yet are mindless in their prayer, and those who only do it for show, and refuse to give even the simplest of aid."
And I really, really love this chapter in the Qur'an because it actually brings this idea of, giving people their rights and giving people what they're owed is in and of itself actually what the faith is supposed to be. So some of the translations of the Qur'an, some people said, "oh, it's denying the day of judgment of, , one day I will be held to account." But the root word of deen is actually, has the same root word as debt of, “No, I owe someone this.” And I just think it's so beautiful
It's really one of my favorite chapters in the Qur'an.
Annanda: For our listeners, Aminah, Who are you?
Amina: So I'm actually the Muslim chaplain at Stanford. I have a really long title that I had to memorize the first time before I met the president. I think the official title is Associate Dean in the Office for Religious and Spiritual Life and Advisor for Muslim Life
Keisha: Amina, that was a beautiful reading. Thank you. Thank you. So I'm hearing from it that there's a practice in Islam that the Prophet Muhammad, peace be upon him, and the Quran has instructed about having vast amounts of wealth. What is that practice?
Amina: So it's actually called Zakat and it translates to purification. And it's one of the five pillars of Islam. If you have excess wealth, you owe a portion of it. I cannot, as a Muslim, believe that if I have excess wealth that it all actually belongs to me. It does not. That is the acknowledgement of the faith, which is incredible. And what this verse is talking about is like really just this larger ethic with money of the person that we have a problem with is the person, that doesn't care about the orphan, that doesn't care about feeding those in need, that would, Prevent even the most common of kindness.
This concept of zakat is if you have wealth that you have not used, it's excess wealth, it's not your house or your. like clothes or stuff that you use day to day and stuff that you need to live. It's anything extra. So if you have home number two, car number two, all the extra stuff, like money sitting in a bank account that you haven't spent.
If after a year you still have not found a purpose for it, now you owe a percentage of that wealth. So it's not a wealth tax. It's a hoarding tax, which I think is really fascinating because what it's actually doing is economically incentivizing people who have lots of extra wealth to actually use it for an economic purpose, so if I had an extra million dollars by the end of the year, I would have to pay two and a half percent of the million dollars. If I take that million dollars and I build a company with it, I don't pay a percentage of the company.
As long as the money is moving, then it's actually not a problem.. You don't pay zakat on it, which I think is really fascinating. And if you think of the idea of water that stands still, it collects mosquitoes. It collects a lot of junk. Whereas flowing water is purified. And if you think of the economy in that way, the more money changes hands, the more services there are, the more goods that we have that are changing hands, The stronger everyone is and the more economic benefit there is to everyone.
Annanda: The fact that it's a pillar. I'm like, you can convert. A ton of people right there just on that alone.
Amina: Not the greedy. They tend to be like, Which actually says a lot. Greed is actually a disease of the heart.
Annanda: That's important.
Amina:And this idea of an investment, of if you are investing in the other person, it's not just, Oh, I'm expecting to make money. Yes, you are expecting to make money because that is inherently what investment is.
But because we are sharing in this material wealth, the biggest issue is if we are sharing in the material wealth and we share in the success, then we also share in the failure. And this idea of, we share the risk of what is happening. It forces investors to be more conscientious about what's happening.
If you are giving someone a mortgage, for them, the line is, I might lose my home. That is a high level of risk and that's the problem where it's just completely based on money and they're using the house as collateral whereas , okay, I'm giving you a loan. And , you don't outright own the house, so I can charge you a portion of the rent as long as you live there, but then if you default, then I'm using your principal towards the rent and you don't just get evicted because I agree to share some of this risk because it is an investment for me as well.
And this notion of a loan, a loan was always viewed as a good deed. Actually, there are some Muslim scholars that viewed it as an even better deed than giving charity. If you're giving someone a loan, you're betting on their success. If you're giving someone charity, you're saying, I don't think you're going to make it.
Which is fine. A lot of the times we are in tough spots. Where I need the ease of mind of knowing that, I don't actually owe this debt, but if I have hope of I will be able to make this money back, if you just help me out of my tough spot, one, it gives me so much more dignity, and two, You have actually helped me out of that tight spot and it was also part of the tradition and the culture that if someone Gave you a loan when you're in a tight spot that you be generous back it's not interest because you're not demanding a certain amount because they might not be able to and if they can't actually pay you back then You can forgive the loan and say that okay, it is now charity because yeah You were in a tough spot and then we had an economic downturn.
So Stuff happens, life happens. I always have the option of forgiving the debt, but if I'm giving you a loan in the first place, I'm actually betting on your success.
Keisha: often charity in this culture is elevated as more, it goes with philanthropy, love of humankind, but you're saying the love of humankind that assumes that the person has a future would make it a loan and not just a gift.
Annanda: I love that so much.
Amina: Part of the post colonialist version of Islam that a lot of people have taken on, of money is bad and dirty, I will not touch it. Money is a means of material wealth. We are stuck in this world. It is how we attribute value to things.
Money is not bad. Money is powerful. How do I use money in a way that is ethical and pleasing to God? Instead of looking at it as, oh, money is just this material world. Like, no, we live on Earth. , we are here now. We gotta figure this out. There's a spiritual sense to it, and we should be mindful of that spiritual sense, so that we are actually using our money correctly.
Annanda: Amina, we talked to Jamie Fanous earlier, and she works with an organization called CAFF, which works with family farmers in the state of California. And Jamie shared with us some major concerns that CAF has regarding the impact of tech billionaires buying thousands of acres of American farmland and this practice is displacing family farmers and having a negative impact on the environment.
What wisdoms coming from Zakat in particular maybe or out of Muslim teachings would speak to this scenario?
Amina:There's a verse in the Qur'an where God says, Do not hoard your wealth so that it just becomes a means of rotating between the wealthy among you. And again, we talked about this idea of Zakat forces money to change hands, sets up an economy where the more hands it changes, the better. So with that principle, if there's basically 10 people at the top that are trading with each other, without the rest of us, that inherently defeats the purpose.
And this is part of the issue when we get obsessed with legality and the letter of the law over the spirit of the law. If I'm living in a society where people do not have their most basic needs, I'm actually saying that I don't have a right to just look away. And even this idea of shared risk, if I control the source of food and people do not have food, I am sinful because I prevented it.
I have no right to do that. Part of the reason we talk about, when I talk to youth about, hey guys, food grows on trees. Money doesn't grow on trees, but food grows on trees. Part of our belief in God being the provider is that God created the earth able to feed all of us. There is a point of, some people are going to be wealthier than others, and that's fine.
But there's a point of, someone's hungry? No, that's not poverty. It gets to a point of oppression. I should not be too willing to accept a society where people are hungry? When I have food, I am then sinful. And part of my fear with this , it's a lack of respect for the society we live in, for the land that grows the food.
Appreciate the earth, the earth will give you more. Appreciate humans, humans will give you more. Because this is how we've been built and designed. This is part of the beauty that God put in the world. And if all I see is , there's not enough, and I'm acting out of fear, of course this is what I look like. But if I come from a place of abundance and gratitude, the joy is endless.
Amina: We are conduits of love and life that we take and we give. If you take and don't give back, there's something broken in you. If you give and think, I don't need to take, you will have not enough to give. It's part of the cycle of life. We are of this earth. We go back to the earth. We all are. It is the humility of the earth that it continues to carry us.
We trample on it every day and it continues to carry us and love us and nurture us.
Keisha: How, Aminah, would you like to see the world change?
Amina: The prophets were ultimately the greatest revolutionaries, that they taught us our dignity and our value as human beings, and because of that they were able to change the systems from within. If we are focused on the material of who has power now, eventually someone else will have power. And these cycles change.
If we fundamentally change ourselves, if we didn't live in a society where people had arrogance in their hearts, we would not have racism in our society. How do we change ourselves from within? How do we change ourselves from within consistently enough that we hit a tipping point where it becomes culturally unacceptable?
For us to hoard. For us, to see ourselves as just material existence. How do we all transform ourselves from within? My favorite quote from Rumi, he said, "I used to be clever, so I wanted to change the world. And then I had wisdom, so I decided to change myself." This is not someone who has not witnessed tragedy and war.
He did. And when he saw that much war, he said, this is anger magnified. I'm gonna work on the anger inside of me. This is hate magnified. How do I change it inside of me? You And if we all do this, you literally transform what the world looks like.
It has been done. Again, we have a track record where we did do this. Is it perfect? No. And the more good that you put in the world, the more good there is in the world. So I'm just gonna keep feeding this cycle of good until one day I meet my creator and be like, I did my best.
Annanda: Thank you so, so, so, so much.
Amina:Thank you so much, too.
SOUND: break cue
BREAK
Segment C
Keisha: Thinking about what both Jamie and Amina had to say about the impact of big tech and farmland and ways we can use wealth for collective benefit and not just private profit.
I think this is where it's not just abstract tech, it's about how we relate to each other. And I'm just starting a book project around religion as an enabling force for better ways of relating through the economy. And was talking with the team recently about that. I, I sometimes question whether it can be a positive force, but I know, we just heard Amina: it can bean enabling force that can inspire these deep values and commitments that change the way that we interact with each other, change our sense of accountability to each other, change even the way that we talk about charity. I welcome religion and spirituality when it reminds us that there are other ways that actually feel better and have more positive consequences.
Annanda: Positive consequences. We don't talk about positive consequences enough, you know? I'm inspired by you saying that. With zakat, you have a positive consequence with your wealth.
Have you ever heard of Potlatch? Does that mean anything to you?
Keisha: Like potluck?
Annanda: No, no, so potlatch, potlatch comes from American Indian from the northwest coast. So around Washington State and Canada, the tribes in that area had what was called a potlatch and the chiefs would have competitions every year of who could give away the most food. who could give away the most like possessions to those in need. Those were the most elevated chiefs were those who, who gave away what they had to display how much wealth that they had, that they could give it away so that nobody went without. That was actually the indigenous practice of the tribes in that region.
And that, and so as you're talking about that with Islam, and those cultural displays of wealth is actually people having instead of have nots. And what would that look like for tech in America? Right? What would that look like for the United States and the world?
Keisha: Yeah. To actually be a community where, and like national community, global community, where people have what they need and nobody lacks what nourishes them.
Annanda: Yes. Literally that. Yes. Everybody got food, everybody got water, everybody got a roof over their heads.
Keisha: Yes.
Annanda: And the people pay for it in their taxes. They're like, yeah, everybody should. The streets are safer. We can have nice things.
Keisha: We can.
SOUND: Closing theme starts
CALL TO ACTION
Annanda: We’re building community with listeners this season. So reach out and talk to us on Instagram—our handle is @moralrepairpodcast. Also catch us on X… formerly Twitter.
We’d love to hear from you!
Keisha: Follow the show on all major audio platforms—Apple, Spotify, Audible, RSS—wherever you like to listen to podcasts. And please help others find Moral Repair by sharing new episodes and leaving us a review.
S2 CREDITS SEGMENT
Annanda: I’m Annanda Barclay.
Keisha: And I’m Keisha McKenzie.
Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Rye Dorsey, Courtney Fleurantin, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.
Keisha: Original music is by Jim Cooper and Infomercial USA. And original cover art is by Sam Martin and Randy Price. Our two-time Ambie-nominated podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.
[PRX SIGNATURE]
SHOW NOTES
We’re excited to be live in Oakland on June 11 for a live in-person show at Kinfolx in the East Bay. Join us in person @4pm PST or online at, Making It: A Night of Big Questions! Sign up at Eventbrite
Community Alliance with Family Farmers
The Center for Biological Diversity’s take on California Forever
Zakat Calculator
Talk to us on Instagram (@moralrepairpodcast), on X (@moralrepair), and on LinkedIn.
Tech & Public Safety: Activism and Community
Season 2 | Episode 5
Keisha and Annanda talk to Sarah Nahar of Community Peacemaker Teams and Buddhist Peace Fellowship about technologies of public safety and conflict in the United States and how communities can organize themselves to repair the harms of oppression and policing. We look at the history and values of public safety in the USA, specific policing tech, and ways communities in and out of tech are responding.
-
S2 E5: Tech & Public Safety: Activism and Community
Lead Guest: Sarah Nahar
EPISODE BIG QUESTION: In today’s episode, we’re digging into the technologies of public safety and conflict in the United States…
… and how communities can organize themselves to repair the harms of oppression and policing.
DESCRIPTION: Keisha and Annanda talk to Sarah Nahar of Community Peacemaker Teams and Buddhist Peace Fellowship about technologies of public safety and conflict in the United States and how communities can organize themselves to repair the harms of oppression and policing. We look at the history and values of public safety in the USA, specific policing tech, and ways communities in and out of tech are responding.
Segment A: A Story, Problem frame: Conflict and surveillance tech
Keisha: Annanda, the first time I saw snipers in real life, their guns were pointed at some mothers… and me.
Annanda: Okay, Keisha, tell us that story.
Keisha: So it was about nine years ago, a National Moment of Silence gathering just a few days after police killed Michael Brown in Ferguson, Missouri.
Most of the people that I was with were parents grieving children that they’d lost to gun violence. We were walking toward a vigil at Baltimore’s inner harbor. We were mostly quiet.
And there were rows of officers peered down at us from the balconies of shops along the route. Helicopters flew back and forth overhead. And I will never forget the snipers…
“Hands up; don’t shoot!” In Baltimore, in Ferguson, in New Orleans, in Denver—all over the country that summer.
AUDIO: Audio from Denver CO #NMOS14
AUDIO: pensive sound
If there was any moment where the mythology of—in the UK, it's the good Bobby with his bowl hat—the police officer who's out to look out for you if you're lost as a child, like that narrative, that belief just fell out of my body.
Annanda: I think it's a beautiful thing that you even had that belief.
Keisha: It was really sweet, but there's no way to keep that belief and experience the police as they operate with Black people here.
Annanda: Like, yeah. The first time my parents had to sit me down about the police I could not have been more than seven years old. It was at Chicago at the time and there was a black woman driving in her car and she had a cell phone in her hand and the police shot her dead.
We already had conversations about the police by the time I was seven, but I remember asking something along the lines of why did they kill her?
And then I even remember when the Mike Brown protests, when they were happening, two things distinctly came to mind.
One, I remember following it live on Twitter because Twitter was reporting it before national news was and international news was picking it up before national news was.
And siblings from Palestine would tweet back and forth, like, here's how you navigate militarized police. That was the first modern experience I had of Palestinians talking to African Americans on here's how you navigate this. Literally online teaching us how to use, like, Maalox, all that stuff, for tear gas.
And I remember my mom. My mom, my dad, both were in American cities during the 60s. My mom was in Newark, my dad was in Chicago.
And my mom talks about being a little girl. We're talking about elementary school, and she remembers waking up and seeing a sniper by her window where she was living in downtown Newark. And the sniper didn't say anything, but put one finger over their lips to tell her to not make a noise.
Keisha: Wow!
Annanda: And so I think about this history that Black people have with guns, with snipers, and the intimacy of it. Because that was like in her home. That was her living space.
And that's one of her earliest memories.
Keisha: And like the thing about that, that just—I'm feeling it in my belly—is that they enrolled her… in their raid.
Annanda: Yeah. And at gunpoint!
Keisha: Right, right.
Annanda: Gun stories. Like, more of us have them than we’d like to admit.
AUDIO: title music
Annanda: I’m Annanda Barclay.
Keisha: And I’m Keisha McKenzie.
Annanda: This is Moral Repair: A Black Exploration of Tech.
In today’s episode, we’re digging into the technologies used to manage public safety here in the United States…
Keisha: … and how communities can organize themselves to repair the harms of oppression and the values and technologies of policing. [added space]
AUDIO: end title music
Keisha to Annanda: That day in Baltimore, it came home to me: Those snipers were not there to protect us. It felt more like they were there to protect something from us?
They were filming everything. Everything. And from everywhere.
Audio: rugged/gritty/curious vibes
So here’s what I learned that day: it wasn’t just those police officers who were watching us like that. It was also the tech.
Do you remember the film Minority Report?
Annanda: Yes I can: Tom Cruise sci-fi.
Keisha: Tom Cruise running.
Annanda: You’re not wrong! laughs
Keisha: Tom Cruise is always running and they keep trying to make us live in that movie.
So it’s based on a short novel by Philip Dick from 1956. Describes a world where police can predict crime in advance and then they have some of the tech that we have today in real life.
AUDIO: bips and beeps / computer noises
Touch screens, iris scanners, spy bugs.
Annanda: The imagination comes to reality, right? Like, this is how humans create.
Keisha: I kinda want a spy bug… like a little beetle with a microphone and a camera.
Annanda: Oh hell no.
Keisha: But I’m not a cop, so I’d use it for something boring like keeping an eye on the elders downstairs or watching birds or something unobtrusively. Nothing weird. laughs
Annanda: Oh, so you'd be benevolent. I'm supposed to trust that. laughs
Keisha: All of the overlords are benevolent, right?
Gadgets aside, it’s predictive analytics that are the main tech feature in Minority Report.
Audio from Tom Cruise version trailer: 00:06-00:16 (turn down volume though)
John: Look at me…. beeps Positive for Howard Marks.
I’m placing you under arrest for the future murder of Sarah Marks and …. That was set to take place today April 2nd at 08:04.
Howard: No!
Annanda: You know, those who get away with white collar crime constantly, like where are the systems for them? Give me a Minority Report for white collar crime.
Keisha: Mmhmm.
There's some new research that shows that the large language models that are used in predictive policing are trained to actually call police on users more often when they're in minority neighborhoods. (Source) More than 11,000 computer vision patents are about surveillance tools. This is something that's endemic.
Annanda: Those Chinese robot dogs on CNN sighs Y’all, y’all, take a look:
AUDIO: 00:00-00:14
“China’s military just showcased a robotic dog… outfitted with an automatic rifle. You heard that right. They say the remote-controlled weapon is not only able to fight alongside soldiers in urban combat operations, but maybe could replace them.”
Keisha: I hold Boston Dynamics responsible for this because we wouldn’t be here without their dancing robots.
Annanda: It reminded me of seeing imagery of the 1940s and World War II. But instead of people marching in goosestep, we have robot dogs that Boston Dynamics has trained to more or less parkour.
And China’s now like, “Booboo, you could put a gun on it.” And that's not to say that America wouldn't.
Keisha: Or hasn’t. We don’t even know.
Annanda: Exactly. I’m like, it’s bad enough with the repression with a human police officer. Now you’re going to have a robot dog?
Audio: break cue
When we come back, we’ll talk about how the technologies of policing play out on the ground … and how everyday community members can respond.
Segment B: stories and context from organizing: what’s happening with conflict tech in the field?
Audio: end break music
AUDIO: MSN.com / CBS Detroit: “Michigan police using hi-tech ‘Flock’ camera system to track criminals” 00:00-00:06
“Taylor Police Department is rolling out high tech cameras equipped with license plate readers. They’re already helping them with some major investigations…”
Keisha: So Annanda, the city traffic monitors they use in Michigan aren’t that much different than the cameras that take your picture when you drive through a highway toll booth.
But my concern with these kinds of surveillance technologies is more about the data that they can pick up and piece together: how often you travel, where you go. So you can’t just anonymously jump in your car or walk around your town anymore because these cameras, networked as they can be… talk to each other. And the data that they produce creates this map—of you. You’re never just one person in a crowd anymore. You’re like one of 50,000 data points.
And in Australia, police are using drones to track migrants after they leave immigration jail. [Source]
So that’s the kind of technology politicians mean when they talk about “smart border control” or “smart policing.”
Annanda: That sounds on brand. We try to ignore the amount of cameras we see, but they’re there… The further I get from Palo Alto and Menlo Park… just the more and more public surveillance I see. And that's not to say that Palo Alto doesn't have surveillance, because it does, but it's nothing to the extent of other, like, places that aren't as wealthy.
I do think public safety is a necessity. I don't think historically the way, and even the history of the police, which comes from patrolling slaves in the United States, at least, like, the police and paramilitary policing, it certainly doesn't make me feel safe. I don't see the good that comes of it. [added space]
Annanda: To help us make sense of all this, we talked with Sarah Nahar about her experiences with community and peacemaking organizing in the United States and in other conflict sites around the world.
AUDIO: break/transition cue music
Keisha: Sarah, thanks so much for joining us today.
Sarah: My pleasure.
Sarah: I'm a descendant of both willing and unwilling settlers. Mennonites who came about seven generations ago, fleeing religious persecution in Europe because they were pacifists and they refused to participate in various religious and governmental structures around what it meant to register their children for the military, pay for war, or insist that faith wasn't an adult choice. So that's on one side.
And then on the other side, about seven generations ago, we're trafficked as enslaved Africans. And so as a biracial person raised in the Great Lakes watershed, I am connected to water and protecting water, as well as to recognizing the precarity in which we live and the ways that different groups do or do not talk to one another. And I feel like I live a lot of that in my body.
The other thing to know about me is that I'm a committed scholar activist and I love so many of the experiments that I see in movements here in the U. S. and around the world. finding ways to be together not based on separation, scarcity, and powerlessness.
Keisha: Is there an example where you saw those traits display themselves in vivid color?
When you saw the system for what it is?
Sarah: I'll go to Charlottesville 2017 and Charlottesville 2018.
There was a rally to unite the right wing. This was in reactionary response to the invitation towards reorienting our society, paying attention to anti-blackness, and inviting us into the politics of care.
A group of us trained local community members who are already very well versed in how to engage actors who are armed, but often those are state actors. This was a moment in which there was going to be a lot of civilians with weapons. Civilian on civilian rules of engagement are different than the rules of engagement with the state.
And so we worked together on what it meant to, like, keep our communities safe. We trained in a lot of observation. Affinity groups. Learning street medic skills. It had to do with preparing our hearts and minds for being challenged and knowing how to quickly recalibrate our nervous systems.
It also had to do with practicing bullet dodging and noticing where weapons were and alerting people around us about the size, the activity, and the location and the timing of different armed actors.
Keisha: It also had to do with recognizing the white supremacists’ fear about demographic change.
This country has always been multi-ethnic. It has not always been an inclusive multiracial democracy—but we could choose to become one. And some people—White supremacists especially—are terrified.
Sarah: I think this realization had created fear for those involved in right wing politics and they were getting together to figure out how they could leverage the most power themselves. It was a movement based on a lot of separation, scarcity, and powerlessness. It has more or less been imploding and fighting amongst itself, because power-over within that space is so dominant.
There could have been an amazing moment in another world potentially where if a bunch of folks, who are white, who are feeling nervous about what the future holds for them, all came together to talk amongst themselves, but also talked with everyone else and said, like, how can I know that you will care for me, even though our histories have put you in a position and put me in a position that is not one of togetherness or connection?
There is a lot of fear among groups that have done significant oppressing that if others get power they will do to them what has been done. And there's a lot of really amazing stories from history where that hasn't happened.
It's amazing how much folks who are dealing with the impacts of oppression find resilience and togetherness and possibility when simply the yoke of oppression is removed.
Annanda: The fear that you spoke is so palpable, it's so real, and it repeats itself in history. The fear of those who have oppressed, that the oppressed, when empowered, will then do the same, to the oppressor.
Sarah: Those who are formerly oppressing can also find new ways of being if they're willing. And that is incredibly transformative.
Annanda: I mean there's a lot of folks of privilege in Berkeley and Oakland that are like, we want this to be a part of our legacy. Like, we want a different heritage. The healing of the generational traumas is real.
Sarah: But unfortunately in 2017 there was vehicular homicide by someone from the Unite the Right group that killed Heather Heyer and injured many, many others.
And It was the street medics, people that we trained and others who were first responders there who directly went and assisted them. It took a long time for the cops and for ambulances and official militarized police to respond because the police have been standing back and letting the civilians go at each other.
This wasn't what was planned because the police were to intervene and do their understanding of their job of public safety. This allowed for the militarized Unite the Right folks to have an upper hand and that was severely injuring and severely impactful.
In 2018, when we had a memorial march in Charlottesville, the cops were everywhere. And there wasn't any significant observation from any neo Nazis or Unite the Right folks. This was a community moment to honor Heather, to honor all those who had come together the year before, to collectively grieve, and it was highly policed and that was devastating.
Keisha: You've been connected to groups like Christian Peacemaker Teams and Buddhist Peace Fellowship. What role do you think religion or spirituality plays or could play in public safety narratives and changing those narratives?
Sarah: Oh, I love that question so much.
It has quite a lot to connect. I consider religions long social movements with existential claims. And they often have to do with your ultimate orientation in life and how that informs everything else.
And it has to do with navigating tension and helping to release tension so that one can like cellularly feel their belonging or a sense of contentment or recognition that what is happening for oneself is not the only thing going on on the planet or in the universe. One is certainly part of something much larger.
Christian Peacemaker Teams now calls itself Community Peacemaker Teams to challenge Christian hegemony. Those roots of pacifism were seen in the life of Jesus and the creative nonviolent direct action and resistance to empire that was there.
We also recognize that in many other traditions as well—those threads of liberation—and we want to be as inclusive as possible and invite anyone whose spirituality moves them to do unarmed civilian accompaniment, coming alongside communities to transform violence and oppression and to deal with the militarized violence that they face, not to go toe to toe with it.
The weapons are so intense and the connections with techno-spatial systems of surveillance, even on a very small scale, the nanotechnology of control in societies… we can't compete with that even if we wanted to. And so what are the most humane and connective ways that we can live together despite this?
Annanda: Sarah, how have the technologies that oppose you changed?
Sarah: There used to be a thought that you could find the state in its buildings of the Capitol, in its offices, of its centers. And what we have seen is that the state is now much more diffuse.
People can work from home now due to widespread internet, access to computers, and security technologies that secure the data going across, where normally that would not have been centralized before in a hard copy file or a certain type of internal server.
So that is why you see more demonstration at people's houses. Cause they're like, “We know you're working from home. laughs We're here on your driveway and on your lawn!”
Wanting to find the people where they are to make a type of human connection is the idea, right? And to try to invite them into shared risk while rather than consistent insulation away from the impacts they are having in the world.
Keisha: So the state doesn’t really have a single location anymore and its tools and powers are dispersed or as Sarah said, “diffused.”
Police have always been able to show up at individuals’ homes, like we talked about, but for government workers (especially politicians and Supreme Court justices), the office used to be the only place ordinary people could reach them. And then a lot of places closed their offices or reduced their office hours and made it harder to show up in person.
Annanda: For those who actually go into government offices and talk with the people when they're available, their stuff gets done a lot faster than, like, via email.
Or if they call and are able to get somebody on the other end, like, their services are provided a lot swifter.
Keisha: Yeah. Even at local government, like city or state government. I was trying to fill out some forms the other day, and the extra effort it took! There's a phone number on it, but they didn't want me to call because when you call, you get shunted to a voicemail system. And then they, in the voicemail, they tell you to email them anyway. I can see it happening at the local level, which I think is shortsighted because the local level is where people actually experience government that works.
The national level, government does not work!
Annanda: No, but local elections, I'm, local elections, local, it's true, you get, you get the immediate gratification. And the relationships, right? And the accountability.
Keisha: Yeah… What we've been talking about in this episode so far, like the relationship, the ability to human-connect-to-human as opposed to human-connect-to-data point or human… not even connect…. to the blob on the screen.
And how is our sense of possibility affected when we’re disconnected?
Sarah:
I feel like we've had to retool our brains a lot because in addition to the state being more diffuse, there's many other ways that they communicate that there is no alternative. Everyone can find their niche and just feel good and do their thing. And, self soothe while things are difficult.
And we're inviting people to reject a self soothing and say, we need a collective soothing.
And this collective soothing involves turning up the drama in the places where it can be decided on. And it involves staying with our grief. It involves managing our anger and it involves constant internal work in order to be ready to work together with others making a change with an entity that is consistently in motion and getting quicker and quicker at what it does.
Annanda: I think witnessing one another and our pain and grief is huge. We actually do grieve in community, and we're not the only creatures on Earth that need to grieve in community.
I wonder about the police officer, because who do you have to become in order to protect society?
Keisha: Yeah.
Annanda: And what does that do to rewire your brain and how you think about people and those that you love?
Why it's okay to put police officers through that kind of mental gymnastics that we also put soldiers through. And why we put civilians through the mental gymnastics—
Keisha: —Accommodate it or celebrate it or have to justify it and validate it and pay for it.
Annanda: Yeah, nobody's safe with this. The police aren't more safe being more militaristic. The populations aren't more safe. Their families aren't more safe. Our families aren't more safe.
Even the fact that there's a they and them shouldn't be the case when we're talking about policing. Like, it pains me to even have that dynamic.
Because my police officer should be my person. It should be the image that you had growing up, Keisha.
Sarah: The technology of guns is a real technology and with so much gun pollution, we've had protests fired on more in the States than we used to. This has happened at many other places around the world. And there are paramilitary actors in much more highly configured groups around the world, but we're seeing more and more of that social technology come into play.
And so people are needing to prepare to deal with armed civilians in a way that we just, we hadn't as much before… with the exception of the constant public terror that indigenous and Black people have lived under.
AUDIO: Reflective music
Keisha: So the crisis isn't really a crisis or isn't acknowledged as a crisis until it impacts the “wrong” people.
There's this environmental justice concept called the sacrifice zone, which every time I think about it, I get chills. It's when a populated area becomes hazardous because people and companies have dumped toxic waste in it or they've economically abandoned the area.
So the place becomes unlivable, but the people who have always lived there still live there. So they're getting diseases and having disproportionate death rates because… they're considered to be acceptable sacrifices so that life as normal can continue for everybody else.
And it just makes me think about who our public safety system right now considers acceptable sacrifices.
Obviously Indigenous, Black, and disabled people…
But also… children…like… Columbine, Newtown, Parkland, Uvalde didn’t shift American gun culture.
Should I be shocked? I should be, but am I? Maybe not.
After the break, we learn about the community wisdom that’s helping folks move through fear, expose the violence that public safety technologies make invisible and ordinary, and how they’re building something new.
Segment C: solutions and wisdom to help heal the breaches conflict tech and surveillance are causing
AUDIO: reflective music continues
Annanda: Welcome back to Moral Repair, a Black exploration of tech. We've been talking with Sarah Nahar, a peacemaker and organizer who has a lot of wisdom about how communities can create alternatives to violence and militarized policing. For Sarah, public safety can be found in relationships.
Sarah: The type and quality of relationships that we have directly translates into safety.
We have a practice at home of thinking about who might we call instead of the cops? Part of thinking into abolition had a lot to do with knowing who else you could call. And if your relationships were not such that you had others to call, then some of your personal work around abolition could be to develop those relationships so that you had many others to call.
I credit Miki Kashtan for the work that she is doing around surrounding the patriarchal field with love and the community of nonviolent global liberation that is seeking to address separation, scarcity, and powerlessness through togetherness and flow and sufficiency.
This makes us have to think a lot about information, decision making structures, how people get fed, clothed, taken care of, as well as how their emotional needs are met and how we can weave togetherness.
What I do is, within movements, supporting movement articulators to get clear about what they're asking for and how they're doing it so that means and ends become as much unified as possible.
We think about how every tactic, action logic, every vision implementation, every goal reeks of the otherwise, of the beauty that we're seeking, of the hope that we have, of the way that we want to operate so that it's, as best as possible, undeniable.
Keisha: According to religious studies scholar Ashon Crawley, “the otherwise” is a way of invoking possibility, alternatives to systems of domination presented to us as natural and impossible to shift.
“In the midst of ubiquitous, seemingly unceasing violence,” Crawley writes, “we need imagination.”
So what can “the otherwise” mean in practice?
Sarah: What nonviolent direct action does is it helps to dramatize the situation of a violence that is occurring when it's especially when that violence has been ignored by the systems perpetuating it.
Dr. King was often accused of being a troublemaker or a rabble rouser or bringing up conflict. And he would remind people that we don't bring the conflict. The conflict is already here, but it's not close enough to have the friction so that everybody can see the pain that we're in. Everyone can understand beyond a shadow of a doubt that what is happening in the “status quo” is not okay, nor sustainable.
And so nonviolent direct action rather than seeking to win over others. It seeks to win others over. We want to invite people to know that there are other ways of being together in this world.
Yes! It will probably mean buying less stuff, but it will mean having more conversations. Yes, it will probably mean more conflict, but no, that conflict won't be as violent. it will mean going slower on your journey, but you'll go on your journey with more people and have more fun along the way. It's just these, kind of, reroutes.
We practice physical formations of standing, of sitting, of taking up space in particular strategic locations where the conflict is not being dealt with, but the people in those locations or the locations themselves have the power to change the situation.
Keisha: I love the theme of dramatizing the issue and placing that drama where it can activate action. To make it really concrete, that's a group of organizers locking hands outside of a venture capitalist firm that's funding pipelines and bombs.
Sarah: Precisely.
Keisha: Because they can do something about it.
Keisha: Sarah has protested with a community group called Upstate Drone Action, which protests drones outside the Hancock Field Air National Guard Base in upstate New York. They’re out there at the front gate of the base every first and third Tuesday, witnessing to another way since 2010.
Keisha: | In the solutions that your groups have tried to practice—reinvesting in community ties, reinvesting in relationship, encouraging people to feel the hard things and stay with them. I'm curious about how you're responding to the increased depersonalization, the automation of violence.
I'm thinking of drone warfare, where somebody can push a button and lots of people die, but they don't have to see. You mentioned guns and the ways that guns shifting from manual to automatic kind of helps to accelerate the depersonalization and remove you from the consequences of pulling that trigger.
I'm curious what you think about that.
Sarah: There has been a group of faithful folks here in Syracuse, right outside Hancock Base, because they know that drones are flying from there to Afghanistan and Pakistan. And we stand outside the entrance and there are some faithful elders who are there on a monthly basis.
Some would call this protest ineffective because those drones continue to fly and the automation continues to go.
When they're there they're like brainstorming, like, what else can we do? What else do you need? Like it actually becomes a practice of hope, meaning making. They're standing there with their signs and the images of the children that have been killed by drones, to make sure that they are those who remember, and that's those who are leaving the Hancock Air Force Base cannot forget, or cannot go about their day without having the sensation through their eyes of some of their impacts.
Keisha to Annanda: I spoke to one of them recently and he said that they're now in their 70s and 80s. And they show up every other Tuesday, every Friday, Saturday morning at the farmer's market. That's what it takes.
Annanda: Oh my gosh, I love that.
Keisha: The determination, the persistence, to hold a different vision of the way things are, despite the evidence. This is not their first rodeo.
Annanda: What I'm hearing is young organizers talk to your organizing elders because they got stories and strategies!
Keisha: Yep. Knowledge, experience, wisdom. And if separation is fundamental to common stories about public safety, that group is a counter example. What solidarity and connection can look like. Reaching out and insisting on a human way to connect even with people whose job is to kill in a removed and clinical way.
Sarah: This is another way that it feels depersonalized because it's just all in code. And, what activists have decided to do is to make statements as much as they can at Google. And they're calling for some upcoming rallies to make visible what is happening behind the scenes creating immense surveillance technology that has an impact if it's accurate and given its levels of inaccuracies has other types of devastating impacts.
And I don't know that we know necessarily any more than to say stop, don't make these contracts. You know, stop these data pipelines, like we would say, stop this other pipeline. As people are thinking about their relationship with data usage and participation, that all this begins to come into question. I know more and more people who are opting out of various types of social media as a personal practice and are not sure yet what they might call others to do.
Of course, in the academy, many have contracts with the Department of Defense, and this has to do with intellectual technology as well as the weapons technology. And so we are putting pressures on these types of spaces to show the connection between AI in the classroom and AI as a part of warfare and to figure out again, these fundamental questions of what does it mean to be human? What is enoughness? What is sufficiency? What might we be able to gift one another if we are not reliant on artificial intelligence?
Standpoint theory is a gift of feminism, to really invite people to take each other's standpoint seriously. And so even if you're having one experience with the tech, if someone else is having another one, and this other experience is violating their sense of flow, togetherness, belonging. Human rights, connection with the environment and the larger web of life, then it is incumbent upon us to respond to that and to listen and to stop when we need to stop and to go only when we have found ways that this is going to be life giving for all.
AUDIO: Closing music
How might then we try to have an ethical stance? And that takes a lot of conversation but I would suggest that we defer in those conversations to those who are most impacted by where those bombs are dropped. Where that surveillance technology is experimented. And that is who I would like to see centered in the conversations.
Annanda: Thank you so much, Sarah. You could really tell your whole spirit, your essence, your being is really in this work and comes alive and I'm so thankful.
Keisha: Thank you for raising the call to center relationship and to treat people as the gift.
Sarah: Thank you for this podcast and what you all are doing.
AUDIO: End closing music
Annanda: I long for a world that imagines technology not to be reductive and destructive, but to bring out the best of ourselves.
And you know, it's an achievable dream.
AUDIO: Credits music
Keisha: Yes, the technologies of policing and “public safety” get newer and fancier and more invasive all the time…. But the technologies of community care are actually old. And they’re simple: listening, imagining, making sure the people around you are ok, managing resources and risk together rather than every gang for themselves.
That’s what it takes to be safe.
###
CREDITS
Annanda: We’re building community with listeners this season. So reach out and talk to us on Instagram—our handle is @moralrepairpodcast. Also catch us on X… formerly Twitter… And LinkedIn. We’d love to hear from you!
Keisha: Follow the show on all major audio platforms—Apple, Spotify, Audible, RSS—wherever you like to listen to podcasts. And please help others find Moral Repair by sharing new episodes and leaving us a review.
Annanda: I’m Annanda Barclay.
Keisha: And I’m Keisha McKenzie.
Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Rye Dorsey, Courtney Fleurantin, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.
Keisha: Original music is by Jim Cooper and Infomercial USA. And original cover art is by Sam Martin and Randy Price. Our two-time Ambie-nominated podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.
[PRX SIGNATURE]
SHOW NOTES
Talk to us online: at Instagram (@moralrepairpodcast), on X (@moralrepair), and on LinkedIn: https://www.linkedin.com/company/moral-repair-podcast/
Follow Sarah Nahar’s latest community project, EFERT: https://www.efert.org/
Ashon Crawley’s essay on “the otherwise”: “Stayed | Freedom | Hallelujah” (LA Review of Books): https://lareviewofbooks.org/article/stayed-freedom-hallelujah/
972 Mag reports on algorithmic systems the IDF has used to target buildings and people: https://www.972mag.com/lavender-ai-israeli-army-gaza/
Journalist Yuval Abraham talks to Democracy Now!: https://www.democracynow.org/2024/4/5/israel_ai
The Climate Reality Project on sacrifice zones: https://www.climaterealityproject.org/sacrifice-zones
The Bay Area’s KQED reports on “No Tech for Apartheid,” a campaign against Google’s and Amazon’s contract with Israel’s military. https://www.kqed.org/news/11986743/the-tech-employees-who-want-to-sever-silicon-valleys-deep-ties-with-israel
The NYT explains “the unseen scars of those who kill by remote control” (2022): https://www.nytimes.com/2022/04/15/us/drones-airstrikes-ptsd.html
GUEST BIO
Sarah Nahar neé Thompson (she/her) is a nonviolent action trainer and interspiritual theologian. Now as a PhD candidate inSyracuse, New York (Haudenosaunee Confederacy traditional land) she focuses on ecological regeneration, community cultivation, and spiritual activism. Previously, Sarah was a 2019 Rotary Peace Fellow and worked at the Martin Luther King, Jr. Center in Atlanta, Georgia. She is a member of the Carnival de Resistance and has been the Executive Director of Christian Peacemaker Teams. She attended Spelman College, majoring in Comparative Women’s Studies and International Studies, minoring in Spanish. She has an MDiv from Anabaptist Mennonite Biblical Seminary in her hometown.
Life on Spaceships and Mars
Season 2 | Episode 6
In this episode, Annanda and Keisha Explore The Big Question of, is it worth the expense to go to Mars given the needs on Earth? And what would it be like to live on Mars or in space? They interview Kai Staats, Director of Research for SAM at the University of Arizona to get the space tea.
-
The Big Question
This episode explores the big question of, what would it be like for everyday people to actually live on mars or on a spaceship in space?
Lead Guest: Kai Staats
Intro
SOUND Montage: NASA Cape Canaveral Countdown[Space Shuttle Launch Countdown 0:02-0:14], Neil Armstrong Moon Landing [One Small Step, One Giant Leap 0:40-0:48], Mae Jemison TED Talk [Want interstellar travel? Build interdisciplinary teams | Mae Jemison 10:54-10:58]Sally Ride Interview [An Interview with Sally Ride 04:29-04:33],
Annanda: Hi Keisha,
Keisha: Hi Annanda,
Annanda: Let’s talk about mars baby, let’s talk about dreams to be, let’s talk about all the good things and the bad things that can be, let’s talk about space
Keisha: I'm getting 1980s flashbacks right there.
Annanda: Then my job is halfway done.
Keisha: Do you really want to go to space?
Annanda: I would be intrigued to go.
I think it'd be cool. But I need several people that have gone before me for me to feel comfortable to go, but, why not? How about you?
Keisha: I don't know, like, you have your little suit on that they've worked so hard to engineer for you, and then something breaks or there's a solar flare or some kind of rock storm out there and I guess it just reminds you that things could be beautiful.
But that doesn't mean you should go up and touch them.
Annanda: Okay, icarus.Okay. No, I'm with you. I will keep my behind, in the ship or in the station. there's some outside I don't need in this lifetime, and space outside is one of them. I've come too far. The ancestors went through too much. You know what I mean? We're good.
Keisha: But I think it's just something about humanity where even if we know the risks, we're going to go try it out anyway. Like the ocean is also deep and scary and we know a tiny fraction of it. but we still swim in the ocean. We still love the ocean. And we're still curious about going further and further into it.
I think space draws us in the same sort of way.
Annanda: When you were a kid, do you have dreams of being an astronaut?
Keisha: Actually no. And I'm not sure why. Because, as you know, I love Star Trek.
But I didn't imagine it as something for me or maybe people like me. There was not a lot of engineering or physics models around us, like people in those specialties. So I didn't see myself that way, but the part of things like Star Trek that I loved and did see myself in was the idea of being part of a team. It's like everybody had a different role to play and it was because each person played their role really well that the group kind of prospered and I just loved that concept.
Annanda: Yeah, I remember my grandparents, they retired in Florida, and we would go every year for a month in the summertime, and my granddad would always make it a point to take us to Cape Canaveral, to look at all the different spaceships. So space. Yeah I'm about it.
But Mars, I mean with you got the billionaire Mars race going on, what are your thoughts about living on the red planet?
Keisha: I lived in West Texas. We have dust storms there. Yeah. I think I've had enough. If you've ever tried to clean red dirt from everywhere it has gotten and shouldn't have gotten, then you're not trying to live on the red planet.
Annanda: I'm interested in the development of that technology and also what it's sharing about what it means to live on earth and how to live on earth. I think it really goes with my values of sustainability. I am an Earth alien, I am here for this planet.
Keisha: The other part of thinking about living in space is how silent and how empty.
Annanda: This episode explores the big question of what it would be like for everyday people to actually live on Mars or on a spaceship in space.
Annanda: This episode explores the big question of, is it worth the expense to go to Mars given the needs on Earth? And what would it be like to live on Mars or in space?
SOUND: title music starts
Keisha: I’m Keisha McKenzie…
Annanda: I’m Annanda Barclay…
Keisha: This is Moral Repair: A Black Exploration of Tech. A show where we explore the social and moral impacts of tech… and share wisdom from Africana culture for caring for and repairing what tech has broken.
Annanda: When we return, there are so many problems here on this planet. It would seem like this reality to live on Mars is only for the wealthiest of people and a divestment for the issues we're facing on the blue planet now. Making it a pipe dream for the rest of us, at the cost of most of us. We’ll talk to somebody who knows what it’s actually gonna be like to live on Mars.
SOUND: End title music
Segment A
SOUND: interview cue
Annanda: Kai Staats is Research Director of Project SAM at the University of Arizona. SAM is the world's largest and most technologically advanced simulator of life on the moon and Mars. A place that replicates, more or less, what it will be like to actually live on the red planet.
Kai, welcome, welcome. So glad to have you here on our podcast today
Kai: It's an honor to be here. Thank you.
Annanda: Kai Staats is research Director of Project SAM at the University of Arizona. SAM is the world's largest and most technologically advanced simulator of life on the Moon and Mars… A place that replicates more or less what it will be like to actually live on Mars.
Keisha: SAM, is that an acronym?
Kai: It is. It's a Space Analog for the Moon and Mars. Technically it should be SAMM or SAM², but I just thought SAM was easier.
Keisha: I think you're right.
Annanda: Yes, yes. Why should we care about Project SAM?
Kai: Boy, we're getting right to the point, aren't we? So, as many kids, I grew up wanting to be an astronaut. I definitely went through that phase and maybe I never quite outgrew it. So there's a certain fascination with Wanting to live on another planet, perhaps that's fueled by a hundred years of science fiction, or perhaps it's fueled just by some innate desire to explore.
SAM is At one level, my contribution to that exploration. I was part of, , the Mars Desert Research Station, which is a, the longest running analog, Mars analog, and that's, , operated by the Mars Society, I was a member of a crew in 2014, and it definitely fueled my inspiration and understanding of how analogs play a role in preparation for human exploration of space.
Annanda: What's an analog?
Kai: Good question. Analog, meaning a place where you practice something before you do the real thing. NASA has been operating analogs for many years since 1950s. And the Apollo astronauts trained in various analogs on Earth. For instance, the Apollo astronauts went to Meteor Crater, Arizona, which is just outside Flagstaff under the guidance of Gene Shoemaker, he was a planetary geologist at the University of Arizona and took the Apollo astronauts to Meteor Crater because so many of the geological features of that crater were similar to what they would experience on the moon. They practiced, driving their buggies, and practiced wearing their spacesuits and moving over difficult terrain, and using different tools at Meteor Crater, as well as Iceland and many other places on the Earth, and those are outdoor analogs.
There are also indoor analogs, or controlled environments, in which you're creating a spacecraft, for instance. And that's where Sam comes to play. So Sam is a Mars habitat analog, meaning what's on the inside of where we're going to live on another planet.
SAM is an opportunity and was an opportunity for me to participate in that preparation by creating, currently the world's only operating, hermetically sealed and pressurized, Mars Habitat Analog.
It's a building that represents how we might live on the moon or Mars. It's sealed. When you're locked in and we shut the airlock and turn that big latch on the hatch, you feel like you're in an actual spaceship.
Annanda: That's pretty sweet.
Keisha: Can you zoom out to the Project SAM motivations overall? How would you describe what is driving that project overall?
Kai: What's driving SAM overall is a fundamental goal of demonstrating the transition from physicochemical to bioregenerative life support.
Those are fancy words for machine based and plant based life support. So all human space travel to date, everything from 1950s, you know, late 1950s with the Russians, all the way through the International Space Station, have been using machines. So it's called physiochemical. It's a machine that has chemicals that extracts or sequesters the carbon dioxide from the air and all of the oxygen is brought up in tanks, essentially.
That's not sustainable for humans. Five year missions or two year missions or five year missions or 20 year missions in which we have maybe a dozen or a hundred people living on another planetary body. We can't be reliant on machines. They break down, they need replacement, and psychologically we want plants.
We're a species that's evolved and grown up with plants, plants make us feel good. So ultimately we do want to have hydroponics, aquaponics, greenhouses on the moon and Mars. And the function of those plants is multifold, it's aesthetic, and pleasing.
Emotionally pleasing. On the whole, they take in more carbon dioxide than they produce, and they produce more oxygen than they consume. So we think of them as air cleaning systems. By this time next year, we will have demonstrated the world's first transition from physicochemical from machine based life support to bioregeneration, which means that in our facility we will have a mechanical CO2 scrubber similar to the one that's on the space station, and we will have that activated, and we'll have one, two, three, or four people move into our habitat with seeds in their pocket.
They will then plant those seeds, grow them into seedlings, transplant them into the hydroponic system, and depending upon the plant species, if it's a radish it's 28 days, if it's a lettuce it's a little bit more, that plant will germinate, grow, and as the plant matures, it will draw down the carbon dioxide level produced by the humans.
Produce the oxygen required by the humans to breathe. And we will be able to turn off or at least reduce the reliance on the machine. And that's an exact demonstration of what's going to happen on the moon or Mars 20, years from now. So that's the number one goal. That's why we built SAM. That's why I designed it.
Now there are many other experiments that are happening, that will happen in and around that, but that's the primary goal. And for me, it's exciting because for lack of a better cliche phrase, it brings us back to earth and in an understanding of how we are related to the biomes and the ecosystems around us.
Annanda: Oh, that brings us, Octavia Butler will want to be very proud and very nerdy, like, happily nerded out at Project SAM.
Keisha: Indeed. And SAM, is that an acronym?
Kai: It is. It's a Space Analog for the Moon and Mars. Technically it should be SAMM or SAM², but I just thought SAM was easier.
Keisha: I think you're right.
Annanda: Initially with these space projects, as cool as they are, because they are cool, I feel like NASA and the fire department are two things that I will always, already happily give my money towards, as a taxpayer. But I always, feel a way, like a Gil Scott Heron kind of way of like "Whitey On The Moon" But now it feels like “Whitey on Mars”, right?
And for those of you who don't know the reference Please look it up lest you misinterpret what I am saying but, basically that there are so many issues here on earth, especially right now and we are putting our funds Please towards space where we have people in deep, deep, deep need. That tension always comes to mind for me as we're thinking about space travel.
Keisha: Yeah. I remember us talking about some of the escapist literature and films showing people with wealth being able to run away from climate change and insulate themselves from the consequences.
And there has been that critique in the last hundred years of how really, really wealthy people. And governments have put money that could otherwise have gone to education or health care or other more mundane things, to literally moonshots, like moonshots in the 1960s to get to the moon in the first place, and then some of these more experimental research projects that simulate what it could be like on other planets
Kai: There's a story that was told to me by Dr. Jim Bell, who's a world leading planetary geologist. Also the designer of the cameras for the last four Mars rovers I don't remember exactly who was the subject of the story, but there was a gentleman who was giving a public talk. During that talk about Mars rovers and exploration of other planetary bodies, someone in the audience who was disgruntled said, "What a waste of money.
I want my money back." So he reached in his pocket and he pulled out one or two quarters. I think it was 50 cents. And he walked across the stage and handed 50 cents to the guy and said, Here, here's your money back. It's nothing compared to everything else that we pay for. But the outcome is extraordinary. And so in the same respect, what we're doing at SAM, I believe, has extraordinary outcomes.
Annanda: What would living on Mars actually look like? Given what you've seen at SAM what are some quirks?
Kai: Number one, you're indoors all day long, every day. And a lot of people say, Oh my God, I couldn't do it. And yet I remind them that the average American doesn't spend that much time outside these days.
You go from your air conditioned home to your garage to get into your air conditioned car to drive to a covered parking. It's To go into your air conditioned office, and then you'd go to a gym, and you work out in an air conditioned gym, and then you go to an air conditioned grocery store, and then back to your home.
The actual amount of time you've spent outside of some kind of man made object is maybe 20 minutes a day. Now, of course, there are many people who go for walks in a local park, or they go for a run in the morning, and that's awesome. But if you look at the average person, they don't spend that much time outside.
I'm not saying it's for everyone. It's actually probably not for me, because I'm an avid outdoorsman. But that is the case. on the moon and Mars, you will be inside all the time. And the only time you go outside, which would be relatively rare, is when you're encumbered by a full pressure suit, by an EVA suit.
That becomes your personal spacecraft. It's how you move across the lunar or the martian surface. And you can't spend too much time in that suit because the radiation protection is minimal.
Spacecraft, like homes, are a manmade object and all things according to the laws of entropy, break down. Everything breaks down. Rubber seals dry out. Closures break. Hinges need oiling. On the International Space Station, I don't know the exact percentage, but a significant portion of the astronaut's time is spent just maintaining and repairing the space station so it doesn't fall apart.
And that's a vehicle that has no influence of gravity. So it's not even being pulled the way that our houses are being pulled. We see shingles fall off our house in a windstorm, or we see water come in from heavy rains. Even in that relatively constant environment of microgravity and a constant temperature, or fairly constant temperature, in and out of shadows, the space station's falling apart constantly.
And so even on Mars, your habitat is being pulled. The thing that keeps you alive is going to need repair. You're going to be maintaining your vessel on a regular basis.
You're going to be constantly monitoring your air quality, your water quality. You're going to be monitoring your food stores. It's not unlike. a voyage across the ocean on a single small sailing ship in which every single item you brought with you is imperative for your survival.
Keisha: The weight of everything matters as much as the costs of maintenance and time to maintain it.
Kai: That's right. Until we have larger infrastructure on Mars, and I mean manufacturing facilities, 3D printed facilities, CNC mills, the ability to convert regolith, which is a fancy word for essentially inert soil. Until we can have the ability to convert those raw materials into usable objects, such as aluminum or steel or glass.
And it's not a simple task to convert stone into usable materials. It actually takes quite a bit of effort. We take for granted that there are entire factories the size of small cities that do this for us all over the world every day. And to bring those small cities to Mars, we're going to have to completely change the way in which we do manufacturing.
Annanda: Oh, Jesus.
Kai: Just to get one kilogram to orbit, that's not even to Mars. It's two or three times that cost to get it to Mars. So you're not bringing bags of concrete with you.
Keisha: Or heavy machinery.
So staying with the, what would a kid ask, like, what do we do with the bathroom?
Kai: Oh, that's a good one. What goes into our body is what keeps us alive.
And what comes out of our body, we got to do something with it. You can't just throw it overboard.
Annanda: Oh my gosh, please tell me you're doing humanure. Please.
Kai: So there is a very real possibility of doing humanure. Yes. And to back up just a little bit, there's a modern fear of humanure that is not fully predicated on historic evidence.
Humanure has been used for thousands of years all over the world, and there's nothing wrong with it. You just have to do it right. You just have to make sure it's, you know it's processed, or I should say dried properly in the sun or, or other means, my neighbors where I live have been doing humanure for decades, and they have no issues. So yes, the technology exists and we know how to process human waste, both urine and feces, into usable materials again. The International Space Station has a urine processor that was designed and developed by Paragon Space Development Corporation, which is here in Tucson.
The urine processor is about 98 percent efficient, 97, 98 percent efficient. The efficiency runs out because of the salts. There's simply nothing you can do with the salts that are in human urine. It's not the kind of salt you can process and sprinkle on your eggs in the morning. It's a salt that simply doesn't go anywhere.
So in that respect, we do get 97, 98 percent of the water back and human feces is a little bit trickier. They're not processing human feces on the space station for any usable material. However, if you do a little YouTube search, there are in fact people who have learned how to take human feces, and I think they mix it with baking soda and they bake it on their stove and turn it into 3D printer spools of plastic.
Keisha: What?!
Kai: Yes, so you can create, you can 3D print poo things.
And I'm not suggesting everyone try this at home. I don't believe the smell would be all that great. But it demonstrates that if we get down to the molecular level, the molecules in human feces are usable.
Keisha: I'm just imagining all the parents in our audience writing us angrily because it's summer time, so the kids are at home, something might go wrong.
Kai: Mom, dad, can I turn my poo into plastic, please?
Keisha: Oh my goodness.
Annanda: So Kai, what have you seen with the social impact of living in these enclosed spaces, right? Of, community, dating, the general need for touch. What about privacy, entertainment? What have you observed?
Kai: All good questions. So these fall into the social and psychological research realms, and at SAM, we are intentionally not focused on those particular subjects, although we do observe it, and it certainly comes to play.
We leave the long term psychology studies to professionals at NASA, and NASA is almost entirely focused on long term psychological studies. In fact, there's a mission running right now at NASA Johnson Space Center, in a new 3D printed habitat, and they did not 3D print it out of poo. It was out of concrete. At least that's what they tell us. They have, I think it's four or five or six people. Living inside of a facility for extended periods of time, several months, and this is their second mission in that new facility.
There will be physiology studies, there'll be food studies, there'll be tool use, and a very stringent daily routine as is true with astronauts as well, but they're principally focused on the psychology.
The best analogy is submariners. Navy folks who live on submarines for weeks at a time without surfacing. That's the one we should be looking to. There are many challenges in that, there are many ways of managing and governing human behavior.
We really have to look at individual characteristics of each human and train those people long before we take them to another planet. We're kind of a messy species.
We're very, very complicated when it comes to our social behavior and our individual characteristics, despite the fact that we're the most homogenous, genetically homogenous species on the planet, our characteristics as individuals are widely diverse and disparate. And so how do you take all these people? From different walks of life, from different ethnicities, different languages, different cultural backgrounds and put them into a box and say, good luck.
It's going to take you seven months to get to Mars. There's no windows. There's no gravity. You're in something the size of a small school bus with three other people who chew with their mouth open and fart and don't like the same food as you and don't like the same music as you. Many years ago, we consulted with a former NASA astronaut psychologist. In a one on one conversation with him, he said, "do you want to go to Mars?"
And I said, "yeah, absolutely." "but I couldn't." And he says, "why not?" And I said, "I would be driven nuts by everyone else around me." And he says, "what do you mean?" I said, "well, I can't stand the sound of people chewing. It drives me insane. I have to duct tape everyone's mouth shut."
And he says, "No, That's not a problem. We just put you in a crew with three other people who also can't stand the sound of people chewing."
The point is that we look at all the characteristics, all the traits, all the skills and the education that people have. Are you a doctor? Are you an engineer? Are you an electrical engineer? Are you a programmer? And you think, we have to have the perfect balance of these traits to make sure we survive. But really, it's no different than a marriage. You want to be with somebody who doesn't drive you insane. What they do for a living doesn't really matter in the end.
All those other skills can be trained. You can give people enough medical skillset to save someone's life in a few, several months or a year of training, but you can't change the fact that they chew with their mouth open.
That's how you build your teams. You build your teams based on who actually gets along for long periods of time. And NASA is very good at building those teams over years and years and years of training before they ever go to space.
Keisha: The chemistry matters in the end.
Kai: Exactly.
SOUND: break cue
Annanda: After the break, Kai will share positives he's uncovered with Project SAM about what it means to practically live on earth as well as space.
[BREAK]
Segment B
Annanda: And we’re back with Kai Statts, the person who runs the most complex simulator in the world of what it would be like to live on Mars, the Moon or a Spaceship.
SOUND: interview cue
Annanda: What are some interesting things that Project SAM is teaching us about living on earth?
Kai: SAM at its forefront, when you first walk up to the building, it feels like a technical home in the sense that in some respects, it's just another place to live, but there are nuances to it. There are details to it. The maintenance and the monitoring of the facility brings it into
That space travel experience or that spaceship experience. When you're actually sealed inside, the moment that we close that hatch and everyone's standing in the airlock and waving goodbye to the folks outside, you are immediately aware of everything you do.
Every breath you take. In such a small space, it's only 1, 100 square feet. Every breath you take is immediately contributing to the carbon dioxide in that facility and you are immediately aware of, we need to remove this carbon dioxide. We need to convert it back to oxygen or flush it out of the system.
The four people are given 55 gallons of water for six days. For every six days, they get 55 gallons of water. Every team has come out of there with at least 10 gallons left. I think the least was eight, but the last team was 10 or 12, 15 gallons.
You're looking at really being conscientious of how much water you actually need. And how much water we actually waste here on earth. So in that immediate respect, Sam is a demonstration of how we could be living completely satisfactory, completely enjoyable lives without sacrificing. The word sacrifice doesn't even come to play.
In fact, I would argue that when we're living simpler lives and we're more aware of these things, we're actually enjoying life more because we're not taking for granted everything that's given to us. Especially as Americans with the largest consumption of resources of any individual person on the planet.
The irony is that if you go to someone and you say you should use less fresh water, you should eat a more vegetarian diet, or at least eat less red meat for many reasons. You should be conscientious of The plastic that you consume because there are no waste cans in Sam.
I don't care if you're in there for six days or a month. Everything you bring in stays inside and you have to carry it out when you're done. Just like backpacking or camping. If you go to people and tell them these things, the natural reaction is, don't tell me what to do.
It's my right to eat red meat, or whatever people say. And yet, the reality is if you say, "hey, do you want to go to Mars?" "Yeah, that'd be great." "You want to live in space?" "Sure. Wanted to be an astronaut since I was a kid"? "Okay, you're probably going to eat less meat, less red meat, and you're not gonna be able to take full showers."
So it flips it on its head and it gets people excited about the very thing that they otherwise would feel like is somehow their God given right. And every person who's gone through our facility, we've had three teams inside, so 12 people total.
Every one of them has come out with some kind of realization for a comparison between how they lived in Sam and how they might want to improve how they live back in the real world.
Keisha: The thing I love about what you've said is you're giving people, or at least the people who are participating in these experiments, are having a very visceral sense of a more sustainable way to live.
Is actually living on the red planet Mars. A pipe dream for most people or something you expect will be only available to the wealthy people, the military, others who have either access to these research facilities or might be able to buy their way to a ship or a commercial space flight. What do you expect?
Kai: The first answer is not everybody wants to go to Mars. It's a very small select number of people who would actually want to live. in such a harsh environment. It's not easy.
It's scary. But for those who do want to go, let's rewind 100, 110 years to the very first people who ever flew an airplane. There were no passengers. There was no such thing as passengers. Everybody who flew an airplane was crazy. And if you go back, and they were pilots, they were not just pilots, they were explorers.
They were nuts and people did not necessarily envision a day in which every single person would have a chance to fly an airplane as a passenger. Those experimental flights, in those first 10, 15 years pushed aviation from experiment to commercial.
They were funded by wealthy people. I mean, sure, you can build these things in your garage to some degree, but somebody has to provide the funding. And as we transitioned from those early prototypes that fell apart and crashed and landed, or crash landed, we moved into higher fidelity systems, higher fidelity prototypes.
Somebody had to fund those. And it was, wealthy individual people. The first people to fly were wealthy individuals. And yes, at the time, most of them were upper middle class Europeans or upper middle class Americans or North Americans, and at a time the people who had the funds were principally of European descent. And so we have to look back and say, okay, that's how this started. And in some respects, Space travel is similar. There's an interesting saying, the government agencies have the money, but they don't have the courage.
The private entities have the courage, but don't have the money, right? So Elon Musk is not afraid to blow up a spacecraft or two. Quite frankly, he's probably not afraid for one or two people to die. That's part of the experience. And it's true. Every experiment of this nature, every extraordinary excursion and adventure puts people's lives at risk. So not devaluing human life, but saying, these are people who have chosen to take this risk.
We're in the middle of a rapid evolution from what has been almost a 50 year stagnation, from. Apollo to the, to space shuttle to international space station. We haven't left low earth orbit with humans in 50 years. It's insane.
And yet we're going to the moon again, and in that process, yes, there will be wealthy people who can pay for a ticket to the space station, and that's okay. Let them spend their money.Because they're the ones who are paving the way for the rest of us to make it cheaper.
The difference is that we are in a different paradigm. We are in a different paradigm of, I won't say equality, but, we have a much, much better mix of ethnicities and heritage that are going to space. We know that when people walk on the moon, there's gonna be a woman and there's gonna be a person of color. And that's cool. That's really cool. So it's following a similar pattern, but we're 50 years later and we're going to do it better.
Annanda: How has this research changed you, Kai? How have you been moved and transformed by Project SAM?
Kai: The work that I've been doing for the last three and a half years has taken me away from my computer from 7am until sunset. I work 80 hours a week. It's incredibly intense, which is also how we built it for so little money because we are a hands-on team. All of us spend time away from our cell phones, away from our computers, with hands-on construction. So we're welding, we're shaping metal and plastic and wood, we're designing these pressurized interfaces of very dissimilar materials, shipping containers weren't designed to be pressurized.
We enjoy that time together in that physical real world. For me, as someone who's always been an inventor and always loved building, but got pulled into the computer interface for way too many years, I'm thrilled to be back in that physical world again. Honestly, if I never turn on a computer for the rest of my life, I'd be completely satisfied.
It's literally transformed the way that some of my team members think and see the world. And so I would recommend that Anybody, go out and garden, build things, do pottery, get the woodworking tools out and start doing woodworking on the weekends. And turn off the damn cell phone.
I think it just robs us of our creativity and robs us of our focus. Good ideas do not come in 30 second sound bites. The good ideas, inventions, come through focus time and the ability to rotate an idea in your head over and over again until you've seen it from all angles.
And when you're interrupted every 30 seconds, every 45 seconds, you simply cannot be as creative as when you have silence and when you have uninterrupted time. It's physiologically impossible.
Keisha: That's a mic drop.
Annanda: Indeed. Kai, thank you for your generosity and sharing about Project SAM. Thank you for being all of who you are. This was an amazing interview.
Kai: Thank you.
SOUND: break cue
[BREAK]
Segment C
SOUND: we’re back cue
Annanda: So Keisha, what are your thoughts? What are your thoughts given Project Sam and everything, the rich things that Kai had to share with us today?
Keisha: So I'll tell you something really, really nerdy about me.
My mother used to sell encyclopedias and World Book Encyclopedia had a children's version called Childcraft. My favorite volume in Childcraft was about mysteries. And there was a mystery about why the Dogon people in West Africa knew about Sirius, the star system.
Because of course, why would Black people know anything about anything?
Why would they be able to study the stars for hundreds or thousands of years before colonial contact?
But I just remember reading that story and feeling, again, that sense of awe about People living in very different circumstances from me, but to whom I felt somewhat connected, looking up and using mathematics and spirituality and talking amongst each other and building a book of knowledge and sharing that generation after generation.
I was entranced by that in the same way that I get entranced by looking up at the stars. And I take from that, part of our cultural heritage is. the practice of looking up, looking beyond ourselves, and then threshing the knowledge and the insight together as a community and seeing how it can reshape our daily life.
Annanda: That's great. I think for me, what has come to mind is, I don't know if you've ever read, this does not get enough press, but I think it will as space travel becomes more common, but Octavia Butler's book, Xenogenesis, and this is a sci fi book where, humans have devastated the planet and this alien species comes and they're basically like preservationists of species, not to colonize them, but to be like, Oh, y'all are dying. Let us restore your natural habitat and see if you all can live together in harmony. The book basically begins with groups, pods of people, similar to what Kai was talking about, that the aliens determine through certain instruments and testing that they could be good with one another.
Meeting each other on this spaceship that is also living and breathing. The spaceship itself is a living vessel.
That whole book is, what does it take for people to not kill each other, to learn how to live in pods as total strangers so that they may survive? And the aliens are testing out to see, will they have the social skills and the sustainability skills to where if we put them back on this planet in different places to repopulate the earth, are they gonna be okay? I think there is wisdom to that observance of are we able to get along? And not this idea you have to get along with everybody, but can you get along enough to survive, to be sustainable, to have common sense?
SOUND: Closing theme starts
CALL TO ACTION
Annanda: We’re building community with listeners this season. So reach out and talk to us on Instagram—our handle is @moralrepairpodcast. Also catch us on X… formerly Twitter and LinkedIn. We’d love to hear from you!
Keisha: Follow the show on all major audio platforms—Apple, Spotify, Audible, RSS—wherever you like to listen to podcasts. And please help others find Moral Repair by sharing new episodes and leaving us a review.
S2 CREDITS SEGMENT
Annanda: I’m Annanda Barclay.
Keisha: And I’m Keisha McKenzie.
Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Rye Dorsey, Courtney Fleurantin, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.
Keisha: Original music is by Jim Cooper and Infomercial USA. And original cover art is by Sam Martin and Randy Price. Our two-time Ambie-nominated podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.
[PRX SIGNATURE]
Show Notes
More Info on Project SAM https://biosphere2.org/research/research-initiatives/sam-mars-analog
Xenogenesis By Octavia Butler
AI & the African Diaspora
Season 2 | Episode 7
This week, Keisha and Annanda explore AI and tech from the perspectives of the African diaspora in North America, in Europe, and continental Africa. We ask: Where’s Africa in the story of AI? What does the diaspora have to say about inclusion in tech? Our featured guest is Mutale Nkonde (AI For the People), and we get into inclusion, colonialism, and what we can all learn from the Maori.
-
S2 E7: AI and the African Diaspora
Lead Guest: Mutale Nkonde
Description: This week, Keisha and Annanda explore AI and tech from the perspectives of the African diaspora in North America, in Europe, and continental Africa. We ask: Where’s Africa in the story of AI? What does the diaspora have to say about inclusion in tech? Our featured guest is Mutale Nkonde (AI For the People), and we get into inclusion, colonialism, and what we can all learn from the Maori.
---
Segment A: Where are Africans in the AI Story?
Keisha: Hey Annanda.
Annanda: Hey Keisha.
Keisha: So much of how we think about tech, machine learning, or AI is still shaped by people from a really narrow list of countries, folks | from similar economic classes and cultures.
Annanda: It frightens the bejesus out of me, you know?
Narrow understandings of what it means to be in relationships, of culture, of ways of being… to make something so big…
There’s so many different ways to be, and those different ways are all valid, yet it seems to be very narrow cultures shaping AI.
Keisha: I was thinking about this the other day. Facebook has more users than some countries. You could be from the Republic of Facebook and you’d have more citizens in your laughs in your clan than some actual countries.
Annanda: I did not think about that. But you’re right, more people.
Keisha: And I guess it’s not actually a republic, which is the problem, right? It’s not just so many key decisions are made by so few people and so few kinds of people, but the ability of the rest of us to influence those decisions— that’s what I’m thinking about the diversity of boardrooms, STEM companies, and AI in particular.
Because AI is showing up now… everywhere. Everyone’s “featuring AI” or with an AI search bot or AI in my text messenger. It’s like sand!
Annanda: It is like sand. It’s that insidious and will likely be in people’s draws soon. both laugh
Keisha: If I say “Africana perspectives on AI,” what comes to mind for you?
Annanda: Freedom… joy... sense, which ain’t common… and hope.
Keisha: Mmm… We definitely need all of that right now! And Moral Repair is about expanding the table and bringing in Africana perspectives on this emerging technology… So take a listen to this clip… from Jake Okechukwu Effódúh (eff OH do) — a Nigerian lawyer who teaches about AI and human rights at Toronto Metropolitan University’s law school.
AUDIO: Jake Okechúkwu Effódúh on 300 Seconds: (01:34-01:44; 01:55-2:06)
“Several reports have said that in terms of adopting AI, African countries are slow or that we have a low AI adoption rate but this position is changing really fast…
Seven countries in Africa now have national AI strategies. We have nine countries in Africa that have established either an AI agency or an AI commission or an AI task force.”
Annanda: Okay, that’s pretty big news for AI on the continent. As a diasporic Africana person, I’m real excited. I’m like, what Wakandas have y’all cooked up? Knowing of course that chuckles these are not utopic, but I’m very curious to have that Africana perspective of how do we use this?
Keisha: Yeah, so this audio is from a YouTube clip from 2022 when multiple African countries had already started to develop national approaches to artificial intelligence: countries like Nigeria and Ghana in the west; South Africa; Kenya and Tanzania in the east.
Advocates with African roots—they’re working together to ensure that AI is built around community well-being and human rights… and goes beyond national economic growth or companies’ profit.
Annanda: I have Black pride just for that because it just makes some damn sense! Keep telling the good news.
Effoduh says, “We need more Africans in the AI space, more homegrown algorithms.”
And I understand that as more tech designed from African and Africana perspectives, not just imported Western tech with Black edits on it.
Annanda: Come on now.
Keisha: We want to know – what difference could that make? Not just in Africa. But in the world.
AUDIO: title music starts
Annanda: I’m Annanda Barclay.
Keisha: And I’m Keisha McKenzie
Annanda: This is Moral Repair: A Black Exploration of Tech.
Keisha: This week we ask how the information revolution and the age of AI is playing out among Black people around the globe. Where’s Africa in the story of AI? And what does our diaspora have to add to the conversation about inclusion and ethics in tech?
Annanda: We’ll get some answers from Mutale Nkonde, sociologist and founder of the tech racial justice organization, AI for the People… after the break.
AUDIO: end title music
Break
AUDIO: back from break music
Segment B: AI & the Black Diaspora
Annanda: Hi Mutale!
Mutale: It's a pleasure to meet you.
Keisha: Thank you for being with us today.
Mutale: I'm very excited.
Keisha: How in everyday terms do you describe what AI for the people | does?
Mutale: AI for the People is a very smart advocate. We are not legal practitioners, nor are we lawyers, but we think about ourselves as social scientists who understand engineering practices and who advocate for the diminishing of discrimination and bias. |
The ability to open your phone with your face is much more effective if your face is white. Imagine if you'd had an advocate that said simple things. your face is something that many of us will never change. Do you want to use it in a transactional way?
And then decide from there, are we going to build that application? And then once we had spoken to our Native siblings—we're having this conversation on the unceded territory, which we now call the United States, but I too am an Indigenous person. Many other people | from | other subcontinents are indigenous people.
And they have very strict ideas about what personhood means and is. | Maybe facial recognition isn't appropriate for them. | If we are going to recognize faces, are we recognizing humans or are we just recognizing white humans?
| these sticking points provide opportunities within the engineering stack for product teams to lean in, to ask questions; we can bring in our friends from marketing, design, UX, and then decide what type of company and what type of infrastructure do we want to provide?
And then say we are going to use facial recognition. Are we going to give this to the security forces? | Now, one of the great things about racist design is facial recognition really doesn't work for us anyway. So I travel a lot. | I have gotten to mad companies in Europe.They've been like, show your face here. I've looked at the score that comes up 93% inaccurate. That's not safety. |
Keisha: Wait, just so I understand what you just said, in looking at the back end of these facial recognition systems, you're saying it can be 93% inaccurate.
Mutale: Inaccurate.
Keisha: Oof!
Mutale: so I guess we get to shape shift and slide under the radar. but I think that we should be using technology to improve everybody's lives as opposed to make some people more miserable and then render some of us invisible.
I was recently in Germany, and I was coming from the UK. We're no longer in the EU so I have to go through different customs and I confront facial recognition.
I know that thing is optimized for Germans.
Keisha: Yeah.
Mutale: I don't look like no German. I go in front of the thing. | If they had to find me again in Berlin, good luck because they don't know what I look like.
AUDIO: Curious music
Annanda: Something appeals to me about being Black and being able to be hidden in plain sight. And there's some relief: there's a way in which she cannot be surveilled. I cannot be surveilled to that degree. Possibly when we get on the African continent, that mess is going to change, right? But at least in European contexts where there’s so much anti-blackness, I find good hope in that. And it tickles me.
Keisha: Imagine, of all the things, airport security being the one space where algorithmic bias works in Black people’s favor! Every time I fly these days, if they’re not targeting you because you’re Black or from the Middle East or whatever, then they’re targeting you around gender. Those machines, they’re matching you to a binary version of gender. So if you’re wearing a different kind of pants, or you have a different undergarment on than they expect you to have, then you get, you know, pulled for that extra patdown and the unnecessary human contact.
But seriously, I come back to the ethic of just because wine country in Washington or California dreams up something doesn’t mean it’s automatically going to work for every other community.
I think it’s really important to think about how our values can be different across cultures and often are, and that difference should be reflected not just in the design room, but also in how | or whether \ these technologies get rolled out… | Right now we have a situation where one culture’s values get imposed on everybody else without consideration.
|
Annanda: As you’re talking, it makes me think of Safiya Ujoma Noble. She has this book called Algorithms of Oppression. And she talks about Yelp and the way in which you would rate business was antithetical to the way Black culture does business. Yelp was designed to help businesses grow based off of feedback. You’re not gonna really be able to rate a Black barbershop or beauty supply by some Yelp reviews!
Like, we need to see the hair. We need to know if this is going to be the loctician in our case, because we have locs, you and I. Is this going to be the loctician that’s going to be in and out on time? Or is this going to be the loctician that’s going to keep you in there with their 50 million stories?
Keisha: And the overbooked client list.
Annanda: Bless their hearts.
Keisha: Is the vibe right?
Annanda: Yelp cannot give you that. Unless you're Black, you're not gonna know this culture. Like, you're just not gonna know it, and that's okay. You don't have to know it, because it's not for you.
Keisha: No, it's not for you, but | everybody says they have the app from heaven that's going to help us all. you want somebody who knows something about the context of use. So in technical communication, the conditions under which a technology is used should be shaping the design of the technology. |
So it comes down to relationship. Is it just me in my lab cooking up something without knowledge of where it's gonna go? | maybe it's me with relationships, willing to listen to the outside world, and bring that outside world in.
Annanda: I'm gonna take it a step further, Keisha. It's you and your lab looking at the outside world saying, I don't have it.
Keisha: Somebody else should, Yes.
Annanda: Exactly. Somebody else should.
There is somebody from the other culture that has the gifts and skills that you have. | And those gifts and skills can be taught. Period. So have them do it.
Keisha: Yes.
Annanda: And then be in relationships still with them.
Keisha: Yes, because we're better together.
Back to our conversation with Mutale.
Annanda: | you have advocated for having more black people and people of color in the AI development space. | How do we create inclusive datasets and how have the stakes changed since you started AI for the People?
Mutale: this notion that we should move away from diversity and towards merit really ignores the steps that it takes for people to get into the tech industry. Being the mother of future knowledge workers, my parents were both children of knowledge workers, we were able to leverage not just education but understanding how particular industries work to create economic viability.
when you take away diversity and replace that with merit, you're gonna get diverse people. You're going to get Indian people, African people, Jamaican people, but you're going to get a particular class of person who is potentially insulated from the issues.
The reason I call for Black people to be in the design room isn't just that I want Black bodies around the table. I am speaking about Black people with a liberatory politics. | If we are weighting a healthcare application, we shouldn't look at people who pay premiums and people who don't because that is a racialized and class inquiry. let's | weight who isn't in the data set because surely they need health too. | As a sociologist, I'm always going to go to the sociological reasons they're not there. let's create towards that. Let's center meeting the need rather than driving the profit.
Now, I say that, as somebody who holds tech stock, as somebody who is building equity in a rotten system. | We hold this asset, but then what do we do once we cash out? How are we using those gains to make sure that people who don't have the background the access that we have?
Because it's when you get those people from the margins of the margins, into the design room that you really truly understand, not just the problem that you're solving, but the real problems. People don't necessarily want to hear about Audre Lorde. People don't necessarily want to hear these theorists. But I am going to bring them in because they've pointed to the problem. And our inability to solve for those problems is why tech is so crazy and whack and doesn't really work for not just this market, but our future markets as the consumer becomes increasingly black, brown, queer, non binary, and living on all the other, colorful spectrums of life that there are.
Keisha: I love what you said about bringing Audre Lorde into the room with you, and I think it was Maya Angelou who said something like, you come in as one and you also stand as 10,000. There's a principle there of always bringing your people with you no matter where you are.
And on that point of bringing your people with you, one of my friends often says that geography is biography. Our places shape our stories just as much as our people do. My places and people are all over so I could tell my geo-biography as London-Jamaica-West Texas-Maryland-New York-Maryland again.
What is your geo-biography, Mutale, and how has it affected your approach to tech?Mutale: I love that question because I've never been asked it. My geobiography is Zambia, London, Edinburgh, Leeds, Newcastle, Japan, Abu Dhabi, Moscow, and then New York for most of my adult life.
Keisha: Mm-Hmm
Mutale: Particularly holding a British, citizenship and an American citizenship, when I hear about migration,
I think of myself as a colonial subject, as somebody who comes from an African colonial country, as well as a Black person in America and in deep community with people who were actually captured because of their intellect, because of their knowledge of how to manipulate land and to create value.
When I became a tech worker, it knitted in these big, deep legacies, which are really about pain. And having worked within industrial settings and governmental settings and being given the opportunity to create something of my own within tech that was based around reparative action and systemic change, I think if you ever meet a colonial subject, you are meeting somebody who is in search and in need of reparations.
Annanda: Mutale, what values does AI for the People use as you advise governments on national AI strategies? |
Mutale: Yeah, our primary governmental agitator is the United States. That's where I have chosen to call my home, and I've chosen the values | of slave abolitionists. |
Keisha: The network of clergy, teachers, writers, and researchers who organized for abolition of transatlantic slavery in the 18th and 19th Centuries applied all of their values and skills to pushing the US government toward fully expressing its own values—liberty, justice, equality under the law.
At the time, of course, women were rarely accounted for in public space—Black women least of all. But scholars like the sociologist Anna Julia Cooper, who inspired W.E.B. Du Bois, used a different lens.
Mutale: If you ever read her speeches, she was saying that there can be no emancipation of Black people from slavery unless there is emancipation of Black female people from patriarchy. |
Annanda: Mutale has other influences too.
Mutale: I | am really inspired by the people that authored the Combahee River Collective Statement, Black queer feminized people, because they said that we have to be socialists. And | I am in the capital gains market. So part of me is like, Ooh, I just need a few more years and then we can get there. But it's a challenge even to me and my politics. I don't see a world where very few of us succeed and everybody else has to have no clean water, or what's happening in AI development, where we're going back to coal to cool down processing centers. I would rather not make capital gains than exacerbate environmental peril, which has a real impact.
I was supposed to be in Jamaica. I couldn't go because of Hurricane Beryl. | Those islands are not going to be there if we keep behaving in the way that we do.
Keisha: The Atlantic hurricane season hit earlier and stronger than ever before this year. My family in Jamaica didn’t get power restored | for weeks, and comms are still down. It really brings home the short term profit seeking Western countries have done over the last century and a half of tech-development—at the risk of climate catastrophe. It’s not that we’re all reaping the consequences in the same way, because we’re not…
But I resonate with the value systems that Mutale’s talking about that recognize we all deserve dignity, care, and safety, and we’re better off individually when we take care of each other collectively.
Annanda: Yeah. I totally agree, Keisha. I think collective care really is the only way. And I think those values applied to AI can actually be really powerful. AI working for everyday people, not to mimic systems and patterns of oppression but rather what does AI do to enhance collective ways of being? How do we make it a part of the village?
Mutale: | It's really interesting thinking about the way large language models are being used in the Middle East conflict right now, and how the accuracy of some of the Israeli weapons is coming from the fact that they hold the Palestinian data set so they can actually profile Palestinian people and habits and use them in the theater of war. That could be any war, but we're really seeing it because Tel Aviv has such a huge startup economy.
I'm actually just about to start a PhD in the UK. So I will be at Cambridge and I'm going to be doing things with the UK government because in the UK | Black female politicians in particular are being targeted with online speech and abuse.
And I'm really looking to go into that work with partners who are doing incredible work. just adding, whatever [00:14:00] talents I have to try and solve that problem.
Mutale: One of the things I think the UK has done really well, and it was black female advocates in the UK that did this, is conceiving of online safety as not just a gendered issue, but a national security issue.
Keisha: Mm hmm.
Mutale: They were able to use that argument to really make sure that women and girls were protected in the Online Safety Act.
The UK has a dwindling workforce. There are more older people in the UK than younger people. So if you're not thinking about gendered violence as a threat to your workforce, then you're really not understanding that you're taking out people within your workforce who are more likely to go to university, who are more likely to work and have a family, so they're really interesting skills, and who outlive men. And so if we have a bunch of online violence that drives to these people not living out their natural lifespan then you have a real problem for your economy. And I just love that the UK were able to talk their way through that.
Keisha: | This Online Safety Act, | it's designed to make sure that children and women and really anybody who uses the internet isn't subject to violent behavior, abusive harassment, that people under a certain age have the protection they deserve. We're going to reframe online safety, not just around the historical archetypical user for the internet, which was a young guy in his basement and/or office, but we're going to think about women.
All of those things come because advisors like Mutale are saying, this is too big for the platforms to manage by themselves and they've demonstrated that they cannot handle it on their own. It needs to be a social concern driven by guardrails… guidelines that come from what is just a collection of people at scale: the state.
Can you imagine that sort of reframing happening here in the US about online safety or public safety, |?
AUDIO: Reflective music
Annanda: The truth is I imagine it all the time. what online safety looks like for vulnerable populations and technology. | the question is | how do we build relationships to actually make the imagination manifest into reality?
Keisha: | People are telling their stories about what they want their communities to look like, about how to use tech, and every time somebody does that, I think it does open up the imagination box, which is what you need to think about new policy approaches and, and new ways to define safety or belonging or welcome or security.
Annanda: So how are these questions playing out in the African diaspora right now?
Keisha: Let’s find out after this break.
Break
SEGMENT C: Repairing Colonialism: What can that look like in tech?
Annanda: Welcome back to Moral Repair. We’ve been talking to Mutale Nkonde, a sociologist and researcher who has advised the US government on AI strategy.
Keisha: The African Union just adopted an AI strategy a few weeks ago. Two years ago it was just 9 countries on the continent that were working with AI. Now there are fifty-five. I think it’s an amazing development, and I’m curious what it’s going to mean for them.
Annanda: Me too. That’s awesome.
Keisha: Yeah. The African strategy is called the Continental Artificial Intelligence Strategy and African Digital Compact and it’s got similar themes as the AI frameworks that the Biden White House promoted recently: innovation, economic potential, responsible development, risks, and ethics.
With regional strategies like this, you’re going to hear about development goals, jobs, education, health, infrastructure, agriculture, and security.
Annanda: This is | piquing my interest. This is exciting.
Keisha: But… I noticed something. When Western organizations talk about AI in Africa, they often talk about it using business and manufacturing language, legal regulations, what they call “preparing the workforce for AI”, which is kind of like jobs and skills.
But they don’t really talk about ownership. They don’t talk about who owns the data or the systems. And they don’t really talk about bias in how AI is trained and used with everyday people.
It feels kind of colonial to me. Like the power patterns haven’t changed at all.
Annanda: Oh yeah, walks like a colonial duck, talks like a colonial duck.
Keisha: Quack, quack.
Annanda: Go somewhere! No!
That’s why we’re talking to Mutale Nkonde of AI for the People—what other models are there that actually do shift power and support repair?
Keisha: So we've been hearing | about researchers from the global south who are using AI | to preserve languages that might otherwise die out. how are you seeing or hearing of Black communities or organizations using AI in new and fruitful ways? and not just | as consumers, but leading development?
Mutale: the Nigerian startup economy very much dominates | And that just honestly looks like Silicon Valley. It's a bunch of guys. They're all from the same tribes. They all went to the same schools.
Where I have seen action is | in | Maori communities and other indigenous communities where they're also using AI to preserve language. But on top of that, they have all of these data sovereignty ideals.
That means that who holds that data, how it's used is not released to the wider world. So they have the best privacy protections in the world. While I am excited that African languages are being preserved, I'm worried that they're being preserved on maybe Chinese servers, maybe using Chinese tools. So how safe are they really? What are they being used for?
The Maori | have their own servers and in the tech industry, your server is your life. So if you can hold and store your own data, you have a level of freedom that I don't think we've ever known in human history.
Annanda: How you're seeing China interact | with the African continent, | that is alarming for folks who might not know?
Mutale: | what the Chinese will do is they will come into an African country. They will build bridges. They will build roads. They will then give that government a loan and say to that government, if you miss even one payment, this road becomes part of the Chinese Republic.
And so what ends up happening is the colonization isn't just of the people, it becomes of the land because what they're interested in is what's beneath the road. And they're only building in mineral belts. So all you have to do is to develop a town using Chinese money on top of a diamond mine and suddenly China has diamonds.
Keisha: I wanted to look into this. In 2022, CNN reported on China’s infrastructure deals in Lesotho… The reporter talked to Prime Minister Sam Matekane about it. This clip gives you a little taste:
CNN: 00:58: Lesotho’s parliament… the state library… the convention center… are all built by the Chinese. Even their state house: a gift from the people’s republic. Reporter: Are these smart deals to make?
PM: Well uhhhh… What happened in the past is in the past. I’m focusing on the future now. Because the debt is there.US ambassador to Lesotho: I will often caution my African partners that if something sounds too good to be true, it probably is.
Keisha: I was in Jamaica earlier this year and, and learned | China is building up Jamaica's highways left and right. And last year they invested more than US$1 billion in Jamaica.
Annanda: I am shook, struck, and frankly angry. When we talk about Jamaica and China's colonialism, to me, China is no better than Great Britain. The old fox led in a new fox to the same hen house.
Keisha: it's unsettling | that these are not peer trades because the terms of the agreements do not equally benefit both parties and | we get into the question of what it means for new colonial powers to be using infrastructure deals, investments, and technological resources, not necessarily to benefit the communities that they say they're partnering with, but to secure their own power.
Annanda: We just saw the African Union approve a regional AI policy. Why is even that approval important?
Mutale: Africa's voice and black voices need to be part of the global AI conversation. | it was Dr. Alondra Nelson, while she was at the office of science and technology [policy] that really got us to an AI bill of rights and to an executive order that spoke about non discrimination being an American value within AI. |
That was done because Dr. Nelson's perspective was here. Dr. Nelson is a Black sociologist who has looked at science throughout her career. I think the reason she felt that this was important was a combination of her lived experience, her incredible scholarship and her proximity to power.
But we cannot have heroes who are going to get us to the promised land. We need to have movements who are going to keep up the fight. The African Union represents a movement, | a bloc of sovereign nations who can | say, as long as you come here, | these are the rules of engagement and it gives us in the diaspora something to look back to.
The other bloc of countries really doing great work, even though they haven't released their AI strategy is CARICOM.
Keisha: CARICOM is the Caribbean Community, a collection of 15 member states across the Caribbean and Atlantic Ocean.
Mutale: But what's so interesting about the Caribbean is that they also hold slave records.
Annanda: Yes.
Mutale: So they can actually quantify how much of the global economy even helps seed AI, and they are coming at AI through that. I've heard it described when I've been in UN meetings by particularly Bajan folks as the new sugar plantation.
Keisha: Mm.
Annanda: Yeah, I'm a member of the Caribbean Philosophical Association and we had our annual conference in Cancun, right before the hurricane hit. compared to years prior, there were several blocks dedicated to AI. |
China's current economic philosophy that has gotten it out of its hole comes from a Caribbean economist who, the majority of the Global North did not take seriously, but China did and implemented it.
Mutale: And just on that point, we're not reading journals from Africa, from the Caribbean. So we're not even improving our own scholarship, and our own thoughts. |
Keisha: you said earlier, we don't need heroes. We need movements.
And you're inspired by the African Union being 55 countries signing onto this strategy thinking about their collective benefit and not just national interest. I think something similar at the interpersonal scale has to happen too, | like, how are we seeing each other and how we can help each other and not get sucked in by the power that the | systems offer us?
Mutale: I'm speaking at a social impact investors conference. And one of the things I'm planning to speak about is, I love that you fund me, but just funding me is a waste of money.
How do you fund all of the talent from across the spectrum so that we can go from we funded a Black woman to we're building an ecosystem? Because I have always wanted to be a star in a constellation rather than a star in a dark sky.
I'm really actually inspired in this call by, Annanda's role within the ecosystem, particularly as a chaplain. |
Technology isn't going to heal the body, but are there ways in which within our collaborations, and within funding systems, we make one of the metric, what is going to be the net benefit to society? Like, how is this going to heal the problem, as opposed to just solve the problem? And I think that diasporic communities, I think that Black communities have a real competitive edge, because in order for us to do anything, We have to heal. |
I would love to scale that healing through speaking to investors and family offices and VCs and saying if we do not invest in healing technologies then we're not investing in sustainable technologies. | I would love, love, love to see it and I think we can see it.
I tell almost every audience that I speak to When they ask me this question of hope, well, do you think it can happen? And I say to them, am I a slave? Because there were generations of people that lived and died in slavery who kept the hope that it wouldn't be one day no more.
Anything can happen, we just need the will and the movements to get us there.
Annanda: Oh, Mutale, I'm so appreciative of that. There are Black and Indigenous ways of being, knowledge, and context that understand the particularity of Western colonial harm as it relates to technology.
Because we forget, you know, like slave ships were, high tech for back in the day. I mean, that was wind power used to the max, right?
Mutale: Yes!
But also, you were able to transport human beings without temperature control, you did it in the most inhumane way, but when we look at those pictures of the way those people— somebody mapped that, somebody modeled that, somebody tested that, somebody engineered that.
Annanda: What moral repair can look like within the context of harm… I 100% agree with you that Africana people do have a leg up on that because we've had to contend with it.
Keisha: That's right.
Annanda: not just to survive, but also to thrive.
Mutale: Yes.
Can I just, this is the most beautiful podcast. To think about technology and repair, is so beautiful because those of us who work in technology, who are committed to technology.
AUDIO: Closing / reflection music fades in.
We have wounds that we are healing. and I am definitely someone who is determined not to pass on my trauma, but I'm also not a martyr. | I think a lot of the damage is when people feel scared. “It's her or me.”
| I've built a career on doing what's right because it's right. And starting conversations that might end when I die. And I am good at that. I am good for that. I am good because I know somebody else is going to have that conversation.
Keisha: thank you for spending this time with us.
Mutale: I'm so thankful.
Closing
Annanda: The sociologists have really come through for this podcast.
Keisha: They really have! I love it.
Annanda: I’m so mad at China. I’m like, can we not? Can we stop—
Keisha: The thing that doesn’t work?
Annanda: It makes me think of Caribbean philosopher Sylvia Winter talking about the colonial legacy not just in the past but in the present. She talks about how these colonial practices actually constrain “the possibility of humanity.”
Why are we constraining the possibility of our humanity? Are we so uncreative?
Keisha: I don’t feel called to like a nationalist stance with technology and I have a problem with the national strategies that only frame the good of tech in terms of what it can do for that nation state. I think that’s just too small. We were talking about space recently. We’re part of a much larger neighborhood than the nation state. At the same time the nation state is the language of politics and culture and economics right now. So you can’t escape it exactly. How can it work for us?
AUDIO: Credits music fades in
Where I have hope, as I think about the diaspora’s perspective, is in the people who, like Mutale, like you, like me, are able to hop borders a little bit and understand how these patterns play out differently on different sides of those lines, and then are willing to take that perspective to shift what happens, how it’s designed, how it plays out for people beyond our own kind.
Creativity is about the surprise, and the potential of putting this by this and seeing what new thing emerges. You may not be able to control that third thing.
Annanda: Come on now.
Keisha: But we all grow because of that third thing.
AUDIO: Credits music
CREDITS
Annanda: We’re building community with listeners this season. So reach out and talk to us on Instagram—our handle is @moralrepairpodcast. Also catch us on X… formerly Twitter… And LinkedIn. We’d love to hear from you!
Keisha: Follow the show on all major audio platforms—Apple, Spotify, Audible, RSS—wherever you like to listen to podcasts. And please help others find Moral Repair by sharing new episodes and leaving us a review.
Annanda: I’m Annanda Barclay.
Keisha: And I’m Keisha McKenzie.
Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Rye Dorsey, Courtney Fleurantin, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.
Keisha: Original music is by Jim Cooper and Infomercial USA. And original cover art is by Sam Martin and Randy Price. Our two-time Ambie-nominated podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.
[PRX SIGNATURE]
SHOW NOTES
Talk to us at Instagram (@moralrepairpodcast), on X (@moralrepair), and on LinkedIn: https://www.linkedin.com/company/moral-repair-podcast/
Follow Mutale Nkonde at AI for the People.
MIT Technology Review: “Africa’s push to regulate AI starts now.” (March 2024)
African Union: “African Ministers Adopt Landmark Continental Artificial Intelligence Strategy, African Digital Compact to drive Africa’s Development and Inclusive Growth” (June 2024)
Stanford Encyclopedia of Philosophy on Anna Julia Cooper
Combahee River Collective statement (1977) https://www.blackpast.org/african-american-history/combahee-river-collective-statement-1977/
China and African infrastructure projects: https://www.cnn.com/2023/09/26/china/china-african-loans-development-belt-and-road-intl-hnk/index.html
Tech Boom Or Bust? A Syracuse Story
Season 2 | Episode 8
Tech Companies and American Manufacturing have a history of booming and busting towns. The Big Question for this last episode of the season is, will this AI chip factory scheduled to open in a suburb just north of Syracuse, NY actually provide reasonable, sustainable increased quality of life, or will it be the latest iteration of the boom and bust cycle? Annanda and Keisha ask key Syracusans to find out.
-
Syracuse S2 E8
Intro
Annanda: Hi Keisha,
Keisha: Hi Annanda,
Annanda Narration: Today is not a day to sleep on upstate New York! Micron is the 4th largest maker of semiconductor chips in the world.
Keisha Narration: Semiconductors are the basic materials needed to make chips and integrated circuits.
Annanda Narration: These chips are a cornerstone in AI technology. They are behind everything from AI powered video games to Chat GPT.
NVIDIA — the Silicon Valley company that continues to revolutionize the semiconductor computing power of AI — has chosen Micron to produce a good number of their chips.
And Micron is expanding…
(Insert Montage) [ 0:00-0:17 Micron to spend up to $100 billion for New York semiconductor factory] [0:40-01:07 President Biden: Syracuse area poised to ‘lead the world’ in advanced manufacturing]
Keisha Narration: Micron is building a chip manufacturing factory in Clay NY, a suburb just north of Syracuse. It’s going to be a Mega factory. 1,300 acres when all is said and done. It’s expected to employ thousands of people and generate a boom cycle in Syracuse.
Annanda Narration: But is this too good to be true for locals who’ve lived through one technology manufacturing boom and bust cycle after another? This episode, we’re taking a look at whether Micron’s new plant could positively impact a city that lost its last manufacturing decades ago?
SOUND: title music starts
Annanda Narration: I’m Annanda Barclay…
Keisha Narration: I’m Keisha McKenzie…
Annanda Narration: And this is Moral Repair: A Black Exploration of Tech. A show where we explore the social and moral impacts of tech… and share wisdom from Africana culture for caring and repairing for what tech has broken.
SOUND: End title music
Silence
Segment A
Annanda Narration: Tech companies and American Manufacturing have a history of boom and bust in towns and cities across the midwest and northeast. These regions make up what’s known as the rust belt.
Keisha Narration: Micron is opening its biggest factory just north of Syracuse NY. The company has pledged $100 billion dollars to build a technology complex there.
Annanda Narration: So the question becomes, is all this hype too good to be true? Will Micron actually provide reasonable, sustainable increased quality of life while producing the chips needed for the AI revolution, or will it continue the history of moral injury of boom and bust?
We’re going to start with Bob Searing. He’s the curator of history at the Onondaga Historical Association in Syracuse.
Bob: We are sort of the de facto county historical agency of record.
Annanda Narration: We wanted to talk to him about Syracuse’s industrial past. To see where the city’s been as it confronts this new chapter.
Annanda: Thanks for being here Bob!
Kiesha: What is the history of Manufacturing in Syracuse?
Bob: if you could have dropped in to Syracuse, New York in 1960 or even, through the 1970s and even in the 80s you would have seen people making very good money.
Working in very stable jobs, with a really nice housing stock with a lot of great benefits of living in a city. Great roads, great schools, great parks, great libraries, great opportunities, it was a wonderful place to live.
I think that manufacturing allowed that to happen.
Annanda: How so?
Bob: Manufacturing jobs allowed for a class of people to get employment. They didn't require a higher education. My own father didn't grow up in Syracuse, but graduated from high school, served in the Coast Guard, got a job at IBM and worked there until they sold the place.
My father's story is the story of millions of Americans, right? Who had an opportunity to earn a very good living and support a family and buy a house and have children, and go on vacation. Do the things that most folks want to do and provide for their families.
You can have a life. And I think that the pull of that, what used to be naively called the American dream or whatever the branding of that was, was a reality. And manufacturing was the way that most folks of a certain class, and background were able to crack into the ranks of a middle class that might have otherwise only been accessible through professionalization.
And so that pride, right, of work and pride of doing something, maybe with your hands, is something that is important to a lot of people. And when you have a community like Syracuse where at its peak, maybe 40 to 45 percent of the jobs in that city were manufacturing, that means that a lot of people in your community and your neighborhood are living the same life. Right?
Keisha Narration: This story reminds me of my grandfather’s story. He worked in a bauxite plant. He worked in sugar mills. He even worked on a US air force base in Jamaica. So many of these industries made it possible for these families to change their economic class, to get access to education they would never have had.
But it didn’t always last.
Bob: When you talk about maybe the bitterness of politics we're living through a scary era in a lot of ways, when you drive through cities, whether it's here or, Camden, New Jersey, or, Rochester, or Binghamton, New York, where I'm from, they're about to knock down the old IBM buildings.
You just see that this vast wasteland, that was in, in large part, made to exist, not through any fault of the workers, right? If anything, they're victims of their own productivity, and all kinds of changes in policy. So I think that, people that are left behind, whether it was when Allied Chemical closed in ‘86 or when Chrysler and GM closed their gear factories, you had generations of workers there.
Grandfathers had worked in these businesses. That hurts a lot. And that leaves people feeling bitter and out of touch and that the politicians and the corporate leaders don't care about them. And so you can understand why there becomes a real cloud.
About an area or a real sort of sadness or a bitterness. And there's a lot of pushback that we have to do, in communities where we're trying to be positive and say, “Look, I know that this has happened, for 50 years, but we're trying to turn the corner are things going to be the way they were, in the good old days, and I'm of course, making air quotation marks here, probably not because those days are gone. Right?
For a host of reasons, but understanding what happened and why it happened, helps me at least, but I'm a historian. Most people don't have the luxury of being able to take the time to understand this stuff. They're just living their life every day, and they lost their job and they didn't get another one. This is the story of the Rust Belt.
Annanda: What in particular, led to the last tech manufacturing bust cycle in Syracuse?
Bob: When you talk about manufacturing in Syracuse, there were a few things that led to the decline, right? It was unionization, probably being the most prominent one.but Syracuse is in many respects synonymous with Carrier, the air conditioner manufacturer.
Carrier was one of the primary producers in the community. And one of the major employers as well. And at its peak, you've got, maybe 11,000 people working in manufacturing and R and D here in the city. So this unintended consequence of success is that Carrier's machine and the air conditioner allows people to move to the Sunbelt and to the Southwest, and into the American South, which are historically open shop, anti union states.
It also makes it hospitable to the business climate. It's also hospitable to human beings living there, thanks to air conditioning. So you have this sort of weird sowing the seeds of your own destruction, in a community like Syracuse, and now, it's 2024, the dome is no longer Carrier Dome, it's the JMA Wireless Dome signaling a shift in this community's manufacturing base.
One of the things that Syracuse has always had going for it. Is that, we are almost geographically directly in the middle of New York State. We are at the middle of vast international transportation networks.
When Micron was making the decision to build here, all that we hear about it is that this was a unicorn site. In terms of its water usage, in terms of the, proximity to major university centers like Syracuse and Cornell and RIT and
We've got a base of manufacturing institutional know-how that's somehow left from the heyday of the community. You can see why Micron would Choose, to come to this particular place, and in many respects, it represents a back to the future moment for Syracuse and the community here in upstate New York, really in general.
That's what I think is amazing about being in Syracuse now is that a lot of these cities aren't getting this chance at, another go at it. And that's a testament to a lot of people in this community that have played a long game and a great partnership between government manufacturing interests, commerce groups and things like that.
It's somewhat appropriate, and really rather poetic that a community like Syracuse that has been down for so long, has been fighting for its survival, is about to find itself at the center of this revolution in semiconductor packaging.
Annanda Narration: That revolution begins with what’s left…a series of driveways. Micron bought property and removed the houses left behind… to prevent squatters on the land.
Keisha Narration: Our producer Genevieve Sponsler met up with a friend of hers who’s a lifelong resident of the Syracuse area — Michelle Kivisto. And they brought their kids along as they drove through the site. Michelle used to work with her dad at one of the GM factories Bob mentioned, before it shut down. And she now lives across the road from the future Micron site in Clay. Nothing has been built yet, and you can drive through the huge parcel of land that will someday be Micron’s home.
Michelle: What's this over here on your side, Brody?
Brody: It's a driveway
Michelle: One driveway.
Brody: Two.
Michelle: Two.
Genevieve: Wetlands.
Michelle: There's a nice pond. So some of these houses look like they were pretty close together.
Genevieve: some of these trees are really beautiful.
Michelle: Yeah, it reminds me of the notebook where they go and just sit under trees and things like that. It's trees and meadows as far as the eye can see, these are really mature trees. It's interesting to think about the factories that will be here will be at least that tall. These kind of look like big, white, monoliths, not too different from the Amazon factories that are already in town.
<<music cue>>
Annanda Narration: We sat down with Michelle before their drive through the site and asked her about her time working at New Venture Gear, from 1997-2000, before it and the last major factories left town.
Annanda: What is a distinct and meaningful memory you have working at the factory with your dad?
Michelle: My first day at the factory, my dad and I actually drove in together. And, I found out that on his way into work (it would be like at 9:30, 10 o'clock at night). He would call his Mom on the way into work. And I had never known that about him. Getting to hear him talk to my grandma, just ask her about her day, then bring me to the factory and show me around.
And it was, I think, half a mile to get from where we were in the parking lot to my actual department. So, we walked around a lot. He showed me his footlocker where I could put my things. He worked in a separate department than I did, but he introduced me to the foreman and made sure I knew my way before he left and then came to pick me up again at the end of the shift.
When I got laid off I was just like, Oh, I really miss that about my dad.
Annanda: Sounds like a family man.
Michelle: He's very much a family man. Anything he can do to take care of his family, he'll do. And he'll do it with a smile on his face and not let us see what's really going on. But having worked in the factory with him, I could tell that, the toll that working a really, really hard physical job would have.
He did it without any complaint and I have so much respect for him for doing that. But I also have respect for people who say, no, this is hard and this is really difficult and I can't do this or I don't want to do this or we deserve better working conditions because of it because humans are humans.
Annanda Narration: When New Venture Gear closed, it affected Michelle’s family a great deal. Her dad started working for a factory in Indiana, living there and driving home to Syracuse on the weekends with co-workers.
Keisha: Having lived in Syracuse for most of your life, how would you describe the town to people who don't live there?
Michelle: It's a place that has a lot of history but a place that's also been resilient to change. Our downtown area was really vibrant with retail stores in the mid ‘80s. It was a place that had really big department stores. My mom was actually part of a guild fashionada. She would go and be a living store mannequin and we would go and watch her. During the ‘90s those storefronts fell into disrepair.
When Carrier left, it was a lot of loss of business to the area. So, places where we used to go where it would be really vibrant, like, Shopping Town Mall, which was the local mall closest to the Carrier Corporation, lost a lot of retail stores. It was a loss of choices, of places to shop, places to bring, places to buy different things that were different than other local area stores.
It was also a loss of people, because people that work there no longer had jobs there, so if they didn't find another job, then they had to move away. Some people also couldn't keep their homes once Carrier was gone. Neighborhood shifts and things like that would happen.
Keisha Narration: Neighborhood shifts are happening now, too. Michelle is thinking about selling her home near the Micron facility, to get farther away from potential traffic, even though she is excited about the company coming to town.
Annanda Narration: And we are thinking about shifts in how factory work itself is done. During Syracuse’s last manufacturing boom, there wasn’t talk about robots or AI replacing people. And one of my biggest concerns with Micron is that they may just up and replace the Syracusians they employ with robots — eliminating the people who are excited to finally have good, solid jobs. And that would be a modern repeat of the devastating history of boom and bust in this community.
Keisha: When you think about AI giving us robots and other tools that don't need to take time off. But people do need to take time off.
Do you think that the Micron factory is a practical long term solution for the economic needs that so many people in Syracuse have?
Michelle: I saw the physical toll that working on a factory assembly line had on my dad. I was [00:12:00] like, what if robots took his job? Well, would that be such a bad thing if he literally didn't have to give his body for this job if robots did that instead and you could just have people fix the robots instead of trying to fix human bodies?
Micron is investing in local schools to build spaces to train employees coming up. So, long term. If, you know, 15, 20 years from now, that ends up being a robot's job, those are still skills that people will have that can transfer in other ways, I believe.
It's not like all of those skills are completely gone. I think that if AI eventually takes on certain roles, maybe it could be a good thing, in a way, if, people were able to find other ways to supplement their income. If Syracuse is built up enough by then in order to have many other industries that have attracted to [00:13:00] the Micron snowball effect, then there'll be other opportunities for people if that were to happen.
Keisha: What hope does the Micron Factory represent and offer for Syracuse's people?
Michelle: Oh my gosh, I'm so glad to use that word hope, because again, this is a topic that people can be very divisive on in which they're like, “I won't believe it until I see it.” For me, I've already seen what they're investing into the community. Rather than just coming in and putting down a factory or just, putting up some fancy poster board and being like, look at the really nice things we're going to do.
And we're going to build this many factories and, this many people are coming. What Micron is doing is actually investing in the community before any concrete has been poured. Any foundations have been dug. Before anything else has been created. They're actually creating, I believe, very goodwill [00:14:00] in the community.
Annanda: From what you’re saying Michelle, it seems that Micron’s promises are golden so far and I wonder what impact that goodwill and community investment has had for you and your family?
Michelle: My 13 year old daughter just went to a chip camp that was sponsored by Micron in which they were 110, 7th grade students from her district were bused to a nanotechnology lab at Cornell University, and then they were brought to the Elite Gaming Academy to learn about e gaming and all of the really wonderful things like jobs and positions that can come out of that. Not just being a gamer, but you can be a sideline commentator or coordinator for the games.
The last day they got to do some STEM activities as well. And all of the busing. They were given t-shirts for each day to keep track of them at the places they were at. Plus breakfast and lunch and snacks were all provided by Micron. For me, that was just a really great moment to have that happen for her.
They've invested in our local science museum. They've invested in our local YMCA's. To help build the infrastructure for childcare, knowing that families that come, or even people that come and wanna stay here and build a family here, are going to need that.
Keisha: Are there any other hopes you have with Micron coming to town?
Michelle: Another hope that I have that they'll continue with is getting the feedback of the community. They've already asked several leaders of many different backgrounds to speak with them, and to get together to give them ideas about what to do. They've established a fund to give back to the community. And asked a committee to get together to figure out how to disperse those funds throughout the community, so that everyone can benefit. Not just people at the top, or people in construction, or people, in certain other industries. That it's something that everyone can feel the love from.
Annanda Narration: Not going to lie… Michelle’s report sounds almost too good... But I don’t want to deny her experience or a sincere possible reality I honestly hope to see in the world. The reality that a company, while let’s be real, is concerned ultimately with its bottom line, doesn’t see that bottom line as antithetical to being in good relationship and stewardship with the local community means something.
Keisha Narration: Yeah, I was waiting for the other shoe to drop. But when we get back, we’ll interview the person who makes sure Micron keeps its promises to the people and environment of Syracuse.
SOUND: break cue
BREAK
Segment B
Keisha Narration: So whose job is it to make sure Micron keeps their promises? Who’s minding the store? The one name we kept hearing from community members was “Melanie Littlejohn.”
Annanda Narration: Aside from a fabulous name. Melanie Littlejohn is also the President and CEO of the Central New York Community Foundation.
Keisha Narration: Ms. Littlejohn also serves as the co-chair of the Central New York Community Engagement Committee. The committee will identify and recommend where to spend the $500 million Community Investment Fund created by Micron and New York State. Some priorities will include education, housing, and workforce development with an emphasis on underrepresented and disregarded communities.
Annanda Narration: And before these latest roles, Ms. Littlejohn was a leader at National Grid in Syracuse for 19 years.
SOUND: interview cue
Melanie: Over the last three years, we've had a process to attract Micron here to locate its largest fab in the world, [00:17:00] um, there was a community benefits plan associated with their move here.
A part of my role as the co-chair of the central New York community engagement committee was to get the voice of the community in the plan. Right? Why have a plan and not ask community what's important? It's almost so simple, but it's not. And I do applaud, not only Micron, but the state of New York, which said, look, we're jumping in this with you.
Because we want to ensure that we make central New York, the best place to live for all. All people. Central New York, unfortunately has some really tough statistics, around poverty, and around, all of the isms associated with poverty.
Unfortunately, the city of [00:18:00] Syracuse, is on the top 10 list in the country of the, most segregated, city for black and brown poverty. Micron, State of New York, understood we needed to lean in, more now than ever before to ensure that everyone had an opportunity in this moment.
Part of our role was talking to community. Synthesizing that data and what was said, then funding those organizations that will help begin to change the trajectory of the lives of people um here, keeping an equity and equality lens throughout the entire process.
I'm a mom, a wife, a daughter, a sister, and my coolest and best new title is a Mimi, right? So when I think about this moment, creating that best place for my grandchildren to live in a world where they're going to be embraced. All of them will be embraced.
Annanda: In Boise, Idaho, Micron has a similar building facility project, which you might be aware of. Tax breaks and incentives on local, state, and federal levels will be given based on Micron's ability to keep its promises for [00:19:00] jobs, sustainable energy practices, and positive community impact.
Is there something similar in Syracuse?
Melanie: Absolutely. That's what that community benefits plan is. And it's the Green CHIPS Act of New, in New York State. And then there's the Federal CHIPS Act, which Micron received that has the same tenets associated with it. What you will see over the years to come is a focus on five or six key focus areas: workforce development, childcare, housing, infrastructure, transportation, small business, supply diversity, and K-12 education with an emphasis on STEM and STEAM. Is a part of our community priorities document. For corporations who receive these incentives, I love the fact that both state and federal government as well [00:20:00] are requiring this to be a fundamental part of receiving any of these benefits.
Now, I got to tell you, I've been in corporate America a long time and been around lots of corporations, but Micron has been pretty easy and forthcoming in working with us in this space. They come to all of the meetings. They show up for all the public events. When we're in the background with our sleeves rolled up, putting together documents, their people are there, and they're not there as a watch over your shoulder.
They are there to roll up their sleeves to really help frame out the document. And when I talk about the document, it has the good, the bad, and the ugly. But the only way that we move forward as a community, a region, and a nation is to look it square in the eyes. And more importantly, To do something about what it is [00:21:00] that the voice of the people have said.
And that's what this moment is about. I'm actually so excited to be in the middle of it because I get to experience it. I'm exhausted, but it's a good, tired because I believe that If we don't get it right now, shame on us. Shame on us.
Keisha: We've talked about the community benefit agreement as a way to help Micron keep its promises, but why are you confident this time will be different?
Melanie: This time will be different because there are so many other people that have experienced the past and know what happened, actively engaged in the moment. There is a trail of accountability that's a mile deep, and we didn't have that with the industry before. Industry came, industry went, industry did its thing without the voice of the people until after it was too late
We have the [00:22:00] lessons of the past, which I got to tell you what I've seen. There is no let up off around accountability, questions, engagement, partnership. And the whole, “Prove to me, Micron.” And all of those that will come because of Micron (other companies) the level of community collaboration is at an all time high. And you know how hard it is to get folk to work together, right?
Whether it's government, education, corporate, community. Try getting everybody to work together. That's always difficult. But it's a clarion call, if you will. And so people are falling in and working together. And yep, sometimes you got elbow through it. [00:23:00] But the most important thing is even when there is disagreement or I'm not on the same page with you on this, people stay at the table.
So that's the fundamental difference that I see before I disagree with you and I'm getting up from the table. I ain't ever coming back to the table to talk to y'all again. It's not that way this time. It's like, yep, I disagree with you. Here's the reasons. Where's the middle ground? Where can we find the common space for us to meet?
And it's hard. This is not a cakewalk by any stretch of the imagination, because sometimes the moment requires us to have hard conversations and that no one wants to have. But, in order for us to step through, we have to call a thing. We got to tackle big hairy Issues, because if we don't change the systems and we don't change the moment again, it just becomes glitz and glam.
Annanda: What is it that you see is needed for the people of Syracuse?
Melanie: I need kids to know how to read by the time they get to the third grade. I need kids not to have lead poisoning. I need kids to have a reliable, safe roof over their heads.
I need folks to have livable, life changing, Wages, right? Because sometimes when we say livable, it's just a little bit livable. I need reliable transportation. I need good childcare that falls outside of the traditional center opens at 6am, it closes at 6:01 and if you're not here by 6:01, you're going to get charged $10,000. I need reliable and different forms of care. I need the digital [00:25:00] divide not to be the digital divide. I need a bus to be able to pick folks up and take them to where the work is.
Annanda: Is there anything that you want to say to the people of Syracuse?
Melanie: Let hope and courageousness ride shotgun. Be engaged. Use your voice. You are heard. You are seen. And if you feel like you're not heard or seen, just speak louder and often. Everywhere you go. In the words of a very dear friend of mine, the one and only Gwen Webber McCloud, she always says, “I [00:31:00] have my hands on your back.”
So Syracuse, Central New York, I got my hands on your back. And you can take that to the bank.
SOUND: break cue
BREAK
SEGMENT C
SOUND: interview cue
Annanda: We're back. And this, being the last episode, was definitely filled with a lot of community and a lot of meaning. Keisha, what comes to mind for me is, Adrienne Marie Brown, in her book, Emergent Strategy, she talks about how conversation is a crucial way to explore what we believe and to make new understandings and ideas possible.
And I see that so much with the people of Syracuse what does it mean to have a higher quality of life? What does it mean to have manufacturing come back? And, I never thought I'd say this, but it appears to be that Micron as a tech company is also saying, what does it mean to be responsible and to invest in the community that we're in?
Keisha: Every single conversation kind of gave us a different layer of understanding of what it means to be at this cross section of this part of New York, of this part of the country and at this kind of front line in what it means to be in tech. We heard it from people who've worked in the tech factories, from the historian with his big scope of history, from the community development leaders who are trying to make sure that it actually works for people.
And I love the theme of not just us hearing from them in conversations with us, but them talking to each other, co-creating what does democracy actually look and feel like, you get softened by somebody actually doing a responsible thing.
Annanda: Come on now! Yes! Which is a radical act, Keisha. As you're talking, it reminds me of this saying I heard somewhere where To not be bitter in our society, in our world, is actually a radical act. That is a spiritual and emotional discipline, to not allow the world to make you bitter. And what I'm hearing from you is, Syracuse is open to not being bitter.
Keisha:This is why I was waiting for the shoe to drop. I was waiting for the, "this is what they say, but actually we haven't seen it yet. Or this is what they've promised, but it's not come to pass," but it's coming to pass. People are already experiencing benefits. And I think the healing is not just the money transfer from the big company to the people.
It's almost like the healing of imagination and like people who have seen and experienced the resources getting taken away, being willing to put the hope back in the public sphere, put the hope back in their local community and dare to dream of something better. Yes.
Maybe their tagline should be the open hearted community. Because all we heard from them was open hearts. Like, willingness to trust and willingness to try and willingness to have those conversations that you're talking about. And you have to have an open heart to do that.
Annanda: Eyyy!
Well there you have it. Thank y'all for having open hearts to listening to us for the past two seasons. Stay tuned , for what may come next.
Keisha: It's been amazing.
SOUND: Closing theme starts
CALL TO ACTION
Annanda: Reach out and talk to us on Instagram—our handle is @moralrepairpodcast. Also catch us on X… formerly Twitter. We’d love to hear from you!
Keisha: Follow the show on all major audio platforms—Apple, Spotify, Audible, RSS—wherever you like to listen to podcasts. And please help others find Moral Repair by sharing new episodes and leaving us a review.
S2 CREDITS SEGMENT
Annanda: I’m Annanda Barclay.
Keisha: And I’m Keisha McKenzie.
Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Rye Dorsey, Courtney Fleurantin, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.
Keisha: Original music is by Jim Cooper and Infomercial USA. And original cover art is by Sam Martin and Randy Price. Our two-time Ambie-nominated podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.
[PRX SIGNATURE]
SHOW NOTES
Syracuse area
From 2003 History of the Syracuse area’s Decline of manufacturing jobs – Brookings Institution
Episode about Haudenosaunee
Micron Items
https://www.cnycec.org/ – Melanie Littlejohn is co-chair of the Central New York Community Engagement Committee for NY State’s Micron investment
Melanie Littlejohn also is the CEO of the CNY Community Foundation. Here’s a video
NVIDIA AI & Omniverse: Pegatron Digitalizes AI Smart Factory w/ Isaac Sim, Metropolis, and Omniverse
Inside Micron Taiwan’s Semiconductor Factory | Taiwan’s Mega Factories EP1
What will Micron Technology mega computer chip fab look like in Central New York?
Micron to spend up to $100 billion for New York semiconductor factory
Season 1
Algorithms: Follow the Purple Light
Season 1 | Episode 1
What do we do about recommendation algorithms? What ethical standards could we use to reshape technology? Hosts Annanda and Keisha talk to Stewart Noyce, a technologist who helped develop the internet, and Rev. Dr. Sakena Young-Scaggs, an Afrofuturist scholar and philosopher, to understand how we can all navigate recommendation algorithms in a life-giving way.
-
INTRO
Keisha: One time I hopped onto TikTok and I think I was watching these gym or body movement videos, and then, like, before I looked up, it was an hour later and there was some live talking about the president being a reptilian imposter, and I'm just like, [laughs] how did we get from one to the other? I don’t even know.
It actually made me wonder about, you know, not only how much time do these devices take from us, but also how much data is needed to create them in the first place?
[title music starts, laughter]
Annanda: Quite curious that the president is a reptile [laughs], and this is what, this is what we're fed. This is what we're fed.
Annanda: I'm Annanda Barclay, death doula, moral injury researcher, and chaplain who’s concerned with a life well-lived and cherishes the relational stories people share.
Keisha: And I'm Keisha McKenzie, technical communicator, strategist, and advocate.
This is Moral Repair: A Black Exploration of Tech. A show where we explore the social and moral impacts of tech advances and share wisdom from Black culture for healing what tech has broken.
Today on Moral Repair—what are we gonna do about algorithms?
[title music ends]
[upbeat hip hop instrumental music starts]
Annanda: What algorithms recommend shapes our worldview. It impacts how we think and feel. It informs our politics, how we shop, how we perceive one another.
Keisha: It's surprisingly like exposure, to share what's going on in your algorithm because you know that it's doing recommendations based on what you told it at some point or what it's assuming about you. When we share the Netflix password at home, like I, I have a whole separate profile for guests because first of all, you don't wanna see what they're seeing. You don't want them to see what you're seeing and you don't want what they're seeing to mess up what you're seeing.
Annanda: Recommendation algorithms are so incredibly intimate and personal, even though it's a shared experience within the company, the platform, it's oddly both public and private.
It's as though an algorithm is a pair of glasses that colors our view of the world. And that algorithm, it was created by a stranger for corporate interests to keep us watching and to sell us things. So yeah, what are we gonna do about algorithms?
[upbeat instrumental music ends]
SEGMENT A - STEWART
[mysterious background instrumental]
Stewart: I watched Instagram over the last six or nine months go from serving up my family and friends to serving up whatever it thought I needed to watch. I just watched a guy jumping off while he was skiing and juggling. It's the most amazing thing.
[Background yelling: “YES, the flip juggle!] [laughs]
I did not have any interest in that at all when I woke up this morning.
Keisha: Stewart Noyce helped create the internet—no joke.
Stewart:
[mysterious background instrumental]
[In] 1992, our friend group had a guy who was the first webmaster at Sun. He had to explain to us what he was doing with HTML. And they were building the first browsers; they had some of the first websites and all of a sudden what we thought was just gonna be something that helped companies do a better job of making money now became something that everyone could interact with and everyone could immediately, uh, kind of create their own worldview.
Stewart: So, when they had the Lillehammer Olympics, Sun went over there and started broadcasting the news using their web browser and so you could get access to the news faster and in a more open and creative way.
CLIP AUDIO from IBM Olympics ad: 00:09-00:35
Before they could start at the 1994 Winter Games… we started. Four years ago, IBM was here helping plan and design every event…. The Lillehammer Olympic Organizing committee chose IBM not just to integrate thousands of computers and systems but to actually help manage the Winter Games.
Stewart: I think that was the moment I was like, we—I can’t believe, I think we've unleashed something.
Annanda: Something that upended who was in control.
Stewart: Being in the middle of the first push for the internet meant that we were taking people away from very centralized systems.
Individuals, then, were there empowered to create and to interact in new and interesting ways. So Twitter ended up being a massive change model for the world because it gave people a place where they could throw out their voice.
Now let's step back though and say what's happened in response to all of that energy, right? Was that the people who wanted to continue to maintain their control over the social fabric, to make sure there's order in society, to make sure that we don't get too far over our skis in terms of chaos in the public forum, they wanna pull it back. And so surveillance came out as really a response to all of us being able to share our information.
Annanda: Stewart’s a technologist with a lot of experience in what powers the internet, including recommendation algorithms—
Keisha: which are based on that notion of surveillance.
You can think of them as recipes for serving up content based on what the app or site that we're using thinks we wanna see—
Annanda: —or what we've already chosen to tell it. So we feed it something, and then it keeps on suggesting the next thing, and the next thing.
Stewart: The first step you would do in the recommendation algorithm is you can just take the last thing that somebody's watched and keep feeding that. It's like, did I watch a comedian for 60 seconds? Okay, well I'm gonna get that comedian for the next two months. That's not a capable algorithm. It's a trivial algorithm. It just gives me the last thing I saw. It gets smarter when it can take in more inputs and it builds from a base of experiences that I have and maybe brings in other factors that give them clues as to my interests. And those can be coming in from external sources. If you have multiple feeds of information that can come from different browser history that I've had, then you might be able to get a little bit more information about who I am and what I'm interested in: including more features, right, in the algorithm itself. So as opposed to having one or two things that you clue off of, this might have 10, 15, or 20.
Annanda: One of the issues with recommendation algorithms is privacy. Corporation Big Brother is able to see what you watched and when. Think dating apps. Over time, they learn your preferences, you're able to chat privately, but it's always at least a three-way conversation: you, your potential cutie or cuties—and the corporation that designed the algorithm has forever access to our private conversations. Corporations with recommendation algorithms know us intimately. We pay to play and we pay with our privacy.
The internet was groundbreaking for freedom of speech. But also a new threat to national security. Pandora’s Box unleashed.
Stewart: I was involved in the introduction of elliptic curve cryptography, which allowed people to have secure and private communications from your handheld computer into the back end of the internet, and to other people. All of that activity would be private and would accelerate communication between individuals. Well, you know, you can't have that, so we had to have backdoors put into all of the cryptography…
Keisha: “Backdoor.” Quick definition here.
[Background: swipe sound, double bass-heavy mysterious music, keystrokes, wicked giggle]
Let’s say you want to use a social media app on your phone. You’ll come into the system through the front door: you type in your username and password and you’re in. But the system owner also makes a “backdoor” to that system. They and whoever else they let in can use that backdoor to access the system and its data, including yours, even if they don’t have your direct permission. Those terms and conditions we don’t read but swipe through? We’re giving implied permission there, to God knows who. OK, back to Stewart.
[Background sounds end]
Stewart: And if anybody did not wanna play ball, then they had to leave the United States because this is where we were maintaining that control.
Keisha: The government required those back doors? Or that was industry consensus that there would be back doors?
Stewart: I wasn't in the room. All I know is that their back doors have been inserted into all these algorithms.
Stewart: And if you look at the systems that are in place like Signal or WhatsApp, or Telegram, these peer-to-peer messaging applications that are secure are secure to the extent that most people can't break in and hear your communication, but whoever's running that system can. So whatever sovereign entity or large corporate entity is running that model, they can hear everything that you know you have and they can use it, uh, for their own purposes.
Keisha: Since 9/11, particularly, the US government has been empowered to suck up tons and tons of data, whether through the NSA and the intelligence community or through the information that we volunteer through our interaction with corporations.
Stewart: Live in the light, then, because there's no, there's no privacy anymore, literally no privacy. And if you want privacy, then you should probably go and have a conversation with somebody at the beach and leave your phone in the car.
Keisha: You were saying capable recommendations require several different kinds of information sources, so not just the stuff that we might create as a Instagram post, but also information from the mobile device that we're using to log into Instagram and maybe our network of connections on Instagram and maybe the metadata for how often we use Instagram and all that information then gets compiled into this more capable algorithm. Is that right?
Stewart: Yes that’s correct.
Keisha: So like, If, as you're saying, what we're facing now is corporate control, decentralized control, and very little restraint cuz there's a profit motive—
How are you imagining a path forward? What's the next step if corporations are not being restrained by a government and also government isn't being restrained by government, what's the opportunity here?
Stewart: You mean our opportunity for survival?
Keisha: Yes! [all laugh]
Stewart: I would say that, yeah—
Annanda: Yeah.
Stewart: I thought I'd throw that out there. I would say that I'm very optimistic about the next generation and I think of them as generation AI or generation artificial intelligence.
They're growing up in a very new kind of world, where these massive global systems that have been collecting information about them and all their friends and everybody else, now get turned against them and they just look at 'em and say, oh no, if we're gonna live here for the next like 60 years, we're not gonna put up with that.
And so they're already changing.
Keisha: Several sites built on recommendation algorithms have lost credibility and users over the past year. As we start to worry about privacy and bots, some of us will share less with each other. As people lose confidence in the communities that the internet has given us, trying out smaller platforms and being more wary of algorithms makes a lot of sense.
But it might be too early to know how big a trend this is gonna be.
Annanda: I wonder how this moment in tech is affecting those of us who want to feel known and safe with each other, not watched by corporations who make money out of what they think they know about us, and not pushed apart by algorithms we don’t control that tuck us into separate corners.
The distributed web lets us find things and people we’d never have discovered on our own. And yet the recommendation algorithm can make it really hard to come back together.
Disconnection and endless advertising is not at all what the internet promised us.
Keisha: I came of age on the internet around 1998. I've seen people have really deep and meaningful human experiences, relationships, connection, self-discovery, all of that because of this distributed web. But then at the same time, in the last two decades, people have been crying out for a lack of local, in-person third spaces.
So work has zoomed into massive, um, influence on your life. You have less time to do things and to build connections that are meaningful. And so I do see people getting off of X platform, whether that's TikTok or Facebook or Twitter or Instagram, and going to these smaller servers on Mastodon and Discord.
It's like going from a walled garden that one guy in Silicon Valley controls—[Stewart laughs]—to a walled garden that somebody down the street controls, but at least you know the guy down the street and you can kind of work with them on the rules of the walled garden that you're now in. You're in somebody's patio instead of in somebody else's mansion.
And so there's something around this more localized version of the internet that's pretty interesting and if we play it right, if it doesn't turn back into those smaller spaces, our spaces that corporations can weasel their way back in to advertise to us through content or to influence us in terms of our political thinking or our social thinking, then I won't worry.
But if it's more that people isolate, I think that's a troubling trajectory.
Stewart: If you want to make more contacts and more relationships that are real and deep, then as an individual, you should be using the internet and using social media and public spaces to present your talking points, goals and interests, in such a way that those people who should connect with you find you and start having a relationship with you. But don't expect that “I represented myself” to result in immediately strong interconnected relationships that develop value. You're going to have to work at those.
Keisha: Our best shot, given the technology is here, is to try to use it to drive towards knowledge and wisdom. And part of that means that we get to use influence, whether it's influence and virality through the bots, or it's actually building on our human connections and persuading our neighbors to share the thing.
What's the difference between that sort of spreading the message and the kind of spread-the-message that unnerves you about this technology? If everybody's having the same dream, that's because somebody spread a particular dream. But we want to spread a particular dream, like a particular dream about flourishing and a particular dream about life and community.
Stewart: Every group that gets together and creates something new brings in the best of each individual to bear on the solution. We have a vision around something that is a flourishing human society that brings value to each of us. Each of us is worthy, each of us has value and should be able to contribute. And because we do that, the life becomes so much easier for all of us because we just have more instead of the scarcity.
Keisha: Where I hear people worrying the most about algorithmic technologies is where they are used to break the group apart, to break apart the symphony, where they're used to atomize us and to increase suspicion. And even the question of whether we're able to tell what is true because there's so much content and it's increasingly hard to distinguish between true and false, that feels fundamentally dangerous because if we can't tell what is true, then how can we be in the same reality with each other?
There has to be some sort of countermeasure to the profit motive that encourages that sort of true-false distinction to be washed away.
Stewart: Those people who practice division and deception then that's what you'll get. And those of us who live in this world need to consider that acceleration, in every part of our life and how fragile it makes our society, because we have to make decisions faster when we should be spending more time to think, is this true or is this false? How do we slow it down?
[bass-heavy episode music starts]
And that is why I believe that we will have more communities where we are better known and those communities where we are better known are more likely to push back on us when we do something foolish or say something that divides and deceives and that we will become better people because of the small community we're in.
God wants what is best for us. That means each of us is valued and each of us should be valued. So for there to be one meme that is the controlling meme for everyone, I don't agree with that. I agree that each of us do have value, but we are working within a framework of something greater than ourselves.
[midroll Break -- ads and promos go here]
Keisha: This is Moral Repair: A Black Exploration of Tech. We’ve been talking about algorithms and their impact on society and us. And now we’re thinking about ways to heal what those algorithms have broken.
Annanda: Technological innovation in the West has had a complicated impact on humanity over the last five centuries.
[background: driving, curious music starts]
Portuguese, Dutch, British, Spanish, and French enslavers used innovations in shipbuilding technology as their literal vehicle during the trans-Atlantic trade. Plantations also used state-of-the-art tech to process goods like cotton, rice, and sugar cane as well as the labor of enslaved people. That was the tech that’s made our world.
And tech’s still shaping our world. Algorithms are used in judicial systems to support how judges sentence people charged with a crime.
Many states also use recommendation algorithms to determine who gets access to food assistance and other social services.
Keisha: According to research from Professor Virginia Eubanks, automation of the Supplemental Nutrition Assistance Program has overwhelmingly resulted in fewer people receiving life-saving social services.
The poverty rate increase is so dramatic, that she says these recommendation algorithms help to build “digital poorhouses”.
In the colonial past, as today, Black people disproportionately experienced poverty and incarceration. And today, Black people are also significantly underrepresented as engineers or decision makers in the tech industry shaping this world.
Annanda: A 2019 Harvard Business Review paper revealed that Black entrepreneurs receive just 1% of all venture capital funding. An amount that hasn’t really budged in the last several years.
I went online just a month ago to see how many C-suite executives were Black in the top five tech companies and of the top five tech companies as of this recording, I saw maybe one or two. And out of those, they were all doing DEI. For those of you who don’t know, DEI stands for diversity, equity, and inclusion. Meaning that the work that they do doesn’t directly contribute to technological innovation. The work that they do is diversity, equity, and inclusion work within the organizational structure.
Keisha: That’s wild.
[background music ends]
Annanda: So with that, I'd like to introduce my mentor, dear friend, and scholar, Reverend Dr. Sakena Young-Scaggs.
SEGMENT B - REV DR. SYS
Rev. Dr. SYS: "Reverend Dr. Sakena Young-Scaggs" is a lot for most to say and definitely a lot for most to say on the go. So it just sometimes gets shortened to "Hey Rev SYS [Sis]," and that's fine.
Keisha: Rev Dr SYS explores tech’s problems and possibilities through the lens of her field, Afrofuturism. She’s a pastor, campus dean at Stanford University, and philosopher who thinks a lot about how tech shapes us all, who’s often left out of the design room, what that costs us, and how we can still find joy.
That’s where moral repair comes in. Something breaks down, not only in a kind of technology like algorithms, but also in our sense of what is right and how we interact with each other. Arizona State professor Margaret Urban Walker describes moral repair as the work of creating or stabilizing mutually agreed ways for people to be in relationship.
And that means we have to look at all of the features of the social world and the technological world that we’ve inherited.
Rev Dr SYS: One of the challenges in science fiction and futurist thought, historically, has been the erasure of race, and talking about the future without giving that consideration.
That has changed over the last three decades. The beauty of that, was a term, Afrofuturism. It crosses genre.
You take that philosophy of Afrofuturism and you imagine a future, you imagine a world in which we can shape and mold, but also reimagine a world that has caused harm and trauma systemically to Black and brown folk.
Keisha: So these patterns of surveillance and erasure go way back. But so do some of the tools that monitored communities have used for resilience and repair.
Annanda: Among Black enslaved communities, hush harbors were places our enslaved ancestors would go to “get out of the gaze” of their enslavers and an overseeing population—aka White people who were not directly their enslavers—to find privacy.
They would participate in religious and spiritual practices, ceremonies of releasing stress, trauma, pain. They'd also celebrate in joy. They would laugh, they would cry, dance, and sing.
They would exchange multiple forms of wisdom to keep on keeping on in conditions designed to reinforce their chattel condition. They found ways to be resilient in the moral injury of slavery.
Keisha: Lack of privacy and self-determination isn't new in the human experience or in recent US history. But neither is imagination.
[background: trumpets and whirly lounge jazz music, evoking the whistles of the original Star Trek series]
Rev. Dr. SYS: Every course that I open, that I teach—science fiction, race and religion—I have everyone take out their cell phone, can't get away from it. And I say, you're holding in your hand the imagination of Gene Roddenberry. I say, in 1963 he imagined this technology called, a tricorder and it was a communicator. And it was able to do wonderful imagined things of not only communicating but medicine and identifying, uh, body metrics and, maybe even doing healing. And as Star Trek, of course, uh, took on “the next generation,” the tricorder did lots of things.
Keisha: I really appreciate seeing the cell phone as the, almost the precedent of the tricorder. I watched Star Trek and my favorite captain for the majority of it was Picard. He would go into his office and then tell the computer to make him an Earl Grey, hot [AUDIO CLIP: Earl Grey Hot] or when anybody would say to a newcomer onto the ship, "Whatever you've got on your planet, the computer can make it for you."
There's something about the knowledge that is embedded in that computer system in this series that mirrors all of the data that has been collected and infused into Google's machine and into the new forms of AI. I think people are wondering now about what it does to us as humans to have acres and acres of human knowledge and experience at our fingertips, ready to serve us exactly what it thinks we want.
Rev. Dr. SYS: Very practical example: 3D printing is the manifestation of that science fiction, science happening in real time.
My husband is a chess coach. He was teaching at a STEM school in Phoenix, and the students came to him and said, you know, we'd love to play some chess.
And before they had a chess team, parents came and said, oh, we'll bring you some chess sets. But the school had a 3D printer. And they began to print out chess sets for these kids, in urban Phoenix, downtown, to, to play chess regularly. And so manifest on the tables were these 3D chess sets that was just printed up, um, like Earl Grey Hot.
Some say that the Stanford air has imagination and innovation just in it, uh, because things just kind of get constantly innovated and imagined, and then manifested and come to life. That's part of that imagination.
What do you do with it? People sometimes, yes, are afraid, and uncomfortable with imagined science that comes to life. But it's not the science or the imagined information. It's how it's applied and how it's used.
Those come back to what I talk about in ethics and theology. It's how it's used and how it's applied with the human element, uh, not just the science in and of itself.
We had this discussion, uh, two decades ago in the late ‘90s. When they broke the human genome, everybody got uncomfortable… because there was no controls and we had to put in ethical imperatives.
And I think the same can be said about AI. It's not the science.
It's how the human element interfaces with the science.
Annanda: My research is centered in the moral distresses and injuries directly informed by the implementation of technological innovation. So I’m curious, from a Black Africana perspective, what moral wisdom, resilience, and self-determination looks like as it relates to recommendation algorithms.
If you had a magic wand, what do you think would be an Black approach?
Rev. Dr. SYS: Is this life-giving or death-dealing?
Mercy Oduyoye, a womanist theologian, I always use her quote, how do we assess, how do we go about the world when we assess things? That has always been my litmus test, whether something is life giving or death dealing, and if there was a purple light that could assess whether something was life-giving, that would be what I would imagine.
Because we need more life-giving things, we have a lot of death-dealing things that are imagined every day. And so what good—does this enhance the human experience? How does it shift, the barometer of hunger and healing and wellbeing and not status or rank or you know, how much we are gonna make in terms of the cost and capital of something that we're gonna just rack up and this is gonna just make us rich.
No.
What good does it do for humanity overall? Is it life giving or is it death dealing?
Rev. Dr. SYS: does it disconnect us? If it's a technology that disconnects us from our humanity, then I find it problematic.
We know that we have experienced trauma and harm at the different hands of society, whether it be the state, whether it be the everyday citizen that just harbors intergenerational racism or hate.
We have, as a people, I'm gonna talk from the Africana perspective, from the I perspective, experienced that, so we have a healthy suspicion of anyone keeping track of us, monitoring our bio data. Exactly.
You have this cultural boundary, that is valid.
Annanda: What role do you think black Africana spiritualities and religious traditions have at this particular time of the fourth industrial revolution? You know, what wisdom and gift, um, can it provide at this time?
Rev. Dr. SYS: Who do we take our problems to, our heartaches, but also who are speaking from our pulpits that we flock to on Sunday and then live it out Monday through Saturday.
Africana spirituality includes the Black church, Christianity, but it also includes Black Muslims. It includes Black Buddhists, includes African traditional spiritualities. So the voices of our spiritual leaders, the Black church, all of those spiritual traditions, um, has always been part of community: sages and wisdom.
You know, we can talk nicely about shamans and witch doctors as well. They've all been part of the spiritual landscape. The Yoruba Baba, the Iya, they've always been the ones who we go to, to seek wisdom—the Nana in the Akan tradition.
So there is no way around in many ways, spiritual practice and spiritual leaders having a voice. Now, you know, whether someone is a humanist, they're still in some ways practicing spirituality. Because they're thinking about their human existence and having meaning.
So all of those voices play a role in shaping decisions and shaping the way in which people have decided to interface or not.
For me, leading with heart and head means making decisions that's wise intellectually, that takes the human capacity and the harm that it could do, or the help that it could do, it meaning technology.
Keisha: Mm-hmm.
Rev. Dr. SYS: And the, the joy that it could bring because we need more joy.
Dr. Prathia Hall coined the term “Sankofa ethic,” and it means that you live out both past, present, and future in taking those things into consideration.
Keisha: From all the moral streams that you've referenced today, choosing paths that are life-giving for technology, thinking about the past and remaining anchored in it while stewarding the future, is there a particular frame that you wanna offer people today who are navigating almost infinite choices through the technologies they hold in their hands?
Rev. Dr. SYS: Because there is so much to choose from, the marker has to be simple on how we decide. There is infinite—as we know now with AI, infinite combinations and possibilities of technology moving forward, and we have to wrestle and reckon with that.
[title music starts]
But whether something is life giving or death dealing, that's the bottom line, I believe, as an assessment. And there's the purple light: make a choice.
OUTRO closing music
Keisha: One of the things I loved most about Rev. Dr. SYS's conversation with us was the part where she talked about how Star Trek and music and other culture media moments help her bring her students into the conversation about imagination, science fiction, and how science happening today is the imagination made real.
Annanda: I appreciate her introducing herself and showing the importance of what does it mean for her to be, uh, you know, a Black African American woman who's concerned with how we be— with the phenomena of things— and that how we be matters cuz it's how we show up in the world.
Keisha: As I'm looking at the way the algorithm is pushing me through different kinds of content, I’ve been thinking a lot about how I felt about the internet back in the ‘90s and early 2000s: hopeful about relationships, spending time in life-giving communities. I’d really love to have that back in my life again. But maybe it means I give less time to the algorithmic internet. We're more than what is manifest. We're more than what is obvious. There's something that's | intangible and beautiful about humanity and sometimes technology helps us like deepen into that and sometimes it distracts us from it. And that goes | back to what Rev. | Dr. SYS was saying, that there's a way that technology can enhance our humanity | or / distract from it.
Annanda: Stewart reminded me this technology is not going anywhere so knowing how to use it to your benefit in this capitalist system is important. With Rev. Dr SYS, the takeaway I have is if the outcome and the impact is misaligned with the good intentions for this technology that there’s accountability and responsibility for the real-world ways in which it impacts people’s lives.
Stay strong out here folks! [laughs]
Keisha: Okay! Yeah.
Annanda: Y’know, make it work for you. In life giving ways.
Keisha: That's right.
CREDITS
[bass-heavy episode music starts]
Annanda: I’m Annanda Barclay
Keisha: And I’m Keisha McKenzie.
Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Courtney Fleurantin, Rye Dorsey, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.
Keisha: This podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.
[episode music fades]
[PRX SONIC ID x 2]
SHOW NOTES
Learn more about Stewart’s work in marketing and consulting at StewartNoyce.com
See IBM promoting their work at the 1994 Winter Olympics in this vintage ad.
How do algorithms drive social inequality? Virginia Eubanks explains in Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor (St. Martin's Press)
What’s Afrofuturism all about? Read Afrofuturism: The World of Black Sci-Fi and Fantasy Culture by Ytasha L. Womack (Lawrence Hill Books)
Learn More About Black Entrepreneurs receiving 1% of all Venture Capital: Sources of Capital for Black Entrepreneurs, Harvard Business Review, 2019 by Steven S. Rogers, Stanly Onuoha, and Kayin Barclay
Explore more On “live giving and death dealing” from African Feminist Theologian Mercy Oduyoye: Beads & Strands: Reflections of an African Woman on Christianity in Africa (Theology in Africa), Orbis Press 2013
Holograms
Season 1 | Episode 2
Have you ever considered the moral dilemma of a hologram narrating someone's life story for them? Dive into the latest episode of Moral Repair as we interview Otis Moss III about the interactive AI hologram of his father Civil Rights Leader Otis Moss Jr. showcased at the Maltz Museum in Ohio. Discover Zuogwi Earl Reeves how the black wisdom of hip-hop plays a pivotal role in the moral repair of narrating our tales authentically. Tune into the episode and embark on this thought-provoking journey with us.
-
MORAL REPAIR - EPISODE 2: Holograms (& How We Remember)
Big Question: Can your hologram replace you?
INTRODUCTION
Keisha: Hey Annanda.
Annanda: Hey Keisha.
Keisha: Listen to this clip I found on YouTube. It’s from the Maltz Museum of Jewish Heritage near Cleveland, Ohio.
Audio from Maltz Museum’s promo video (00:19-00:59): (Maltz narrator) “Using state-of-the-art technology, Rev. Moss was filmed over the course of several days, answering thousands of questions about his life growing up in Georgia, being orphaned at the age of 16, and continuing on to earn a Master’s of Divinity from Morehouse College, to become one of the foremost thought leaders on civil rights.” (Maltz staff to hologram) “What’s your story?” (Moss hologram) “I am Otis Moss, Jr, an eternal… fighter… for justice… human rights… civil rights, and the right to be… a human being.”
Keisha: “The right to be a human being.”
But guess what. This isn’t a human being talking.
Annanda: It’s not actually Otis Moss, Jr.
Keisha: Nope. It’s a hologram trained with AI to sound and interact like him.
And he’s not alone.
Earlier this year, therapist Esther Perel learned that someone she didn’t know had made an AI bot of her.
They’d trained the bot on her voice and her stories, counseling sessions, and speeches, and then used AI to make it responsive… all without her knowledge or consent.
It sounded like her. It answered questions like her. But it wasn’t her. Here she is at South by Southwest.
Audio from Esther Perel’s SXSW talk, The Other Ai: Artificial Intimacy (01:09-01:18): As I’m getting older, I’ve been thinking a lot about mortality. But I didn’t imagine immortality to appear in the form of a machine…
Audio from Esther Perel’s SXSW talk, The Other Ai: Artificial Intimacy (01:31-01:54): Part of me was flattered. Part of me felt plagiarized. Another part of me was deeply concerned about the clinical and ethical implications. Mostly, however, I was fascinated. This “AI Esther”—is she better than me? [SXSW audience laughs]
<< Music >>
Annanda: I’m Annanda Barclay.
Keisha: And I’m Keisha McKenzie.
Annanda: This is Moral Repair: A Black Exploration of Tech. A show where we explore the social and moral impacts of technology… and share wisdom from Black culture for healing what that tech has broken.
Today on Moral Repair — can your hologram replace you?
<< Music >>
Let’s return to Rev. Otis Moss Jr — the civil rights leader — and the exhibit in Cleveland that turned him into a hologram. First off, he and his family authorized everything.
OM3: The Maltz Museum approached my father.
Keisha: This is Rev. Otis Moss Jr’s son — Rev. Dr. Otis Moss the Third.
He’s a multi-hyphenate—author, filmmaker, mentor to young ministers, community leader, and senior pastor at Trinity United Church of Christ in Chicago. I know him as OM3, or Pastor Moss. And he knows exactly how his father’s hologram exhibit came to be.
SEGMENT 1 - Rev. Dr. Otis Moss III
OM3: The Maltz Museum is based in Cleveland. It is | the primary museum and repository in reference to Jewish history and also specifically the Holocaust. And they were partnering with StoryFile.
Keisha: StoryFile is a software company that makes AI-based conversational videos.
OM3: StoryFile had already done several of these holograms with very unique and interesting people. Some were astronauts, some were scientists, some were writers.
And they had been doing the work, of recording, uh, the experiences of Holocaust survivors so that another generation would be able to engage with them.
But they've expanded beyond that of looking at issues of the tragic and triumph among the human spirit among people across the board from Indigenous to people of African descent, uh, to people who are Latinx and so on.
And my father, you know, is quite an individual | in the city of Cleveland as an icon around civil rights. And they asked him, could we record you?
He had no idea it was a hologram. He's just thinking they're gonna bring in a camera, and record him. But then they explained that this is going to be an AI-generated hologram that will learn over time to engage people based upon the questions that they ask.
When StoryFile presented the other people that they had recorded and the purpose, that it was an educational purpose, that it was an ancestral purpose, he was immediately intrigued and reached out, which is, it’s so funny — My father has this wonderful, deep resonant voice. And he calls, he says, “Otis.”
I was like, “Yeah, Pop. What's going on?” “Um, I've been asked… to be recorded… as a hologram. What do you think?” *laughs* So we had this conversation about holograms, which was fantastic.
So it was a three-day recording event, which was, uh, fascinating in and of itself. In terms of what he had to wear, how he had to sit, and all of that in order to get the right posture to make sure that his voice was recorded appropriately.
Keisha: You've picked up a couple of threads that I'd love us to touch on, one being the function of multi-generational storytelling.
You said, for the museum, it was really important for them to help capture the stories of Holocaust survivors. And then with regard to your father, somebody who was part of the civil rights movement.
You're a storyteller yourself too.
OM3: Story is one of the most important things in terms of the human endeavor. Story is how we live, it's how we function. If you live out the right story, you literally can change your life and change the world. If you live out someone else's story, you have a ceiling on what you can do. And that ceiling is based on someone else determining what your possibilities are.
I come from a tradition and a family rich with the telling of story and the importance of story. Both of my parents are Southern. And there are no storytellers, in America, quite like Southern storytellers.
Keisha: Mm.
OM3: There’s something special about it. Before there was TV, you, you sat on the front porch, you heard a story. When families get together, you listen to a story. The myths, the "big lies," as Zora Neale Hurston likes to say, and the deep truths that are within these stories that sometimes have a lot of hyperbole. Every faith tradition is a story tradition. And by operating out of the appropriate story, it can be transformative or it can be reductive and destructive. What I appreciate that StoryFile did is they allowed my father to articulate his story, to just go for it, which was very similar to the manner in which he teaches and preaches: There's going to be a story, there's gonna be a reference to, one, African-American history or African history; two, there will be a story that connects his direct experience in the American experiment.
And StoryFile gave the space to do that and allowed him to, to, you know, to talk and talk and talk and talk and talk, versus trying to truly edit his story so that it would be palatable. Because his story is a story of great triumph and great pain and you can't separate the two.
He mixes those two things together with | the story of what it was like growing up in rural Georgia in the 1940s and the 1930s. Having four siblings, a father who made a decision not to get remarried after his wife passed away as a result of medical apartheid.
That story connected with his eventual acceptance to Morehouse College and connecting with Benjamin Elijah Mays and the StoryFile people allowed him to share the reality of what it meant to be a person of African descent who was a visible, invisible human being in the South.
Keisha (Narration): Dating back to the 1700s, European and American courts, circuses, science fairs, medical schools, and museums began collecting and exhibiting Indigenous and African people and their remains. Several universities and public museums still store Indigenous and Black remains today, usually without the consent of the families or the nations they were taken from.
Keisha: You're aligned with the project on the level of story and cultural transmission. Did you have any concerns about the technology itself? Black culture has had a mixed legacy of being engaged with museums.
OM3: The difference in this | is that he had personal relationships with people who were involved with the project. So the individual who gave the initial money, uh, for the project he had worked with before back in the day, probably protests and everything else. So they had relationships.
Pastor Moss says there are a few things that made this project really sing: how the staff was educated to work with his father and the civil rights generation… their sensitivity to the content of their lives…. and what it might evoke for people interacting with those histories and experiences in such a vivid way.
Otis Moss Jr and his wife, Edwina, visited the Maltz Museum when the hologram of Rev. Moss first opened. On their way through, they walked through an exhibit of nine photographers who had chronicled the civil rights movement.
OM3: They had these amazing photos many people have seen. My mom stops and says, oh, this picture of Coretta—you know, I'm over on the right side of the photographer when this was taken. This was Coretta Scott King. You see her mourning because she had gotten news that Dr. King had been assassinated. So it's the first time ever heard this, that she was in the room in that moment. And she's just naming all the people who were in the room.
So we're just absolutely blown away. And it brought back memories for all of them.
But the one that was really moving was when they stopped at the picture of the open casket of Dr. King. My father paused and my mother paused. And they stated that “We were never given the opportunity to mourn our friend.”
Annanda: Mm.
OM3: And they stood there for like 15 minutes. Because they had to go back to work. There was a short mourning period, but they just had to get back to work and let other people mourn.
Keisha: What if holograms had existed when Dr. King was alive? Could they have helped people and communities remember their loved ones and mourn more fully in that moment, not just sixty years later?
OM3: Imagine if the community and the family of Medgar Evers had that.
Keisha: Medgar Evers was a voting rights and desegregation activist from Mississippi. He was assassinated by a White supremacist in 1963. That whole decade, people like Evers were murdered… people who devoted their lives to practicing freedom and fighting segregation.
OM3: That family did not have the time to mourn in an appropriate way because they were essentially in combat. They were, they were at war. It’s like we do the rites, but we gotta get back to work and let other people mourn. One of my biggest questions for Pastor Moss is whether the holographic representation of his dad felt authentic.
OM3: It—it does. They use multiple cameras to film. So you’re getting a video essentially of my father. But then the AI system is drawing on your questions.
So the AI system is trying to figure out, based upon the file that we already have, where should we go in terms of an answer. You know, "Did you have a dog when you were a kid?" "Oh yeah, we did."
But if it's something that the system does not have information, there's a particular answer that it gives for that. You know sometimes people ask just like you know, an out-the-bo question, you know, do you put ketchup on a hot dog? Something like that. And, and the system says, "Excuse me?"
Annanda: Yes. Oh my gosh. Yes.
OM3: It was really great when, when some of the little kids were asking these, you know, kind of out-the-box kid questions.
Keisha: Mm-hmm.
OM3: And some of them were, “Excuse me?” And some of them the system would learn “Excuse me?” with a smile and a little laughter: “I'm not familiar with that.”
So it felt authentic because it was authentically his voice. I was familiar with the moments that were recorded. It was now the system was doing the curation versus a reconstruction of my father physically.
Annanda: How is it that you hope people engage with your father’s hologram?
OM3: I hope that they will have a new picture of people of African descent. And I mention that because the gatekeepers of media, and even on social media, these stories are not held. They're not elevated.
These are stories that are deeply rooted in the human spirit and rooted in the spirit of God moving in magnificent and sacred, serendipitous ways, and I hope that another generation would raise a question:
How could a people denied everything in this country create space and institutions that rewrite the history of the country? How do they do that?
Keisha: A few possible answers… Resilience. Telling stories to stay connected. Leaning into community when folks cheat or steal. And being savvy about questions of dignity… and ownership.
Keisha: How do you understand ownership when it comes to stories? There is a way in which stories are cultural property, or collective property.
OM3: Mm-hmm.
Keisha: But we also are in a society that has intellectual property law—You can privatize that ownership. Did you have conversations with the museum about who owns—
OM3: Yes.
Keisha: —the stories and your father's image and so forth? Talk to us about that?
OM3: Yes. | I'm glad you raised that. Yes. That was very clear that we, the Moss family, Otis Moss Jr., my father, it is his story. It is his image, and they are only leasing per se, that image. And they do not have ownership.
We, as the family at any time could say, take it down. We don't want you to use us anymore. This is his intellectual property. That's very, that's very important. And Black culture has been appropriated so often, um, purchased and then saying, we own it.
Keisha: Yep.
OM3: When in many ways we, we should lease, but not let anybody own. So you want to use my beats? Sure. Lease them. You wanna use my estate? Sure. You're more than welcome to, but the, the ownership comes back to me. And we've seen that in music. We've seen that in film and art and others in books and things of that nature over and over and over again.
But there's some things that you can't, you know, even if you attempt to say that you own it, there's no way that | you can, it's like Emeril Lagasse trying to say he owns gumbo. It don't work. *laughs*
Keisha: We came across statements from Chance the Rapper and Sheila E talking about Prince. Prince had very, very rugged ideas about whether holograms should be used of his work. Even Justin Timberlake.
OM3: Oh Lord.
Annanda: God bless.
Keisha: —introducing holograms at the Super Bowl or pretending that he would.
What’s your perspective on the use of post-mortem holograms?
OM3: In terms of educational purposes, I'm fine with it.
Keisha: Mm-hmm.
OM3: Uh, to create spaces where people can interact I think there's an appropriate place for museum to do this work.
In these other spaces? These are not for educational purposes |. These are for purely profit purposes and go back to what company owns.
So, in other words, we're doing Henrietta Lacks with holograms.
Keisha: Mmmmm.
OM3: If the company was really about “We wanna save lives,” then the family also would've retained the rights to her own doggone DNA. And that’s what’s happening with the MJs and the Tupacs and all these other things: somebody’s figured out “I can make a buck.”
They're not trying to educate on Tupac's trajectory and his connection to the Panthers and his experience in the School of the Arts in Baltimore. They're not trying to tell us about his fascination with the seventies Black poets. They're not interested.
Educational purposes put things in public domain, not for-profit.
So the spirituals are in public domain. Mm-hmm. And they should be ta
But I don’t want a hologram… of a bunch of Black folk…. from 1842 and go to Disney—
Keisha: Singing “Deep River.”
OM3: —Singing “Deep River,” doggonit. I don’t want a hologram of Paul Robeson. You need to read about him.
Keisha: Oh Lord.
OM3: The only place it needs to be is if I'm going to the Museum of African-American Culture, to the Museum of—in Nashville—on African-American Music. Uh, yeah. That's fine.
But I do not want to go “Tickets available. $35. Sponsored by Coke.”
Keisha: Come on.
OM3: I don’t. I don't want the hologram like Henrietta Lacks to become like the NAACP programs have many times become. Where they're sponsored by the same people who are selling certain things to our neighborhood that is destructive.
Keisha: That's real.
Keisha: Sidebar. Just this summer, Thermo Fisher Scientific — one of the tech companies that profited from using Henrietta Lacks’ body without consent — finally settled with her family—over 70 years later. That’s how long it’s taken ethics to catch up with medical tech.
Now, hologram tech isn’t good enough to be confused with real people. Yet. But it’s reasonable to expect it’ll improve, and fast. As it does, how could community relationships and knowledge protect us from being misled, plagiarized, or preyed on for-profit like Lacks was?
Keisha: You put a lot of content out there yourself, sermons, lecturers, talks, interviews, films. | Because of | the density of that content available on the internet, we have deep fakes, deep fakes being very, very credible imitations of the real.
How concerned are you on a scale of 1-10 about deep fakes drawing on the content you've already put out there and applying it to purposes you don't agree with?
OM3: At this point, six to a seven. | Within the Black church tradition, deep fakes have been around a whole for a long time. Preachers have been preaching other people's stuff | and telling a story. "I once heard" —it's a deep fake. So we have just given an algorithm to a cultural practice among human beings.
And that's why Zora Neale Hurston talks about, how when you are of a culture that knows story, you can make the distinction between the mythos | and that which is the truth, and the bigger T “Truth” that’s embedded within it. So you then become an automatic editor.
OM3: And you know the wink that's happening. Soon as someone says, “I heard it said,” it's like “He got that from somebody.”
Annanda: It’s the “If you know, you know” culture. To that wink and the nod, if you are in community, Black community is efficient. It has an efficiency to it— that is designed to deal with the road bumps, the blockades, that capitalism specifically targeting Black bodies deals out. I really appreciate you naming that because that is how we operate and move through the world.
And hearing Keisha’s questions and your responses on how we do that with holograms, how we incorporate this technology into a Black sensibility and way of being is really fundamentally important and educational, for ourselves, for our audience.
OM3: What scares me about technology is the profit motive. More than anything else.
Plus, the inherent human frailty. So profit plus human frailty, historically, has meant tragedy for someone.
And where are the spaces that allow conversations of public good and not for quarterly profit? And we’re in this age—I struggle with that because AI in itself is held in the hands | of a few companies. That terrifies me more than the AI because | if it's a profit motive, it's not for public good.
Keisha: That's right.
OM3: They’re raising the question instead of, Should we do this? Can we do this? And those are two fundamentally different questions. And come from two different moral places.
Keisha: That's right. That's right. One is a pragmatic question and the other is an ethical question.
OM3: Mm-hmm.
Keisha: A moral question.
OM3: Yeah.
Keisha: Did you ever watch the Netflix series Love, Death, and Robots?
OM3: Yes. *laughs*
Keisha: There's an episode in there about what happens after humanity is gone. If that happens to us, your dad's hologram at the Maltz Museum might be one of the last sentinels on earth. What do you think that hologram would be saying to the universe?
OM3: Hmm. Ooh. What a question. It would be very clear that there lived a people and a person that deeply believed in love and the power of love, and the power of faith, and the power of a family and of story. Beings from another space and time, another dimension in the universe would marvel at a little community that produced these magnificent human beings who had a deep commitment to the idea of love.
Annanda: | Thank you so much, Pastor Moss.
[BREAK]
Annanda: There's a long history of attacks on Black ownership that continues to this day, which makes the family ownership of Reverend Moss's hologram all the more unique and important.
And on this podcast, Keisha and I easily celebrate that ownership. But I'm skeptical that the ethical standards afforded the Moss family would be applied to us humble common folk. And as I record this episode, the Actors and Writers Guild have been fighting and negotiating, um, protesting this very issue of giving up the rights to their image and likeness because AI is primed and ready to use the content created by writers and actors to generate entertainment for the public in perpetuity.
And the situation is understandably terrifying, but it's also not surprising when it comes to American business practices, right? Like our black ancestors’ enslavement was considered a practical necessity for ruling class elites and the political and economic ideas of liberty and freedom for some could only be achieved through the super-exploitation of others like this is as American as American or rather, apple pie.
Keisha: Last century, the pattern of claiming ownership over Black bodies and lives repeated during the eras of Jim Crow and the Great Migration, Civil Rights, and Black Power. In this century, the pattern repeated again, inspiring the Movement for Black Lives.
Annanda: And now, in this age of AI, history repeats itself. Again. And holograms could be another way that pattern of claiming ownership over Black bodies repeats itself. What does the past have to teach us about the present?
SEGMENT 2 - Elder Zuogwi Reeves
Zuogwi: The use of recreating someone's body, especially after they've passed, given what America has done with Black bodies historically. I think Black people really need to really start thinking about how precious their body is, number one for themselves, and how precious the Black body is within the American construct of making money.
I think it's just a complex conversation about technology. And how we use it, within our own experiences.
Annanda: This is Zuogwi Earl Reeves, a mentee of Otis Moss Jr. and Otis Moss III. He’s a Howard University trained public theologian. And his work is centered in the storytelling culture of Hip-Hop as a Black indigenous tradition.
At first glance, it might seem odd to consider Hip-Hop a resource for hologram moral repair. But it’s been a dominant way Black people have challenged and managed the moral crisis of our narratives being told, owned and/or controlled by others.
Zuogwi: Black people have spoke themselves into the future since we arrived on these shores.
When I think of holograms, I think of African Americans preserving African indigenous religions from West Africa, and how over time you start to see an amalgamation of both Christianity, Islam, and Indigenous. To preserve some kind of resilience in which our language spoke futuristic. Spoke. In holograms. We spoke in holograms before we could actually see a hologram. “I'm going to go see Jesus.” “Swing low chariot, why don't why don't you let me ride.”
Look where, what? What chariot are we talking? We out here in the plantation. What's, what's happening here?
Annanda Narration: Holograms have a history in Hip-Hop and Black music culture. From Tupac’s hologram at Coachella to Chance the Rapper saying he doesn’t want to become a hologram. {play from 0:26-0:28 capturing his iconic ahhs}.
“Speaking in holograms” as Zuogwi put it, is a practice that’s evolved in the Black imagination over time. It’s a way to project power and wholeness in the face of the repeating unjust patterns of history. One arena where this is clearly visible, is hip-hop and rap music. Where did hip-hop even start?
Zuogwi: When you look at hip-hop, rap, even trap music, there is this ability to tell exactly what is going on within the community. And I think the resistance and the celebratory pieces of that is, is that through hip-hop. You can hear a warning about what's happening in the community, what's happening in the projects.
When I think about holograms, I'm always thinking about, yes, we have the technology to show this person, but our minds have created spaces for us to go.
When we think about hip-hop, it's in its 50th year, we have to look at where we first see it emerge, which is of course in New York City. You have to deal with the history of the major divestments within New York City education, and how it's repeating itself right now. Right?
The emergence of Black people needing a space and creating their own economic resilience to this catastrophic experience. The over-policing within those communities and also the divergence of other oppressed communities coming together to create these brush harbor experiences within the hip-hop community. We start to see the theories of Black power, even some of the theories of King, start to evolve within these destitute places, these roses that grew from these project concretes all across the United States.
When I think of sacred memory with the artist, I also think about the first time I've heard a song somewhere. And it takes me back to the memory of what I felt as a kid hearing this versus what I'm feeling now. It could bridge a lot of emotions with that memory.
Annanda: When Zuowgi said the phrase “sacred memory,” it reminded me of the moment Pastor Moss told us about, when the hologram and photo exhibit helped his parents remember their slain friend Rev. Dr. Martin Luther King, Jr. Sacred memory lets us look at what’s happening right now, and also think back to where we have been—to feel the contradictions, to be deeply connected in them. So we asked Zuogwi to share a moment when he’d experienced sacred memory himself.
Zuogwi: A member of my home church passed away. And she had a nephew that me and him grew up.
We were already in the space in which we had grown up and learned so much. when I heard "God Is" played on a pipe organ, it took me back to a time when me and her nephew were in the balcony of the church playing around. We dropped a program from—this balcony's kind of high—onto whoever was sitting below us.
We didn't care. We were 10, 11 years old. And it was a point in time where, my parents were going through a divorce, and we were sitting in a funeral. That memory right there brought a mixed emotion.
You ever watched that movie with the girl, "Inside Out", and how you can experience joy and sorrow within the same emotion in the development of growing up?
It was literally that moment.
That's what I believe sacred memory is: the ability to understand historically what we, as a people were going through, what you're individually going through, but then also being able to smile and look at that moment. And also cry at the same time cuz look where we are right now. Look what's happening right now. But look where we have come.
Keisha: As Zuogwi said, Black people made holograms through imagination and vivid word pictures, long before there were technological holograms made with photography and video.
It makes sense that hip-hop culture, an art form rooted in wordplay and vivid pictures, would also become a space in conversation with hologram technology. Holograms in hip-hop have two sides: to help community members remember their dead and to help the industry that keeps artists and creators producing.
Annanda: What's the meaning of holograms in Black hip-hop culture? And where does that bring resilience?
Zuogwi: We should be in the 11th anniversary of seeing, the first hologram in Coachella, which, celebrated the patron saint, Tupac Amari Shakur. Can I tell you how excited I was to see it on YouTube because I could not afford a Coachella ticket in 2012? I think that experience of seeing Tupac in a digital form to me was quite amazing because, we lose him tragically.
And so to see this experience of celebrating a very radical artist who centered African Americans, centered the struggles of Black people, centered the struggles of Black women, though being chastised by the US government or whoever. The excitement of seeing someone celebrated that way in a community, engaging that idea.
I think I remember seeing an interview with Snoop Dogg and he talked about how emotional it felt just to be standing beside him after not seeing him for almost, I'm guessing that was around 15 or 20 years. {insert this clip https://youtu.be/TVzbapajkbI?si=4iVLXf4VA9CPwFmY&t=22 starting at (0:22-0:36) The idea from a hip hop perspective of holograms started historically for me at that point.
I think media and how music portrays, how different voices speak to you, are all forms of, holograms in my opinion.
There's verbal, there's visual, there's also this intrinsic spirit that you can feel as well. And so when those all meet together at that Coachella moment, we don't know how much money. It took to do that kind of, let's say, conjuring. This kind of conjuring only happens when there's some kind of capital involved for someone to make money, whether it be the family or the concert.
But it's always amazing to see how, Imagination jumps from just our brains into reality.
Zuogwi: Chance is one of the special artists that I love because, you can tell the maturity and also you can hear the development of oneself over time, right?
Annanda: Chance The Rapper is an artist who has explicitly named that he does not wish to be made a hologram after he dies. Hip-hop offers us the possibility of moral repair — by empowering us to tell our own stories and own our own bodies. That’s very different from the moral distress and despair when our narratives are told and sold by others.
Zuoguwi: If you were to just listen to Chance the Rapper at the time, of let's say Acid Rap, you have this reality that, wait a minute, Chance The Rapper was dealing with a major drug addiction. You know, you come out with this album 10 Day Suspension, which is also a classic within itself as well. And you've gone from just being a senior in high school to experiencing all these different things and then you also have this addiction.
And sometimes in dominant culture's idea of a hologram being preserved that way, it might only preserve the dominant culture narrative of that one person, right? But there's an article where he talks about how right before his grandmother passed away, she looks at him and attempts to banish the spirit of addiction from him.
That I found very intriguing. And he thought that she was banishing the spirit of his artistry. No, his grandmother was performing as a healer. Sometimes it takes a moment to see that this person is plagued with these things because of the exposure and needs a resetting. And that's what his grandmother did. And I think that's why he says he doesn't want to be a hologram, which is okay, but we can always tap back into where are you at the moment when you heard, “what's good gangsters, what's good people?”
“I wonder if I call God, will he answer?” “Jesus’ Black life ain't matter” Who, like—
Annanda: “I talked to his daddy.”
Zuogwi: Yes, “I talked to his daddy”. Yeah. Yes. “I wonder if I call on God will he answer?”, Black artistry is our hologram.
That delicate dance we do of secular and sacred, preserving our indigenous traditions.
Annanda: Zuogwi values storytelling mediums that allow us to live out our stories on our own terms.
Zuogwi: I really want us to think that when we pick up books, these are holograms of stories that people have preserved so that you can know about this, right? Or this whole idea of quotes: listening to someone's voice because your mind can take you back to a time when you've seen them in the flesh, you know?
Annanda: Is there anyone's recorded voice you listen to, to take you back to a time you saw them live, in person?
Zuogwi: in 2015, my father moved back to West Africa, Liberia, Liberia, west Africa, and that was during the height of the Ebola crisis.
And subsequently, my father passed away because he was not able to return to the States, get some of his medicine for diabetes. At the time of the height of grief, dealing with the funeral logistics, there was one day where I just decided to go through my voicemails to be like, do I have my father's voicemail anywhere?
And, this is like June or July. He like called. He called me from Africa and he leaves this voicemail. Thank goodness to technology. I was able to save that voicemail and to preserve that voicemail to today. I can go into my voice recording and just, I've edited only to the part where I can just hear “I love you,” you know? Mm. And the importance of as much strife that we had as father and son and you know, Black men and their dads, they always have this little rift sometimes. So for that to just be there that I could tap into is great. Or also being able to tap into recording my dad singing a Teddy Pendergrass song at one of his friend's houses or being able to preserve some of the voices of the elders. I am grateful for the opportunity to preserve stories.
And not just to base it off of my own brain's capability, but also to have technology to say, well, maybe I want to use this part of this story, so let me just listen to what this person said. So, yes, there are ways that we could use technology to continue on the legacy of resistance.
As long as it doesn't come expensive to save space on the cloud.
Keisha: Memory is kind of like time travel for you.
Zuogwi: Memory -- it's not even kind of. I think it is time travel for me.
Annanda: And also time travel as emotion, right? Like you are literally put back into a different place and time, but also the feelings that came with it. You are literally embodying that moment captured in time. In the memory of your body too.
I think that's so powerful and often not talked about, but that is a Black sense of memory. Of time travel.
Annanda: Would you preserve yourself as a hologram?
Zuogwi: No, I wouldn't preserve myself in dominant culture's hologram experience, but as in leaving the bread crumbs so that more theory, more ideas, more spaces of freedom can occur from what I write or what I say, yes, that's the kind of hologram preservation I want.
Annanda: Appreciate you, brother. Appreciate you.
Zuogwi: Thank you.
[Closing music starts]
Keisha: I'm still taking as a threat the idea Pastor Moss shared that Disney or Coca Cola or some other corporation could one day sponsor a hologram of sharecroppers or Black elders singing deep river in the field. I just, I find that obnoxious. Um, last year, last June, I went to a plantation in South Carolina.
It felt like it was trivializing Black memory for the sake of a tidy tourist experience.
Annanda: Yeah, that trip was horrific. One, I don't put it past Coke or anybody for that matter. These streets are not safe. There's one thing America has taught us: to sleep with one eye open, right? Mm hmm.
But oh my gosh, that plantation experience haunted me, haunted me. I remember when we asked, could we see the graves of those enslaved? And we could not. And they said that they have reached out to the descendants and those who are descendants have access.
And also, they didn't call them slaves at first. As I recall, they called them “workers.” I was like, who was earning wages?
Keisha: Right, right.
Annanda: “The workers.” I mean, the, the attempt to sanitize—
Keisha: —and flatten.
Annanda: Yes. The horrors. I was like, you have got to be kidding me. And they didn't even really talk about those who are enslaved.
They mostly talked about like what the culture, um, like the different rooms of the house and what they held. I'm like. We are on a plantation like—
Keisha: It's not about the bricks. I'm not interested in the bricks. I'm interested in the person who made the bricks.
Annanda: Come on now.
Keisha: Yeah.
Annanda: Come on now.
Keisha: Yeah.
So I can totally see companies using holograms and history that kind of way to shore up a story that only includes certain kinds of people if we don't reign things in right now.
But we do have some more respectful options, and I'm glad that we got to talk about an example of something that went in a better direction with the Moss family.
Annanda: Yeah, and that better direction was rooted in relationship, right? Like, uh, Otis Moss III was talking about how, you know, how they felt comfortable because they're people that they had worked with before and I think relationship continues to be a key and how we approach technology and innovation and actually do right by the greater community because there is a human connection called relationship that we're nurturing and fostering as opposed to a transactional | relationship, which is what that plantation was about, which still to this day hosts many weddings, many events, you know.
Keisha: Prom pictures...
Annanda: Prom pictures, my blood still boils. I'm like, and everybody, nobody, people don't see the issue. And very specifically, white people do not see the issue.
So yeah, relationship. That is what I'm leaving with is how can we make sure that we're in a relationship and how, how do we look to the past, even the past experience of the plantation, my God, and see, you know, how could, how could we make this better? I got some ideas off the bat.
Yeah.
But the descendants of the enslavers do not own the property. That's number one.
Annanda: Agreed. My, my grunts for those of you who can't, can't read them, have more to do with. Oh my, I mean, just.
Keisha: Yeah, it's overwhelming.
Annanda: Yeah, of what is owed. Like monetarily, relationally, what is owed?
Keisha: So part of the theme I think is about how we remember is not just about the layers of facts. So I think some of these sites, whether plantations or big houses or otherwise, they, they go through the raw history and they lay out a timeline and, and they might include some primary sources and that's the quality of remembrance, but that's not, that's not the fullness of it.
I think we hear from both Reverend Moss and from Zuogwi, we hear a richness in memory, a connecting that memory allows us to do both the past and the present, connecting us to the future in making the stories of the people who experienced whatever it was vivid, in helping us understand their motivations and concerns and values, and, uh, asking us questions, what, what would we do differently now that we've experienced the story? Like it pulls us into responsibility instead of just spectating.
Annanda: Yes. | I have a friend whose mother, uh, just died. And as I think about looking | at the plantation, as I think about Otis Moss II and where he is in his age, right?
We're looking at holograms as telling our stories and, and ancestry and, uh, hearing the story of how my friend, who’s Muslim, helped to wash her mother's body because in the Muslim tradition, you need to bury the body within 24 hours, right? Yes. How she washed the body, how she knew what her mother wanted, the particular prayers prayed over her mother's body, uh, because there were prayers that came to her in a dream, right?
Of what her mother would say or what her mother in law would, would want to say, right? Um, and I, and I use mother and mother-in-law interchangeably because technically it's her mother-in-law, but for hers, it was also just her mother. That's the relationship that they had. And hearing the, the story of Pastor Moss sharing the grieving of the loss of his parents with Martin Luther King and not having the ability to grieve, I think about so many of us today in our respective communities, Black or otherwise, how so many things die. So many people die, but also aspects of life or ourselves die, and we do not have time to grieve.
And how do we hold on, uh, to the meaning and the memory of what has passed, but | not just out of joy and nostalgia, but also, you know, the meaning of how do we carry that wisdom, that life, that love into the future. And so when I, when I think about my friend who's, you know, washing her mother's body, doing all these things at | the grave site, which in some, in some cases, depending upon where you are geographically as a, as a Muslim woman, you might not be able to do.
How she carried on her mother's legacy of independence, of care, of self-determination. And I wonder for us, when we think about holograms and memory, how do we carry on the rich legacies of our ancestry? You know, kind of what Zuogwi was talking about, and to a certain degree, Pastor Moss.
And how do we, you know, use that sankofa wisdom of going back to the past and fetching, um, fetching what is wise, what is good, what is pertinent right now | for the present as we intentionally, uh, know that whatever we're doing in the present is creating a future. | In order to do that, you have to be in relationship with the past, not haunted by it, you know, not traumatized by it.
That's all I'm gonna say right now.
Keisha: No, no, I feel you.
In fact, this reminds me a little bit of the book Beloved by Toni Morrison, where the past haunts people because the past has not been resolved yet. Yeah. And so like the interrupted griefs that you're talking about where you experience a trauma, but you aren't able to process it for various reasons like those incidents haunt you in a way until you come to terms with them, until you're able to fetch that wisdom and until you're able to like be buoyed up by the past in a way that helps you face what is now and what's to come. So I think of, of Beloved and, and the ways that memories can, um, I guess kind of—they bring communities together and they can also wrench communities apart if not held with the love that our memories when held in good community and with respect for the ancestors can like totally strengthen us collectively.
Annanda: Oh, I think that's the collective prayer for the United States because we do not, we don't do the past.
Keisha: We do not do that. *laughs*
And the struggle and because of that. Those same struggles evolve and continue and I think that to me is the fear underlying like the fear that I think this podcast addresses when it comes to technology when it comes to AI and specifically this episode and comes to holograms because we don't deal with our past.
Well, we have not healed those wounds. You know, having certain days of celebration. That's not it.
It's not enough. No.
Annanda: And because we don't have that skill as a society and a country, we continue to pay the high, high prices.
CREDITS
Annanda: I’m Annanda Barclay
Keisha: And I’m Keisha McKenzie.
Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Courtney Fleurantin, Rye Dorsey, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.
Keisha: This podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.
---
SHOW NOTES
• About the Maltz Museum exhibit on Rev. Dr. Otis Moss, Jr.: https://www.youtube.com/watch?v=diHzrepVY_U&t=19s
• About aiEsther, the bot based on Esther Perel: https://www.youtube.com/watch?v=vSF-Al45hQU
• About museums and human remains: https://theconversation.com/us-museums-hold-the-remains-of-thousands-of-black-people-156558; see also https://www.science.org/content/article/racist-scientist-built-collection-human-skulls-should-we-still-study-them about the U of Penn Museum
• About Hologram-2Pac at Coachella with Snoop Dogg: 2012 NPR story with still from the performance
• About the Henrietta Lacks family settlement: https://www.statnews.com/2023/08/01/henrietta-lacks-family-settlement-hela-cells/
Machine Learning: What’s Good?
Season 1 | Episode 3
Is it possible to control AI? If so, how can we make it more ethical? Damien Williams, PhD, a philosopher of science, technology, and society, has some ideas. Annanda and Keisha chat Marvel movies, Ultron’s bad manners, and what lessons machine learning could take from the world of plant medicine and harm reduction.
-
Episode 3: MORAL REPAIR - Machine Learning: What’s Good?
INTRO
Keisha: In Marvel Studios’ movie Age of Ultron, Tony Stark and Bruce Banner make a super-intelligent computer.
[Audio: light curious background music.]
INSERT AUDIO: Clip from The Age of Ultron: Avengers - Age of Ultron - Born to Kill
[1:01-1:10] Ultron: What is this? What is this, please?
JARVIS: “Hello, I am J.A.R.V.I.S., you are Ultron…
[1:19-1:26] Ultron: Where’s your body?
JARVIS: I am a program. I am without form.
Ultron: This feels weird. This feels wrong…
[Audio: keyboard click]
Annanda: Ultron is supposed to defend Earth. But when JARVIS, an older AI system, introduces Ultron to the internet, Ultron wakes up—shushes Jarvis—and it’s game over.
[1:59-2:03] Ultron: Oh no.
JARVIS: You are in distress.
Ultron: No. Yes—
[2:03-2:15 ] JARVIS: If you will just allow me to contact Mr Stark—
Ultron: Why do you call him ‘Sir?’
JARVIS: I believe your intentions to be hostile.
Ultron: Shhhh… I’m here to help.”
[Audio: keyboard click]
Annanda: “Shushed.”
[Audio: silence… then both laugh]
Annanda: I’m so upset by that. I’m so upset by that.
Keisha: I know. This is like a breach of Black politeness or something!
Annanda: Yes! It’s like you were just born! You were just born! What do you know?
And so… once Ultron’s on the web, scanning videos, news articles and all the things people have uploaded… in seconds, he learns who we are at our best… and our worst.
[Audio: Show music fades in]
Keisha: Like God before the Flood in the Hebrew Bible, Ultron looks at everything humans know, have made, and have ever been… and decides the biggest threat to Earth… is humanity itself.
Annanda: I’m Annanda Barclay, a moral repair researcher, chaplain, and death doula.
Keisha: And I’m Keisha McKenzie, a technical communicator and strategist.
Annanda: This is Moral Repair: A Black Exploration of Tech. A show where we explore the social and moral harm of technology… and share wisdom from Black culture to heal what tech has broken.
[Audio: Show music ends]
So, Keisha, in the movie, Ultron gets trained on the internet, and immediately starts threatening people?
Keisha: Yeah, it’s a quick slide down to hell. *laughs* Garbage in from the internet and the news, and then garbage out: He framed human existence as garbage, and we’re in big trouble!
Annanda: Way to work with trash. So in today’s episode, we wonder: how much of a threat is artificial intelligence in real life? And will it shush you?
[Audio: curious background music fades up]
Keisha: I think shushing humans is the least of our concerns with AI, but I feel like you’re really moved by that.
Annanda: I’m—ooh. Just don’t go around shushing people.
Keisha: Yeah. We want to ask: Can we control AI? And if so, what can we do to make it more ethical?
Annanda: And we know AI is evolving quickly, right? There’s billions of dollars invested in the tech. It’s already in our email, browsers, our socials. It’s used to approve or deny welfare benefits…
So… can we expect our future to turn out like the tale of Ultron? Or are we overreacting?
SEGMENT A - Can we control it?
Damien: Ultron learned very specifically from a certain set of inputs. And it took those inputs and it expanded on them down a particular line of thinking.
I actually thought that that was something they could have spent a little bit more time with, and it would've strengthened the core of the movie in a potentially interesting way.
Keisha: This is Damien Williams, a philosopher at the University of North Carolina in Charlotte. His work focuses on the moral value of technologies like artificial intelligence.
Bias
Keisha: So when I heard that a lot of the language models were trained on the open internet and even Reddit perhaps…
Keisha: It gave me a lot of pause.
Damien: Yeah.
Keisha: And it goes back to what you were saying about whether the people who created the movie Age of Ultron could have spent more time thinking about what Ultron was being fed before Ultron woke up.
Annanda: And luckily for us, Williams wrote an article on this very topic, and it starts like this —
[Audio: background music ends]
Damien:
"Recently I learned that men can sometimes be nurses and secretaries, but women can never be doctors or presidents. I also learned that black people are more likely to owe money than to have it owed to them. And I learned that if you need disability assistance, you'll get more of it if you live in a facility than if you receive care at home.
“At least that's what I would believe if I accepted the sexist, racist, and misleading ableist pronouncements from today's new artificial intelligence systems. The creators of these systems promise they'll make our lives easier, removing drudge work such as writing emails, filling out forms, and even writing code.
“But the bias programmed into these systems threatens to spread more prejudice into the world. AI-facilitated biases can affect who gets hired for what jobs, who gets believed as an expert in their field, and who is more likely to be targeted and prosecuted by police.”
Damien: One of the things that we increasingly find when we build artificial intelligence and algorithmic systems, so-called AI, we find that they do things that seem surprising to us when they're given a set of parameters and starting points and then told to complete a task. They then go about that task and somehow we are surprised by some of the outcomes. So there's AI systems which are built to try to predict whether somebody is going to re-offend if they're given bail or parole.
And those algorithms have a very clear racist prejudicial bias in the predictions that they make. They almost always give a lower probability of re-offense to white subjects and a higher probability of re-offense to Black and other racial minority subjects. And this is because the training data that they are given to work with, is built out of the kinds of determinations that are made by judges, by cops about how one should view those kinds of subjects.
It is based on projection layered over top of what is a field of data, right?
So if you're given a set of measures that says, okay, this person committed aggravated assault four times in the past. They have a history of drug use, and they were found with a weapon on them, and that person happens to be white.
They will be given a moderate likelihood of re-offense score within that algorithm where someone who has an aggravated assault, no history of previous offenses, no weapon found on their person, history of drug use happens to be Black, will be given a high likelihood of offense.
Keisha: Mm-hmm.
How many times has this person reoffended, how many times has this person been within the system previously would be a higher marker score for what kind of prediction those systems put out on the other side. But it is not always, and is very frequently not the case, that those things line up that way.
Keisha: There's so many places that this technology is already being used. Does the trajectory, given the bias being programmed in, make the outcomes of disproportionate impacts feel inevitable to you?
Damien: Yeah. Not inevitable. there are very few things think of as inevitable in this world. Um, they are predictable. They are foreseeable. We can understand from the nature of how these systems are built, what they are most likely to do if left unchecked. But that's the crucial part. If left unchecked, if we don't take steps to intervene on them, if we don't work to put regulations in place to change the values, to change the beliefs, to change the understandings of the people who build these systems, who commission the systems to be built in the first place, who ask for certain types of functionality within these systems who deploy these systems, who then hold the results of these systems up as inviolate gospel truth written in stone laws of the universe.
If we don't change those things, then the high likelihood of what comes out the other side is an exacerbated bias, exacerbated prejudice. It's a higher likelihood of disproportionate harm to those already being harmed.
Regulatory Capture
Keisha: It is possible to control this technology and we should, and we should direct it toward collective well being.
How hopeful are you that the trajectory, the current trajectory, can be turned toward wellbeing?
Damien: I'm always hopeful that it can change. I'm not in this particular moment encouraged by the direction. [added pause] | Most of the work being done right now when this is being brought into the light, when it's being brought into the realm of public policy and into law, the vast majority of the work in the United States especially, is to bring in as experts the corporations who are doing the work. And that results in what we call regulatory capture, right? The space in which the very people who are supposed to be regulated by these policy changes, by these laws are the people who are writing the policy, writing the laws, or the very least “advising” on them.
Annanda: So yeah, there’s a lot of bias built into AI and algorithmic systems. And some of the biggest AI producers influence how the government regulates the AI sector.
So, Sam Altman the CEO of Open AI was up before a congressional hearing a few months ago now. | When people got very concerned about what Chat GPT was doing, he was in front of a congressional committee and he ended every statement about problems, potential concerns, he said to the senators “and we would love to help you with that.”
AUDIO from Altman’s reply to Senator Blumenthal (41:49-42:45):
(41:49-42:08) Altman: My worst fears are that we cause significant—we the field, the technology, the industry—cause significant harm to the world. Uh I think that can happen a lot of different ways. It’s why we started the company. Um it’s a big part of why I’m here today. And why we’ve been here in the past. And have been able to spend some time with you.
(42:08-42:22) Altman: I think if this technology goes wrong, it can go quite wrong. And we want to be vocal about that, we want to work with the government to prevent that from happening. But we try to be very clear-eyed about what the downside case is and the work that we have to do to mitigate that.
(42:22-42:45) Sen. Blumenthal: Thank you. And our hope is, that the rest of the industry will follow the example that you and IBM (Ms. Montgomery) have set by coming today, and meeting with us, as you have done, privately, in helping to guide what we’re going to do so that we can target the harms and avoid unintended consequences.
Keisha: You create the problem, you create the solution too.
Damien: Exactly. And so every step of the way, his response was, we wanna be the ones in place to help craft these tools so that we can be the ones who guide how it gets deployed in the future.
At no point does any of what they are actually saying indicate they want real regulation. When the Federal Trade Commission here in the United States actually started to step up Sam Altman's response was, we're very disappointed that secret internal emails were deployed in such a way that the public got a misleading view of what's actually happening within Open AI. Like all of these things were retrieved by the FTC in due process of their power as a federal agency. None of this was secretive. It was the FTC doing its job. But because it stood in the path of what Sam Altman understood as his job of guiding people to understand what AI's really about, he saw it as an imposition. He saw it as a threat.
If he was truly interested in being regulated, he would go, okay, that's a step towards regulation from a body that regulates. I'm not exactly happy about it, but it is their job and if they would like us to comment or if they would like us to elucidate any of the things that they're looking at, happy to do so.
And that's a very different position than how dare you, right? Which was their first response.
Keisha: Right? Right. And will probably be their enduring response. 'Cause if it's a regulatory capture position, then it's kind of like a teenager being asked about what time curfew should be.
Damien: Whenever I come home and go to bed, that's my curfew.
Keisha: And don't ask me where I've been.
[Audio: New background soundscape]
Keisha: Earlier this year, the White House published some voluntary guidelines for AI development focused on trust, safety, and security—but there was a carveout in those guidelines for the military, the police, and the government itself. If they won’t have to follow those guidelines, where does that leave the rest of us?
Keisha: One of our previous guests mentioned security back doors being installed into text messaging apps that many of us use in our everyday lives in the name of national security. And so there are these spaces where we've become used as consumers to giving up the protection of so-called private conversation,| because of that larger cultural story about protecting “us” from the “other.”
Damien: Yeah.
Keisha: And what we've learned in the last like 30 years of surveillance culture expanding "beyond, beyond" is the standard that supposedly protects Us against the Other, inevitably comes home to us.
Damien: That’s exactly right.
Keisha: I'm a recent immigrant, and so, like I remember in the early 2000s, when you’d come into the border airports having to give your fingerprints and not just fingerprints, but then digital prints. And then within 7-10 years, everybody was giving 10 prints and, and eyeballs in the airport. And it just—it made me, | I guess, sensitive to the ways that you need a culture of solidarity to counter this culture of mass surveillance.
Damien: Yes. And you know, it's difficult to have that culture of solidarity. Because there is a convenience culture in the United States in particular, but in the west writ large, the idea of “I just want this to work” is built into so much of our technology.
In a very real sense, it's why you have the divisions between personal computing that you have, right? You have Windows and you have Mac. And then if you wanna get really deeply fiddly with your technology, you can become a Linux user.
Keisha: Which most people will never be.
[Audio: Background soundscape ends]
Damien: If you tell people, here's this system that will allow you to literally change absolutely everything about it. You are in control but you're also responsible for all of the changes that you make. And if you make the wrong change, your entire system might just stop working because something doesn’t agree with something else now. You'll be able to get it back. It'll, you know, come back for you. But you'll have to do the work to get it back. Most of them will be like, why would I ever want that?
Keisha: Right. You become a god and then you can brick the universe.
[Audio: lightning and percussive background audio]
Damien: Exactly. Most of the people I know who work in IT and security are Linux users because that's | what they want to be able to do. The vast majority of the people who need a laptop for their daily lives, a tablet, a phone, a computing system of any kind for their daily lives, they just want it to work.
And so the trade-off for it just working is you're increasingly being told that you have to agree to this provision in the terms of service. You need to agree to this amount of data being collected. You have to allow us this algorithmic application within the system. You have to let us use your camera if we so choose to.
Our ability to have a thing that just works shouldn't be tied to this increasing divestment of our right to consent, that should not be the trade-off when these tools are such a foundational and fundamental part of our everyday lives. Some interest is served by that and it's not necessarily the users.
Concrete Examples Of Harm
Damien: Certain groups of people are just foundationally, fundamentally oppressed within those frameworks to a degree. That automating, algorithmizing, and just having that all done at the push of a button is only going to make it drastically worse. And it has made it drastically worse.
Keisha: I'm wondering if we can offer, like, a plain language example, and was thinking of maybe insurance decisions?
[Audio: percussive background ends]
Damien: Yeah. Algorithmic insurance decision-making systems recommend lower standards of care for Black patients and specifically Black women. Because the kinds of correlated markers for decision-making within AI systems that they're fed as their data, in a human system, those are all balanced against each other.
Human judgment says, okay, well what are the determinate factors that are actually at play here? An AI system correlates across the board and says, well, these kinds of markers, these kinds of recommendations from doctors are tended to be correlated with lower outcomes and lower survivability for this group of people. So why spend the money at the outset?
And that correlates with lower survivability because those people tend to be given lower standards of care to begin with. What course should this person take? What drugs should they be given? What trials are available? Maybe this person falls into a category where they need something outside of the norm of what we tend to think of for these conditions. Those kinds of thoughts, those kinds of extra steps are taken less often with Black patients.
BREAKPOINT / Music /
SEGMENT B - Building more ethical systems
Keisha: We’re talking with Damien Wiliams about artificial intelligence, what can make it ethical… and whether ethical AI and ethical algorithms can help us make a good future for everyone.
But what do we mean by “ethical”?
The field of science, technology, and society looks at tech from the perspectives of the people who make it and the people who are impacted by it. Technology comes from us but we don’t get to collective well-being by default. We have to make intentional choices about tech from the start, and that means the human values underlying those design choices count for a lot.
[Audio: Synthy sound ends]
Human Values Shape Tools
Damien: One of the core research questions within science and technology studies is that idea of how values make their way into human-built systems, into the tools, the systems that humans create.
We have this habit of thinking that human-created things, somehow because they're artifacts or because they're systems or 'cause we think of 'em as representations of natural forces or concepts that they don't have the hallmarks of human beings bound up within them.
You know, it's just math or it's just science. It's apolitical, without values, without perspectives or prejudices or biases. But as we've come to see increasingly over the past several years as we've gotten more and more entangled with AI-type systems in our lives, um, that's just not true.
And it never really has been true. Everything that humans make carries the hallmarks, the fingerprints of human beings. It carries human values in it in some way, shape, or form.
And the key thing is to try to figure out how and what that means. What does that do? What can we do with it?
Defining “Good”
Keisha: Dr. Williams, you’ve used the word “should” a couple of times in your responses, and so let's drill down under the should: The ethics that are underlying your sense of what is good, or that should be challenged about AI, what’s the source of those ethics?
Damien: Yeah, I think that's a very important question. There’s this kind of assumption that we all mean the same thing when we talk about ethical standards and not enough understanding that we don't mean the same thing about ethics within the United States when we talk about ethics, let alone when we start to think about other cultures in this world.
That question of what do we mean by good? Good for whom? Good for what purpose? Are we talking about goodness in terms of certain things that simply should never be done?
There are certain things that I think AI shouldn't do, ever.
[Audio: mysterious background sound starts]
Those include being used towards carceral applications, determining jail sentences, determining who should be arrested, who should be prosecuted, who should be surveilled more often. Because the foundational nature of carceral interaction is, by its very nature, oppressive. It disproportionately harms some group or another for no reason other than that some in power say they ought to be harmed.
As you said earlier “those people” is us. Eventually, it comes back to us because if someone in power with control of those tools gets to use those tools to determine that some group is bad, just because they're bad. Then that's just who happens to be in their focus today. And if they manage to properly corral or even eliminate all of those people, they're gonna shift focus and who's next.
Now, there's no meaningful way to have control over literally everything that we put into the world. Because we put things into the world with every breath we take and every interaction we have with another human being or any other thing in the world, we just do.
[Audio: mysterious background sound ends]
But when it comes to systems that gather up the things that we make, that learn from those things, that build upon those things, whether those systems are technological or whether they're interpersonal and human, we should have some measure of say, in whether we approve of that or not.
If somebody takes up my writing, if a human being takes up the things that I've written that I've done, and they say, I'm going to use these writings to justify some atrocity|. And they go off and they commit, you know, horrible acts and they say that it is in my name. I should be able to have a very meaningful say in going, no, you don't speak for me, and you have fully and completely misunderstood everything that I've tried to do. You don't get to use me that way. That's a conversation that we can have.
When it comes to the way that these systems are currently being built, there's no way for me to say, I don't approve of this. I don't approve of this tool being used to plagiarize or to misrepresent someone's work or to cheat on a paper. I don't approve of this tool being used to generate misinformation and disinformation in the world and to, to muddy the groundwork of knowledge and the ability for people to understand what's actually happening. I don't consent to my work, my writing, the things that I've done in the world being used to make this world a less understanding and understandable place.
I don't, I don't agree with that. And I think that, that, that's harmful at base.
The Wisdom Of Lived Experience
[Audio: Gentle lo-fi hip-hop background track]
Keisha: What do you think Black culture contributes to collective wisdom about technology and science?
Damien: Uh, so much. *laughs* Look at the strategies of resistance and survival, but also the strategies of flourishing, the cultural, communal mechanisms of building family and building community that [exist] within black communities, but also within other minoritized and marginalized groups. How we know how to live in a space, how we navigate certain types of interactions in the world. Each one of those things is built out of a piece of knowledge.
There's a host of questions, um, that built a system of, you know, kind of getting people to dig down on their assumptions. it sits in the same tradition as Peggy Macintosh's invisible knapsack. what do you know? What don't you know? And it's questions like without looking how many exits are there from here to the lobby? And which one can you reach the fastest without encountering too many people? What do you do when a police officer pulls you over? What's the highest you can reach unassisted? What kind of things do you struggle with whether and when to tell a new romantic partner? How do you walk to your car at night? Where are your keys?
Each one of those questions is a lived experiential knowledge set that certain people will have an immediate visceral reaction and answer for. Certain communities have a known knowledge protocol built around that exact scenario, and certain other people hear that question go, what does that even mean?
That's what this kind of engagement with and focusing on marginalized experiences allows us to build for these systems. It's a kind of knowledge that is otherwise opaque to the vast majority of the people doing the designing of these systems. If you don't understand what the question “Where are your keys when you walk home at night” means. If you don't understand what the question “Where are your hands when a police officer stops you” means, then you're missing a vast wealth of knowledge about the world, about the way that people interact and live in the world.
To be able to include those kinds of questions, to be able to include those perspectives directly into the building of these systems allows for us to have systems that actually represent the world in a real way that they're missing right now.
[Audio: Soundscape ends]
Keisha: Which means the mental model has gaps.
Damien: Massive gaps. They simply cannot access the real world. And not only can they not access the real world as it is, they can't figure forward what the world might be like, what it could be like tomorrow.
Plugging The Gaps In The System
Keisha: So if Damien Williams were running a company like a much improved Open AI, what would that company look like?
[Audio: Lo-fi beats fade in]
Damien: The main issue is where are these tools being trained from? How are they being trained? What's the data inside them and how is that data being manipulated? Because one of the key things that a lot of people don't fully get yet is that it's not just the training data itself, it's the architecture of the system. It's the specific operational algorithms that the system is built to take and how it is built to manipulate that data. You have to have a set of protocols that tells the system how to do the work you want it to do, and built into those protocols are biases, values, perspectives, prejudice based on who is asking the question and designing protocols.
[Audio: beats end]
Keisha: Let's say you have flour and eggs—this is your ingredients— you would need to create protocols on top of those ingredients that say, “bake a cake” versus “make pancakes.” Or fritters.
[Audio: Jolly background music with kitchen sound effects]
Damien: That’s a perfect example because if you give me flour, eggs, milk, and butter I can make all kinds of things with those four things. You say, I would like some food please. Well, you might get a cake, you might get pancakes, you might get some bread, you might get pasta and cream sauce. I don’t know what food you’re in the mood for.
Keisha: It's the human influence to take the ingredients and then shape the protocol.
Damien: Exactly. The way that I'm told to do something with those ingredients that will determine what comes out the other side. Who builds those protocols into these systems? Who tells these systems what to do and how to do it? That's one of the kind of foundational and fundamental things that I would start with changing.
[Audio: jolly background sound ends]
When Open AI started back in 2015, I was intrigued and semi-hopeful, and then I saw who their board was, and at that point in time there were zero women and zero non-White people involved in that project whatsoever. It changed over the next year or so. But who was going into that space was still disproportionately white and male. And that has remained true.
I'm not saying that someone who is white and male can't learn about the world. In fact, I'm desperately hopeful that we can all learn about each other's experiences in the world when they are different from ours, because it is crucial that we do so.
But my learning about somebody else's lived experience is not the same as letting the person who lives that experience, who has that direct knowledge, direct the work.
Uses and Misuses of AI
Keisha: In June 2023, the Verge reported that only a third of the people they surveyed had already used generative AI-based tools like ChatGPT. It’s being used in education and business: to draft writing, summarize records, and save administrators and programmers time.
There are also some positive uses for AI in the arena of cultural preservation.
Damien: There's people who are using large language model frameworks to revive dead Indigenous languages. There are currently living speakers taking audio and the written language and using it to teach new generations to connect with their native Indigenous culture.
That's amazing. But that's a very specific framing of language and conservation and culture. That's not about them asking that system to tell you how you ought to perform a ritual within that culture. And to some people, those seem like the same thing, but they're not. It can string you together a grammatically correct chain of sentences that are statistically acceptable, that use words like ritual and culture, community. But the truth content of any of that, and whether it matches the culture in question is incidental at best.
[Audio: dramatic, adventurous background sound]
Keisha: So there are reasons to be cautious, especially as we barrel into another season of election and campaign communication: it matters a lot right now whether something is true or merely within “statistically acceptable” ranges of what we as the audience for the press, the campaigns, and the social media platforms show we want to read, see, and hear.
In spring 2023, a major US political campaign started using AI-generated images of opposing candidates. As the AP reported, “Generative AI can not only rapidly produce targeted campaign emails, texts or videos, it also could be used to mislead voters, impersonate candidates and undermine elections on a scale and at a speed not yet seen.”
Casual TikTok and Whatsapp chain messages are one level. Political campaigns are a whole order more serious. As fast as this tech is changing and being adopted across society, we have to help people learn to parse truth from credible nonsense.
[Audio: soundscape ends]
Damien: I don't think that we can simply, um, rest on some kind of blanket prohibition against these tools, but we have to think very differently about there's a need that's being met. There's a value that is being satisfied We have to be able to, to demonstrate to them the places where, okay, it's, it does cut down on time. It can make things better for us, but we have to be careful because again, these things, they make stuff up.
They're not trying to tell you the truth. They're not trying to give you a clear, real picture of the world. They're concerned with a statistically acceptable output.
They want to give you the thing that has the highest likelihood of you accepting it. The thing that has the highest likelihood of making you happy with its output. And that's very different than the truth this is about confirmation bias, the framework that we already hold to be true being fed back to us.
What Shannon Vallor calls the AI mirror is this notion of we're putting ourselves out there and having it reflected back. I also see it somewhat differently. 'cause it's not just being reflected back to us. It's being amplified.
Keisha: Philosophers like Vallor and Williams are warning us away from AI misinformation and confirmation bias.
Damien:in 2004 by Harry Frankfurt. | “On Bullshit.”
Annanda: Y’all, this is a real book!
Keisha: *laughs* Yes, a friend gave it to me about 10 years ago.
Annanda: Harry Frankfurt spent over 50 years writing about knowledge, ethics, choice, and character—virtues we desperately need in this time, especially as we think about the moral impact of tech and machine learning in our society. He died in summer 2023.
Damien: Frankfurt calls bullshit “not-truth.” If I tell you the truth, I'm trying to give you a clear picture of the world as we understand it. If I'm lying to you, I'm trying to give you a picture of the world that directly, intentionally, specifically contravenes my understanding of the world to get you to understand something differently. I’m trying to deceive you. If I accidentally tell you something that isn’t true, but I believe it’s true, I’m not lying to you. I’m mistaken. But I was trying to tell you the truth.
Bullshit is none of that. Bullshit is just saying stuff. I don't care about the truth value. I don't care about whether you're deceived. I don't care about whether you actually believe me. I'm just trying to serve the end of you liking me. That's what these systems do. That is all that they do. That's their knowledge stance. Any truth or falsity that they produce is not intentional. It is incidental.
/ Closing music starts up /
An Ethic for Everyone
Keisha: Even Aristotle talks about the core characteristic of intelligence: looking toward wellbeing for the individual and for the group not just looking for the wellbeing of a single company, not just looking for the wellbeing of any single ethnic or cultural or economic group, but thinking about the impacts for a wide spectrum of people and not just today, but also into the future. It’s an intelligence ethic.
[Audio: gentle lo-fi transitional sound fades in]
Damien: Yeah. And if we're going to try to build something that we are willing to call artificial intelligence, we could do worse than having intelligence be tied to how well we seek to take care of ourselves and each other.
We could do worse than having consciousness and community be a relational act, rather than being some metric that we measure.
Keisha: I do think we can build tech ethics on caring for everybody. And maybe it sounds idealistic because of the stories we’ve allowed to shape our imagination about technology and ourselves.
I watched the “Space Odyssey” this year. And HAL is a murderous computer in that movie. From HAL to Ultron, who looks online and says “Nope!” to our species—I love a good apocalypse story! But maybe these stories about runaway tech that’s independent from us, don’t help us reshape how we use machine learning today or guide its course for tomorrow.
So maybe we can start to tell a different story. What do you think?
Reducing Harm
Annanda: Well, first off, I think… we should not be shushed.
*both laugh*
First things first. This is not how we build relationship.
Keisha: That’s right.
Annanda: It’s not by shushing. That’s the first.
The second is I went to the Oakland psychedelics conference this year, 2023. And a huge topic of conversation was harm reduction within our communities as it comes to plant medicine or just drug use in general.
I think some of the strategies around harm reduction or even the consideration around harm reduction that is happening in community streets and both urban and rural areas across the country can be implemented in the tech sector . And the idea is harm reduction is done in community, right?
Somebody is using a particular drug or substance. Is there addiction? Is there not addition? But regardless how do you make sure that they have access in a safe way?
How do you make sure that if there is an addiction, right, that they are not seen as a criminal for an act that could be or frankly could not be self harm, right? Because that criminalization now just adds a whole host of other issues.
But you need to be involved in the community and make sure that folks have spaces to be who they are going to be, in a safe non judgmental way. There is a “there” there with harm reduction that I would love to see explored as it relates to generative AI.
Keisha: It makes sense to me that it could be a window into something different in part because there's always been this overlap between the tech communities, developers, the billionaires, and communities that have been able to use some of these substances even before they've been available to the general public.
Annanda: Mmhmm
Keisha: And, it's been used in those communities under the rubric of like, optimizing your life and optimizing how your brain works or optimizing your body so you're more productive. I don't think that that's what we're talking about, just re-inscribing that ideology, but we're talking about learning from those communities that have always natively, been in good relationship with the plant world, treated it as medicine. As you say, in community, under the care of people who have trained for years and practiced for years and being attentive to the impacts of these substances on people and understand how to hold them with care as they might use them.
I love that sensibility and I do hope we create more space for it.
Annanda: Yeah, yeah, agreed. I hope so too, you know, and even for the folks that aren't as, you know, cause we talk about like certain psychedelics and it does have that like real beautiful space, but you know, we don't, we look at like crack, right? And you have folks that be using crack forever, and for some reason we'll criminalize them because they're using crack, but even though they're getting the same thing that somebody is getting from a mushroom trip, I mean, their drug of choice is different. But they're getting a similar thing and there's a narrative around crack that is different than the narrative around psilocybin.
Keisha: Mm-hmm. 'cause of the social context around it.
Annanda: Exactly. And so, so, so the harm reduction for psilocybin is far more accessible than the harm reduction for crack.
There's a ton of harm reduction for crack, but most people wouldn't know those communities because most people wouldn't darken those doors.
Keisha: right
Annanda: And so I wonder for, like, whatever the version of crack is out there within the tech world within this particular topic, who are the communities that are doing harm reduction in those spaces, [shortened space] as opposed to like the accessible psilocybin, which I would argue is like your [Chat GPT], where folks are like, cool with it, for the most part, it has traction.
Keisha: And it’s trendy.
Annanda: It’s trendy, right? So yeah, harm reduction is really central to this, and again, as with almost every episode, that comes from being in relationship and in community. And so the gap that technologists have from the average everyday person needs to be mended,and the diversity of the human population needs to be reflected in tech.
Keisha: in design and conception. In development in. material sourcing in, um, in all of the layers that make the system go.
Annanda: Yeah, I just don't get it. You'll literally have a better product. Like, would this not actually help your bottom line? Because it actually is meeting the needs of more people and understanding the breadth of the human experience in a more intentional way. That's the, that is the rub that gets, that to me You know, that is the “shush.” It's like we're gonna have just a narrow group of folks create technology for the global majority, which that dog just don't hunt as, as is said at the South.
Keisha: And what was it Auntie Lorde said?
Annanda: Oh, you can’t dismantle the master's tools with the master's house.
CREDITS
Annanda: I’m Annanda Barclay.
Keisha: And I’m Keisha McKenzie.
Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Courtney Fleurantin, Rye Dorsey, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.
Keisha: This podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.
###
[PRX Sonic ID]
### 45:07 ###
SHOW NOTES
NEW: Come talk to us on social media! We’re at @moralrepairpodcast on Instagram and @MoralRepair on X (Twitter). You can also reach us by email: moralrepairpodcast at gmail dot com
The Verge surveys Americans on who’s using AI tools and what worries them (June 2023)
A 2020 note in the Federal Register on how US border-crossing tech expanded and evolved: first for so-called “aliens” (non-citizens) and then to other categories of immigrant or citizen
The Associated Press reports on concerns about generative AI producing disinfo during the 2024 election cycle (August 2023)
Harry Frankfurt’s On Bullshit (2004)
A philosophy anthology where listeners can find the Aristotle essay Keisha and Damien mentioned and many other reflections on science, tech, and human values from the fields of technical communication and science, technology and society: Philosophy of Technology: The Technological Condition: An Anthology (2nd edition)
Annanda’s closing nod to Black poet, professor, and theorist Audre Lorde references a 1979 conference address later republished in the storied collection, Sister Outsider: Essays and Speeches (Penguin, 1984/2020)
Web3 & The Pursuit of the American Dream
Season 1 | Episode 4
How has the American Dream transformed in the wake of the Great Recession? Annanda & Keisha examine the impact of the Great Recession on the American Dream focusing on the rise of Bitcoin and blockchain. Through the lens of bell hooks’ philosophical perspectives, they explore the deeper moral stakes. Featuring a captivating conversation with Adorable Earthangel, a web3 entrepreneur and spiritual technologist, who offers unique insights on how to navigate this new landscape.
-
INTRO
[montage]
“ It has no intrinsic value. You can't hold it. Often you can't spend it. So what is it about cryptocurrency?”
“It turns out that the internet and with it the web are entering a new era known as Web3.”
“Bitcoin nearing the 35, 000 mark again.”
“We have progressively built up a Bitcoin position up to 140, 000 Bitcoin over the past two and a half years.”
“I've discovered that countries in Asia are building for a Web3 future.”
“hitting fresh highs, now up triple digits. Bitcoin... Crypto... EtherToken... Cryptocurrency... Cryptocurrency... Cryptocurrency... Crypto... Crypto... Crypto…”
Keisha: Oh, that was a lot.
Annanda: This whole world of cryptocurrencies and Web3 is a lot.
Keisha: It feels overwhelming, like a little opaque, as if you're not already knee deep into finance, you're not supposed to understand it. Also seems like the only people talking about it are crypto platform founders recruiting us to join.
Annanda: So let's talk about it. Welcome to Moral Repair, a black exploration of tech. A show where we explore the social and moral impacts of tech advances, and share wisdom from Black cultures for healing what tech has broken.
I'm Annanda Barclay, a death doula, moral injury researcher, and chaplain who's concerned with a life well lived and cherishes the relational stories people share.
Keisha: I'm Keisha McKenzie, technical communicator, strategist, and advocate.
Today on Moral Repair, what are Web3, Blockchain, and Bitcoin? What fell apart in finance and other social institutions that made these technologies possible? And will these kinds of technologies really create the opportunities we've been promised for people to thrive socially and financially?
<music>
SEGMENT B
Annanda: We're going to start with bell hooks, the late black feminist. The black feminist Who talks about interlocking systems of domination, we've drawn this clip from a documentary several years ago on her definition of what those interlocking systems of domination are and why they matter.
bell hooks: “I began to use the phrase in my work, white supremacist capitalist patriarchy, because I wanted to have some language. That would actually, um, remind us continually of the interlocking systems of domination that define our reality.”
Keisha: Bell hooks was a force. She was so clear. She once said we can't make much progress if we transform images without shifting paradigms, changing perspectives, ways of looking.
Hold-on hold on—this episode is about Web3 blockchain and cryptocurrency, right?
Annanda: You ain't lying. I know. Yes, it is, Keisha. Just bear with me.
Keisha: What do you want us to look at and think about differently today, Annanda?
Annanda: Her words about white supremacist capitalist patriarchy, take me back to college. Which was during the Great Recession of 2008.
[montage]
“If you are watching us from the last home you'll ever own tonight, consider yourself lucky.”
“I think this is the most significant financial crisis in the post war period.”
“Soaring gas prices, falling home prices, and rising unemployment.”
“May was a bad dream. It was a bloody year on Wall Street and a real capitulation for a system run amok.”
Annanda: At the heart of the Great Recession was the financial system's unchecked speculative behavior, driven by a profit motive. Motive without concern for the broader social or economic implications. This aligns with bell hook's critique of capitalism, which prioritizes accumulation and profit over human wellbeing.
Keisha: We've talked about profit over people a few times this season, in the data grabbing that makes algorithms work, in how companies sometimes capitalize on our history, in unregulated machine learning systems. What do we need to understand about how the supremacist and economic patterns played out in people's lives during the Great Recession?
Annanda: Well, Keisha, the racial dimension of the housing crisis is notable. Predatory lending targeted Black and Latiné communities, saddling them with subprime mortgages. The disproportionate effect on these communities is in line with historical practices of systemic racism of the financial system, which is the white supremacist part of bell hooks framework.
Economic downturns often exacerbate gender inequalities. Women in particular, particularly women of color, I'm going to read that again, economic downturns often exacerbate gender inequalities. Women, and particularly women of color, faced higher rates of unemployment and economic insecurity during the Great Recession.
The patriarchal structures of society place women in more vulnerable economic positions. That's just Patriarchy, right? In Black philosophy, in general, it's about the interconnectedness or intersectionality, shout out Kimberlé Crenshaw, of systems. Meaning in this case for bell hooks, the white supremacist capitalist patriarchy is an interconnected system of domination. The Great Recession is an example of how these systems can work together.
The capitalist drive for profit led to risky lending behaviors. The white supremacist system ensured that these behaviors disproportionately affected people of color, specifically Black and Latiné populations. And the patriarchal system further compounded the effects on women. Central to her life's work, Bell often talked about the importance of solidarity and resistance.
The Great Recession saw the rise of movements like Occupy Wall Street, which critiqued the capitalist system and its inequalities.
Keisha: I remember Occupy. It felt like that was a generation that had just lived through 9/11 and the march to war in Afghanistan and Iraq, and they were just waking up to how wealth makes power, and power has always flowed downhill in this country.
But it wasn't all critique. It was also about vision and the kind of world they wanted to live in, right?
Annanda: Exactly, Keisha. Through this lens, such movements can be seen as forms of resistance to the white supremacist capitalist patriarchy. And the Great Recession was an economic event. Its causes and effects can be examined through Bell Hooks framework, revealing deeper issues, deeper systemic issues, in society.
Now you have to understand economics and financial stability is a fickle and often painful reality for black people globally. But as it relates to the American dream, there's a history there.
We've got the Great Depression. We've got the time after the Great Depression through the Civil Rights Movement, and we've got up to today. Mind you, I left out slavery. But clearly, for Black folks, the economy and finances are a sore spot, and it hits not just our hearts and minds, which is often talked about. Frustratingly, it obviously our bank accounts.
Keisha: So the American Dream says you get a white picket fence and a lawn of your own. Until 2008. The Wall Street banks are approving mortgages for people who couldn't have possibly repaid them. They're selling bundles of these mortgages back and forth amongst each other.
And they're gassing up the mortgage insurance policy industry to make themselves money. They specifically targeted black and brown people when they did this. And it was a dangerous, dangerous game because when people lost their jobs, they lost their homes. Mortgage insurance groups like AIG couldn't pay up, the economy collapsed, and trust in institutions collapsed with it.
Annanda: it shows the lack of care and accountability to the people for whom these financial institutions are Allegedly supposed to serve,
Keisha: All right, so take us back. You were in college in 2008.
Annanda: Yes, right. In the middle of the recession. I went to school in middle America, southern Illinois to be exact, and witnessed for the first time the economic plight of rural white folks.
And it was the first time I saw the economics of white supremacy impact working class white people in a way that was all too familiar. Bell's framework of white supremacist capitalist patriarchy was coming alive for me and my understanding of the world.
It was right in front of me, Keisha. I couldn't deny it. It was the summer working at ACH Foods Factory between my freshman and sophomore year of college that changed my life in a very particular way. I realized that while not equal, there are more people having a shared economic experience than I might have been led or taught to believe.
And it was my first time realizing that white supremacy hurts white people too. while it's in service to them. And it definitely hurts some more than others, depending upon where you are economically, you know? Just like patriarchy harms men. As much as it gives them license to dominate.
Keisha: It sounds like a really intense realization. What was going on for you back then?
Annanda: I got this factory as a summer job And everyone including myself was laid off and to my relief as a privileged college student I was able to go back to school full time, without a full time job. I remember going to Walmart, and as I was walking in, Sally, which is not her real name, who I worked with alongside every day at the factory, was walking out. Our eyes met and we both started tearing up because we knew... the difference of our lives.
Sally was caretaking for her current husband who had a terminal illness. They met at the factory. This was her second husband. Her first husband she also met at the factory and he died of what I want to say was cancer. Her son was a mechanic at the factory and that particular factory had been open since 1956 in this tiny town in the middle of Nowhere, Southern Illinois. There were no other job opportunities that could sustain the lifestyle the factory had provided. And unlike Sally, I was in college, right? I worked at the factory because the recession hit my family hard. In fact, I worked two jobs. The factory and Aerie by American Eagle at the Springfield, Illinois mall.
Both of our families had these economic setbacks, but Sally and her family, because they lived in a small town. were stuck. And so much was exchanged between us and our watery eyes. And it was in that moment when I saw the American dream for the first time doesn't really work for anybody, that's the thing about interlocking systems of domination. The jargon of bell hooks, the imperialist, white supremacist, capitalist, patriarchy, was just laid bare. No one came out unscathed.
Keisha: That's that American dream becoming a nightmare, right? So when you talk about this woman, a wife, a mother, caring for her husband, working in the same factory as her son, I can't shake the image of Rosie the Riveter.
That “We can do it.” energy from mostly white American women during and after World War II. They were making waves in industry and labor and setting up the feminist movement and of course black women had always worked outside of their homes but for women like Sally it's 50 or 60 years of glass ceilings and second shifts working all day and then coming home to work up to 19th century sexist expectations and then by the time you meet her. That whole system is just coming apart.
Annanda: Exactly.
Keisha: So you said that year Sally and your colleagues at the factory was your first time seeing how poor white folks lived. Was that what had surprised you about what was happening at the factory?
Annanda: No, it was definitely the first time I saw poor rural or working class white folks, but no, it didn't surprise me, but it also didn't make it any less heartbreaking.
I'd seen the recession take its toll elsewhere by that point. 2007 was very clear on that for me, but upperclassmen also weren't getting jobs and folks who graduated were moving back home or going to graduate school quicker than they would have if they'd planned to even attend at all. To graduate college and not find a job that paid enough to start your adult life was unheard of, falling on hard times seemed to be a common experience amongst many of my peers. So people were being cast aside in the way of a faltering economy, and the American Dream was just... fading, just disappearing before my very eyes.
SEGMENT C
[montage]
Barack Obama: “The news from Wall Street has shaken the American people's faith in our economy. The situation with Lehman Brothers and other financial institutions is the latest in a wave of crises that have generated enormous uncertainty about the future of our financial markets”
Keisha: Back then, I was in the Southwest, not far from the Texas Panhandle. It's a part of the state that people lovingly call the armpit of Texas, because it's right in the part where, you know, the panhandle sticks up. Sidebar that New York's hot garbage has nothing on manure winds coming from Plainview, Texas.
Annanda: That speaks volumes, Keisha.
Keisha: Oh my god.
But the kind of desperation that you're talking about was so common then for the young folks who are coming into this big college town from small country towns. They were cattle farmers and Kids of cattle farmers and chicken factories who were going to school in the hopes that they would be able to escape their parents' fate.
They were also still functionally segregated areas where white folks who'd grown up were locally warning me to be careful in the less developed east side of town where the Black folks and the Hispanic folks lived. It was wild to me because they were all struggling. But they didn't seem to know that they were all facing that similar oppression and could be allies.
Annanda: Yes.
Keisha: But didn't the term hillbilly originate as a slur for poor whites in the rural South? They've organized with African Americans around their shared oppression at so many different points in U. S. history, from the 1600s to the Civil Rights era. And that's made them a target for cynical politicians who want to use them rather than change their conditions. And it still does.
Annanda: Exactly. Exactly, Keisha. Poor whites are often invisibilized, just like Native Americans and Immigrants. I shouldn’t say just like, but similar to Native Americans and immigrants. helps to paint a picture of what was going on in the United States when Bitcoin is first revealed to the public.
The 2008 financial crisis is in some ways a springboard for cryptocurrency. There was a lot of moral despair, injury, and pain at that time.
Annanda: This is why, in part, Bitcoin's debut in 2008, which is historic in its own right, is noteworthy.
Keisha: We got there, folks! Bitcoin!
Annanda: Yes. Thank you for bearing with us. But I think that background is really important to understand the world that these cryptocurrencies were born into. OK. So Bitcoin started on a new version of the internet that has the potential to disrupt or at least minimize predatory lending and denial of purchase by racial apartheid. It could support the economic and moral repair of the economy.
Keisha: So Satoshi Nakamoto first wrote about Bitcoin in a white paper on Halloween 2008. Now, we still don't know if Nakamoto is an individual or a group because their identity has never been revealed.
But the concept of Bitcoin represented a new technological promise, an answer to the American dream that was falling apart. Instead of what we have now, centralized banking from the same Wall Street banks that tanked the economy in 2008, Bitcoin promised a decentralized way to move money. Bitcoin runs on transparency through something called a public digital ledger. We asked Adorable Earth Angel to break this down for us: she uses technology as a way to understand how people can connect spiritually and share power.
Adorable: Rather than relying on the banks who are centralized, it is in the structures of the blockchain that you have a ledger, which has all of the transactions.
Annanda: Bankers literally used a big book to log transactions. It had columns and rows for tracking deposits, withdrawals, and running balances. That big book was their ledger. And in crypto, and in crypto, unlike, and in crypto, unlike at your local bank, these ledgers are visible to everyone.
Adorable: The difference between the bank is that the bank knows how much money you have and knows how much money everybody has. But with blockchain, everything's visabalized.
It allows us to play with this concept of money and currency in a way that takes it away from a centralized banking system there's people who've set up these different servers, and we're using the community to create a bank,
Keisha: So what if logging cryptocurrency were like building a Lego tower with a group of friends? So every time you add a new Lego block to the tower, it would have a note explaining who added that block and where they put it. No one would be able to change the note or remove the block without everyone else seeing and knowing.
That's what makes the tower like a blockchain, a chain of blocks or notes that everyone can see.
Adorable: I know that if I send this wallet this money, everyone can see the transaction so it's verifiable. So the blockchain in essence, creates a lot of things, it creates a trustless system, meaning that you don't have to have trust in what's happening because we can all see what's happening.
Whereas with the centralized model, I can't see what's happening, I don't know where they're investing my money. I don't know if the bank is using my money to invest in guns or other kinds of things.
Annanda: Or banks selling subprime mortgages to an investor who sells to another investor and so on
Keisha: With the group Lego analogy, you and your friends would keep track in the Lego ledger every day. Who brought which Lego pieces and how many. This ledger helps to ensure that no one loses their Legos and everyone gets back what they brought.
At least that's what it sounds like. Am I on the right track, Annanda?
Annanda: Keisha, you're totally on the right track. And, let's be real, if we're playing with Legos, I want all my Legos back. [LAUGHING]
If only one person keeps the Lego ledger and decides what gets written in it, that's a centralized system. And this is how the banks and Wall Street operate. In the game you play with your friends, if they make a mistake, lose the notebook, or are unfair, you all might disagree.
Someone might lose some Legos, but because the centralized Wall Street ledgers are at the heart of the U. S. and global economy, if they make mistakes, millions of people might lose everything. During the Great Recession, the government had to give these banks Billions of dollars. They bailed them out to make sure they could cover the mortgages and losses on their ledgers.
Keisha: So, Annanda, you've taught me one more thing that sounds pretty cool. We've said blockchain creates a ledger and it makes that ledger visible to everybody. There's another step to it. Instead of there just being one notebook, everyone building Legos has their own copy of the Lego Ledger.
Everyone building Legos has their own copy of the Lego Ledger. Then after playing, everyone gathers and updates their notebooks together, making sure they all match. This way, even if one person loses their notebook or tries to cheat, The group can check their notes against each other. That's what makes the ledger decentralized. Everyone has a record and everyone works together to keep it accurate. In the world of blockchain, the lego ledger is digital. So every time a lego, or in reality, a piece of data or a transaction is added, it's like adding a new note to our notebook.
And just like our decentralized system where all friends keep their ledgers in sync, in blockchain tech, multiple people or computers around the world store copies of the ledger and work together to make sure they match.
This process is called proof of stake. It pretty much makes it impossible for any single person to cheat or change past records without everyone else noticing.
Annanda, have you used this tech before? I haven't.
Annanda: Oh, oh yeah. No, I have. I've, I've definitely, I have gained and lost in this.
Keisha: Okay. Cause I, I heard about the big crypto rush during the pandemic and honestly it made me feel a little wary, but I do know what it's like to buy a stock online and it feels a little bit unreal to me whenever I do it because it's all virtual.
It doesn't feel like real money. Like, I can log on to, say, Fidelity and tell the site to buy a stock for me, it will pull the money from my checking account, which is also not real money because it's online, I can only see the numbers. And since I'm not going to sell the stock, it will just sit there on the screen in a list.
I can't see the underlying ledger, like it sounds like I would be able to see it with crypto, but at least I can see the transaction I just did. Is that that much different from buying Bitcoin or other crypto coins?
Annanda: Actually, not at all. I think the difference is literally volatility and I, I cry because in 2020 I had to sell the majority of my crypto in order to, you know, to make ends meet. If I had the financial stability to HODL, I'd be in a very different financial situation than I am today. For those of you who don't know, Hodl, according to Investopedia, is a term derived from a misspelling of hold in the context of buying and holding bitcoin and other cryptocurrencies.
So basically all that means Set it and forget it or buy it and don't touch it.
Keisha: Hold on for dear life. Isn't that what they say up there?
Annanda: That is, yes.
Keisha: Okay.
Annanda: Yes. And that is the way, Keisha.
Keisha: That's what I thought you were doing.
Annanda: It is the way . Yes, yes. Hold on for dear Life is right y'all. And don't touch it. Don't touch it. Unless you ran into a situation like me, you absolutely had to touch it and then cry. Get yourself a bottle of Jack and cry.
You know, death by a thousand economic paper cuts, alas, but, the use of block
The use of blockchain ushered in the new technological age of the internet called Web3. While the United States government was busy at work trying to keep our economy from collapsing, which also means the global economy from collapsing, Bitcoin quietly came online as if it was a direct response to our 2008 crisis.
Or arguably, a logical and moral referendum o n our banking system. So, what makes Web 3 different from Web 2, or what all of us today refer to as the Internet?
Brian Brooks: “What makes Web3 different is the ability to own the Internet. The actual network and that's what crypto assets themselves represent is an ownership stake in an underlying network. So when you hear people talk about, for example, layer 1 tokens, what they mean is this is your reward for providing the ledger maintenance services, the computing power to the network that on web 1 and 2 was done by Google.
So now people in my hometown of Pueblo, Colorado can actually own the Ethereum network.”
Annanda: That's former Bitfury CEO, Brian Brooks, explaining and defining Web 3 to the United States Congress at a hearing over a year ago.
Keisha: So the crypto network is kind of like a co-op ownership model, where because you participate in it, that means you own it and get to make decisions about it.
Unlike in the regular internet or banking system where the people who make decisions are Google and Microsoft and Fidelity and Schwab, not me, no matter how much money or effort I put in, I might have understood that for the first time.
Annanda: 100%. You know, he makes it really clear that Google owns the internet, Facebook owns the internet, er, meta, sweet Jesus, Zuck, meta, um, but crypto is the reward. NFTs not necessarily, NFTs are kind of different. Web3 is a co-op.
Keisha: Okay.
Annanda: The current advertisements for Web 3, like the metaverse from Mark Zuckerberg, seem to be disconnected from the lives of everyday people. Hashtag prayers for Zuck. Hashtag prayers for Meta.
Keisha: I can't. [LAUGHING]
Annanda: Prayers by obligation. The obvious question is, like, why should we care about Web3 and these forms that these tech billionaires are introducing?
The Great Recession created a rift that has only continued to widen the racial wealth gap to this day. 53% of all Black wealth was lost during the recession. And by Black wealth, I'm specifically talking about African American wealth and African Americans with a history of chattel slavery. And so, I wonder... How Web 3 could be used to repair what has been economically lost due to thewhite supremacist, capitalist, patriarchy. It seems so disconnected from our everyday lives.
Keisha: Yeah. So it's a whole experience for those who want it and I don't even think that having a space where you get to play out your imagination is the thing that makes them attractive or a reason that people should care about it.
I think it tracks back to what you were saying earlier about ownership and authority over your story and authority over your life. The underlying ideology behind Web 3.0 takes us away from the system of domination that has just morphed and changed its form so many times in the last 400 years.
Is it that it puts the power back where it should always have been—with people? If that's so, then yeah, I definitely want to know more about it.
Annanda: So do I. That’s why I reached out to Adorable Earthangel, Founder of the Deeper Tones Collective who we heard from earlier. The Deeper Tones Collective is a social impact company that creates Web3 products and events specifically for the Black community. Adorable has navigated the web3 world as a black woman… and she’s made it work for her.
Adorable: I produced a couple of festivals called Black MetaFest that were held in the Metaverse. And it was pretty phenomenal to bring people into that landscape. When I was researching what was going on within, the web three community, the crypto community, I found that, black people, black women, women of color, we were very much still marginalized in the ownership of all of these digital assets and also the types of NFTs and products, I don't want to buy that and be like, that's me, that's my identity, this monkey thing.
Annanda: By "this monkey thing," Adorable is referring to NFTs. NFTs or Non-Fungible Tokens are digital assets that can be used in Web3. For example: art, baseball cards, or commemorations of some sort. It’s like a digital collectible item. Think Beanie Babies or pokemon or baseball cards. It was like that. But with random animals and made up creatures. what Adorable’s saying is — "These animal things don't reflect my culture or what I value"
Adorable: NFTs come on the scene and I'm studying blockchain technology and I also happen to be an artist. So what do I do? I start creating NFTs.
I start getting into this movement and unpacking how to not just be a consumer but a creator. I'm completely embraced by the Web3 community. I start creating NFTs, I start consulting within that. I bring my product management skills and everything that I know. And that's kind of what led me to get into Web3.
I guess at my core, is service to, to community and service to people. I've always been fascinated about the wellness of community. It came to me in a dream. I was like, wait a minute, we create our own money. I could create money. Why don't we just put Harriet Tubman on the $20 bill? We've been talking about that forever. And so I, I did that within the crypto space, I created black money and it was something that I was really excited about.
Annanda: Forever we've been trying to put Harriet Tubman on the 20 bill.
Keisha: Right.
Annanda: Right, and so I think what Adorable is talking about is black people as faces on U.S. currency in particular. Forever we've been debating Harriet Tubman on the 20 bill. She has yet to show up on the 20 bill.
We still have, I believe, uh, is it Thomas Jefferson? Or Andrew Jackson? Some slaveholder is on the 20 bill. I think what Adorable is doing is saying the country has taken too long. So, if y'all aren't going to put Harriet Tubman, I'm going to go. and make an NFT, and that NFT is going to have black people on different forms of fake quote unquote US currency, but it's real in terms of an NFT.
Keisha: Mm hmm. And it is Andrew Jackson.
Annanda: He was gnarly, man.
Keisha: There's a note from the New York Times 2021 that says Biden's treasury will seek to put Tubman on the 20 bill. That was two years ago. It took Tubman, like. 30 years to give her, her federal pension from the army,
Annanda: Good God, America, see? See, that's why Adorable went ahead and made black money.
Keisha: . And they were probably waiting for her to die because life expectancy was shorter back then.
Annanda: Wow. Wow. She's African American's Joan of Arc .In terms of how... literally heard the voice from God and would act, and that's how she knew how to move between the North and the South. Whew.
Annanda & Keisha: Okay.
I put it on a blockchain called Solana. It still is, it was a newer blockchain. It was one of the best experiences I've ever had. I did it without money too. I made all of that happen within my own means. It opened the door for me to see what's possible.
Annanda: I asked Adorable what she was seeing in the web3 industry that made her want to create a startup focused on the Black community.
Adorable: One. There aren't that many software engineers like the black people, black women, people of color in tech period, like who are making the technology. I think people think that San Francisco is diverse and we think that the space is diverse, but what do you have to understand about technology?
They want to bet on a sure thing. Capitalism within technology is about winning. It's about solving problems, but solving them in a way where you're creating a return on investment.
I worked at a venture capital firm so I understand what's really happening here. It's almost a self fulfilling prophecy, right? Cause it's money is building technology, but then, and the technology solving a problem, but it's solving the problems of other technologists and industry, but it's not really solving the problems of everyday people. That's why the disparities in poverty are so huge.
There's a moral crisis within technology and the moral crisis is this, when you have venture capital there are more diverse funds now, there are funds that are for people of color, for immigrants, like those are happening, but they're far and few in between.
The majority is traditional technology that is solving for things that support the middle or upper class. It's this feedback loop that's happening over here, but it's not going all the way down into the everyday problems of everyday people.
Annanda: Adorable's using decentralization as another form of looking at intersectionality, right? She's going beyond identity into aspects of our relationships with others and with the self.
Adorable: I mentioned before a blockchain technology. No, Being a decentralized set of nodes. You are a decentralized set of nodes. You, Ananda, are this node that has different smart contracts within each node.
That's real. There's a smart contract called your relationship with your mother. And there's a node that's called how you relate to food, there's a node for your relationship, your mind and your mindset. And you can update your smart contract for all these different parts of you.
You're not one thing, you're decentralized. We all are. And I want to talk about how in our decentralization, we can heal ourselves at all these different micro levels. We're all connected on our healing journey. And when we come into that oneness and that understanding. And we do the work.
There's nothing that we can't do as a humanity.
[Closing music starts]
Annanda: if bell hooks was alive today, because she hadn't been gone that long, what would she be Shaking her finger at and and putting her hand on her hip about Cuz she doesn't she does it all
Keisha: yes, she does
Annanda: and she was a woman of deep faith in particular Christian faith
Keisha: Yeah, I would say I don't often hear about that side of her life and contribution.
I think she would be in cahoots with the network of scholars and activists who are trying to describe ways forward, imaginative ways forward, that help to, build a different way to thrive in and beyond the system that we inherited. A different way of building wealth, a different way of building influence, um, a different way of achieving success. I think she would be skeptical about the crypto hype, but she would be curious about potentials that it opens up and she would want to see the economists and the technologists And the critics kind of making real talk amongst themselves about the risks and mitigating the risks for the people who have little.
I don't think she'd be easily swept away by the claims that it would fix everything. I don't think she thought anything would fix everything. But I, I think... he would wonder about who's being used to message this change, whose image is being put on the thing, um, who's determining what has value within the system, and what new images or representations would help us see a way forward. That's what I would expect from her.
How about you?
Annanda: That's excellent, I have to sit with that.
Keisha: It was a great question, it was your question.
Annanda: I mean, it's true. I'm talking about your response, I have to sit with that.
Keisha: Oh, thank you.
Annanda: But thank you, I'll take it, because you got the solid question. So I'm gonna take that up and pat this back right here.
Keisha: Yes, yeah, it was a great question.
Annanda: You know, I think with bell, it's hard for me to look forward, and what I mean by that is we are still so stuck in these systems of domination. Out of all of the works that bell hooks has published, I think she's published over like 40 titles, something insane.
She read a book a day. Her self care was a book a day, right?
Throughout the themes of these books is still this echo. This talking about undoing this system of white supremacist, capitalist patriarchy. Part of me wants to be like, you know, I hope she could wrangle some of her colleagues that are a part of the economic minority, right?
Who can speak and bend those ears and say, how are you having your people do this work? I think that's where I'm stuck. There are very few things in which I hold little to no hope, but economic mobility, closing the wealth gap in the United States is one of them.
Keisha: As in you have no hope that that's possible?
Annanda: As in, no, I believe it's possible. I have no hope that those who have the agency to shift things because they actually have the economic upper hand will.
Keisha: So we've talked a lot about what individuals can do, but what about the government?
Annanda: What about the government? I feel like it is truly a government responsibility because the lack of access for affordable housing or even to own a home in the United States historically for Black folks, and other historically marginalized groups, but in particular for Black folks and Native folks, has been made inaccessible due to government policies and apartheid economic policies as well.
For me, this really is an issue that the U. S. government is responsible for and needs to correct. And do I think that the U. S. government in all of its glorious functionality...
Keisha: And shutdowns.
Annanda: Yes, and shutdowns. All of it, Keisha. I do not believe that the U. S. government will do the right thing by righting the wrongs of the years of apartheid for Black people.
And I say that because it has had the option to do so, and it has yet to do so. I want to be proven wrong. Please prove me wrong. I don't think it's just on private interests and entities, but I do not believe the U.S. government will do the right thing.
Keisha: We started the episode asking if it was technologies that would create the opportunities we've been promised, and what I'm hearing from both Adorable and you is so many of these promises have been broken.
And... Individuals and collectives and cities and some states have been making experiments, in fulfilling the promises, whether by local scale reparations projects or, giving people direct funding so that they can buy or refurbish houses that they have been in their families. So those are local small scale experiments that have been happening below the level of the federal government, but there's still a missing layer there.
And that's the overall system. And the direction in which it's pointed, whether it's pointing to people's flourishing or it's pointed to some people's profit. And I think we end the episode with that question still, like, where does the promise get fulfilled and who's responsible for making decisions that take us in that direction?
CREDITS
Annanda: I’m Annanda Barclay
Keisha: And I’m Keisha McKenzie.
Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Courtney Fleurantin, Rye Dorsey, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.
Keisha: This podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.
[PRX SONIC ID]
SHOW NOTES
Work with our Guest Adorable Earthangel!
bell hooks defines white supremacist capitalist patriarchy
bell hooks shows the difference perspective makes in Black Looks: Race and Representation (1992).
Some background on the history of the New Deal and the 2008 crash:
How the New Deal Left Out African-Americans (Smithsonian)
Codeswitch explains the history of housing discrimination and redlining
Economics Professor Richard Wolff (The New School) explains the 2008 subprime mortgage problem
Investopedia breaks down on the AIG Bailout
In January 2021, the New York Times reported the Biden administration’s intent to include an image of Harriet Tubman in a redesigned $20 bill
The Income Gap, the US Department of Treasury marks Racial Inequality in the United States
Why the Great Recession Made Inequality Worse by Ken-Hou Lin and Megan Tobias Neely
Explore more on the 53% loss of African-American Wealth During the Great Recession “The Color of Money” by Mehrsa Baradaran
Caribbean Insights Upend Tech Norms
Season 1 | Episode 5
-
SEGMENT A (INTRO)
Annanda: So I love Adam Curtis documentaries.
Keisha: Love a good documentary—and what’s the draw to Adam’s work?
Annanda: They are heady, visually stimulating, he has a very keen talent that works well with my brain of taking things that seem to be totally unrelated and showing how they intersect. They aren’t the kind of documentaries that are necessarily restful. But I find Adam Curtis incredibly engaging. Watching his documentary “All Watched Over By Machines of Loving Grace” was the first time I heard about this internet manifesto called Californian Ideology. Here’s a clip from the documentary.
(play clip 0:00-0:23)
Annanda: The Californian Ideology summed up the culture and moral beliefs that drove Silicon Valley tech in the 90’s and continues to this day.
Keisha: I’m ready to get into this!
Keisha: I'm Keisha McKenzie, technical communicator, strategist, and advocate.
Annanda: And I'm Annanda Barclay, death doula, moral injury researcher, and chaplain who’s concerned with a life well-lived and cherishes the relational stories people share.
Annanda: This is Moral Repair: A Black Exploration of Tech. A show where we explore the social and moral impacts of tech advances and share wisdom from Black cultures for healing what tech has broken.
Today on the show— Is the dominant worldview of Silicon Valley a reasonable representation of the real world?
<music>
BREAK
SEGMENT B (Introduce the Californian Ideology Concept)
Annanda: The term Californian Ideology was coined by Richard Barbrook and Andy Cameron in 1995. In a manifesto essay which celebrated and critiqued the emerging belief system they are observing in the culture of Silicon Valley, It's a manifesto essay, a staple taught in most university history of computer science classes.
The Californian ideology mixes Silicon Valley's business directed and libertarian free spirit attitudes with a belief in technology's ultimate power to improve society
Keisha: That sounds a little bit like PayPal's Peter Thiel. What does it mean?
Annanda: the term Californian ideology critiques the idea that digital technology can solve societal problems without first addressing underlying political and economic structures. Here's a clip of Richard Barbrook talking about the ideology last year
(play clip starting from 0:13-1:06)
Annanda: This ideology helps to explain how America's tech culture blends the technological utopianism, individualism, and anti establishment sentiments, which shape big tech successes and controversies.
(Insert a montage of tech company successes and controversies: Surveillance clip starting at 0:00-0:13,0:25-0:31, 0:01-0:06,3:19-3:26, 0:07-0:15)
Annanda: It's like mixing ideas of personal freedom, challenging the system, but also with a deep trust in the latest technology and putting entrepreneurship first.
This combination is reflective of In the ethos of tech companies, which tend to prioritize innovation, disruption, and a libertarian approach to regulation.
Annanda: First, tech has an impact on our work culture. The American industry, the American tech industry, is known for its informality, emphasis on creativity, and non traditional work environments. Think Google, Meta, and Apple having everything you need on site. From three meals a day, from chefs in multiple cafeterias, to cafes, Gyms complete with yoga and workout classes, medical and dental clinics, real talk, y'all, laundry, dry cleaning, and child care.
Yes, Google has child care. They supply you with everything you need, including a hefty salary, so you have no excuses for not having enough material support to achieve your work. It's designed so you don't have to leave. Creepy. Creepy
Second, tech is well known for its pro technology, pro market, and individualistic business mindset. The impact of which is reflected in how tech leaders engage with social and political issues, often focusing on technological solutions to complex problems, which is an issue we'll dive into in the second part of this episode.
Finally, tech companies also tend to show preference for minimal government intervention, which is slowly becoming more of a household conversation, right? companies accumulate vast amounts of power and data, leading to concerns, again, as we've talked about in prior episodes, about privacy, surveillance, and monopolistic behavior.
Hashtag Peter Thiel to Keisha's point earlier, who really does enjoy a good monopoly.
(insert clip 0:54-1:05)
Annanda: In short, we know the Californian ideology well. We live it, and the results of it, in our everyday lives.
We just don't refer to it by its formal manifesto name. And Richard Barbrook and Andy Cameron totally called it in 1995. It's ubiquitous in American tech culture. It's the water we all know and swim in. And it seems like an ideology that's become too big to fail.
Keisha: When you talk about the Libertarianism, the utopianism the individualism. It just sounds like the latest evolution of some of those early formative American ideals., the work ethic, ascribed to Protestants and the colonial era, the triumphalism before World War I, where the Industrial Revolution was transforming the country.
It was outliers in religion who were saying, well, maybe human nature hasn't caught up with the technology. And then we got the world wars to prove it. So it's like some of these streams evolving over the last two, three centuries to bring us this network of extraordinarily resourced—they're not industrial barons, they're tech barons—billionaires who are able to influence policy, but also literally influence the devices we use to live our lives.
And it makes sense that we should try to unpack what they're thinking about and what assumptions they're making about our world and us.
Annanda: I like how you said tech barons because we don't talk about them that way but that's exactly what they are and if we compare it to the industrial barons before them Tech Barons do know a lot more about us and have way more access to ourselves, our habits, our histories.
They know us in a frighteningly intimate way through our data. Let's unpack this because it's real. And this ideology has named and pointed to it.
Keisha: I also think it makes sense to see it in, in the light of those older ideologies because if we treat it as this outlier, then it becomes a thing that's other and foreign, but it's actually very native to the U.S. It's almost as it would have been a surprise if it didn't emerge from the U S given all of the streams feeding it. And I remember, not long after I came into the country, it made sense for me to look with some sense of aspiration and idealism at the people being successful in technology because of my discipline and because of the spaces that I was in at the time.
They lived the sort of plucky, reinvention, self invention, do it yourself kind of spirit that America preaches to the rest of the planet. And yet, nobody ever really unpacked what that sort of entrepreneurialism-to-the-extreme cost everybody, in terms of a tunnel vision focus on individual success, perhaps at the expense of the communities around.
Annanda: As you're saying that, given that it's called the Californian ideology I think it actually shows even more so the extreme because California is out west and the history of what it took to shape and cultivate the West.
This idea of libertarian free market is very much overemphasized in the West.
I've really noticed the difference of that culture compared to say, Chicago or aspects of the East Coast where you're at. There is a very particular tinge on it that is, like, the ghosts of the gold rush coming back to haunt, and it really has shaped the culture of California. to me, it makes total sense. It's as American as apple pie.
It's clear. This is very California.
Keisha: So given this approach to reality, it makes sense for where America has come from, but it also is incomplete. So where have we benefited from the Californian ideology as a culture? What's still left out? And is there a better way?
BREAK
SEGMENT C (set up Jane as a re-framing of what’s possible)
Annanda: We interviewed Jane Gordon. She's an Africana Studies professor at the University of Connecticut. We wanted to get her take on what's a Black Caribbean point of view compared to the Californian ideology.
Jane, how would you describe the dominant worldview of Silicon Valley?
Jane: My sense, which does feel like it's probably pretty caricatured, to be honest, is of a lot of very young, unattached people, who are increasingly international, who really have tremendous faith in the ability of technical forms of human creativity, to do really innovative things.
Some of which are likely to actually benefit many people and some of which I think they're not actually even thinking about what the large scale effects will be on people.
It's interesting because their influence is so massive, right? And it's so outsized. And I think that's why when I stop to think about who actually is embodying, like, who we're actually referring to, I realized that it's probably a pretty caricatured image.
What I'm thinking about are the consequences and effects. I know that there's a pretty massive gap between folks who are developing technologies with all kinds of expectations about what will come with them and the effects that's having on everyone else.
Annanda: how do you think the caricature that you describe is different from dominant or different narratives outside of Silicon Valley?
Jane: I think about IT as a really innovative area. Right? I'm at a big public university in New England, and the range of people who are drawn to opportunities, and problems opened up by technologies are vast. A lot of the people who are flocking to those fields the most, are immigrant kids, they're kids of color.
My presumption is that the dreams of some of those people I'm describing is to work in Silicon Valley and to transform it and remake it in their own image.
I think other people just assume in advance that world will always be close to them. And that they'll be doing either subsidiary work or work that's trying to develop other forms of, other conceptions of technology and global technology that'll have a kind of, I don't know what kind of relationship that'll have to Silicon Valley.
But it feels as if it's the sun around which everything orbits., and I think people make all kinds of choices about when opportunities and resources are distributed that way.
Annanda: What social concerns surfaced for you? this idea that tech can be a panacea for all and can save everybody. And why should we care?
Jane: Yeah. What I do find a little bit challenging is I think of tech very broadly. Right? Just last week, I was reading the work of a political theorist, on North American Indigenous political thought. And one of the figures who he studies most closely is, a man named George Manuel, who is an Indigenous person in what is now Canada, but was absolutely fascinated by Tanzanian experiments in socialism.
And so he had this really interesting approach to tradition, which was, Never at odds, he thought, with emerging technologies. So he used to say, for instance, just because you're a farmer and want to continue to be a farmer doesn't mean that you need to use a wooden plow. So he saw technology and its valuable expressions as simply anything that enabled people to make their values live and live dynamically into the present. So that could be a wooden stick. I mean, it could be any number of things. And so I have to admit that when I think of technology, it's really more in that vein. It's sort of anything that amplifies human abilities that we're committed to using.
A pen is a technology. I think about it in this sort of, very broad way that I think in some ways has been narrowed by the rise of Silicon Valley. Only a certain number of things have come to mean tech, when the truth is tech is human creativity trying to amplify our own abilities.
When I think about Africana philosophy, it's so rich with a broadened conception of technology, that it's all about cultivating human well being, cultivating sustainability, there's this narrowed version of what we're meaning by tech that is coming to eclipse the broader world of human creativity that it makes more sense to embed it back in. And that's even more of a concern if people want a narrowed version of tech to solve all our problems, because it divorces a very particular technologies from what motivated their development, and then questions about how they should be distributed.
There's always a human desire for technical solutions to complex human made problems. And so I think we often channel our creative energies into more decontextualized technical ways of addressing problems that are always embedded and are always social and cultural and thickly enmeshed and everything else. We need this broader understanding of technology if we're going to think about The role of technology in solving our problems
But it's we who solve our problems. And we have to remember that we also are the creators of the technology that we're looking to solve our own problems. And so there's this strange relationship to our own agency when we turn to things that we've produced but act as if they're independent of us and that they're going to solve things for us.
Because of course they bear the marks of us. Right? Good and bad and every kind of combination of those. So my own feeling is human beings, I mean, I'm completely indebted to Octavia Butler and what I'm about to say. I think human beings have this massive capacity for imagination and that we're terrible at solving our own collective problems.
Keisha: Jane is referring to the late science fiction writer Octavia Butler's works, she's pointing to a theme in those books around human beings creating meta existential crises and the struggle it takes these characters to try to solve collective problems.
In recent years, Butler's work, The Parable of the Sower and The Parable of the Talents, has... Had a renaissance coming back in, podcasts and a theater production. So a new generation is experiencing these themes from the mind of Octavia Butler.
Here's Jane Gordon again.
Jane: You know, we're great at producing things, we're terrible at distributing them well, and we just do it over and over and over again. We're fantastic at the less human, less social dimensions of creativity, and we're really bad at working with each other on a collective scale that's not small, in figuring out what these should actually be in the service of.
I see it so much say in film right now that you'll go and you'll see these films and they're technically brilliant and they're superficial often. I mean, not everything. There's amazing film and TV getting made right now. I don't mean it like that. But there's just, there's such an emphasis on what we can do at a narrowly technical level.
But I wish that same attention were devoted to what it is that we're bothering to explore.
Annanda: how does black Caribbean philosophy make social sense of the dominant. Ideology you see in Silicon Valley? Or maybe contextualize it.
Jane: Right.
I'd answer that in a couple ways. The first is that there's a remarkable kind of how to orientation that really runs through Caribbean identity, political identity, especially that, I mean, one of the things that I think kind of dumbfounds folks who haven't been exposed directly to Caribbean people, is that there's a sensibility of like, who, who are you to tell me this is impossible?
Like, we're going to make this happen. You know, it's just a question, like, give us some time maybe. But like the will is always present. There's this kind of remarkable creativity that comes with that. Cause rather than devoting all your energy to convincing yourself that you can do something, you're devoting it to doing the something,
The thing that I think Is like in a simplistic way really like attractive about Silicon Valley is there is this kind of , "let's do this new thing. Let's be pioneering " spirit of innovation that is hard not to be drawn to. But where the Afro Caribbean sensibility is so different is the idea that you would create something separate from the social world of which it's a part.
Would be alien, right? The idea that you would do that with absolute disrespect for any sense of the world, elders, the earth, right? I mean, one of the things that always really struck me about Caribbean fiction is that they're not the narratives of individual protagonists. I mean, Caribbean literature of all kinds. There's a thick social world that always comes with it. Yeah, you have heroines but there are worlds that they're acting in. So that sense that like a technique would transform the world and fix all our problems. I mean, people would laugh you out the room.
I think the other thing too, that you see in Caribbean philosophy and how it's written and what's prized is there is a much less constrained approach to creativity.
Creativity is in everything. It's not just in how you solve a political problem or an economic problem or a technical problem. It's in how you speak. It's in how you make music. It's in how you cook food, you know? That creativity must draw from everything, I think, then shifts how people think about what technology should be in the service of, right?
Because if it's drawing from all of these domains, then it's also answerable to them, I think, in a way that's quite different. So in other words, just as we bring our character to the technology, I think we bring our philosophical anthropology to that technology as well. So we imagine the technology as a decontextualized, individuated thing, rather than a thickly enmeshed, you know, social creature with ancestors and descendants to whom we're obligated, to listen to and to respond to and to be accountable to.
Annanda: You gave examples of food and being accountable to ancestors, to the planet that you're in service, you're in community and not separate. Is there a particular, philosophical or spiritual lens that you could give our audience as everyday people to consider an alternative approach to innovation outside of Silicon Valley?
Jane: The first thing that comes to mind really is how profoundly creolized Caribbean music is.
There are different moments and there are different kinds of Caribbean music. I don't want to treat it as a monolith because it's not one. That would be an injustice to it. But there's a way that, say, in Caribbean music there's no missing that is Caribbean.
and yet what's Caribbean about it is how it brings so many different cultural influences and things that move people into a coherent whole. And so you listen, you think, Oh, that's from there. And that's from there. There may be things that you don't actually know the origins of, and yet it's in their combination that you recognize their expression of Caribbean-ness.
And so I think there's something about the ability to draw from, while acknowledging indebtedness. There's no pretending that you haven't borrowed, but there's a sense that you can borrow in a way that enriches and that doesn't leave you indebted. When we're talking about technologies that we should prize we should, reinforce the availability of, I just wonder, if there's a way of emulating, that way of borrowing so generously from all of the different contributions that come from distinctive experiences, distinctive locations, and yet what we're proud of and what makes them an expression of who we are is how they've come together.
So it's not about pretending that we're the unique source and it's not saying that to borrow means we're imitative. There's a model in that of human community that I wish we could emulate in so many other domains.
that we acknowledge that there are different, different words, different music, there are different ways of engaging in ritual, that really carry, the wisdom of different experiences, of the human world. And that if we know how to sew them together, that's when we do best. And by do best, what I mean is we're figuring out how to actually be in human relation.
Keisha: What if the tech industry is also a space of different cultures and contributions being knitted together to create something new, with some of the same sorts of power dynamics determining which contributions have value or rise to dominance?
Jane: The tech industry itself is creolized at the level of the creativity that we see at work in it.
Especially good for this conversation is the 1972 Jamaican film, The Harder They Come. Jamaica and Jamaicans had often been the backdrop for movies that really centered other protagonists, whether they were James Bond or anyone else; this was the first film where Jamaicans in Jamaica, 10 years after independence, was the focus.
The reason I think it's useful here is for a few reasons. The first is that if you see that film, it's obviously not just a film.
(insert clip 41:15-41:31)
Jane: Unchanged with its focus on Jamaica. It really was a Jamaican film. The story itself is a quintessentially Jamaican story.
what I think it captures so powerfully is it's a working through of the promises and fantasies of Hollywood, and knowing those, but speaking back to them,
Annanda: So, in this case, Hollywood and its dominance in film would be a metaphor for the Californian ideology and its dominance in tech.
Jane: You have somebody who comes from the countryside with the death of his granny and wants to come to Kingston and make it big, he has this dream of becoming a superstar musician. And of course, it's played by Jimmy Cliff.
Annanda: For those who don't know, Jimmy Cliff is a pivotal figure in reggae music. His work helped to bring reggae, as a genre, to a global audience through the soundtrack of the film, The Harder They Come. This is the film that Jane is describing now. For those of you who are diehard Bob Marley fans, it was Jimmy Cliff that put reggae on the global map and Bob Marley was able to use the platform that Jimmy provided to amplify reggae music even further.
Jane: What he encounters at every turn is the brutality of colorism, the brutality of stark class inequality, and yet he persists he does become a superstar musician, and he's completely ripped off, when the song is recorded. The radio hosts who control what can become a hit, try to block it from being able to circulate until he's become so notorious in outrunning and outsmarting the cops that it becomes the soundtrack, to his life and to his story.
And the song is, you can get it if you really try, but you must try, try and try, you'll succeed at last, right?
(insert clip (0:06-0:24)
Jane: And there's this irony of ironies, that there's this sort of shallow version of meritocracy that should be true. But in those circumstances, it's so absurd that the song has this kind of irony because you see someone trying, but what does trying mean, and you see them succeeding, but what does succeeding mean? there's this nod in it to the Jamaican love of the Western.
At the end the cops show up and there's the shootout, he suddenly imagines himself as the protagonist of the spaghetti Western that he sees when he first comes to Kingston. And so there are all of these interesting taking on tropes that really circulate in Hollywood. But speaking back to them through a protagonist who's a member of the lump of proletariat.
Annanda: Jane cites this example of Hollywood in the film because it shows the mindset of Jamaican Black Creolization through this scene of the Spaghetti Western. And this scene is speaking back to Hollywood, by Jamaicans doing this, by Jamaicans adding a nod to Spaghetti Westerns, being cheeky and ironic, putting this Spaghetti Western plot within the narrative of a film, yeah, it's a wink and a nod speaking back to Hollywood, as Jane is saying, but also it really is still uniquely Jamaican.
It's a Jamaican take on the Spaghetti Western, and I think that's a wonderful moment of creolization that Americans can recognize, or rather anybody who's seen a Spaghetti Western can recognize in the film. And so what Jane is doing is giving an example within Jamaican art and culture through film and music of Creolization.
Jane: The other piece of it that I think is interesting for our conversation is the life of the soundtrack, , because the film, it features Jimmy Cliff, and the album that's made from the movie circulated far, far more broadly than the film. The soundtrack had a life completely of its own, and many people who don't even know about the film have encountered the music. It was considered the first global reggae album. And really it was one of the first encounters globally with Jamaica speaking back to the world about promises of freedom.
It's a great example of people being very excited to employ and use a technology, but absolutely remaking that technology to tell the kinds of story that they wanted to tell.
When it comes to who's thinking about ways of expanding the reach of human beings or the capacity of human beings, the contributions come from everywhere, but it's a little bit like the radio stations with, Jimmy Cliff's record, they were only willing to circulate it when they were ensured that the profits would come to them. And that was actually only when the notoriety began to surround him.
It's similar with technology, that there's a big difference between what's actually being generated and what's potentially in circulation, and then the controlling of that circulation and the controlling of that distribution, and that what isn't creolized is who's getting to make those decisions
Keisha: The Jamaican national motto is outta many one people. I just became a US citizen, which talks about e pluribus unum, the out of many one. How is that different? Or is it different from the Creole impulse to pull things from many sources into coherence?
Jane: An argument that many Latin American scholars have made about the ideology of mestizaje or sort of a national identity that's already mixed, is that you can include people in ways that actually erase them.
One of the policy approaches to indigenous dispossession in the U. S. was forcibly incorporating indigenous nations that had no interest in U. S. citizenship, that didn't want to belong because that was essentially discrediting the sovereign status of their own states. Right? And so there are ways of incorporating that are actually about not only diluting difference, but literally erasing it, bleaching it, for lack of a better word.
Whereas I think the point with creolization is to say it's the coherent whole idea. That we can actually be enriched by the continued difference within that whole. We don't want to flatten it and we don't want to mute it, we want it to speak. Fully, you know, vocally. The emphasis on creolization, and not that everything that's called creolized gets it right, because it certainly doesn't, but I think as an ideal, it's such a useful one, because it says we want that difference to continue to speak, and that the relationship is the differences speaking together.
Rather than letting one drown out the other. And while there, there are imperfections in how that Jamaican ideal is realized, as an ideal, it's such a valuable one because it's really genuinely rare. It's saying that within the nation, we're a multi nation, that our pluralism is a strength, that it's not a site of threat, that it's not a weakness that we haven't all become modulated into one, you know?
Annanda: what Jane has been talking about is a prime example of the limit of dominant ideologies. like the Californian ideology. Richard Barbrook and Andy Cameron critiqued the dominant ideas in tech, that technological innovation could solve all of our problems.
Cause they're like, no it can't, not without first addressing the underlying political and economic structures and experiences that differ from that of the dominant tech culture. And I'd take it a step further to say that technology can never solve all of our problems. I really do believe that that boils down to how we relate to one another, how we are in relationship with one another, and we can use technology to facilitate that and make that more easeful and enhance it and better.
But ultimately, there is no tech in the form of tech that we consider to come out of Silicon Valley or like computer innovation that is going to solve all of our problems. Jimmy Cliff's character served as that example in the film, the daily experiences of the global majority serve as a constant reminder of the massive gap that's missing between the ideology that runs American and Western tech companies and the lives of everyday people. It's an ideology that leaves no room for meaningful creolization.
Keisha: When I was listening to Richard Barbrook talking about the origins of his read on the Californian ideology, I was moved by one of the things he said, which is that the empire is in decline.
So the American empire is in decline is his read, but he's saying that from the UK. an empire that long declined, like during the era that Jimmy Cliff was burgeoning on the scene. And so you see creolization happening, both in the shadow of empires, but also in the aftermath of empires. It points the way to a future of differences living together in a functional and healthy and creative way, versus differences living together in a consumptive way, which is what Empire brings you.
[Closing music starts]
Keisha: When they used to talk about the global village through technology or even in the early days of social media, they were talking about, oh, this creates connection and it drops all the barriers and borders. Well, yes, but no, it didn't. It did in some ways, of cultures being able to be connected instantly, but it didn't in terms of the power structures not changing so that they were actually on equal terms rather than object and voyeur, you know?
So it's a different sort of dynamic that the tech industry actually gave us versus the rhetoric of global village.
The internet said we would be able to know each other, and what it actually has [00:31:00] meant is I will know all about your first date presentation of yourself through the social media update, but, the care and the community and the connection that we need to survive the crises that we are already in, whether those are economic or military or environmental or otherwise, like that's , the challenge that earth is throwing at us and we're going to need different strategies, including this extraordinary creativity to navigate them.
at core, we are still mammals, and mammals like soft things, and good relationships and community are a soft thing, and we need more of it
Annanda: The narrowness of Silicon Valley and its approach to life is just not reflected in my everyday experience with the diversity of humanity and relationships that I encounter and cultivate.
And so I feel like the gift of Black Caribbean Creole or the gift of Jamaican Creole, In particular, is this idea of like, we're all here. We're all on this tiny ass island and we can make it work by honoring each other because we have to. You just expand the island to a planet and I think the point is really easily made.
And so what's the Black Jamaican Caribbean wisdom here? It's creolize. Break down barriers to access. Break down barriers to knowledge. Break down barriers and acknowledge our shared value that out of many, we are one.
And also, have some good Jamaican, Caribbean, you know, That's like, who are you to say we can't do it? Who are we to actually say we can't do it? Especially when we really haven't in this area in Western and American tech. When we really haven't tried.
CREDITS
Annanda: I’m Annanda Barclay
Keisha: And I’m Keisha McKenzie.
Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Courtney Fleurantin, Rye Dorsey, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.
Keisha: This podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.
[PRX SONIC ID]
SHOW NOTES
Check out our Guest Professor Jane Gordon
“All Watched Over By Machines of Loving Grace” an Adam Curtis Documentary Series
Data Science, Consent, Colonialism—What We Can Learn from the Woods
Season 1 | Episode 6
-
Keisha: Annanda, let’s go back to 2010.
SOUND: open, folksy montage music -- maybe fading up the I’m From Driftwood background will do it.
It was the heyday of public storytelling profiles like Brandon Stanton’s photo series Humans of New York and Nathan Manske’s LGBTQ video profile site I’m From Driftwood. Platforms like these were based on the idea that people who believed in equality, equity, and justice needed to “tell their stories.” And if they did that, it would help them find a home in a harsh world—online and offline. You heard of them?
Annanda: “Humans of New York,” definitely.
Keisha:
The website from I’m from Driftwood, tells us about this strategy of belonging through visibility:
SOUND: Clip from “I’m From Driftwood - The LGBTQ Story Archive”
[00:00-00:08] Narrator: “I’m From Driftwood: The online archive of lesbian, gay, bisexual, and transgender stories from all over the world…
[00:24-00:31] Narrator: “LGBTQ people aren’t just in the big cities. I’m From Driftwood’s message is that no matter where you are and what you’re going through, you’re not alone.”
Keisha: It sounds a little naive to me now, but it was common at the time to think that if people who were different came out of the shadows and left their anonymity behind, that was what would change hearts and minds and help create just communities.
If they really knew us, we thought, they’d treat us better.
Even the most hypervisible Black woman on Earth got in the game, using her stage to spotlight others. Here’s Oprah on her show with John Amaechi:
[00:05-0:08] Oprah: “Do you think there are other professional athletes who are still in the closet?”
[0:08-00:14] John: “Yeah, my mind boggles when people ask this question. Of course, yes, there are definitely—there are definitely other athletes. And it’s a workplace thing…
[00:19-00:28] John: “There are gay people in all professions. It’s not just the ones where people find it acceptable or expected. It’s not just makeup artists and hairdressers and, uh, flight attendants so [laughter] we do exist in other areas.
[00:30-00:31] Oprah: “Yeah. Ok.”
I was a bit ambivalent about this idea of revelation as revolution, but I did it myself. I helped others do it too.
By then I’d already spent more than a decade and a half | online | I rarely used my government name | and I mostly connected with people I’d already met in person or interacted with for a while in discussion rooms.
I got to know these people in some really meaningful ways—we went through graduations, bereavements, marriages, divorces.
But I didn’t know much about their day-to-day lives, or expect that I or anyone else should have access to that depth of information.
I’d have been mortified if I’d known how easy it was, even then, for tech companies to put together all of those data points from the texture of our lives online.
They can do that now.
SOUND: Add Title music
Annanda: I’m Annanda Barclay.
Keisha: And I’m Keisha McKenzie.
Annanda: And this is Moral Repair: A Black Exploration of Tech. A show where we explore the social and moral impacts of tech… and share the wisdom from Black culture for healing what tech has broken.
Keisha: On today’s show: Tech companies have access to an immense amount of data about each of us. What does it mean to be known in a world where no one can be anonymous?
Annanda: How are technologists shaping that world—and how is it shaping all of us?
SOUND: End title music
BREAK
SEGMENT B: The people behind the data
Annanda: I’m not even on social media anymore. Except YouTube.
Keisha: How come?
Annanda: Well… I had a major relationship that I broke up from years ago. We're talking like 2016. | For those in the camp of divorce, it was a marriage. And I decided that I had too much online. Like I realized | how public | I intentionally made my relationship.
SOUND: Fade in audio from Jill Scott’s “Watching Me” -- e.g. 00:00-0:25 (vocal version)
We were both going for ordination in our respective denominations |, and so we used social media as a platform to push our | “queer agendas” | within the church. | so we intentionally made ourselves hyper visible | to really hit on what you talked about earlier, Keisha: | maybe if they hear our stories or know us, | it could be a strategy | against LGBTQ discrimination in the church.
| but I also realized, like, I actually wanted to mourn the loss of that, | not in public. And so, from that moment on, I began to question what are we putting online? Who owns it? | and really | backed down and took some time to | patch myself and come up with a approach | to social media.
Because I realized social media is not public and I don't own it, so why am I giving my life, quite literally, | to some | corporation?
Keisha: What helped you | heal from that time?
Annanda: Oh my gosh, uh, not being on the internet, um, | this is just me, right? I don't think there's a right or wrong way to do it. | It's just my way for myself. | so yeah, I think not being on the internet and actually getting really clear | like this tool no longer serves me in the way that I intended so how do I establish a different relationship with it? And what does that relationship look like? | and what do I want to use it for? | For a while I just needed a break. So I took a year off, kind of observed things a little bit after, you know, making, making those couple's statements, "we're no longer together," that kind of thing.
Keisha: It sounds like it was a devastating experience on a personal level but | unmooring professionally because | you put out your information and then had to figure out how to | pull it back.
In 2016, we were both in the same space of ordination, gender, and LGBTQ justice. I was working with Seventh-day Adventists at the time, and then later just about anybody who meant that the world should change. | I wasn't directly impacted clergy though, like you were. I was just supporting clergy that I knew who were doing good work and were being hurt by their community's policies.
| we did a lot of video profile work and | 10 years later, I | still hear from people who have found some of those stories and profiles and stuff, and | once that information is out there, you no longer control it, and you no longer control the conditions under | which people find it or use it. | it's a little unsettling.
Annanda: Yeah, a lot unsettling.
| I stalk my social media accounts frequently. Yeah. | I mean, | I'm not on it often, but I know what's happening. |
Keisha: When was the last time you googled yourself? You wanna do it right now? Tell me what you see?
Annanda: Okay, the first thing that comes up for me is my, my profile at Stanford, then it's like LinkedIn, then when I pass out first Presbyterian Church, and then it's, uh, like a, a piece done on me from a local radio station on chaplaining and then a lot of, uh, yeah, it's very, it's very clear that I'm, uh, that I'm a preacher. *laughs*
So I look at social media, the way people | watch a movie and eat popcorn, and there has been a gain. | I don't feel the anxiety that I used to of like, oh I have to post, or | this hyper competitiveness that | I think can lead to | deprecation of self worth.
| but there's also a loss there. | social media | is creating culture, is creating life. And so, | I fully accept | what I lose from not participating in social media. | there is something to be lost there. There are connections that I'm not making because of it.
Keisha: Did you | feel like you had | a choice in terms of what parts of your life you were sharing with the people you were organizing with and the people you were trying to influence?
Annanda: | Oh yes. | I always have a choice. I'm always in my own agency, no matter what is going on in the world. | And with that decision, I was aware that the choices I made came with real consequences. | I was then, as I am now, incredibly strategic. there is an anchor of | I want to make a choice, I want to make it | for myself, and | when it's appropriate, I want to make it in community and to uplift my community, but I need to care for myself first.
| I tend to research | and observe |. So | I was like, okay, I need to get off social media, and | like, what is this?
| How do I make informed decisions on how to | engage with it when and if I want to re-engage with it again? |
Keisha: So you were looking for the midpoint between being hypervisible on | one extreme, and completely having to stay out of public life?
Annanda: I don't know if it was a midpoint. I was looking for | what feels good for me. | what doesn't bring up anxiety? What doesn't | bring up | bereavement and grief | in a way that is | keeping me stuck.
I think I have | like something like 36 things on my Instagram, | right? | I don't have that much. I'm | kind of like a digital nobody, you know? But I also don't have that digital stress.
Keisha: So | that's the perspective as a tech consumer. I'll let you in a little bit on the perspective as a tech | creator. | Not long after 2016, I was on an advocacy campaign where we used early advertising tech to engage people who were supporting the campaign. So we used web services that were hooked into social media and email.
And that let us have basic information about the people who were following the campaign's Facebook page. Just their demographic groups or interests that they had volunteered already. It was enough information to let us connect with people who might want to learn about the legislation that we were trying to influence or | wanted to help share positive stories about minoritized communities.
And eventually… because I was also a consumer, I started to ask the other campaigners questions about the system behind the page. Like, what was the ethics of collecting that data? And what was becoming “best practice” in the industry of campaign analytics? | Mostly I got back blank stares. And I started being even more intentional myself about the data I was putting online. Because of how it could be used.
And then there was something I found the other day. | When the pandemic started, some churches were starting to use digital marketing to find their members online and | to attract new members. | Some of those marketers were monitoring search data to do it.
So there's a Christian Broadcasting Network report on Gloo, which is one of those | companies:
SOUND: Clip from CBN report
[00:00-00-03] Anchor: Church attendance continues to drop in America…
[00:59-01:06] Reporter: Gloo, a ministry platform, is building on that trend by working behind the scenes online to help people connect with a church.
[01:15-01:29] Reporter: If you’ve ever wondered why you randomly get online ads for mental health support, marriage help or dating sites, it’s because your search activity isn’t a secret. In fact it’s a gold mine for companies like Gloo, that are paid to connect you with people who can help.
Annanda: Oh, wow. | as soon as you said it, I'm like, oh my gosh, I'm not surprised. But I honestly had no clue that this kind of targeting was going on | when I was a pastor. You know, now I'm a chaplain. | Still clergy, but now I'm a chaplain. | Given my stance on social media, | I had no clue churches were doing this because | I was never put on the—obviously *laughs*—on the social media team.
| You know, something like this would never be professionally permissible in a place of higher education, | or | a medical institution | either, any hospital, frankly. | violation of student or patient privacy | is pretty serious and it's protected under federal law with FERPA, | the | Family Educational Rights and Privacy Act, and HIPAA, | the | Health Insurance and Portability and Accountability Act.
| Even | the research that I'm doing on students at Stanford required an | extra committee in addition to the normal research committees | so that, yeah, I find that ethically concerning, and I also think that shows that like, congregations are | institutions that need people to buy into a certain degree in order to survive.
Like, that's the model, right? | you can't open up a non profit organization and not have participants. | what I'm hearing is | the cost of some of that and also the desperation right now in America with | less and less people going to church | in general.
Keisha: Yep, | There's a | business framework wrapped around the way that they engage the people they serve on one hand and then define as audience in another sense.
So if you're thinking about audience, then you get to marketing and product and value proposition and all of those kinds of things. It's | fascinating. But I've been talking to | my tech friends about the whole sector. | because of the experiences we talked about being online, being hypervisible, but also because of what I was seeing in my professional work.
Scott Hendrickson is one of those friends. He's a data scientist who's worked with some well known companies like X, the company formerly known as Twitter, and some smaller business to business data shops that most of us will never hear about. But over the years as I've talked to him about this stuff, I've realized there's a whole web of data and analytics companies busy behind the scenes of the internet. |
Pull quote from Scott’s interview: There isn't a practical solution that says companies are going to choose not to use this data to their advantage—unless their competitors are also constrained by the same | ethical standard. |
Data science and analytics teams like Scott's share reams of user data with corporations and researchers. These are the enterprise solutions that help companies to make | choices about what products to sell and to whom | and hopefully make | customers happy at the same time.
So Scott's | seen | how much data companies have on us, | data that can be exposed, eating up our | anonymity online, and he's had enough experience with it to doubt | how these systems develop and how much data they draw from our lives is really to our benefit. After the break, we dive in with Scott.
BREAK
SEGMENT C: The data behind the people—and what we do about it
Keisha: So we’re talking with Scott Hendrickson, a data scientist, about how the | systems behind social sites and shopping apps actually work. [add space between narration and interview]
Keisha: How much data are we talking about?
Scott: There are two kinds of answers to that question that might be meaningful | and maybe the sheer volume of data is the least meaningful one.
| Every company is collecting customer data, contact data, | product purchase data. |
| If you shop for something on Amazon and don't buy it, you might be visiting Facebook, for example, and see an ad for that same thing that you were shopping for. [added space] | Those kinds of things are the result of data sharing between organizations.
| The second answer, though, might be more interesting: the kinds of data that we're able to capture are really exploding right now in ways that we can't imagine.
There are many kinds of biometrics: Facial expressions | can be interpreted with AI more and more reliably so there's many kinds of data | that are available. | I think those are probably more interesting than the sheer terabytes.
Let me give one example that’s | actually | not far away in terms of technology coming together. [added space] We can put tiny sensors in our headphones and track brainwaves, can easily sense some things like | alertness or | excitement, and we would be able to tell if you liked product A more than product B by just observing you wearing headphones and looking at brainwaves.
SOUND: *record scratch*
Keisha: Yeah, so what? —tracking brainwaves?!
Annanda: The technology doesn’t surprise me. The lack of consent doesn't surprise me. And the likelihood of how these technologies will be used and the compounding impacts it will have are vexing and maddening to think about. | But to know is to be empowered. So keep talking Scott, I want to learn more so I know my options. So we know our options.
Scott: So for today, it’s almost certain that you’re going to know that’s what’s happening. | There is actually one trial in IKEA where you can step into the little square and be tracked and they can show you different products and tell how excited you are.
But that’s very explicit.
Keisha: Imagine this kind of biometric data deployed at the scale of LLMs, large language models like those behind ChatGPT and BARD. Early on, it's still clunky and explicit, so you have to seek it out and you have to opt in, but all of the incentives point to baking it into different consumer and social systems in ways that make it so much harder to opt out of.
That's already happening with AI-based identity verification tools like ID.me, which now collect face data and IDs to gatekeep Americans’ access to the IRS, Social Security, and several state departments of labor. So the pattern's likely to repeat with other kinds of data including information so organic to what it means to just be alive that we might not even realize we're producing it, or that someone might | collect |and analyze it.
Again: Brainwaves?
Scott: | The challenge | is the rate of change of that kind of technology is | very fast and it’s for some of the same reasons that LLMs have become so exciting. | we now have ways of taking a large amount of data and interpreting | meaning from a fairly disorderly data set. |
| Interpreting one's brainwaves as excitement or attention or inattention or boredom | follows the same path. | the more data there is available, | the more specific we are going to be able to interpret what someone is doing.
| There are probably,| thousands | of heavy equipment and trucking operations that are tracking people's sleepiness through these kinds of devices. And they're able to give a warning when someone's dangerously | fatigued | and out on the road |.
Keisha to Annanda: Some sort of feedback for sleepy truck drivers seems like a good thing, right?
Annanda: I’m all about safety first. I’m just too American to believe the benevolent use of this technology will be prioritized or legally regulated as the normative business standard. That’s not how we tend to do things.
Keisha: If trucking companies can reduce accidents, they’ll save money on workman's comp and insurance, and they also save their reputation. | That's just a good business plan!
I also don't mind if hospitals take steps to protect junior doctors and surgeons who are chronically underslept. I just want the measures to be consensual. And maybe administrators could start by making worker shift lengths more humane instead of | eye tracking off the bat, because not all management breakdowns are best resolved with |tools. |
Annanda: During the pandemic I was a chaplain at Stanford Hospital. I will say that the pandemic positively impacted the shift lengths. | They shortened them | to prevent long term burnout in the midst of an unknown | disease, | like, I'm talking, | early 2020. | My experience in the role that I played as a chaplain, they have been fantastic.
I think everyone has benefited from that decision making. |
For me, increasing surveillance sounds like widening mistrust and leaving little room for circumstances of what it means to be human.
Right? Like, | who wants their eyes to be tracked all the time? Nobody.
Keisha: Like keyboard strokes in a remote work environment.
Annanda: That's not a work incentive. That's like an instant F-you incentive.
Now it’s like, I no longer trust you as an employer.
Keisha: Yeah. I feel you.
Data Coherence
Scott: | Cookies are | pretty low level in the browser. | You visit a site and | someone can say, you’ve been to google.com, so we know that about you.
| More coherent things can happen with us | knowing less about it. | CCTV-type surveillance | could be used very coherently to track where you’re moving and make interpretations about what your daily schedule is.
Keisha: Imagine driving from Baltimore to New York. I’ve made that trip a couple of times. It’s 4 hours depending on traffic. Along the highway, every tollbooth takes a picture of your car. So someone with access to the toll cameras can put those photos together and track you across four states.
They could also do it by stitching together security cameras, and gas station purchases, and phone call logs and cell tower pings. We’re talking about making a single profile of someone out of several sources of data.
Annanda: That reminds me of Enemy of the State, when you say that.
Keisha: Yes, that was my mental picture.
SOUND CLIP: The NSA can read the time off your f--- wristwatch!
Annanda: But now anybody can do it. Enemy of the State was this like government agencies, but now you're saying like anybody that has enough tech savvy..
Keisha: Yeah, like you could break into those systems. You still have to have access to the system, but you'd have to break into it if you're not like already part of the system or part of the government.
Scott: It could be book buying or it could be television watching. | A single incident isn't | so very interesting. But | we can learn a lot about someone when | we | put those pieces together and string them into a more elaborate story.
| Coherency can | be something very comfortable with your intimates. | You are known well, and | they know what you're going to like and | not going to like, | how to make you happy and how to pull you into an argument.
| Coherence can be just downright creepy when the relationship part is missing. | When someone's putting together a story on you | without your consent or knowledge, | it becomes the opposite of what we want from our relationships. It becomes a threat or | a potential invasion | into privacy or| safety.
| Coherence | is putting the story together. And | in the data world, | we talk a lot about telling a story with the data. We talk about it as if it's always benign, but clearly we tell marketing stories that are not always benign. Sometimes they are designed to be manipulative, | to change someone's thinking in a particular direction, | get them to spend money. [space added]
Keisha: So a business that owned, say, a grocery store and a rec center would be able to use coherent tracking to build a picture of Fred, who likes Fritos and also swipes into pickleball every other Wednesday. And maybe that would help them customize his experience in both places a little more.
That’s a gentle example. It’s already gotten more serious.
Keisha: when the laws around reproductive health changed last summer. people, for maybe the first time, were thinking about | data tracking and sharing policies in period apps or | data collected when they visited health clinics, for example.
Annanda: The Vanderbilt University Medical Center allegedly shared the health records of transgender patients with the Tennessee Attorney General without authorization. Patients sued them in August. The US Department of Health and Human Services is still investigating.
Keisha: On top of that, CyberRisk Alliance reported that the same medical center got hit with ransomware over Thanksgiving holiday.
Annanda: First off, Trans rights forever. Period.
Keisha: Amen.
Annanda: Second, there is never such a thing as a “benign story.” Stories are | never neutral. This is in part why the humanities cannot be thrown out when it comes to computer science and math. | The humanities help us to be better. Because coherence is clearly out here and very ready and able to snatch all of our wigs. And ain’t nobody got time to deal with that. Not this data breach. This HiPAA breach. On Thanksgiving. That’s serious.
Keisha: The best case scenario for a cyberattack is that it’s inconvenient for you, so you have to update your password or keep an eye on your credit report. But the aggressive prosecution of reproductive access and gender care takes data concerns way beyond mere inconvenience.
Crooks and fascists can’t breach your data if the data doesn’t exist. But if sensitive data does exist, then the people whose data it is should always be able to approve or unapprove who gets to see it.
Why do ordinary people always seem to be the last group considered in scenarios like this?
Data Colonialism, Disadvantage for Consumers
The pattern that I have observed over and over is that once we move to a new realm | of things that we hadn't thought of before, there does seem to be | I guess I'll call it a bit of a land grab where the commercial interests jump in and try to take as much of the space as possible before someone pushes back.
And I think it's a very natural thing to do that because you feel like you're gonna get a competitive advantage in your business and you're gonna be able to sell more product. | |But I think what happens there is | the consumer is often behind in figuring out which part of that land grab is for their good and which part of the land grab they—we want to resist. All of us want to resist.
| An old example, the search and seizure laws in the United States were very much based on someone breaking into your house and walking to your desk and reading your private papers. | When we moved to cell phones containing so much about our lives, the search and seizure laws didn't follow. | The constitution didn't contemplate cell phones.
Annanda: A document written in the 1780s didn’t envision all the conditions we’d encounter some 250 years later. I’m shocked.
Keisha: The tech environment has changed so much. In the 1700s searching required investigators to physically interact with an accused person’s physical things.
But in our world, it can mean grabbing human data without explicit consent or ripping data off a phone as it passes by. Police can do that with devices called stingrays. So how we judge a violation of privacy is changing too.
Annanda: The question that’s coming up for me with all of this is what does it mean to live in a world where equal opportunity violation is expected and deemed normal? What does that lack of consent do relationally to a society?
Keisha: I love that question | what quality of society are we making where that violent extraction is the norm?
Annanda: | All of this | is incredibly private, personal information. | And so if that is a normal way | certain businesses are conducting, right, | you walk in a store and—brainwaves—and they’re able to extract the data that they need to support your business. | That feels incredibly violating and nonconsensual.
Here in Oakland, folks have scanners. They'll scan your car, and then they'll do what we call bipping, which is they'll break the glass, either the passenger side or usually the quarter panel, and then, get into your trunk and take whatever prize they feel like they've scanned.
| Thinking about consent, the ownership that one has, not just on our bodies / but also the mind and the spirit | [added space] when you go to a therapist, you’re like, yes, observe my behavior. And we’re doing this, you know, for my | improvement with my agency. Cause you could always tell your therapist no | right? | or like, this doesn't feel good for me, let's try another way.
You can't do that with this.
| It says that we are creating a culture and a society that politically, | economically, | and socially says | the society must run on non consensual extraction | of all parts of you.
| That is what, to me, is truly morally at stake, and that's a problem.
Scott: | we've all | been moved to a new expectation for | what someone can do in terms of reading from our phone: It's | a different standard and partly because the context changed so much:
If someone can interpret my facial expressions or read brainwaves while I'm shopping in their store, and they can do this without telling me 'cause they don't actually need to connect anything to me to do this, | I, as a consumer need to be aware | of being able to say “no” earlier and that's difficult. [space added] |
We don't have laws that cause a company to disclose everything they can do with new technologies. | it's concerning to me | 'cause the back and forth almost always is | led by | a commercial interest taking ground that the consumer wants to protect.
Keisha: | Public researchers | did a lot of work on social media because they could access Facebook and Twitter's API, for example, but |that research is being cut off or metered, or you have to pay for it now, and it's very expensive. What does it mean for us if the only people who can study the data that exists about us or how companies are using it are those companies themselves?
Scott: The inevitable conflict of interest is playing out right in front of us. | Many of the largest companies | that run social media and have large scale data around our search habits and other things | have gone through a cycle of building teams that were focused on data ethics. But have actually ended up, | fencing those teams in, in terms of what policies and what kind of communication can come out of those teams. | So | not surprisingly, it is hard to trust the internal voices that are studying data ethics at these companies. | At the current time I don't think we have | much political will | to have any kind of public oversight of the data stored in social media.
Keisha: And without that visibility, people can't really opt out.
Choices in the Tech Ecology
Scott: I'm not seeing evidence of | enough influential people with understanding, working in a direction | that gets us to a workable place with policies that both allow us to see what's happening inside large organizations from a data perspective that protects consumers, but also lets | companies go about their business in creative and innovative ways.
Keisha: If we think about the tech social system as an ecology—this is an image that's been sticking with me for months at this point.
What if it's like a system where you have the technologists, the lawyers, the ethicists, the researchers? The general public, the regulators, all of us kind of connected through these tools, | systems, |choices,| moral dilemmas and breakdown.
| None of it works unless we're all in some sort of relationship to the system. | The person who clicks through, | the person who is driving and trying to mind their business, | the person monitoring them on the camera. It requires all of us either to support this structure that extracts violently or to challenge it and change it and make it something that respects consent.
Annanda: | The possible future that you named, I think is a beautiful vision. | and it requires more work. | if that's the future that you want, if that's the choice points that enough people want to make, cause it doesn't have to be everybody.
| it just needs to be enough people, | well, there's a cost to that, right? That means you actually can't be as passive about | technology. It means you actually do need to be educated about | what's going on.
And | you do need to be informed and you need to be an active citizen. | It's part of the, I think, a crisis that's going on in the United States right now, | how many people are informed? | are they even able to vote? Are they getting gerrymandered?
Keisha: Do they feel it matters when they do?
Annanda: Right. Like, you can’t have a miseducated or uneducated populace in order to run a successful democracy.
Keisha: Mm hmm.
Annanda: So I think | so I think that same method applies to what you're talking about, but it requires more effort. | my skepticism | is like, | Most people aren't willing to do the work.
Scott: | Having consumers become more sophisticated about their data is one of the broader answers. [space added] | I think the voice of more people who understand | the benefits as well as the risks and can talk about them and vote about them, | is the most powerful part of this that's missing right now.
| it's tricky to have | our government leaders up to date enough | to create a policy that | is going to allow the kind of innovation that's gonna come from new technologies, like large language models, while still protecting people. | I don't know how to do both of those at the same time. It's always | a race to | build the security model, | build the visibility, build the capability, explore some of the things that we hadn't imagined we could do with | new products and new services | and then maintain safety the whole time.
| That takes too much foresight. So | I don't think we can look to government.
Data Shenanigans
Keisha: While we were in production, Cali passed a law—
Annanda: Oh, you—people do call us Cali outside of—wow….
Keisha: We do. We do. I’m so sorry.
Annanda: It’s kind of rude, but I understand.
Keisha: Put some respect on the name of the state. Okay.
Annanda: like a little respect!
Keisha: Laughs: While we were in production California passed a law that will require all data brokers to delete the data when a consumer makes one request. Can you imagine they had to pass a law?
According to the LA Times, there are over 500 brokers registered in California alone. I love this law.
Annanda: Listeners, get your data deleted.
Keisha: Love it, love it, love it. Wish it were federal.
Annanda: | Oh that’s your Europeanness coming out, you know? For the people! The EU has way better internet privacy protection laws than the United States.
Keisha: It does, and it's complicated even there, but it's a start. It's a start to a different vision that puts the agency back where it should be: with the people.
Scott: | A number of data vendors | make no pretense about trying to keep a database on everyone and they will sell that data to any customer that comes along. | That availability starts the data record, and it's awfully tempting to build on that with any other data you can pull in.
SOUND: shift tone between interview and dialogue
Scott: | There is a sense of trying to build|, a complete circle view of someone who's potentially your customer. It's very tempting to try to do that. | I worry about | the world in which we choose to give up our data because | I don't know how to design a world where we can choose not to give up our data in some sense.
| Not buying groceries, | not driving a car, | not paying your taxes. Like | these aren’t options for people.
Scott: The three major credit reporting companies are keeping a record on everyone in the United States.
There | are some ways to limit access |,| but there really is no opt out. | there is no practical way to live in our world and have people protect their data.
We're absolutely gonna be giving up our data | Visibility into how it's going to be used and explicit controls on how it's going to be used has to be the answer to this. |
| and I think it has to come from government because | any restriction that is going to give | your competitors a disproportionate advantage in the commercial space, you're not going to take on unilaterally. | it's too costly to be that company that steps forward and says, we're not gonna use data in this way.
| If the government imposes these restrictions and we can broadly agree that they're reasonable for protecting people, the competitive landscape is now leveled back for everyone to the same spot and they can all afford to | exercise some restraint | when the law says they can.
Keisha: But these restrictions and constraints — they’re not in place yet.
| I’m very concerned about the data. I have a hard time choosing not to give up as a consumer because | I don't think I'm being broadly protected by the policies of our government. |
Keisha: Scott’s not particularly dramatic so if he’s concerned, I’m concerned.
Annanda: Keisha I’ve been consistently concerned. I don’t trust most of these tech pimps. and my language is intentional because we're all getting pimped out and I did not agree. I did not agree. *laughs*
Future Data Scientists
Keisha: So Annanda: if you were writing a moral repair curriculum for the tech workers who are making these systems, what would be on it?
Annanda: You know, I’m a massive advocate of | experiential learning alongside theory. | so it can be integrated for practical application. | This is how chaplains are taught. I learned better, not just thinking about it, but also doing it. | If you’re better at theory, alright, you’re gonna excel in that. And if you’re better at praxis, or the doing, then you’re gonna excel at that. But either way, you gotta integrate them, right? You can’t be the ivory tower just with theory in your head | thinking it’s gonna work out in the world, and you can’t just be doing things without a better understanding of | possible theories to support the doing. The integration is important. |
And so a core component of my curriculum would be tech folks, shadowing social workers, chaplains, other human-centered service providers to see the real impact of the technologies they are creating on the lives of everyday people. I’d have them do this Monday through Friday for at least the majority of the course, and reflect,
Scott:| I think it starts with empathy. | walk in someone else's shoes for a while and see how that feels.
Keisha: So empathy is core to ethical design. It's a way of helping us understand the environment and context of use for the person who's using whatever product | or process we're trying to create. And it helps us to understand their world and, um, therefore design with a bit more of an accurate eye for how it's actually going to play out.
Annanda: empathy is good. I'd prefer compassion. And compassion is empathy with boundaries, and this is something, | we chaplains talk about all the time, | I don't need somebody to feel the pain that I feel as much as I need them to value and center my experience of pain when it has occurred.
And if we're empathetic all the time, you're gonna get worn out. | if I was empathetic to every patient I saw in the hospital, every student going through a crisis, or a death of a loved one, | I'd be so in tune with feeling what they are feeling, as opposed to compassion, which will allow me to | feel with boundaries. | I don’t have to take on their emotions, but I can acknowledge it. I could value it.
Keisha: Maybe it's about |, building secure relationships and setting up permissions to be known in advance. And that's what can help to make data matchmaking feel more like a tool that's supportive of us than a violation at the whim of an unaccountable person or company.
| Something else is coming up for me around whether we should simply account for the fact that our brains are wired to catch disaster.
So we might be biased against seeing good possibilities.
Scott: when I'm looking for a new part for | my heater, | I get a recommendation for a new filter that goes with it. I install that, my heater’s back up and running. | If that recommendation engine hadn't matched the idea that | if you buy | a new part for your heater, you probably need to replace the filter, I wouldn’t have gotten that advantage. | I don’t usually tell that story to people. It’s just mundane. | *chuckles*
| But when the opposite happens, when I feel violated by some kind of recommendation that's | too close to home,| then I tend to have a reaction to that. | The upside and the downside, are both real human experiences. There's value and risk in both.
Sometimes |, the coherent tracking does get to me |.
I've definitely used technology skills to build ad blocking into my home network | and into some of my laptops | mainly to stop the irritation. |
Annanda: Scott needs to teach me. How to do this ad blocking Dougie, and probably a handful of things I don't even know that I need right now.
Keisha: Yeah. I want a training too from Scott.
Ultimately, I think, though, | beyond | the protective measures, | I really am interested in how tech like this can enable us to do good, |not just to protect against the bad raiders, but to think about how we are being changed | as we become accustomed to using these tools. Because it's not just the data and it's not just the systems, it's also us. And whether we can wrangle our lizard brains to learn to see the good and to move toward it.
Learning from the Trees
Annanda: You know, as you say that, Keisha, it makes me think of biomimicry. In particular, the redwood trees.
Keisha: I love them. Yes.
Annanda: their network of roots, they're actually holding one another up for roots. And so if there is a tree that is a little lower on nutrients or needs something, they'll all come together and like support and lift up this tree based off the data, the information that they have of what one another needs.
And as you're talking, I'm like, yeah, you could design this entire | ecosystem that's created to mimic what life has showed us already. | I think that would be so fascinating.
Keisha: It was the cleanest air I've ever breathed walking through Muir Woods.
Annanda: Wait, wait, wait, where's Muir Woods located? What state? Yeah, in Cali. Okay, as long as we're clear about that. Proceed. Proceed.
Keisha: I absolutely loved it. | and | to have the humility as the masters and crafters of this technology to learn from what life has already | shown us works. Love it.
Scott: | I’m very excited about where we are now | and where that takes us.
|I'm also very hopeful that we are | not going to be the humans in the future that we are now, that we actually are gonna grow | in ways that | connect us in ways that we've never been connected before.
| We should always look at scary things for a minute and make sure | we can manage those risks | but | all of the powerful tools | even the ones we complain a lot about | have an upside | a way of | of allowing us to do more with less | to live longer | to have healthier lives, to have more fun, to share with people farther away.
| All of those things have been amazing. | celebrating what’s been accomplished and what we can do next is amazing | and wonderful.
SOUND: Reflective music
Keisha: Scott opened up a theme for me when he talked about how tech development sometimes operates colonially, grabbing land for the company without regard for the wider ecosystem of other people, other groups, and the society that we're all a part of. And so many of the concerns that we raised in this episode come back to that question of colonial behavior and the question of consent.
From the worlds of gender and sexual justice, I learned the principle of affirmative consent. Affirmative consent is consent that's explicit, free from compulsion, and offered in real time versus once for all time.
Even though this model is increasingly part of legal standards around sexuality and it's shaping some of the data and privacy models in technology, our social systems aren't yet built around this understanding. Humans aren't binary. We don't always know what we want until we're doing it. So it's a practice to be able to recognize what we want in the moment and then take the liberty to say so and have that heard and respected.
A few years ago, Professor Alondra Nelson, professor at the Institute for Advanced Study in New Jersey, talked to the New Yorker about the ambivalence that's part of simply being human. And the law doesn't account for that ambivalence. Sometimes the law tries to flatten the realities that we actually live in.
If I go to a website and get a cookie pop up, I actually have no idea whether I want only the "necessary" parts of the page to work. I don't know if I want a long term relationship with the company behind that page. How do we craft a world that doesn't force people to pretend to be more binary and self aware than we actually are, but | respects our decisions as they become more clear to us?
Dr. Yvette Abrahams works on gender, climate, and Indigenous knowledge at the University of Cape Town. In the essay, Thank You for Making Me Strong, she says, “Our task here on Earth to live well together, bearing in mind the action of each has responsibilities for the well being of the whole. No doubt we human beings are so proud that we would live alone and independent if we could, but we cannot.”
In relationships, as in tech, grasping for individual benefit whatever the cost to the group, puts us all at risk. So these are the ethics that recognize we already live in an ecology and we can only have the health and prosperity we need by acting like it across the full scope of our personal professional lives and cookies and consent and in challenging colonial impulses wherever they surface.
Because in the end, it's not about creating an ecological reality. It's about dropping the illusion that we've been independent all along.
SOUND: Bells and holiday vibes!
Keisha: Hey y’all: Our next show is a special holiday episode… and we need your help.
Annanda: We want to know, when it came time for the holidays, what were some nostalgic memories you’ve had around tech? Was it gifts? A tradition? A hilarious or heart warming story?
Keisha: You can reach us at @moralrepairpodcast on instagram, @moralrepair on Twitter/X, and at @moralrepairpodcast@gmail.com
CREDITS SEGMENT
Annanda: I’m Annanda Barclay.
Keisha: And I’m Keisha McKenzie.
Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Courtney Fleurantin, Rye Dorsey, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.
Keisha: This podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.
[PRX AUDIO SIGNATURE]
Show Notes
For the next episode: tell us about your nostalgic tech memories! Find us at @moralrepairpodcast on instagram, @moralrepair on Twitter/X, or moralrepairpodcast at gmail dot com
How did Cambridge Analytica use 50M people’s Facebook data in 2016?
California bill makes it easier to delete online personal data (LA Times)
“Churches target new members, with help from Big Data” (Wall Street Journal)
In the film Enemy of the State, characters uncover all the ways they’re being tracked—it’s a lot
Digital safety for people seeking reproductive care (Digital Defense Fund)
“Thieves Use Tech Devices to Scan Cars Before Breaking Into Them” NBC Bay Area.
Scott has recommended a few books for our audience:
Re. ChatGPT-4: Impromptu: Amplifying our Humanity Through AI by Reid Hoffman.
Prof. Yvette Abrahams on social ecology ethics in “Thank You for Making Me Strong”
Holiday Special: Tech Nostalgia
Season 1 | Episode 7
-
Coming soon
An Answer to Big Tech? Tech at a Human Scale
Season 1 | Episode 8
-
Keisha: Hey Annanda.
Annanda: Hey Keisha.
Keisha: Last time Netflix raised their prices I dumped them. |
SOUND: Decelerating Netflix logo sound
Annanda: I have my same | subscription from when I was in college, so I'm grandfathered into one of the oldest price points possible. Maybe it's raised like a dollar on me.
Keisha: Wow.
Yeah they’re up to something ridiculous for a subscription that doesn’t have ads on it. You can get one with ads for $7 plus the digital tax… but if you want it ad-free that’s like $16-23 a month.
Annanda: Oh yeah, definitely don't pay that. I don’t pay that with Netflix
SOUND: Light casual background music
Keisha: Absolutely, ‘cause it’s not just them. It’s also Hulu and Disney and Paramount and Apple TV… There are so many of these entertainment services you might as well bundle them all together and call it “cable” ‘cause it costs as much!
And now Amazon’s ringing in the new year by changing Prime Video. Even those subscribers will have to pay more for an ad-free service.
Annanda: See, my Netflix and my Amazon Prime are both | old because | Amazon came out really wanting to sell books, right? And so that became a cheaper place for me to get all my books in college. | but also, I do what so many of us do, right?
|I have one family member or | friend that's close to | family, they cover one thing, and together we make a medley of access to media, which I know they're trying to bust up, but I'm like | try it. | Don't think your content is so addicting or so necessary that I won't say goodbye off the principle of it.
Keisha: That's right. | They push you hard enough and you're like, “Okay, thank you for the clarity. Let me move on.”
And we’re already paying for the services they offer. Our relationship with these Big Tech companies is basically “pay more and more, and it’ll never be enough.”
If you're buying from a platform that not only knows what books you read but also what you bought at midnight five years ago, and they've just bought your doctor's office…you shouldn't be paying more for surveillance.
Annanda: No! You definitely shouldn’t be paying more for surveillance. Also, | this is why I want to bust it up. I don’t want so many companies or corporations to have so much knowledge on me.
Keisha: I was reading the Amazon Ad blog in November. They told advertisers they expect to reach 115 million viewers a month by up-charging for ad-free. And that’s just in the U.S.!
They want to make more money off the streamers that they know are also shoppers. So Big Tech is a whole ecosystem of services based on tracking users from transaction to transaction, click to click, action to action.
They could just as easily have built that system around people’s actual needs. We want to be connected. We want to enjoy great stories. Maybe, yes, buy some stuff from time to time!
But it’s like data scientist Scott Hendrickson put it: Big Tech development is a “land grab.” And in that exploitative form, it’s driving degraded outcomes for consumers: privacy breaches, suspicion of businesses, and suspicion of each other. We’re means to an end for them.
Annanda: It's definitely come to “land grab.” | There’s so much technology out there. |
People can basically recover | a majority of your genome, just from | traces of your breath, or, like, where you have sat. | so I’m like, Oh | soon this will be something commercialized. That’s been a thing that I’ve | been realizing: | This commodification goes so much beyond | shopping.
SOUND: Start fading in Title music
Annanda: As I look at | what new technologies are coming out, what new horrors | and gifts are down the horizon. I'm always following the money. | Once I see one thing I look for | what is the equal opposite reaction within the market to it.
Keisha: In terms of equal opposite reactions to Big Tech, what about small tech? |
SOUND: Title music feature pause
Keisha: I'm Keisha McKenzie.
Annanda: I’m Annanda Barclay.
Keisha: This is Moral Repair, a Black Exploration of Tech, a show where we explore the social and moral impacts of technology…
Annanda: and share wisdom from Black culture for healing what tech has broken.
Today on Moral Repair: can small tech make the internet good again?
SEGMENT B: DESCRIBING THE SYSTEM
Annanda: | Keisha, I have been so focused on big tech that I have not taken the time to look at small tech.
I've been | focusing on | how do I support the mental and spiritual health of those in big tech | companies that are feeling like they're compromising their morals, their ethics, their values, | but I think what you're talking about is so important. | I am in the Bay Area, so, *laughs* I’m just inundated.
Keisha: | Big Tech’s in the air you’re breathing over there.
Actually I’ve been hoping Big Tech workers would catch the labor rights wave that’s live right now. I’ve been rooting for everybody striking. | Amazon staff. UPS drivers. Grad students. Nurses and doctors. Postal workers. Writers, actors, everybody.
Annanda: I've definitely been rooting and very eager to see | where their civil disobedience lands because these are | precedent setting strikes and responses that are happening right now.
Keisha: Organized labor everywhere: get the working conditions you deserve!
Digital workers shaping the AI revolution in developing countries around the world… don’t have the conditions that they deserve.
In October, WIRED reported that data companies have been hiring people from the Global South to do piecemeal data work like tagging images or sorting the data that trains AI chatbots. And these folks earn pennies.
Annanda: Oh this sounds like good old fashioned American capitalism.
Keisha: Yeah. The poorhouse era, so not even American, that’s European too.
And in Kenya, Meta, which runs Facebook, has been hiring local Kenyans to do content moderation for the site. 184 content moderators recently sued Meta over inconsistent pay, union-busting, privacy violations, and work conditions. Negotiations between the company and the contractors just failed again, though, and Meta doesn’t seem motivated to make those conditions humane. They’re big enough as a global corporation to be able to outlast any individual workers and keep trading in the country—even if people suffer for it.
Jeffrey Sachs talked to the UN about this pattern a couple years ago. He’s an economist at Columbia’s Center for Sustainable Development.
Annanda: I love Jeffrey Sachs. I've been following his work since college.
Keisha: In a speech at the UN Food Systems Summit, he described how colonial exploitation and private sector extraction have wrecked countries like the Democratic Republic of Congo since the 1880s.
Clip: 00:03-00:12; 00:16-50: “What we’ve been hearing is how the system actually works right now… It’s based on large multinational companies, |private profits, | a very, very low measure of international transfers to help poor people, sometimes none at all, | extreme irresponsibility of powerful countries with regard to the environment and it’s based on a radical denial of rights of poor people.”
Keisha: “We have a system,” he said. “We need a new one.”
If we use tech to build systems that respect people's dignity, privacy and autonomy, what could that look like in practice?
SOUND: Clicky, curious background music
Someone who’s trying to create a new system is Aral Balkan (Uh-RAHL Ball-KAHN) of the Small Tech Foundation. He’s a designer and developer based in Ireland And one-half of the first tech outfit I ever heard of that wants to catalyze a different way for ordinary people to use and own technology—at a much smaller scale than Big Tech allows.
Aral defines small tech as web sites and services that are designed for individual people. Corporations and governments don’t own and control our data, we do. Small tech protects basic human rights like privacy and so breaks away from the Big Tech model of watching the people who use it in order to trade their information. It’s funded differently than Big Tech to make that focus possible. We’ll get to what that can look like, but first — here’s the system we have right now.
Aral: Most people are | familiar with | big tech in terms of everything | that they know as | mainstream technology. |
| it's Google, Facebook, | Snapchat, | TikTok.
Follow the Money
Keisha: Companies like these—the army of corporations that make tech hardware, software, and services… they all say their mission is to innovate, but they don’t innovate when they don’t make money, and they don’t make money if they don’t grow.
So they’re motivated to grow and grow and grow and grow. And that’s how a scrappy outfit that starts in someone’s garage or as a pet project in a lab can become a marauding beast, like an orc run amok.
Aral: | it really comes down to following the money.
| How much do you pay for Google? How much do you pay for Facebook? | you don't pay anything, right? | and yet these are trillion dollar corporations. So how are they making money?
| you're too valuable for them to sell. They're actually renting access to you. | and | in a way they own you as well. They don't own your physical body, but their goal is to own everything else about you that makes you who you are. | that's basically the business model.
That's what we call surveillance capitalism. That's what | I call people farming. | it's just a new form of colonialism, | a new form of imperialism.
Annanda: Surveillance Capitalism, | by Shoshana Zuboff, was one of the first | books that really grabbed my attention about what is going on | within the tech industry as it relates to surveillance algorithms and AI.
Here she defines the term while speaking at Suffolk University:
SOUND: Soundbite click
1:23-1:49—”Surveillance capitalism has claimed something from outside the market and brought it into the market to sell and buy. But in this case, the territory that it has claimed is private human experience… to be sold and purchased as behavioral data.”
SOUND: Soundbite click
Annanda: | The phrase people farming is apt. | through algorithms and different choice points | that we are given on the internet, | our behavior is herded.
| we're not necessarily in control of how we surf the internet as freely as we think. We're given a series of choices | and those choices | are curated.
I do think that it’s a subtle form of imperialism | and it’s quite concerning.
Keisha: It reminds me of | studies on | how they present pricing options in stores or online. | the company | will really have one choice they want you to pick and then they'll offer you something that's obviously worse and then something that's marginally better and you "naturally" choose the one they wanted you to choose all along.
Annanda: Exactly. | imagine that with all of our decisions. | I don't think we realize | how out of control we are, | when it comes to | our own | choice and agency.
Keisha: We wanted Aral’s perspective on how “people farming” is playing out around the world. In 2017 he wrote, “People farmers are reaping the rewards of their violations into our lives to the tune of tens of billions in revenue every year… [and] farming us for every gram of insight they can extract.”
So what about today? What are these companies strategizing about now?
Aral: If you talk to someone from Silicon Valley about what they're most excited about these days, you'll probably get an answer along the lines of the next 5 billion, | which sounds like an amazing concept. | What does that mean? | People in parts of the world that they probably haven't even visited, | they want | all of these people to be their | next set of users, that they extract value from and exploit, | for their profit motive | and whatever other motives they might have. Because once you become a trillion dollar entity, and once you have people who are billionaires, it's not just about the money anymore, it's about power.
| So when you hear someone from Silicon Valley salivating about the next 5 billion, | colonialism | is exactly what it is. | It's not necessarily limited in geographical scope anymore. | it's about a very small group of | people who have a lot of wealth, | who are basically exerting control over, | everyone else | over what they get to see, | what they get to hear, | and, | having the means to manipulate their behavior. |
| that's | the next step of colonialism, | and different people,| are of course affected by it differently. | the most marginalized are the most affected by it right now. | but to a degree we all are because if you compare yourself to a billionaire, well, | we're talking about a very exclusive club.
Caste + Capital Bias: “A Very Exclusive Club”
Keisha: A concept that might be helpful here is caste, which Isabel Wilkerson defines as the basis on which people are given or denied social privileges, respect, resources, and fundamental considerations like consent and autonomy.
And the club is indeed very exclusive. As the United States emerged as a nation, its dominant caste also formed around race and class: Whiteness, wealth, and the absence of both.
Wilkerson’s book, the award-winning Caste: The Origin of Our Discontents, focuses on how that caste system systematically violated Indigenous and Black people across this hemisphere and other subordinated people around the world.
Marjorie Kelly of the Democracy Collaborative adds another layer to the story when she describes the role of wealth.
In Wealth Supremacy: How the Extractive Economy and Biased Rules of Capitalism Drive Today’s Crises, Kelly explores just how much U.S. culture values wealth and people who own wealth, and what this “capital bias” does to all of us. Here she’s explaining it to Laura Flanders:
CLIP: 00:23-00:27: https://www.youtube.com/watch?v=mH2eGIHXH5M
“There’s an extraction going on and it’s sapping the resilience of our society.”
People consciously and unconsciously use wealth as a standard for judging others and determining the kind of lives they can have. More than that, capital bias shapes how businesses ensure that people who already have wealth benefit.
We first see caste and capital bias in this hemisphere in America’s broken colonial treaties and plantation systems. And these influences play out today across the modern tech sector.
Changing that is gonna take some work.
SOUND: Fade in and out the intro and first verse of Billie Holiday’s “God Bless the Child.”
Key vocal: “Them that’s got shall have; them that’s not shall lose…”
Fade out as Aral begins to talk again.
Stories of Privilege and Access
Aral: Our aim is to try and not replicate this big tech process. | I am a, | white passing guy with two Turkish parents, |who now has French citizenship and | who has enjoyed | a certain level of privilege. | I have never gone to bed hungry. | That puts me in a very small percentage of people. | As someone with this sort of a background, | the only way I know of not perpetuating colonialism | is to try and build systems that | don't require me to be at the center. | if these systems become more successful, I don't keep growing and getting wealthier and getting more powerful. |
Keisha: This principle is at the heart of the small tech approach.
Aral: Small Technology Foundation is just the two of us. | we don't say we know what's best for everyone else that's out there. We don't, we can't.
| if there's one thing about design, it's that you can only ever best design for yourself. | if you're saying, | we're going to design this solution for this country on the other side of the world. No, you're not. You're still going to be designing with your own biases, with your own experiences |. You're just going to expect those people to use whatever product you've created and | be used by that product that you've created because it's going to be in your interests.
| we're trying to build technology in ways where as it scales, we don't scale. | How do we do this? | Build things in a way where it's | free and open source so that anyone can take it and modify it,| where people can customize it for their needs. | And you don't need my permission. You don't need me involved. | how do we | encourage other groups to be sustainable with these technologies, without it necessarily feeding back into us, without us getting bigger |?
| we're talking about decentralization. | communities are empowered to build their own technologies and to use their own technologies, | and not be used by some center somewhere. |
Annanda: Aral, | I'm wondering, is there a personal story for you there? Because you do not often see white-passing men such as yourself talk about decoloniality | and act on it, create organizations, foundations to actually act on it. So I'm curious as to what | might empower some of our listeners to maybe see themselves in your reflection. |
Aral: | I basically grew up in Malaysia until I was about 13. | We weren’t rich, but | my parents’ | relatives were the | ambassador from Turkey | so we were in those circles.
| Very early on I was subjected to | extreme wealth, | not in a way that was in any way critical. Don’t get me wrong. I was just in the middle of it | going, why am I not that rich? | Why don’t I have the Lamborghini that drives me | to school?
SOUND: Fade in something like drum intro and first verse for Arkadas through narration; no vocals.
| Coming back to Turkey | to a very different culture when I was 13, | maybe that slap in the face was the first step | in | reflecting | on these things. | It might have been studying critical media theory for way too long at university. Chomsky will do that to you. | It was just step by step. |
Keisha: For decades, social change organizers have taught that the personal is political. Our individual choices and perspectives are so deeply shaped by the histories and social politics of the cultures around us.
Annanda and I get to talk about massive cultural themes on this show… the tech sector’s hypergrowth, capital bias, and how to build for community well-being. It’s also fun to see how these big influences come together in our individual stories, framing our orientation to technology.
Aral: When you grow up with privilege, you get these opportunities where | you’re shown how things are not right. You're shown the inequalities. | and then I guess you have a choice to make, | am I okay with this because it benefits me? Am I okay with perpetuating that? Or should this inequality not exist? | whether that benefits me | or not. | Do you | go, okay, well, screw it, | this is good for me. | or | well, no, this is wrong and this shouldn't be.
| At the same time, | being Turkish, for example, I also got to see other things like I’m white-passing and that’s fine until I get to a passport control and then they see where I was born and then it’s kind of like, you trying to fool us? | Are you a terrorist? |
This system that we have, this unequal, unjust system that we have doesn’t benefit anyone. It doesn’t even benefit these billionaires.
| Like we’re on the verge of climate catastrophe | our habitat being unlivable | The world is going to survive. Homo sapiens sapiens, maybe not so much. | I get a lot of people saying, oh | | this is | about charity. No, none of it is! | If anything, it might be enlightened self interest, but that’s it.
| The people benefiting the most from this horribly corrupt system that we have, | my goal is not to save those people. But | we’re all in the same boat here. | It’s just that | some people are going to | have the life rafts maybe, and the rest of us are | going to be the first to go.
SOUND: Evocative space-like background sound
Keisha: Annanda, did you ever watch the film Elysium? |
Annanda: I have not.
Keisha: Rich people escape a wrecked Earth for a luxury satellite that they’ve built in orbit. But the rest of us have to stay here on the surface, in poverty and catastrophe.
Of course it’s fiction, but | when NASA shut down the shuttle program in 2011, that left a development vacuum that Richard Branson, Elon Musk, and aerospace companies like Boeing have been really happy to fill. | They’re talking not just about supporting | the International Space Station but also about colonizing Mars and seeding the galaxy.
Annanda: I’m not surprised. | The money, intellectual property |, and market dominance | as a result of the innovative technology to get to space, is a natural, normal next step when it comes to capitalism. | it's the same old same old… just in space.
Keisha: The land grab strikes again.
Annanda: What this dilemma reminds me of is Gil Scott-Heron’s song “Whitey on The Moon”. | you just have to replace the word “moon” with “mars”. | My dad grew up in the west and south sides of Chicago. My mom grew up in Newark, NJ. Even though both lived in completely different parts of the country, the sentiment in their respective Black communities were the same. |
SOUND: Fade in and out 00:24-34 from the “Whitey on the Moon” audio
| There’s a 60+ year history of Black and Brown communities critiquing the US government for prioritizing | space races over the dignity of its most vulnerable within society. It’s “most vulnerable” not by choice. It’s “most vulnerable” by design.
Keisha: | The | space stuff is | expensive escapism. |
But it’s also what | happens | when | a dominant caste is disconnected from consequences... and it doesn’t have much of a commitment to making capital and tech systems | less toxic or | more beneficial to people who don’t have those resources.
We’ve talked about how tech shapes us as much as we shape it. I also think how we come into the world of tech can influence how we see and use it.
Aral: I was seven years old, my dad got me a computer, and a BASIC manual. Again, privilege, | cause I'm | quite old, so back then it wasn't a common thing. | he just said, | “Play with it. You can't break it.” And I | started programming.
Keisha: BASIC is a programming language that ran on home computers through the 1970s, and Microsoft helped make it popular by embedding it in early PCs. As someone whose family didn’t get any kind of PC until the 1990s, I can’t imagine having had access to this tech so early on.
Aral: | when I was | in the States doing my master's degree, I got a job | making money with technology, | consulting with Flash. | very quickly I was being offered six figures to work in this area. | but I never worked at a Google or a Facebook | And I think there was always something, again, that pushed me away.
| the more I understood, the more I saw, the more uncomfortable I was with it. | I don't want to build something that tracks what everybody does | so I can manipulate them.
I don't want to do this. Oh, you don't? Well then, then you don't want to create a startup, because that's what all startups are | trying to do. | then you won't get any venture capital. | well, I guess I'm okay with not getting venture capital then, if that's what it involves—this is a really toxic system.
SOUND: grim, gritty electronica, downtempo through segment, behind our voices
Keisha: Aral | saw a system that he was supposed to be attracted to. And then once he got close to it, he began to feel repelled.
While he’s choosing to consult around the system |, several decade-long trends are unfolding in tech and society.
Computer hardware improves so much and so fast that tech users can access so much more personal storage. | We can walk around with mp3 players and | stream | films online. But within that same wonderful system, the musician and artist isn’t getting paid properly and they can’t control where their work goes.
At the same time, government agencies are mass surveilling the public in the name of preventing the next national security disaster. And tech companies cooperate by giving law enforcement agencies on-demand access to people’s information.
It’s in 2013 that we start learning just how far-reaching those surveillance programs are. The news alienates a lot of folks who, like Aral, got into tech to make life better for everyone.
Annanda: | This is where those who have wealth and access can really play a crucial role to disrupt monolithic business practices and provide products with a broader range for the public.
I’m mindful that the social and emotional cost to make that happen is quite high | and | our society undervalues emotions. But the reality is, emotions shape our individual and collective identities and cultures. | the core emotion of fear that clearly exists within the tech industry | is a serious roadblock when it comes to catalyzing change.
I’m not just talking about fear through disenchantment and overwhelm, for folks like Aral. I’m also talking about fear of losing market dominance, or losing access to surveil populations to mine data for business or government purposes.
An American intelligence contractor | Edward Snowden leaked the news that at least five governments were surveilling people’s phone data. The US charged him for espionage and theft, and he fled for asylum in Russia.
The program Snowden leaked was challenged in court, but 10 years later governments all around the world can still access our phone and internet records. And there isn’t much accountability for it.
Keisha: | It's sobering, really, to think that it was so | scandalous at the time | but have we become used to those patterns? | Accustomed to the toxicity?
Aral: | The Snowden revelations were | the final drop, | where it's like, my goodness, the extent of this thing. | people always ask |, "Well, what ramifications are there? | I'm such a boring person. Nobody wants to know about me."
Trust me. It’s not just about you. | You’re a statistic to them. But also that's very much against your interests that you're a statistic to someone.
Countering Dehumanizing Systems
Keisha: Toni Morrison gave a lecture about people becoming statistics at Portland State University in 1975. She starts | by describing a ledger of commodities traded in the US—food, solvents, and trafficked Africans.
Listen:
06:59-8:00: “Right after pitch and rice but before tar and turpentine, there is listed the human being. The rice is measured by pounds and the tar and the turpentine is measured by the barrel weight. Now there was no way for the book … to measure by pound or tonnage or barrel weight the human beings. They used only headcount.”
Like Isabel Wilkerson and Marjorie Kelly, who we talked about earlier, Morrison names the root of this dehumanization. It’s not racism. It’s “greed and the struggle for power.” Racism and disparities are consequences of the caste system that allows a few billionaires to target their next 5 million marks.
Morrison also says “You don’t waste your energy fighting the fever; you must only fight the disease.”
“Fighting the disease” includes thinking about how issues intersect and how policies like mass surveillance could affect others.
Aral: if you happen to be gay in a country where it is illegal for you to be gay, | then maybe the ramifications are that you're going to be dead or incarcerated, right? | as a middle class | white person | in some | Western country| it might be that your insurance premiums go up and then you're like, why do my insurance premiums go up? And they're like | your smart fridge told us what you're eating.
| it's at those points where some people get it: “Oh, I'll have to pay more money for something.” And it's sad | that's when they pay, start paying attention. Because | people are suffering and dying.
Annanda: Thinking about “fighting the disease” is a vital start, I worry that it can all too easily become a comorbidity of the disease, | a norm in liberal spaces. At this point, it is a rare thing for me to give any “you know what's” about a damn book study, panel discussion, or documentary watch party, explaining to well intentioned people, the reality in which we’ve been swimming in for generations. I’m far more interested and willing to collaborate with folks who are rolling up their sleeves and doing active work on how they are in relationship with other people, and how those relationships model with clarity, depth, and complexity | undoing historic wrongs.
Keisha: Yeah, Toni Morrison’s work is all about building new futures. She’s literally asking creators to stop shadowboxing! And to make something real, rooted, grounded. Because the current system has its own inertia keeping it in place. It takes energy to move in new directions.
Aral: | the companies | whose business model it is to track your every move and then use it against you | it's not that they're going to go | out of business tomorrow.|
| Hugo Boss | created the uniforms for the Nazis. They're still in business. | VW | built the cars. They're still around. | We seem to have a very hard time getting rid of large corporations, even when they've done the worst things in the world. So, | I don't believe that | we can destroy Google or destroy Facebook |
| but I don't think that's what we should be trying to do either. | They have an unethical business model, so as far as I'm concerned, they don't deserve to exist.
| There's a book called Hooked, | that was a bestseller| in the U. S. | How to Build Addictive User Experiences. You'd think it was a warning. No, | it actually is an instruction manual. | So this is the problem, | our success criteria are wrong. We encourage psychopathic behavior. |
| We don't have healthy cells that are, | doing something good for the organism. | We have cells where we say, you need to grow and grow and grow | to get as large as you can. We just described a tumor. |
| but My concern is how do we build better alternatives. How do we build things differently? | Whenever they do something different, something I can raise some awareness | yes, of course. | But I wish I could spend zero time on them.
Annanda: So what’s the antidote? |
Keisha: More about that after the break.
SEGMENT C: SOLUTIONS
Keisha: Welcome back to Moral Repair. We’ve been talking to Aral Balkan, designer, developer, and co-founder | of the Small Tech Foundation. | If the small tech approach could be a real alternative to Big Tech, what questions would new founders need to ask? And how might this approach need to be funded?
Funding the Sector
Aral: if the company hasn't been formed yet, | you can sit down with the people who are going to form it and say, okay, let's talk about your business model. | it all comes from the business model and | funding model. If they want to be a startup in the Silicon Valley sense of the word, | and if they're looking for venture capital,| the only thing I can tell them is, | you have to either out-Google Google, or you need to build something that is useful for a Google or a Facebook and then you will get bought by them, right? But you have to | build tracking technologies.| because that's the game you're playing.
| A Silicon Valley startup is | a temporary business. Its goal is to either fail fast or | grow exponentially, right? We love that whole exponential | growth with finite resources. That seems to be what we're basing our society on. And if you think about it for more than about 30 seconds, you | realize | that's just a euphemism for extinction. |
Aral: At the beginning, it's all about the product. | “Hey we want to let you guys talk to each other.” | What a lot of us don't know is they've already taken venture capital!
Keisha: Aral’s laying it all out here.
Aral: They've already sold the company. | and they’ve already sold a business model to their investors. | we're going to build this thing. | Why are we making it free?
Because we want to get millions and millions and millions of people. | That’s what we’re selling in our exit: those people and the potential to exploit them. | So that's the | Silicon Valley model. |
Can we talk about different types of funding? | then we have an opportunity here. | are you going to be okay if you build | a cooperative? | are you going to be okay with not becoming a billion dollar corporation? | with not selling this company |, but | investing in it being a long term sustainable business?
| if those are your goals, | then you can't take | venture capital. |
Keisha: So VC/venture capital creates perverse incentives. What can we do instead?
| here in the EU, where you know we have more social funding for things, | if I am just coming into technology, and I want to create | something, | Silicon Valley's over there with their venture capitalists going | “Here's 5 million | for you.” | “I just invested in 10 startups. I don't care about nine of them failing. | I want that last 5 million to be a billion dollar corporation so I can get my money back and more.”
| Do we have commons-based funding? | are we paying for technology with our taxes? | we're not. | “You want to build technology, but you don't want to be a douchebag? | Here's your funding.” | That's what we really need.
| how it's funded matters.
Investing in the Commons
Keisha: I really love this concept of commons-based funding, and I want to think about it broadly.
SOUND: Audio clip
A lot of my parents’ friends in the UK were also first-generation immigrants and part of the Windrush Generation—that’s people | born in former British colonies who moved to England and helped rebuild the country after World War 2.
Because of the hostility that they faced when they got there, many of them struggled to get bank loans and mortgages. So they leaned on each other. They shared resources in a peer-to-peer circle lending system that didn’t charge anyone interest, called susu, or partner. And that’s how some of them were able to buy houses, to fund their kids’ education—and, yeah, start business ventures.
So today, could groups of tech folks… especially those who don’t have access to conventional VC support… use susu circles to fund each other’s small tech projects?
At a different scale, I’m also curious about what Arlan Hamilton has been doing at Backstage Capital: it’s | focused on founders who are women, people of color or queer, | not just in traditional tech, and aims for pretty significant profits |. They say they’ve worked with over 200 companies in the last 5 years, at the $100K level on average.
If someone wants to send me $100K to go be great, I won’t say no. And also there are also lots of cool things I could imagine doing with even less funding than that.
Annanda: What Backstage Capital is doing is great. This is definitely the thing we need to see more of, and $100k in the tech market— | it’s not enough funding to seriously compete with White and Asian founders. | VC’s should choose to invest in all founders the way they do White and Asian men. | my frustration about this is how basic of an expectation this actually is. It reminds me of this idiom said in the South, “and then you want to know why?” Like, if you see | the gaps that we see | $100,000 isn’t even a drop in the bucket | to be | competitive in today’s tech market. | And what Arlan Hamilton has been doing is necessary. It needs… six more zeros behind it.
Keisha: Absolutely, the | investment playing field has not been level. I’m less interested in answering Big Tech at their scale though—because it’s been destructive—and more interested in experiments in different ways to operate. That’s what small tech sounds like to me.
Annanda: | If | we have a listener that's like, Hey, I want to do what Aral is doing, | what steps would they need to take? | actually this opens the door for a lot more people who | don't really have access to VC capital anyway. that | makes space for the global majority.
Aral: we need to support these people from the commons, from our taxes, | “you want to build technology for the common good, | then here's 5 million euros for you.” |
“And | you don't have to give us a detailed, | list of every feature that you want to add to it. We want you to experiment. | We want you to fail fast. We want you to do all those things that actually work. But | you're not going to have that toxic business model.
In fact, you won't be able to sell this organization that you're building. | Cause we're funding it from the commons. It's going to stay in the commons.” | if we did | just that very simple thing, | I think we would be awash with alternatives.
Keisha: I grew up hearing about the commons as a way to talk about public parks anyone can use, however much or little money and social status they have.
Investing in the commons, then would be | kind of like any one of the 50 million trees the Kenyan biologist and conservationist Wangari Maathai [One GAH-ri Mah-Tah-EYE] inspired women to plant in Kenya over fifty years.
Commons investments, like those planted trees, don’t just yield one effect per cause. They produce generations of interacting effects that spread across a system. As Maathai explained in her 2004 Nobel Lecture: They conserve resources, add new value, and reinvigorate the whole landscape:
Clip: [01:28-01:43] from Wangari Maathai’s Nobel Lecture
“Tree planting is simple, attainable, and guarantees quick successful results within a reasonable amount of time. These are all important to sustain interest and commitment.”
Keisha: Small tech aims to be like one of those trees.
Keisha: You describe | Small Tech as everyday tools for everyday people designed to increase human welfare. What kind of tools are you talking about?
Aral: | What if each one of us had our own place on the web that we owned and | controlled?
SOUND: Fade in worldbeat or chill lofi sound.
Where we could be public if we wanted to be, but where we could also be private if we wanted to be.
I mean private in the correct sense of the term, not as in “Facebook private.” | Facebook's always in the middle. Being the Entity in the middle is their business model, | so I can't talk to you directly. I have to tell Mark first and then Mark has to relay it to you. But Mark also takes notes and goes, Oh, who are these people? How can I exploit them? |
| if I had my own place on the web, let's say aral.smallweb.org and you had yours, | keisha@smallweb.org and we want to talk to each other privately, I could go directly to your place, | here's a message for you Keisha. End-to-end encrypted. Only the two of us can see that and you'd reply back to me.
But if I want to be public, | like an old school website, I could | make a message public | and people could follow me and see it, other people who are on this system. | that's what I mean when I talk about decentralization|. We each have our own place that we own and control.
| we tested this out with the city of Ghent to see | what if a municipality sponsored this for all of its citizens? | what if it was a human right that you had a place on the web? |
Keisha: | What would they need to know to manage that?
Aral: That's the thing. Nothing at all. | unless they need to have no technical knowledge whatsoever to set it up and manage it, we don't win. | we will get a handful of people who are ideologically aligned with what we're trying to do, who have technical knowledge, | who are probably quite privileged. |
| It needs to be, | that there is no technical knowledge required, which is not easy. But it's also not impossible.
The web is a client server platform. | it was built without any of the safeties that I'm trying to build into the small web: | not scaling, | being designed for individuals so that we don't have the concept of users, | we only have the concept of a person and we design for a single person. |
it's single people together that form communities, but they have to have | personhood at the core or else we end up violating | those human rights.
SOUND: Start closing reflection music.
Breaking down Moral Distress, Injury + Repair
Annanda: Studies show that the experience of emotional or spiritual pain activates the same part of the brain as physical pain. |The cultural norms of profit over people, of gatekeeping diverse people groups from being active competitors in the industry are actually forms of legitimized violence.
For those of us who work in the field of moral injury | and helping people repair those injuries, | a huge thing that we talk about is | for somebody to even get to a place of moral injury, they've had a sense of moral distress, which looks like normal stressors, and moral injury actually often shows up like PTSD.
| Aral is actually | a great story of moral resilience in the sense that he had this moral distress in seeing the beliefs and values | of | Big Tech function in a way that totally compromised his morals, |and he said, this is not for me. How do I shift and change it?
The amount of moral injury | within | STEM and tech is pretty high. | Folks are constantly negotiating their values in order to participate in the space. | And this is serious impact: | higher rates of anxiety, higher rates of depression, a lack of | self | worth and sense of self… because | you’re being used, your intelligence, your work is being used to support harmful things in society that you don’t agree with and not uplift society.|
SOUND: Fade in title music
And in order to rebuild that | it's not as if we erase the past or | act like this injury didn't happen, right? We have to work with the emotional, | existential | and spiritual scars that occur, and | deal with the complex messiness of the world, | which takes a lot of work.
Keisha: Thank you for that. I think I’m—I’m moved by that theme of recovery. | In Toni Morrison’s and Wangari Maathai’s work it’s | a kind of creative recovery, | making ourselves strong in the face of dehumanization: | acknowledging that it is dehumanizing, but responding in a creative way, not just pushing against | commodifying | systems that wound our sense of self or degrade | or conflict with our values, but actively leaning into community, investing in common spaces, and creating things of value for all people.
These are | Morrison’s and Maathai’s signposts for sustainable futures. And that spirit is rooted in our cultural history, in Black wisdom. It’s not the snake-eat-tail consumption that passes for business success. It’s a different logic entirely, and it makes provision for generations to come.
CREDITS
Keisha: Thank you for joining this season of Moral Repair: A Black Exploration of Tech. We return for a new season, Spring 2024.
Annanda: I’m Annanda Barclay
Keisha: And I’m Keisha McKenzie.
Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Courtney Fleurantin, Rye Dorsey, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.
Keisha: This podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.
SOUND: PRX Sound Signature
###
SHOW NOTES
Talk back to us at @moralrepairpodcast on Instagram, @moralrepair on Twitter/X, or by email: moralrepairpodcast at gmail dot com
Follow Aral Balkan’s work at the Small Tech Foundation; read his essay, “We didn’t lose control. It was stolen” and listen to his talk at NextM Warsaw, December 2019
ORIGIN is showing nationally in selected theaters.
Listen to Marjorie’s conversation with Laura Flanders (December 2023) and talk at the University of Colorado (September 2023)
Read the transcript or listen to the original audio.
Professor Jeffrey Sachs at the United Nations Food Systems Pre-Summit, 2021
Visualize the wealth of one billionaire compared to the average US household income ($65K)
Techcrunch reports US intelligence confirmation that it purchases US citizens’ personal data
Professor Shoshana Zuboff (Harvard Business School) defines surveillance capitalism
Wangari Maathai describes the Green Belt Movement in her 2004 Nobel Lecture
Find out more about how emotions shape our identities, cultures, and societies in the book, “The Cultural Politics of Emotion” by Sara Ahmed
“The Body Keeps The Score” by Bessel van der Kolk (narrated by Sean Pratt): This book educates, assesses and suggests interventions regarding the impact of emotional pain and trauma on our physical bodies.
“Whitey On The Moon” by Gil Scott-Heron, from The Revolution Begins album
Reverend Annanda Barclay (Oakland, CA) & Dr. Keisha McKenzie (Clarksville, MD)
Reverend Annanda Barclay (She/They) As a queer ordained minister of the Presbyterian Church and a Chaplain Fellow at Stanford University, focused on moral distress and injury in STEM , Barclay is also Co-Chair of the religious reparations nonprofit Center for Jubilee Practice.
Rev. Annanda Barclay is a death doula who explores life well-lived, a non-sectarian chaplain, and a Stanford researcher of moral injury and repair as it relates to tech.
Dr. Keisha E. McKenzie (She/They) is a technical communicator, strategist, and advocate who applies humanism and systems thinking to questions of well-being, public good, and ecology.
Dr. Keisha E. McKenzie is a technical communicator, strategist, and advocate who applies humanism and systems thinking to questions of well-being, public good, and ecology.