Note
Resistance to data colonialism requires imagination and action at multiple levels: working within the system, working against the system through protests, and working beyond the system to imagine alternatives. The social quantification sector, including big tech companies, data brokers, and individual entrepreneurs, plays a key role in data colonialism by quantifying and exploiting social data. Historical warning signs about the impacts of AI and technology, such as the environmental costs and biases, have been evident through the work of individuals like Timnit Gebru and the development of technologies like Clearview AI. The narrative of progress, convenience, and innovation used to justify data colonialism mirrors the civilizing mission of historical colonialism, emphasizing the need to resist and imagine a different future. Resistance to data colonialism requires imagination and action at multiple levels: working within the system, working against the system through protests, and working beyond the system to imagine alternatives. The social quantification sector, including big tech companies, data brokers, and individual entrepreneurs, plays a key role in data colonialism by quantifying and exploiting social data. Historical warning signs about the impacts of AI and technology, such as the environmental costs and biases, have been evident through the work of individuals like Timnit Gebru and the development of technologies like Clearview AI. The narrative of progress, convenience, and innovation used to justify data colonialism mirrors the civilizing mission of historical colonialism, emphasizing the need to resist and imagine a different future. Decolonizing data will require changing curriculums and restructuring education to educate scientists, engineers, policymakers, and artists to think about data issues critically.
Highlights
-
2024-12-22 23:00 Joining me to discuss Data Grab are the authors Ulises Mejias, professor of Communication Studies at the State University of New York, Oswego, and Nick Cooldry, professor of Media Communications and Social Theory at the London School of Economics and Political Science and a Faculty Associate to Harvard University’s Berkman Klein center for Internet and Society
-
2024-12-22 23:00 your car is gathering data about you every time you speed up, every time you slow down, maybe even what you’re listening to on the radio or via your ipod or whatever it is that’s all being fueled. Data, fuel to the machine. And it’s not just the car company who keeps that. They probably sell it to a data broker who for sure want to make money by selling it onto an insurer, for example. People have suddenly found they’ve lost their car insurance so they’re committing a criminal offense just because some insurer’s algorithm doesn’t like the way they speed up or slow down and no one told them.
-
2024-12-22 23:00 Well, the way we approach it in the book is through a new idea that hit on us as we were trying to get across our ideas that have been developed in a previous book for Stanford Press about five years ago into to a much wider audience. And we hit on the idea of the data territory because obviously data is not land. It’s not like land. It’s not anything like land at all. It’s made it happens. When people write code to capture data, they put it in a database. Highly technical. We know all that. On the face of it, it’s nothing like land. And yet it’s possible through writing code and building software to create a space where you have absolute control over what goes on in that space. We call these spaces platforms. They think we know they’re useful, friendly things that we use to do the nice stuff with the people we want to do nice stuff with, send nice pictures, meals to family. But they are only possible because we’re doing it on a data territory that has gives just as much control, if not more, to the owner of that data territory as physical land does.
-
2024-12-22 22:59 But I’m sure Nick can mention others as well. Well, I mean, those are really valid points about ChatGPT, because what’s really going on, we think about AI as a technology. We’re told this is the great tech, this is the future we have to hold on to, because tech is always good. But it’s also a redefinition of what knowledge is and what is expertise. And that’s a much more subtle thing. That’s what teachers care about, that’s what parents care about, but it’s being redefined without asking them what they want. It’s literate. And that’s where we get back to exterminate
-
2024-12-22 22:59 Well, I think one of the key things was amazing work which is known if you’re working in the data industry, but not so widely known more widely, which was the work around the hidden costs of the generative AI technology. There is a worker who was born in Ethiopia who did her training in data science in the US Tim Nick Gebru, who was a brilliant data scientist and who became so brilliant that she became Google’s chief ethicist. She became the lead ethics advisor on programs that at the time none of the rest of us knew about, but they Are what? What became Genai, Google’s version of Gen AI. And around about 2018, 2019, she had the temerity with another academic, two other academics, to publish a paper called Stochastic Parrots with a lovely little parrot sign in the title.
-
2024-12-22 14:28 Land was cheap, territory was cheap. It was just there for the taking. Colonizers would arrive to, you know, remote parts of the world, like Australia, like Latin America, Africa, and say, look at all of this land.