The first season of Raw Data has been thought-provoking, fun, educational, hard-hitting, futuristic—and valuable for us at the Cyber Initiative, both in promoting and disseminating research occurring on campus and in generating conversation about the societal changes driving that research. We organize our research agenda around cyber-social systems, the concept that cyber technologies are fundamentally changing (not just disrupting) the ways we interact in the world. As Mike Osborne explains in this episode, technology is changing our relationships with our doctors, with our bosses and co-workers, with our elected officials, with money, with love, and even with our own minds. Categorizing those changes within systems helps us ask and consider big questions like how the ubiquitous availability of information is changing our capabilities as individuals, and how technology that allows people to connect to each other instantly and nearly seamlessly is changing the way we form groups and communities. But while individual researchers will always find an interesting question to think about, putting them together in groups to talk through their thoughts and assumptions—and putting them behind a microphone to allow Mike and Leslie to draw out their conclusions—is incredibly important to ensuring their research, and our efforts, have relevance beyond Stanford, beyond Silicon Valley, and beyond those concerned with tech.
We need to keep talking about how technology affects--and is affected by--society
It’s easy to forget, when you’re immersed in the latest encryption news between tech companies and law enforcement or the newest update to the algorithm that determines who you’re going to vote for and what news you’re going to see first, that the majority of Americans have not only not participated in, for example, the ride-sharing craze—they haven’t even heard of it. That just because we happen to live and work miles from the largest tech corporations in the world and the newest, hottest apps, doesn’t mean they’re available to—or in demand by—everyone. What it does mean is that those of us involved in this cyber evolution, or those of us who want to be, have a particular obligation to think through what we’re doing, and what effects it will have beyond increasing a share price, or allowing people to send new types of text messages. Extending an insight from the security field—that security needs to be considered at all stages of development, not as an add-on to a nearly finished product—the societal effects of technology need to be a topic of discussion from the ground up. Sure, the new app or the new algorithm lets us do something we used to do faster, or more cheaply, or with a greater amount of record-keeping and analysis. But we also need to think about where it’s being deployed (and where it’s unavailable), who will use it (and who won’t be able to), and whose jobs it may make slower, or more expensive, or even non-existent. That’s where discussions like those happening on Raw Data come in: by taking a question like “how do we transfer money” or “how do we address mental health crises” or “how do we pick a news story to read, or to report on”, the podcast takes us from technology to social system and back, examining what researchers are trying to do with technology, what the people who operate in a system are trying to accomplish, and how the two will work together (or butt heads). It’s easy to try to fix technological problems in a binary or algorithmic way, either collecting data or not collecting it, returning a “yes” or a “no”, ranking one search result first and another second and another third, but the grey areas where people disagree and want a “maybe”, an option to collect data for some but not for others, or a system of ranking that changes dynamically, are the most important to work through together.
Are we losing control, or gaining trust?
In this episode, Worldview’s director Brie Linkenhoker asks where we’re seeing people beginning to cede control to algorithms, allowing technology to make decisions for us. This raises an interesting dichotomy: while warnings about security, hackers, identity thieves and power grid saboteurs make good headlines and pop up in our news feeds more and more, the cultural and technological undertow seems to be headed in the opposite direction. For all the hue and cry about the insecurity of the technology we use, we’re not using it less, and we don’t seem to be taking back the personal data we’ve entrusted to it. Instead, we’re relying on a combination of law, insurance, security researchers and innovators, and government to run interference while we keep moving down the field with more and newer technology. In many ways, this is a common pattern: we’ve rarely addressed the risks of exploration by deciding we should limit our curiosity.
Continuing the conversation in Season Two
Even as we learn more about the security implications of technology, from data security to job security, financial security, and national security, I’m still optimistic about our ability to thoughtfully develop and implement technology that will help our societal systems operate more effectively, fairly, and creatively. As long as we keep having conversations like those on Raw Data--and we're going to. Season Two begins later this summer; let us know if you have comments on season one or suggestions for topics to explore in the future.