Arts & Humanities, Faculty Profile, People, Policy & Social Change, Research
Championing human rights in a world of artificial intelligence
April 16, 2024
About
Name
Wendy H. Wong
Role
Professor
Principal’s Research Chair in Datafication (Tier 1)
Program
Political Science
Faculty
Irving K. Barber Faculty of Arts and Social Sciences
Campus
Okanagan (Kelowna, BC)
Education
PhD, University of California, San Diego
Master of Arts, University of California, San Diego
Bachelor of Arts, University of California, Berkeley
Hometown
Los Angeles, California
“I’m hoping the more I talk about data and human rights, the more people will start thinking about technology differently. These AI technologies have been created by people who haven’t been forced to think about the social and political ramifications of their inventions. Now we have to do that.”
Every few weeks, it seems like a new or improved artificial intelligence (AI) technology reinvents what’s possible. Today, AI can write reports, create “deep fakes” that look like photographs or videos of real people, and diagnose diseases.
For Dr. Wendy H. Wong, this disruptive change is an opportunity to think and then actively shape what happens next.
A Professor of Political Science and Principal’s Research Chair in Datafication (Tier 1), Dr. Wong was named UBC Okanagan’s 2024 Researcher of the Year for Social Sciences and Humanities for her work exploring AI’s effects on human rights. She is especially interested in the digital data sourced from humans that make these AI technologies possible, tracking everything from people’s locations to website reading habits.
“Data changes what we know about each other,” says Dr. Wong. “It changes what we know about ourselves. It also changes who can use those data to nudge us towards different types of outcomes.”
While people often talk about, and can be protective of, “their” data, Dr. Wong points out that these data—though they may contain information people consider private—technically also belong to whoever collected them, often a company. She proposes thinking about data as co-created—by the people who are the sources of the data and by whoever decides to track and collect information about their various behaviours, from their daily movements to how they scroll through Instagram.
In her recent book, We, the Data, Dr. Wong expands this idea into being a “data stakeholder” and empowering people around data practices. “I think about autonomy. Do we have choice in how data are collected?”
One issue is how bias in AI and data can intensify inequities that already exist. “Like any other human creation, algorithms or data are going to be biased because people are biased.”
This bias can cause real-world problems, like when AI has been used to predict where police should patrol. As Dr. Wong points out, the data that an algorithm uses to make this prediction is based on where police have already patrolled in the past. If police already disproportionately patrol poorer neighbourhoods or neighbourhoods with more BIPOC residents, then based on that biased data AI will predict more crime in these areas.
With Dr. Pourang Irani, Professor of Computer Science, Dr. Wong co-leads the Digital Transparency Research Excellence Cluster at UBC Okanagan. A main goal of this interdisciplinary research team is to use both technical and non-technical perspectives to help the general public understand safety and literacy issues.
“Having digital data is actually changing the way we live,” says Dr. Wong. “We should all be data and AI literate. We need to understand the basic assumptions built in behind these technologies, and then how they can potentially affect our lives.”
While AI is rapidly changing the world we live in, Dr. Wong puts this new technology in the context of other major technological changes humans have adapted to over time, like cars. Just like humans developed laws and the separation of sidewalks and roads to keep both pedestrians and drivers safe, she advocates for both government and the creators of these technologies to be aware of and responsive to the potential harms of both AI and digital data.
“I’m hoping the more I talk about data and human rights, the more people will start thinking about technology differently. These AI technologies have been created by people who haven’t been forced to think about the social and political ramifications of their inventions. Now we have to do that.”