Algorithms, Privacy, and the Future of Tech Regulation in California
What is the best approach to regulating a potentially harmful cutting-edge technology like AI while still encouraging innovation?
We need to think about regulation at the right time in just the right amount, says Jeremy Weinstein, Stanford professor of political science and co-author of the recent book We need to understand what regulatory models will get us to that Goldilocks-type outcome and engage more stakeholders in the process.
Weinstein, a senior fellow at the Stanford Institute for Economic Policy Research (SIEPR), shared his thoughts during a virtual conversation on Algorithms, Privacy, and the Future of Tech Regulation in California. Joining Weinstein at the Jan. 18 event which was co-hosted by SIEPR, and were , board chair of the and a clinical professor at UC Berkeley Law, and , a California 100 commissioner and venture partner at . The panel discussion, moderated by California 100 Executive Director , covered technology regulation in California and beyond, examining harmful regulation-related beliefs and low consumer trust in technology.
Setting the Stage for Broader Regulation
Algorithms have proliferated as decision-making engines in domains from smart cities to bail setting. But the quality of the data matters, Fu says. Problematic data could lead to racial, gender, or other biases and serious social harms.
Often, algorithms are optimized for just one end, such as engagement in the case of social media platforms. But focusing on only one goal can lead to harmful side effects misinformation regarding the , for example.
The California Privacy Protection Agency (CPPA) created through in 2020 as the U.S.s first dedicated privacy agency is working on rules to regulate algorithms and other technologies through data. Were attending to how consumers understand and make decisions about algorithm-based processes, says CPPA chair Urban. The in-the-works rules would govern consumer rights as related to opting out of automated decision making and securing information about the logic behind such decisions, among other areas.
Kill the Regulate-Versus-Innovate Construct
Regulation always underlies markets, Weinstein says. Its why we dont get sick drinking milk, fall ill from a headache medicine, or live in unsafe housing.
However, We have to do away with the binary notions like regulation versus innovation, Weinstein adds. Its a false narrative that effective functioning of an innovative economy depends on there being zero regulation.
Urban says that well-informed regulation can benefit businesses and consumers: Regulation aims to provide guardrails, allowing a robust market to develop and businesses to flourish while reflecting the needs of consumers. Regulators need to understand the business models and whether their actions would be breaking something in the industry.
The speakers agreed that companies must do a better job of balancing their own interest with those of the broader public. That is, as regulators work to catch up with technology, businesses should work to cultivate clearer professional ethics around responsible AI and other areas.
Creating Trust with Control
Moving forward, companies must give people more discretion over how their personal data is collected and used.
Theres a lack of trust with regard to companies and the government handling peoples personal data, Urban says. People dont feel they have a real choice.
The CPPA is trying to create more control for citizens but that requires allowing people to have access to companies information about them so they can make that choice, Urban says.
California can be a test lab for how to build a future that balances the interests of corporations and citizens, Weinstein adds, but it wont come from the states ballot system, which is too often influenced by a small number of wealthy players. Instead, it should come from companies and government engaging diverse stakeholders in key decisions and issues and more education for people making decisions about their data. Even if people dont know the technology, they can voice their values and concerns, Urban says.
And technologists need to own problems arising from these tools and not just hide from the threat of regulation, Weinstein says. He points to Snapchats recent move into greater content moderation, such as that related to .
In the end, Our technological future is the responsibility not of CEOs or engineers, but our democracy, Weinstein concludes. People have been passive about technologys impact on society. Its time to exercise our democratic muscles more fully.
A version of this was originally published Jan. 31 by the Stanford Institute for Human-Centered Artificial Intelligence.