Uncommon Collaborators: Data science for clean water
A Stanford geophysicist and lawyer team up to use big data for water quality monitoring and governance.
Water is the great homogenizer, mixing together agricultural runoff, livestock waste, and industrial effluent as it travels downstream. Similarly, untangling the web of responsibility for water pollution from various sources is an intricate puzzle. In this episode of the Uncommon Collaborators series, two Stanford researchers discuss how they forged a unique partnership between geophysics and law to address the challenges of freshwater monitoring in the U.S.
When the U.S. Environmental Protection Agency (EPA) launched an initiative to reduce rates of noncompliance with the Clean Water Act, they scouted experts who could help them optimize their limited resources for the cause. Among the partners they found: regulatory policy expert Daniel Ho, the William Benjamin Scott and Luna M. Scott Professor at Stanford Law School and faculty director of The Regulation, Evaluation, and Governance Lab (RegLab).
The RegLab team used machine learning to highlight how the basic design of EPA objectives could have unintended consequences. “Really quickly we realized that we were sort of in over our heads on some of the pollutant transport side of things,” Ho said. So, looking to extend the work to more proactive water quality monitoring and enforcement, Ho sought a Stanford collaborator who could help him wrangle the complex fluid dynamics in freshwater systems.
At a children’s birthday party, Ho posed his problem to Jenny Suckale, an assistant professor of geophysics at the Stanford Doerr School of Sustainability. The two found a common thread of using data-driven approaches in their research.
With funding from the Realizing Environmental Innovation Program through the Stanford Woods Institute for the Environment, the team mined the trove of daily data already being collected at EPA monitoring sites along rivers and lakes. Bringing together their expertise in data science, artificial intelligence, and regulatory policy, the team developed a model the EPA could use along with state and federal partners to better understand baseline levels of pollution and identify likely suspects for pollutant spikes along rivers.
The project could help the EPA be more efficient in allocating limited resources for visiting sites and enforcing compliance with water quality standards. Beyond rapid response to pollution offenders, models like the one Suckale and Ho developed could inform how the EPA writes regulatory water policy to better anticipate where, who, and how much various industries may contribute to freshwater pollution.
For Suckale, the collaboration demonstrates the creative potential of approaching environmental governance questions with complementary disciplines. “It’s not like it just generates a new approach,” said Suckale. “It generates a question that you never would have thought about asking in the absence of these kinds of discussions.”
Ho is the William Benjamin Scott and Luna M. Scott Professor of Law, a professor of political science, a professor (by courtesy) of computer science, a faculty affiliate at the Stanford Woods Institute for the Environment, a senior fellow at the Stanford Institute for Economic Policy Research, and an associate director of the Institute for Human-Centered Artificial Intelligence (HAI). Suckale is also a center fellow (by courtesy) at the Woods Institute, a faculty affiliate at the RegLab, and a faculty affiliate at HAI.
Stanford ecologist and climate scientist Chris Field looks to the 28th UN Climate Change Conference for a roadmap on what he considers solvable challenges.
Visiting Scholar Lily Hsueh on why California's new climate disclosure laws could be a gamechanger.
Josheena Naggea of the Center for Ocean Solutions discusses how people in the tropics, or the 'tropical majority,' have invaluable expertise and knowledge of the ocean that is key to helping protect the high seas.