Interview with Payson Stevens
Payson R. Stevens was President and Creative Director of InterNetwork, Inc. and InterNetwork Media, Inc., science/consulting groups with clients in government, industry, and academia. He has received the US Presidential Award for Design Excellence. Originally trained in molecular biology at the City University of New York and in biological oceanography at the Scripps Institution of Oceanography/UC San Diego, Stevens also studied at the Arts Students League and the School of Visual Arts in New York City. He has been involved with traditional and new media as an artist, designer, writer, and filmmaker for over 50 years. Since 2000 he has supported philanthropic activities, benefiting the Great Himalayan National Park and humanitarian work in India. He is now exploring the role of AI in society, including authoring a book entitled "Before AI Decides: Nine Ways to Stay Human".
Dear Payson,
Your recent book addresses a general audience but many of the points you make are applicable to the problems discussed by scientists and scholars on this website. Can you summarize your thoughts about the importance of transparency and of collective human thinking to deal with the challenges posed by AI?
Payson:
My focus is on preserving human judgment inside AI systems. There are two main concerns: AI systems evolving beyond human control and the fact that current control of the most advanced AI systems is held in a few hands.
At the same time, the scientific community might reflect on how we can use this technology to benefit society. It is very important for scientists of every generation to participate in this debate, including students. Transparency is essential; there is a section of my book called "Show the work." This must apply to AI, and it is also relevant to our human efforts.
Another guideline in the book is “Choose the future.” This means refusing to let important decisions happen early and out of view. What we choose to protect decides who we become. This is essential for democracy and means insisting on transparency, oversight, repair, and collective authority before AI hardens into infrastructure. The book exists to slow that process while there is still time. Not by stopping technology, but by helping people stay present within it.
Without transparency we cannot exercise our collective judgement. Our collective voice can shape transparency, justice, and accountability. This will allow us to keep decisions human and shared and to meet the challenges posed by AI.
How can scientists, scholars and philanthropists help this process?
Payson:
I would add artists. It is the artists and the poets that always are the ones that ask the deep questions about society and civilization. All these groups can play a role. We can exchange ideas and define the right questions. One of the things that I learned from Roger Revelle is that it's the questions that frame where you're going. And it's the process that develops that brings the purpose and the excitement.