Ladies and gentlemen, have I told you that there is no news in the news? And how many times have I told you that what you see on the TV news you should always be suspicious of? You should always suspect that there’s something else. You should always suspect that it may be entirely wrong. You should always suspect, with major network news, that there is a political agenda attached to everything they do.
Among the great challenges posed to democracy today is the use of technology, data, and automated systems in ways that threaten the rights of the American public. Too often, these tools are used to limit our opportunities and prevent our access to critical resources or services. These problems are well documented. In America and around the world, systems supposed to help with patient care have proven unsafe, ineffective, or biased. Algorithms used in hiring and credit decisions have been found to reflect and reproduce existing unwanted inequities or embed new harmful bias and discrimination. Unchecked social media data collection has been used to threaten people’s opportunities, undermine their privacy, or pervasively track their activity—often without their knowledge or consent.
You should be protected from unsafe or ineffective systems. Automated systems should be developed with consultation from diverse communities, stakeholders, and domain experts to identify concerns, risks, and potential impacts of the system
Led by the University of Maryland, TRAILS aims to transform the practice of AI from one driven primarily by technological innovation to one driven with attention to ethics, human rights, and support for communities whose voices have been marginalized into mainstream AI. TRAILS will be the first Institute of its kind to integrate participatory design, technology, and governance of AI systems and technologies and will focus on investigating what trust in AI looks like, whether current technical solutions for AI can be trusted, and which policy models can effectively sustain AI trustworthiness. TRAILS is funded by a partnership between NSF and NIST.