Cornell Bowers College of Computing and Information Science
A graphic illustration showing a brain with computer code in shades of blue

Story

MPS Project measures trust in generative AI

January 5, 2024

By Louis DiPietro

Tools we don’t trust are tools we don’t use. For generative artificial intelligence (GenAI) tools, this means it is critical to measure trust when evaluating tools that require user-inputted data. 

This semester, a student team from the Master of Professional Studies (MPS) program in the Department of Information Science led a research project alongside Google to develop metrics to measure user-perceived trustworthiness in burgeoning GenAI tools like ChatGPT and Google Bard. The team then homed in on the most important factors contributing to trustworthiness. 

The project was one of 10 in this semester’s MPS Project Practicum (INFO 5900), the program’s linchpin, project-based course where students build implementable solutions to real-world problems for clients – from Fortune 500 companies and startups to nonprofits and government agencies.   

Through a survey of roughly 110 student developers and full-time Google employees, the student team found that respondents mostly use GenAI tools for idea generation and that privacy and accuracy are the most important factors when evaluating GenAI trustworthiness. Users want to be assured the information they plug into these tools is kept confidential, and they want that information to be correct, according to the team’s findings.  

“The class provided us with practical experience in developing and implementing solutions for real-world problems,” said Zhuoer Lyu, an MPS student and member of the research team. “We felt fortunate to collaborate with industry partners like Google to enhance our understanding of GenAI applications.” 

Among its recommendations to Google, the student team said users of GenAI tools should be given control over their data by providing clear options for opting in or out of data collection, personalized experiences, and sharing of their information. In regard to accuracy, the team recommended allowing users to verify the AI’s answers. In turn, this feedback would help hone the AI models. 

The student team consisted of Lyu, Jingruo Chen, Elisabeth Kam, Tung-Yen Wang, Xiaohan Wang, and Yahui Zhang.  

Elsewhere, a separate team of MPS students carried out a combination UX design and UX research project for Google Cloud to examine how AI could better integrate tools to build and maintain customer relationships. MPS students Jinmo Huang, Haochen Hu, and Miles Ma worked on the UX design side, while Bandar Qadan, Pika Cai, and Jai Chandnani worked on the UX research. 

"This semester's projects were complex and provided a healthy combination of meaningful practical experience coupled with intellectual expansion,” said Sharlane Cleare, lecturer of information science in the Cornell Ann S. Bowers College of Computing and Information Science, and the course’s instructor. “Students eagerly embraced, navigated and addressed a myriad of comprehensive end-to-end technical solutions." 

Louis DiPietro is a writer for the Cornell Ann S. Bowers College of Computing and Information Science.