Generative AI and Protecting Private Information

As generative artificial intelligence (AI) tools like ChatGPT and Bard become more advanced and ubiquitous, [U]Tech reminds users to protect private information—their own, and the university’s.

“Entering data into a generative AI tool or service,” said [U]Tech Vice President Miro Humer, “is like posting that data on a public website.”

Artificial intelligence tools collect and store data from users as part of their learning process. Any information entered into an AI tool becomes part of its training data, which may then be shared with others.

Among the types of data that should not be used in AI tools are:

  • Personal information: Home and email addresses, phone numbers, dates of birth, employee or student information, disciplinary or legal documents, patient records, other protected health information and personally identifiable information.
  • Research materials: Grant proposals, unpublished data, research drafts, documents related to intellectual property and information subject to export control technology control plans.
  • Information subject to local, state and/or federal protections: These include academic records, data requiring parental consent to provide or accept, data cited in executed nondisclosure agreements, numerical identification (e.g. driver’s license, passport, etc.).

If you have concerns or questions about specific kinds of information you have or would like to share, please contact [U]Tech at help@case.edu.