Skip to content




Eight Things You Should Never Share With an AI Chatbot

Featured Replies

It probably goes without saying at this point, but your conversations with AI chatbots aren't private—everything you type or upload to Gemini, ChatGPT, and other models might be read and used in a variety of ways. If you wouldn't send a document or repeat information to someone you don't know, you shouldn't include it in a chatbot prompt either.

Researchers at Stanford reviewed the privacy policies of the six U.S. companies that developed the most popular AI chatbots, including Claude, Gemini, and ChatGPT, and found that all of them use chat data by default for training purposes. Some retain said data indefinitely, and most merge it with other information collected from consumers, such as search queries and purchases. In most cases, you can opt out of having your data used to train LLMs, but chats can also be read by human reviewers, and long-term retention policies increase the risk of your stored information being leaked in a breach.

If you're going to use an AI chatbot, these are the things you should avoid sharing:

  • Login credentials: Obviously, you should never paste prompts with usernames and passwords into a chatbot, including documents that contain login credentials. AI is also abysmal at generating secure passwords—use the tools in your password manager instead, or better yet, opt for a passkey if available.

  • Financial data: AI chatbots aren't financial experts, and you shouldn't upload documents or use data related to your specific finances in prompts. This includes bank statements, credit card numbers, investment information, account numbers and balances, etc. Sharing financial details anywhere that isn't secure increases the risk of theft, fraud, and targeting by scammers.

  • Medical records: AI chatbots also aren't medical professionals and shouldn't be relied upon for medical advice. You probably don't want your medical records to be used to train LLMs—plus, uploading them exposes them to potential data breaches.

  • Personally identifiable information (PII): AI prompts should never include information like your name, address, email, phone number, birth date, Social Security number, passport number, or any other data that could be used to steal your identity. (Financial information and medical records are also considered sensitive PII.)

  • General health information: In addition to keeping your sensitive medical records private, you should avoid giving chatbots seemingly benign information about your health that could be used to profile you. For example, the Stanford report notes that it's possible for AI chatbots to infer health status from a request for heart-friendly dinner recipes, which could eventually be accessible to insurance companies. This also includes information related to topics like sexual health, medication use, and gender-affirming care.

  • Mental health concerns: Another thing your chatbot isn't is a therapist. AI has been unhelpful at best and harmful at worst when it comes to mental health. Even with updates intended to protect users in crisis, chatbots aren't a replacement for real, human support.

  • Photos: AI image editing is popular, but that doesn't mean it's without risk. You may not want your personal photos used for training purposes, and image metadata contains information like your GPS location. At the very least, avoid uploading images of people (especially minors), and consider stripping EXIF data before sharing.

  • Company documents: AI may be useful for summarizing documents, creating presentations, drafting emails, and completing other work-related tasks more quickly, but you should use caution when uploading files containing sensitive company information to a chatbot. Your employer may even have a policy prohibiting it.

The bottom line is that you should be cautious what you share with AI chatbots—assume everything in your prompts is stored and could be read by someone else. Avoid anything that is personal or identifiable, and enable all available privacy settings (such as data sharing and training opt-outs).

View the full article





Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Account

Navigation

Search

Search

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.