Chatbots might seem like trustworthy smart assistants, but experts are warning not to get too personal with the AI-powered agents.
Recent survey data from Cleveland Clinic shows that one in five Americans have asked AI for health advice, while survey statistics published last year by Tebra found that approximately 25% of Americans are more likely to use a chatbot over therapy sessions.
Experts, however, are warning users against oversharing with AI chatbots, especially when it comes to medical information.
According to USA Today, people should avoid divulging medical and health data to AI, which does not comply with the Health Insurance Portability and Accountability Act (HIPAA).
Since chatbots such as ChatGPT are not HIPAA compliant, they should not be used in a clinical setting to summarize patient notes nor should they have access to sensitive data.
That being said, if you’re looking for a quick answer, be sure to omit your name or other identifying information that could potentially be exploited, USA Today reported.
The outlet also warned that explicit content and illegal advice are off limits, as is uploading information about other people.
“Remember: anything you write to a chatbot can be used against you,” Stan Kaminsky, of cybersecurity company Kaspersky, previously told The Sun.
Login credentials, financial information, answers to security questions and your name, number and address should also never be shared with AI chatbots. That sensitive data could be used against you by malicious actors
“No passwords, passport or bank card numbers, addresses, telephone numbers, names, or other personal data that belongs to you, your company, or your customers must end up in chats with an AI,” Kaminsky continued.
“You can replace these with asterisks or ‘REDACTED’ in your request.”
Confidential information about your company is also a major privacy faux pas,
“There might be a strong temptation to upload a work document to, say, get an executive summary,” Kaminsky said.
“However, by carelessly uploading of a multi-page document, you risk leaking confidential data, intellectual property, or a commercial secret such as the release date of a new product or the entire team’s payroll.”
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.