Explaining Responsible AI: Why AI sometimes gets it wrong

Estimated read time 1 min read

Post Content

​ AI models learn from vast amounts of information but sometimes they create hallucinations, also known as “ungrounded content,” by altering or adding to the data.

Learn more about the tools we have put in place to measure, detect, and reduce inaccuracies and ungrounded content: https://news.microsoft.com/source/features/company-news/why-ai-sometimes-gets-it-wrong-and-big-strides-to-address-it/

Subscribe to Microsoft on YouTube here: https://aka.ms/SubscribeToYouTube

Follow us on social:
LinkedIn: https://www.linkedin.com/company/microsoft/
Twitter: https://twitter.com/Microsoft
Facebook: https://www.facebook.com/Microsoft/
Instagram: https://www.instagram.com/microsoft/

For more about Microsoft, our technology, and our mission, visit https://aka.ms/microsoftstories   Read More Microsoft 

You May Also Like

More From Author

+ There are no comments

Add yours