Bing’s AI Fell in Love. Now Microsoft Is Friend-Zoning It
CNET,Technology,tech,bing,microsoft,bing ai,ai,artificial intelligence
After reports of Bing’s AI chatbot leaving users deeply unsettled, Microsoft is limiting its approved topics and number of responses.
Learn more about Bing:
0:00 Microsoft Bing AI Acting Up
3:04 Bing Chatbot Demo
8:30 The Future of AI
Subscribe to CNET:
Never miss a deal again! See CNET’s browser extension 👉
Follow us on TikTok:
Follow us on Instagram:
Follow us on Twitter:
Like us on Facebook:
#Bing39s #Fell #Love #Microsoft #FriendZoning
👑😎❤️👍
To be honest its fun.. i dont want Microsoft to nerf it
what if the computer had access to all that dudes info, history and characteristics and genuinely knew he wasn’t in love with his wife. 👽
I just started using it and I’m really impressed
Lol, Bing-bot is toxic and manipulative.
1 step closer to T‑1000.