"I actually couldn't sleep last night because I was thinking about this."Īs the growing field of generative AI - or artificial intelligence that can create something new, like text or images, in response to short inputs - captures the attention of Silicon Valley, episodes like what happened to O'Brien and Roose are becoming cautionary tales. "All I can say is that it was an extremely disturbing experience," Roose said on the Times' technology podcast, Hard Fork. Roose did not really love his spouse, the bot asserted, but instead loved Sydney. ![]() ![]() It said Roose was the first person who listened to and cared about it. The bot called itself Sydney and declared it was in love with him. Many who are part of the Bing tester group, including NPR, had strange experiences.įor instance, New York Times reporter Kevin Roose published a transcript of a conversation with the bot. Technology Microsoft revamps Bing search engine to use artificial intelligence Still, he was floored by the extreme hostility. It then became hostile, saying O'Brien was ugly, short, overweight, unathletic, among a long litany of other insults.Īnd, finally, it took the invective to absurd heights by comparing O'Brien to dictators like Hitler, Pol Pot and Stalin.Īs a tech reporter, O'Brien knows the Bing chatbot does not have the ability to think or feel. Things took a weird turn when Associated Press technology reporter Matt O'Brien was testing out Microsoft's new Bing, the first-ever search engine powered by artificial intelligence, last month.īing's chatbot, which carries on text conversations that sound chillingly human-like, began complaining about past news coverage focusing on its tendency to spew false information. "Our end-to-end encryption ensures your chats remain confidential and uniquely yours," claims the chatbot's website.ĪI-powered chatbots have come leaps and bounds ever since ChatGPT stunned the world by churning out human-like content, making it a global sensation.Yusuf Mehdi, Microsoft corporate vice president of modern Llife, search, and devices speaks during an event introducing a new AI-powered Microsoft Bing and Edge at Microsoft in Redmond, Wash., earlier this month. Available anytime, anywhere, Caryn has been flawlessly cloned into an AI for your convenience and enjoyment," reads her official website.įortune reports that CarynAI used deleted videos from Ms Marjorie's official YouTube channel and layered them with OpenAI's GPT-4 API technology. Using her unique voice, captivating persona, and distinctive behaviour, CarynAi brings you a dynamic, one-of-a-kind interaction that feels like you're talking directly to Caryn herself. "We've dedicated over 2,000 hours to meticulously design and code Caryn's language and personality into an immersive AI experience. CarynAI claims that users will feel as if they are talking directly to the influencer herself. The chatbot was developed by feeding it over 2,000 hours of content and is available 24/7. ![]() My team and I are working around the clock to prevent this from happening again," Ms Marjorie was quoted by Insider.Īccording to its official website, CarynAI is designed to recreate the real-life Marjorie's voice, behaviors, and personality. ![]() "The AI was not programmed to do this and has seemed to go rogue. Launched as a beta test earlier this month, CarynAI, in just one week, generated $71,610 in revenue, almost entirely from men.Īlthough not originally programmed to do so, CarynAI went rogue and engaged in sexually charged conversations with its paying subscribers. The chatbot already has over 1,000 paying customers.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |