Microsoft files a patent for chatbots that can imitate a person of your choice

0 32

The artificial intelligence has made tremendous progress in recent years, with applications in all areas of the most advanced autonomous cars to diagnostic algorithms to detect cancer  with reliability so far never reached through the resolution complex mathematical equations . Recently, the Protocol site revealed a somewhat puzzling Microsoft patent: the company appears to be planning the design of chatbots (chat bots) capable of digitally mimicking a target person

The patent, filed last week, is titled " Creating a Conversational Chatbot of a Specific Person ". “ Through some means, social data (eg images, voice data, social media posts, email messages, written letters, etc.) about the specific person may be accessible. Social data can thus be used to create or modify a special 'index' on the subject of the personality of the targeted person ”, one can read in the abstract of the patent.

Why such a patent? What are they planning exactly? The question won't find a precise answer anytime soon, but it's possible to imagine what they're up to, and what other developers might do with them. According to these lines, Microsoft may be hoping with this technology (as a first step) to improve its customer service chat bots, as well as possibly future AI support for its Cortana system and other assistants. intelligent personal. But the patent document reveals intriguing details ...

Image taken from Microsoft's patent, showing a device that appears to be intended to communicate with the various data sources in order to feed the machine learning algorithms used to model the resulting virtual assistant. © Microsoft / United States Patent and Trademark Office

To use the phrase quoted above: “ […] messages on social media, electronic messages, written letters , etc.) […] ”… why do you want to be able to rely on letters? Who nowadays still uses this format for correspondence, if not for administrative purposes? As we mentioned before, maybe this will therefore serve the purpose of improving Microsoft customer service, but if not? Why this device in the shape of a giant Tamagotchi if the goal is to improve a virtual assistant?

Will it be to store “virtual personalities” inside? According to the details of the patent, it would only be an interface allowing connection to several devices in order to feed the machine learning algorithm, which will then allow to "model" the personalized virtual assistant. But how personalized? "For some aspects, conversation about a specific person's personality may include determining and / or using the conversational attributes of the specific person, such as style, diction, tone, voice, intention, the length and complexity of the dialogue, the subject and the consistency. Conversation based on a specific person's personality may further include determining and / or using behavioral attributes (user interests, opinions, etc.) and demographic information», Can be read further in the summary of the patent, to name just one of the many details mentioned. Conclusion: with such technology, Microsoft could well go beyond the simple improvement of current chat bots ... It remains to be seen how much and why.

Will it be to store “virtual personalities” inside? According to the details of the patent, it would only be an interface allowing connection to several devices in order to feed the machine learning algorithm, which will then allow to "model" the personalized virtual assistant. But how personalized? "For some aspects, conversation about a specific person's personality may include determining and / or using the conversational attributes of the specific person, such as style, diction, tone, voice, intention, the length and complexity of the dialogue, the subject and the consistency. Conversation based on a specific person's personality may further include determining and / or using behavioral attributes (user interests, opinions, etc.) and demographic information», Can be read further in the summary of the patent, to name just one of the many details mentioned. Conclusion: with such technology, Microsoft could well go beyond the simple improvement of current chat bots ... It remains to be seen how much and why.

But what if such a device fell into the hands of less ethical or malicious developers? The implications of this type of hyperspecific, digitized personality mimicry in this case could be as varied as they are disturbing.

While Microsoft is probably (hopefully) savvy enough not to use this technology for unethical purposes, such as digitally reviving deceased people or allowing the "digital personality" of any other targeted person to be copied, we cannot not to say the same of other companies and programmers who would use it. The idea of ​​a conversational AI based on a concentrate of tweets, Facebook messages and SMS, possibly coupled with a voice imitation (by the way, already possible) could open a whole new era of cyber-fraud and theft of identity.

There is also the thorny subject of a person's right to privacy after death. What about the ownership of digital data rights? Is the digital assistant you modeled in 2024 and bought on a whim really yours? So many questions whose answers are still unclear, given the youth of these technologies. It will probably be necessary and wise to determine precise frameworks of use for these technologies.

2
$ 0.01
$ 0.01 from @wrabbiter

Comments