- Taking its lead from Siri and Alexa, Salesforce announced Wednesday new voice tools for its artificial intelligence system Einstein.
- Einstein Voice includes an assistant tool, which can interpret voice memos and enter data from what it hears, as well as surface critical data from Salesforce using only voice commands.
- Salesforce also announced Einstein Voice Bots, which let companies create their own voice bots for answering customer service questions via a smart speaker.
Consumer voice assistants like Alexa and Siri have completely rewired the way people interact with technology, and soon the same will be possible for workers using Salesforce.
On Wednesday, Salesforce announced Einstein Voice, an extension of the platform’s artificial intelligence capabilities, adding the ability to interpret voices.
Soon users will be able to update their customer relationship management (CRM) software databases by dictating memos to Salesforce Einstein with their voice. Einstein Voice Assistant will then interpret the voice memo, translate it into text, and log that information into Salesforce.
The idea is to save Salesforce users time on data entry.
“It’s one of the most dreaded parts of using a CRM,” said Richard Socher, chief scientist at Salesforce, about the time spent inputting data. “But if you have natural language understanding, on top of the transcribed speech, then you can automate that process too.”
While in theory, business people could dictate notes through a consumer artificial intelligence tool like Siri or Alexa, Socher said that many professionals, like bankers, are restricted in what data they can share with those tools – they weren’t made with business-class security requirements in mind.
Since its clients are businesses, and sometimes compete with one another, Socher said Salesforce doesn’t cross- contaminate customer data, and Einstein is only able to pull data from that user’s account.
Einstein Voice Assistant can also be configured to understand slang and vocabulary that is unique to a particular company. One company may frequently reference an acronym or the name of a product that other companies don’t say at all, for example.
“At Salesforce we have the ‘V2MoM’ process, where every employee types out their vision and values, methods, obstacles and measurements. “”V2MoM’ is an odd term, but in our own speech recognition system, we can very much integrate that,” Socher said.
“Basically every company has its own lingo,” Socher added. “Every set of people has their own language, which is why language is so interesting to work with in AI. That configurability is something you expect in the enterprise world.”
In addition to the Voice Assistant, Salesforce announced that customers can build their own voice bots on the Einstein Bot Platform. This means a company can launch a customer service bot, for example, which customers can interact with through their Google Assistant or Amazon Alexa. A customer can ask routine questions through their smart speaker, and then Salesforce will source the appropriate response from the company’s Salesforce profile.
Both tools are in a private pilot right now, but Einstein Voice Assistant will be in open pilot in October, and Einstein Voice Bots will be in open pilot in June 2019.