Sir John Kirwan talks to a Digital Human. He’ll be using FaceMe’s platform to design AI-powered digital mental health coaches.
IBM has published a case study on FaceMe’s Digital Humans.
Customers love the speed and convenience of digital channels, but the absence of personal interactions makes it harder for businesses to differentiate themselves. FaceMe uses IBM technologies to create lifelike Digital Humans who respond to spoken inquiries in a natural way – providing help, advice and the all-important “human” touch.”
Analysts estimate that, within ten years, 85 percent of interactions between businesses and their customers will happen through digital channels. While these channels offer speed and convenience at low cost, substituting a digital interaction for a personal one creates a less differentiated customer experience. And with 73 percent of customers ranking quality of experience alongside price and service as a key influencer of brand loyalty, can businesses really afford to give up the opportunity to stand out from their competitors? The stakes are high: churn and revenue loss are the natural consequences of poor customer experience.
What if there were a way to bring the human element into digital interfaces to create compelling interactions? To personalize the digital customer relationship experience at scale? These questions prompted Danny Tomsett, Founder and CEO of FaceMe, to imagine how the positive impact of face-to-face communications in sales could be integrated into digital channels.
“Ultimately, people embody a brand’s values, and the emotional connection with people creates engagement and loyalty,” Tomsett explains. “We believe customer experience is the new currency. By enabling companies to understand emotions, express empathy and converse naturally over digital channels, we aim to help them boost the value of those experiences.”
Drawing on its deep expertise in fields such as computer vision, emotional understanding and real-time animation, FaceMe envisioned an exceptionally ambitious idea: to create realistic, three-dimensional Digital Humans who can understand spoken language, process visual cues about the speaker’s mood, assess the customer’s needs, and respond in a natural way with appropriate facial expressions.
Read the full case study here.