Did you get it? Are you sure? Missed and mixed signals are common problems with human interpretation, and perceptive technologies have the power to correct both. New advancements in perceptive technologies can help you design and deliver more compelling and personalized experiences, from entertainment to engagement, which will not only track how your end users are responding in real-time but also adjust accordingly.
Why should you care? Because your competitors are already incorporating emotion detection and personalized responses into their experience design. Customer service long ago adopted heat maps for callers to prioritize and route complaints, but there are less obvious and more compelling ways in which perceptive technologies can help advance your brand and product roadmap.
Small but Mighty
Sensors are the foundation of perceptive technologies and generate signals for real-time or asynchronous analysis. Sensors fall into a basic dichotomy. The distinctions between sensor types and use cases are significant, and the rapid emergence of new classes of virtualized sensors will have a significant impact on applications.
Hard Sensors
Hard sensors measure the physical attributes of an object. They can measure weight, vibration, pressure, touch, movement and so on. Lights in a room might sense your presence, for example, and dim or brighten accordingly. Soap and paper towel dispensers that sense your hand are quite common. Parking sensors in congested areas alert drivers to open spots, and you’ve probably also heard your car chirp at you when an obstacle is detected as you try to change lanes.
Common functions of hard sensors include the following:
- Environmental sensors measure light values, temperatures, air quality and so on
- Chemical sensors measure allergens and toxicants in the air
- Presence sensors measure the location and movement of objects
- Biometric sensors measure physiological attributes such as heart rate, blood pressure and glucose levels
Virtual Sensors
Virtual sensors, also referred to as soft sensors, allow for abstraction. Virtual sensors aggregate one or more sensor data streams and produce a derived output. They are powered by simple software analytics, change detection algorithms, and other feature detection techniques. They can recognize well-known patterns, as well as, atypical behaviors. You can imagine that virtual sensors can be extended to analyze voice recordings in real time to “hear” anger in someone’s voice, or to analyze real-time video to “see” sadness in a detected frown.
Emotion Sensing—Powered by AI
The human emotional state is never static, which makes the task of sensing an evolving emotional state highly complex. Multi-modal signals derived via speech and image recognition, natural language understanding, biometrics and other AI techniques that detect abnormalities relative to established baselines are key. When combined in analysis, they provide the subtle cues about a person’s intention and state of mind as well as their physical, psychological and emotional well-being. Want to detect sarcasm in your teenaged children? New solutions might soon help you.
Multi-modal input data types include the following:
- Spoken words (natural language processing)
- Voice tone (prosody)
- Facial recognition (emotion)
- Hand gestures
- Gaze and focus
- Body language/posture
- Dynamic physical behavior
- Spatial proximity
- Excitement and stress levels (heart rate, pupil dilation)
Emotion sensors are already on the market. Affectiva, for example, is a leading provider of “Emotion as a Service.” The company provides Software Development Kits (SDKs) that analyze facial expressions, word choice and vocal patterns to indicate emotions such as sadness, anger, happiness, confusion and so on. Affectiva has enabled the next generation of immersive experience developers to create authentic and emotionally rich games through its integration with the Unity platform.
Emotional Disambiguation
Let’s go back to the notion of sarcasm. The same comment delivered in the same way could mean very different things from different speakers or even the same speaker in different moods. Context is critical. An individual’s life experience, use of vocabulary, implicit and explicit social cues, biases, psychology, cultural norms and non-universal nuances all shape context.
AI systems are getting smarter in matters of the human psyche. Automated systems will eventually be able to project empathy, and simple matters such as detecting a genuine versus a sarcastic “thanks” will be possible too. Keep in mind that although humans can exhibit defensive behavior, AI systems won’t. Think you adequately resolved a crucial disagreement? Sensors can help augment your understanding of the other party and eliminate interpretive error.
Applications of Perceptive Technologies
Conversational AI
The current era of communication with virtual assistants was ushered in with chatbots and the convergence of machine learning, speech recognition and natural language processing. Today’s commonly referenced voice assistants, Siri and Alexa, are one dimensional—designed for a specific kind interaction or task, such as finding directions, simple queries like weather forecast, order placement and so on. They are limited.
As perceptive technologies mature, expect new immersive experiences to incorporate conversational AI techniques that can sense intention and emotions in a real-time context and respond appropriately. New experiences on the horizon will include virtual scholars teaching seminars, virtual therapists that are available 24×7 and virtual personal shoppers who know your preferences and real-time body specifications.
Active Storytelling – Story-Specific AI Agents
Today, you might sit back and watch a story evolve. In the future, the story will choose its flow and finale based on how you respond and appear to be feeling. Sensors that read the audience will help direct an AI agent to alter the story experience. AI agents continue to evolve in their ability to mimic individuals (like actors and celebrities) and generate two-way conversation, although fidelity of impersonation is still work in process.
Game designers have been introducing AI agents as non-player characters for some time. Incorporation of emotional intelligence into character design is further evolving, and the University of California at Santa Cruz’s Expressive Intelligence Studio is at the forefront.
Intelligent Assistants and Companion Robots
Smart speakers hit their stride in 2017. Today, one in six Americans owns one. Although it was spoofed on Saturday Night Live, there is truth to Alexa helping seniors stay socially connected. The intuitive voice-first interface with its ever-evolving set of services—entertaining games and content, on-demand video collaboration, Alexa-to-Alexa messaging and helpful reminders—significantly enhances their daily lives. Bloomberg has reported that Amazon is working on a mobile Alexa, a sort of social robot that would enable more personalized experiences.
The next generation of personal social companion robots are likely to sense your state of mind, learn your likes and dislikes, monitor your daily routines, perform basic household chores and even entertain you. In some markets, robots have been anthropomorphized by owners. Japan’s population has taken to “physical” robotic companions such as Paro and Kirobo, which can forge emotional connections with people of all ages and help avert feelings of isolation and depression. Cozmo, the charming and playful toy robot produced by Anki, was designed to change its behavior and grow with its owner as it forges an emotional attachment.
Predictive-Sensitive Homes
How are you feeling? Your house may soon be able to tell you. With an integrated array of sensors and cloud-based AI machine-learning algorithms, predictive-sensitive homes will monitor you and your loved ones’ baseline behaviors. Through a combination of sensor types, important changes such as reduced mobility, symptoms of the onset of dementia, anger and fear, anxiety and depression will be detected earlier and with greater accuracy than self-reporting or human observation. Environmental, floor and behavioral sensors will be able to mood wash the home to reflect or influence your state of mind.
Product and Talent Implications
Missed signals won’t entirely be a thing of the past, but sensor-stimulated empathy will undoubtedly be a thing of the future. Building perceptive technologies into your product roadmap will require a considerable amount of user testing and adoption smoothing. Acceptance of these types of technologies and their implications ranges considerably across demographic and psychographic cohorts and use cases. It will be critical to articulate the value of monitoring and the uses of data generated as well, so that end users embrace new solutions.
Sophisticated product teams are already bolstering their roadmaps with sensors and sensor-driven data, powered by new networks and platform capabilities. Working with perceptive technologies early and training your systems to get smarter with them will give your company an early advantage. Seek out those experienced in AI, psychographics, mechatronics, human-robot interaction design, privacy-security, and futurists to power your product roadmap with perceptive technologies. Interested in collaborating with us on this topic? Reach out to the CableLabs’ Market Development department.
Take a look at how some of these perceptive technologies will come to life here in this 2017 CableLabs’ vision-casting video: The Near Future: A Better Place.
In the next installment of our Emerging Technology Timeline, we will discuss how professions will be reimagined in a world where emerging technologies are dramatically impacting companies, customers and employees.
About the Authors
Anju Ahuja
In our ever-evolving marketplace, Anju believes that taking a “Future Optimist” approach to solving challenging problems manifests solutions that benefit both the individual and the enterprise. Today Anju takes this approach to answer questions for emerging technologies like AR, VR, MR, AI and how they will work with traditional media, communications and the broader global cable industry. As Vice President of Market Development and Product Management, Anju leads the team whose charge is to enable transformative end user experiences, and revolutionize the delivery of new forms of content, while also unleashing massive monetization opportunities. Anju also serves on the Board of Directors of Cable & Telecommunications Association for Marketing (CTAM) as well as the President’s Advisory Council of Northern California Women in Cable Telecommunications (WICT). She is a Silicon Valley Business Journal Women of Influence 2018 honoree.
Martha Lyons
Inventor, Futurist and Technologist, Martha Lyons is the Director of Market Development at CableLabs. With a wide-ranging career at Silicon Valley high tech companies and non-profits, Martha has over two decades of experience in turning advanced research into reality. A leading authority in the initiation and development of first of kind solutions, her current focus is the identification of industry-leading opportunities for the Cable industry. She is personally interested in how advances in the areas of intelligent agents, Blockchain, bioengineering, novel materials, nanotech and holographic displays will create opportunities for disruptive innovation, to the delight of end users, in industries ranging from healthcare, retail, and travel to media and entertainment. When Martha is not inventing the future, she enjoys disconnecting from technology and spending time outdoors, preferably near some body of water.