Opsis is a spin-off from A*STAR / Advanced Digital Sciences Center (ADSC) based on 5 years of R&D in medical research.
The technology already has use-cases in:
- Tele-Health, where video care providers get real time emotional feedback on the state of the people in needs. Discover emotional signals and facilitate interventions with any video platform or devices.
- Tele-Education with insights metrics on teaching content receptiveness and help educators to identify the state of online learners e.g., Confuse, Stress or Bored.
- Tele-HR for online recruitment or training with behavioral analysis data for interpretation based on psychometric evaluation.
The key differentiating factors of the company's emotion recognition A.I. from those in the market are:
- Only 1 in the world using psychology circumplex model resulting in more precise emotion measurements with 2 Dimension data points.
- Allows for real-time processing and analysis for individual and large crowd group emotional state. Most competitors allow for only single analysis.
- Won 6 International Awards and accolates. Accounts for ethnicity differences with high accuracy of 93% vs. 70% available in the market.
- Thousands of mood detection with interactive responds vs. 7 basis labelling·
- Technology accounts for cultural difference with high accuracy especially in Asia emotion
Seamless deployment includes: APP, SDK, Raspberry Pi, PCB Cloud Service
We are looking for a partner to co-develop innovative solution(s) for challenges in an era where society is impacted by COVID.
They could be:
- Service Providers
- System Integrators
- Consumer Devices
- Digital Interactive Displays
- Online platform
- Computer chips / PCB
- Any hardware/IoT devices /edge
Technology Features, Specifications and Advantages
While most AI solutions focus mainly on the “IQ” aspect of intelligence, emotional intelligence or “EQ” is just as important for machines to be able to interact with humans effectively and naturally.
Opsis' facial expressions detection technology is an essential component for understanding human emotions. Using advanced image processing and computer vision with close to a million psychometric real person trained data, Opsis utilised machine learning technology to automatically conduct non-intrusive emotion analysis. Our machine learning algorithms are able to detect subtle expressions through real-time analysis for intervention, based on thousand of moods trained.
We have build a video platform to support any video calls via different services (e.g., Zoom, Skype, and others). Our focus is the ease of deployment through computer or smartphones. This will avoid the need for the caller to install a special app, and allow both caller and consultant to use existing video call services. We have implement a live screen capture from different video communication services or apps, which will then be fed directly to our API for the analysis of caller expressions and emotions. The live analysis can be used to provide immediate feedback to the differbt stake-holders.
1. Market Research : To evaluate if the advertising evoked the intended emotion and predict sales. Solution for participants to view ads, while their facial reactions were captured via webcam and the emotional responses.
2. Digital Advertising : Bringing emotional intelligence to digital advertising experiences for consumers, improving brand outcomes for advertisers and increasing revenues for top-flight publishers. Using content evaluation tool that helps advertisers maximize the emotional and business impact of an ad.
3. Retail : The use of emotion A.I. to track customer satisfaction, predicting and influencing customer behavior, predicting demand via user activities such as visual product search and chatbots. There are retailers who have already installed emotion A.I. technology in stores to capture sentiment data such as visitors’ mood and customer satisfaction.
4. Education : Create platform to support educational interactions that can adapt to kids’ emotions. Such EdTech could sense and respond to students' facial expressions as they go through the learning content. This emotion tracking mechanism can allow for the incorporation of gamified EQ for kids without using text or language.
5. Recruitment and Employee Safety : Utilise emotion A.I during job interviews to understand the behaviour and personality of a candidate. The technology can also be used to analyze the stress and anxiety levels of employees.
6. Car Safety : To develop and create experimental emotion A.I to understand drivers’ and passengers’ states and moods, in order to address critical safety concerns and deliver enhanced in-cabin experiences for public transport.
7. Robotics : Incorporate emotional IQ into robots or devices. This hybrid technology integration can allow for robotic attendant or companions for use-cases such as supporting nursing home residents, as well as acting as greeters in retail stores, banks, and hotels.
- Gain emotion insights in a non-intrusive way for content, A/B testing and packaging.
- Boost content designed in a practical approach beyond the spoken or written feedback, allowing for more genuine emotion feedback.
- Enlarge survey measurement. Dealing with the problem of high survey "satisfaction" scores but low emotion recognition as users tend to be polite.
- Strengthen market research methods to formulate preferences verbally with collation to emotions for new service or products.
- Deepen behavioral methods and automate the emotion analysis to scale the data collection efforts from multiple sources.
- Discover emotions in real-time for customers service, improve productivity, save costs and analyse user experience to plan for future improvements.
- Quantified consumers behavior trends. Optimize everything, from shelf-level displays to store layout based on customers’ emotion analytics
- Link Emotion Analytics. Data from multiple iOT devices, capturing emotion/facial expressions, to cater for broad understanding of users behavior
- Extend emotion detection for better measurement in interview, employee morale and HR tool for recruitment strategies
- Link "human" element to computer interactions, to support creation of EQ powered intelligent personal assistant, for front-facing roles such as concierge service and customer service.
- Many devices have I.Q but No E.Q. Infusion of emotion intelligence into devices can make A.I. truly social, empathic and smarter.
- Expand digital out-of-home (DOOH) marketing measurement in public places, in transit, commercial, retail venue, billboards, street and roads.
- Widen emotional analyse at point of experience to provide for greater understanding of behavior patterns. This can help predict likely purchasing trends.
- Insights into what impacts customer emotions, providing valuable information that drive sales, design for improved product and service offerings, as well as consumer experiences.
There could be many other possibility for the integration of such a technology. Technology seekers who sees a translational application opportunity are welcome to drop an enquiry for futher discussions.