Overview
NextEVI’s emotion recognition system analyzes vocal patterns and speech characteristics to detect user emotions in real-time. This enables the AI to generate contextually appropriate and empathetic responses, creating more natural and engaging conversations.
Emotion recognition is one of NextEVI’s core differentiating features, providing deeper insight into user emotional state than traditional voice AI systems.
How It Works
Vocal Analysis Pipeline
Feature Extraction : Extracts acoustic features from voice audio
Prosodic Analysis : Analyzes speech rhythm, stress, and intonation patterns
Spectral Analysis : Examines frequency domain characteristics
Emotion Classification : Identifies specific emotions using ML models
Confidence Scoring : Provides reliability scores for detected emotions
Supported Emotions
NextEVI detects the following emotions with confidence scores:
Joy Happiness, excitement, contentment
Sadness Sorrow, disappointment, melancholy
Anger Frustration, annoyance, irritation
Fear Anxiety, worry, nervousness
Surprise Astonishment, shock, amazement
Disgust Aversion, revulsion, distaste
Neutral Calm, balanced emotional state
Empathy Compassion, understanding, care
Confusion Uncertainty, bewilderment, doubt
Accessing Emotion Data
React SDK Integration
Use the emotion data from voice messages:
import React from 'react' ;
import { useVoice } from '@nextevi/voice-react' ;
function EmotionAwareChat () {
const { messages , connect } = useVoice ();
const handleConnect = async () => {
await connect ({
auth: {
apiKey: "oak_your_api_key" ,
projectId: "your_project_id" ,
configId: "your_config_id"
},
// Enable emotion detection
sessionSettings: {
emotion_detection: {
enabled: true ,
confidence_threshold: 0.7
}
}
});
};
return (
< div >
< button onClick = { handleConnect } > Start Emotion-Aware Chat </ button >
{ messages . map ( message => (
< div key = { message . id } className = "message" >
< div className = "content" > { message . content } </ div >
{ /* Display emotion data */ }
{ message . metadata ?. emotions && (
< EmotionDisplay emotions = { message . metadata . emotions } />
) }
</ div >
)) }
</ div >
);
}
function EmotionDisplay ({ emotions }) {
const dominantEmotion = Object . entries ( emotions )
. reduce (( a , b ) => a [ 1 ] > b [ 1 ] ? a : b )[ 0 ];
const emotionEmojis = {
joy: '😊' ,
sadness: '😢' ,
anger: '😠' ,
fear: '😰' ,
surprise: '😲' ,
disgust: '🤢' ,
neutral: '😐' ,
empathy: '🤗' ,
confusion: '🤔'
};
return (
< div className = "emotions" >
< span className = "dominant" >
{ emotionEmojis [ dominantEmotion ] } { dominantEmotion }
</ span >
< div className = "emotion-scores" >
{ Object . entries ( emotions ). map (([ emotion , score ]) => (
< div key = { emotion } className = "emotion-score" >
< span > { emotion } : </ span >
< div className = "bar" >
< div
className = "fill"
style = { { width: ` ${ score * 100 } %` } }
/>
</ div >
< span > { ( score * 100 ). toFixed ( 1 ) } % </ span >
</ div >
)) }
</ div >
</ div >
);
}
WebSocket Integration
Listen for emotion update events:
const ws = new WebSocket ( 'wss://api.nextevi.com/ws/voice/conn-123?api_key=oak_your_api_key&config_id=your_config_id' );
ws . onmessage = ( event ) => {
const message = JSON . parse ( event . data );
if ( message . type === 'emotion_update' ) {
const emotionData = message . data ;
console . log ( 'Dominant emotion:' , emotionData . dominant_emotion );
console . log ( 'Confidence:' , emotionData . confidence );
console . log ( 'All emotions:' , emotionData . emotions );
// React to specific emotions
handleEmotionChange ( emotionData );
}
};
function handleEmotionChange ( emotionData ) {
const { dominant_emotion , confidence , emotions } = emotionData ;
// Only react to high-confidence emotions
if ( confidence < 0.7 ) return ;
switch ( dominant_emotion ) {
case 'sadness' :
console . log ( 'User seems sad, responding with empathy' );
// Trigger empathetic response or change conversation tone
break ;
case 'anger' :
console . log ( 'User seems frustrated, switching to calm tone' );
// Switch to de-escalation strategies
break ;
case 'confusion' :
console . log ( 'User seems confused, offering clarification' );
// Provide additional explanations
break ;
case 'joy' :
console . log ( 'User seems happy, matching positive energy' );
// Match enthusiasm level
break ;
}
}
Configuration Options
Configure emotion detection behavior:
Session Settings
{
"type" : "session_settings" ,
"data" : {
"emotion_detection" : {
"enabled" : true ,
"confidence_threshold" : 0.7 ,
"update_frequency" : "continuous" ,
"emotions_to_track" : [ "joy" , "sadness" , "anger" , "neutral" ],
"analysis_window" : 3.0
}
}
}
Enable/disable emotion detection
Minimum confidence score to trigger emotion updates (0-1)
update_frequency
string
default: "continuous"
Update frequency: “continuous”, “on_turn_complete”, “interval”
Specific emotions to monitor (empty array = all emotions)
Time window in seconds for emotion analysis
React SDK Configuration
const { connect } = useVoice ();
await connect ({
auth: authConfig ,
sessionSettings: {
emotion_detection: {
enabled: true ,
confidence_threshold: 0.8 ,
update_frequency: "continuous" ,
analysis_window: 2.5
}
}
});
Advanced Features
Emotion History
Track emotion changes over time:
import { useState , useEffect } from 'react' ;
import { useVoice } from '@nextevi/voice-react' ;
function EmotionHistoryTracker () {
const { messages } = useVoice ();
const [ emotionHistory , setEmotionHistory ] = useState ([]);
useEffect (() => {
// Extract emotions from messages
const emotions = messages
. filter ( msg => msg . metadata ?. emotions )
. map ( msg => ({
timestamp: msg . timestamp ,
emotions: msg . metadata . emotions ,
dominant: getDominantEmotion ( msg . metadata . emotions )
}));
setEmotionHistory ( emotions );
}, [ messages ]);
const getDominantEmotion = ( emotions ) => {
return Object . entries ( emotions )
. reduce (( a , b ) => a [ 1 ] > b [ 1 ] ? a : b )[ 0 ];
};
const getEmotionTrend = () => {
if ( emotionHistory . length < 2 ) return 'stable' ;
const recent = emotionHistory . slice ( - 3 );
const positiveEmotions = [ 'joy' , 'empathy' ];
const negativeEmotions = [ 'sadness' , 'anger' , 'fear' ];
const positiveCount = recent . filter ( item =>
positiveEmotions . includes ( item . dominant )
). length ;
const negativeCount = recent . filter ( item =>
negativeEmotions . includes ( item . dominant )
). length ;
if ( positiveCount > negativeCount ) return 'improving' ;
if ( negativeCount > positiveCount ) return 'declining' ;
return 'stable' ;
};
return (
< div className = "emotion-history" >
< h3 > Emotion Trend: { getEmotionTrend () } </ h3 >
< div className = "timeline" >
{ emotionHistory . map (( item , index ) => (
< div key = { index } className = "emotion-point" >
< span className = "time" >
{ item . timestamp . toLocaleTimeString () }
</ span >
< span className = "emotion" > { item . dominant } </ span >
< div className = "confidence-bar" >
< div
className = "fill"
style = { { width: ` ${ item . emotions [ item . dominant ] * 100 } %` } }
/>
</ div >
</ div >
)) }
</ div >
</ div >
);
}
Adaptive Response System
Create AI responses that adapt to user emotions:
function AdaptiveVoiceAssistant () {
const { messages , connectionMetadata } = useVoice ();
const [ responseStyle , setResponseStyle ] = useState ( 'neutral' );
useEffect (() => {
const recentEmotions = messages
. slice ( - 3 )
. filter ( msg => msg . type === 'user' && msg . metadata ?. emotions )
. map ( msg => msg . metadata . emotions );
if ( recentEmotions . length > 0 ) {
const avgEmotions = calculateAverageEmotions ( recentEmotions );
const newStyle = determineResponseStyle ( avgEmotions );
setResponseStyle ( newStyle );
// Send style preference to assistant
updateAssistantStyle ( newStyle );
}
}, [ messages ]);
const calculateAverageEmotions = ( emotionList ) => {
const avgEmotions = {};
const emotionKeys = Object . keys ( emotionList [ 0 ]);
emotionKeys . forEach ( emotion => {
const sum = emotionList . reduce (( acc , emotions ) =>
acc + emotions [ emotion ], 0
);
avgEmotions [ emotion ] = sum / emotionList . length ;
});
return avgEmotions ;
};
const determineResponseStyle = ( avgEmotions ) => {
if ( avgEmotions . sadness > 0.6 ) return 'empathetic' ;
if ( avgEmotions . anger > 0.5 ) return 'calming' ;
if ( avgEmotions . confusion > 0.5 ) return 'explanatory' ;
if ( avgEmotions . joy > 0.6 ) return 'enthusiastic' ;
return 'neutral' ;
};
const updateAssistantStyle = ( style ) => {
// Send system message to adjust response style
const systemMessage = {
type: "session_settings" ,
data: {
response_style: {
tone: style ,
empathy_level: getEmpathyLevel ( style ),
explanation_detail: getDetailLevel ( style )
}
}
};
// Send via WebSocket if using direct connection
// Or use SDK method if available
};
return (
< div className = "adaptive-assistant" >
< div className = "current-style" >
Response Style: < strong > { responseStyle } </ strong >
</ div >
< div className = "style-indicator" >
{ getStyleIndicator ( responseStyle ) }
</ div >
</ div >
);
}
Emotion-Based Analytics
Track emotion patterns for analytics:
function EmotionAnalytics () {
const { messages } = useVoice ();
const [ analytics , setAnalytics ] = useState ({});
useEffect (() => {
const emotionData = messages
. filter ( msg => msg . type === 'user' && msg . metadata ?. emotions )
. map ( msg => msg . metadata . emotions );
if ( emotionData . length === 0 ) return ;
const stats = {
totalMessages: emotionData . length ,
dominantEmotions: getDominantEmotions ( emotionData ),
emotionDistribution: getEmotionDistribution ( emotionData ),
emotionalJourney: getEmotionalJourney ( emotionData ),
wellbeingScore: calculateWellbeingScore ( emotionData )
};
setAnalytics ( stats );
}, [ messages ]);
const getDominantEmotions = ( emotionData ) => {
const counts = {};
emotionData . forEach ( emotions => {
const dominant = Object . entries ( emotions )
. reduce (( a , b ) => a [ 1 ] > b [ 1 ] ? a : b )[ 0 ];
counts [ dominant ] = ( counts [ dominant ] || 0 ) + 1 ;
});
return Object . entries ( counts )
. sort (([, a ], [, b ]) => b - a )
. slice ( 0 , 3 );
};
const calculateWellbeingScore = ( emotionData ) => {
const positiveEmotions = [ 'joy' , 'empathy' ];
const negativeEmotions = [ 'sadness' , 'anger' , 'fear' ];
let positiveSum = 0 ;
let negativeSum = 0 ;
emotionData . forEach ( emotions => {
positiveEmotions . forEach ( emotion => {
positiveSum += emotions [ emotion ] || 0 ;
});
negativeEmotions . forEach ( emotion => {
negativeSum += emotions [ emotion ] || 0 ;
});
});
const total = positiveSum + negativeSum ;
return total > 0 ? ( positiveSum / total ) * 100 : 50 ;
};
return (
< div className = "emotion-analytics" >
< h3 > Emotion Analytics </ h3 >
< div className = "wellbeing-score" >
Wellbeing Score: { analytics . wellbeingScore ?. toFixed ( 1 ) } %
</ div >
< div className = "dominant-emotions" >
< h4 > Top Emotions </ h4 >
{ analytics . dominantEmotions ?. map (([ emotion , count ]) => (
< div key = { emotion } >
{ emotion } : { count } occurrences
</ div >
)) }
</ div >
</ div >
);
}
Best Practices
Emotion Detection Accuracy
Use confidence thresholds to filter low-quality detections
Consider cultural and individual differences in emotional expression
Combine emotion data with conversation context for better accuracy
Allow users to provide feedback on emotion detection accuracy
Always inform users that emotion detection is active
Provide options to disable emotion tracking
Don’t make assumptions about user mental state based on single interactions
Respect user privacy and emotion data sensitivity
Make gradual adjustments rather than sudden tone changes
Maintain consistency with your brand voice
Provide fallback responses for uncertain emotion states
Test emotional response patterns with diverse user groups
Cache recent emotion data for trend analysis
Implement rate limiting for emotion-based actions
Handle emotion detection failures gracefully
Monitor emotion detection performance and accuracy
Use Cases
Customer Support
// Detect customer frustration and escalate to human agent
function CustomerSupportBot () {
const { messages } = useVoice ();
const [ escalationTriggered , setEscalationTriggered ] = useState ( false );
useEffect (() => {
const recentUserMessages = messages
. filter ( msg => msg . type === 'user' )
. slice ( - 3 );
const highAngerMessages = recentUserMessages . filter ( msg =>
msg . metadata ?. emotions ?. anger > 0.7
);
if ( highAngerMessages . length >= 2 && ! escalationTriggered ) {
triggerHumanEscalation ();
setEscalationTriggered ( true );
}
}, [ messages ]);
}
Healthcare Assistant
// Monitor emotional wellbeing during health consultations
function HealthcareAssistant () {
const { messages } = useVoice ();
const [ concerningPatterns , setConcerningPatterns ] = useState ([]);
useEffect (() => {
const emotionHistory = extractEmotionHistory ( messages );
const patterns = detectConcerningPatterns ( emotionHistory );
setConcerningPatterns ( patterns );
if ( patterns . includes ( 'persistent_sadness' )) {
suggestSupportResources ();
}
}, [ messages ]);
}
Educational Tutor
// Adapt teaching approach based on student emotions
function AdaptiveTutor () {
const { messages } = useVoice ();
const [ teachingStyle , setTeachingStyle ] = useState ( 'standard' );
useEffect (() => {
const recentEmotions = getRecentEmotions ( messages );
if ( recentEmotions . confusion > 0.6 ) {
setTeachingStyle ( 'detailed_explanation' );
} else if ( recentEmotions . boredom > 0.5 ) {
setTeachingStyle ( 'interactive_engagement' );
} else if ( recentEmotions . joy > 0.7 ) {
setTeachingStyle ( 'accelerated_learning' );
}
}, [ messages ]);
}
Emotion recognition is a powerful feature that should be used responsibly. Always prioritize user privacy and provide clear information about how emotional data is used.