New look, still the highest-accuracy emotion engine.

New look, still the highest-accuracy emotion engine.

New look, still the highest-accuracy emotion engine.

Menu

Communication Is a Two-Way Street

Blog

Customer Intelligence

Science

Podcast

Technical

Valence AI team member headshot

Chloe Duckworth

Jan 10, 2023

I've been thinking a lot about communication standards—who sets them, who follows them, who can get away with breaking them, and who can't.

Our thesis at Valence Vibrations has been that the burden of altering communication has been unduly and unequally placed on the most marginalized people in a given conversation.

Whether that be a Black DEI expert being expected to "check her tone" or code-switch when educating a team, or an autistic man having to mask his natural microexpressions in his workplace, or an immigrant taking accent-reduction classes to make it easier for others to understand them—we are constantly asking people to alter their natural expressions to make others comfortable.

And this behavior continues because we expect it to. I say *we* because as humans, we all struggle to correctly identify patterns we are less familiar with. And just as it's hard to identify a Golden Pheasant from a Partridge if you've never seen either, it's difficult to correctly perceive someone's emotions when you have limited experience of them or of other people with their exact demographics.

Our advisor, Sharena Rice refers to our goal for Vibes as an emotional passport, able to experience and learn from the patterns of a more diverse set of voices than we're typically encountering (and encoding) on a regular basis.

If you’d like to try Vibes for Apple Watch, you can download it here.

Improve Customer Understanding with Emotion AI

Enhance every interaction with emotion AI

Improve Customer Understanding with Emotion AI

Enhance every interaction with emotion AI

Improve Customer Understanding with Emotion AI

Enhance every interaction with emotion AI

document.addEventListener('DOMContentLoaded', function() { var imgs = document.querySelectorAll('img:not([alt]), img[alt=""]'); var altMap = { 'AsSN70sKomVbRMMFDqCEflYyfnc': 'Valence AI emotion detection dashboard showing real-time vocal analysis', 'zcHgBgcarHGr5TsS022bfuesnuE': 'Valence Pulse API interface displaying emotion classification from voice data', 'E3zJMuVNaE79LHQbAmfaAZHmho': 'Customer experience team using Valence emotion intelligence dashboard', 'jQYyrsk1QosfECv0yJgmgaFOMEw': 'Step 1 Capture voice interactions for emotion analysis', 'ljr0kuVx1X0KvrQnvvGEeurBaaM': 'Step 2 Classify 10 core emotions from vocal tone in real time', 'FywqKnURb8DfxC08Q1uQqykuhg': 'Step 3 Respond with empathy using emotion-aware insights', 'IgnoPm5DBWOxNUMwEGhuA4ixNNw': 'Decorative wave accent', 'qcEWW3Sp1szegwCsNpcc8c18rQ': 'Partner logo', 'ASnmCgAbJVMrD6z9obPnwOL0k': 'Customer logo', 'kVGbbYrnDtnkUj0SDoCAkvf1Y': 'Customer logo' }; imgs.forEach(function(img) { var src = img.src || ''; for (var key in altMap) { if (src.indexOf(key) !== -1) { img.setAttribute('alt', altMap[key]); return; } } if (img.width <= 30) { img.setAttribute('alt', 'Icon'); } else { img.setAttribute('alt', 'Valence AI'); } }); });