New look, still the highest-accuracy emotion engine.

New look, still the highest-accuracy emotion engine.

New look, still the highest-accuracy emotion engine.

Experience Emotion as a Sixth Sense

Real-time emotional analysis watchOS app as a communication tool for neurodivergent people.

Apple Watch showing Vibes emotion sensing app by Valence AI

How it works

How Vibes Works

Designed to help teams respond with empathy, deliver trust, and build stronger relationships at scale.

Vibes app analyzing emotions every few seconds on Apple Watch

Analyzes every few seconds

Conversation occurring close to the watch gets analyzed on a regular basis, making sure the result is always up to date.

Vibes app confidently interpreting emotions in real time

Confidently interprets the emotion

Once the watch recognizes the correct emotion confidently, a custom tap and animation will notify you.

Emotion recognition

Recognizes 7 Basic Emotions

Sad emotion icon recognized by Vibes app
Angry emotion icon recognized by Vibes app
Nervous emotion icon recognized by Vibes app
Neutral emotion icon recognized by Vibes app
Surprised emotion icon recognized by Vibes app
Disgusted emotion icon recognized by Vibes app
Happy emotion icon recognized by Vibes app
  • Sad

  • Angry

  • Nervous

  • Neutral

  • Sad

  • Disgusted

  • Happy

See Vibes in Action

Discrete Haptic Feedback

Discrete Haptic Feedback

Feel unique taps for each emotion with a disappearing visual label that only appears when your screen is pointed towards you.

Feel unique taps for each emotion with a disappearing visual label that only appears when your screen is pointed towards you.

The Power of Neurodiversity

The Power of Neurodiversity

Valence recognizes the power of neurodiversity and seeks to empower people of all neurotypes to better interpret emotions without having to change their fundamental communication style.

Valence recognizes the power of neurodiversity and seeks to empower people of all neurotypes to better interpret emotions without having to change their fundamental communication style.

Used and trusted by the community

  • Friends sharing Vibes app experience together

    It’s fun to share! And friends love trying to experiment with it!

    Autistic Adult

  • "The app was extremely useful and helpful with my everyday life! Conversation was a lot less stressful with this helping me."

    A Neurodivergent Student

Friends sharing Vibes app experience together

It’s fun to share! And friends love trying to experiment with it!

Autistic Adult

"The app was extremely useful and helpful with my everyday life! Conversation was a lot less stressful with this helping me."

A Neurodivergent Student

Frequently Asked Questions

What data does Vibes take in?

Vibes uses only on-device processing, so your vocal data and emotional classifications will never be accessed by us or a third party because they do not leave your device. Your watch will temporarily record brief intervals of speech to analyze the emotional content, but they are automatically deleted from your device within seconds, so you will not be able to access them either.

How much does Vibes cost?

Vibes for Apple Watch costs $8.99 per month (with a one week free trial) or $89.99 per year on the App Store. We currently have a promotion running for 44% off the first year for yearly subscriptions, costing $49.99.

Which devices are supported by Vibes?

Currently, Vibes is only available for Apple Watches Series 4 and later. An iPhone standalone app is coming soon, and in the future we’d like to support other smart watches and platforms.

Which languages or accents are currently supported?

Vibes currently only supports North American English speakers, including age, gender, race, neurotype, geography and communication style diversity within North American English. More languages and accents are coming soon!

Improve Customer Understanding with Emotion AI

Enhance every interaction with emotion AI

Vibes emotion AI app displayed on Apple Watch

Menu

document.addEventListener('DOMContentLoaded', function() { var imgs = document.querySelectorAll('img:not([alt]), img[alt=""]'); var altMap = { 'AsSN70sKomVbRMMFDqCEflYyfnc': 'Valence AI emotion detection dashboard showing real-time vocal analysis', 'zcHgBgcarHGr5TsS022bfuesnuE': 'Valence Pulse API interface displaying emotion classification from voice data', 'E3zJMuVNaE79LHQbAmfaAZHmho': 'Customer experience team using Valence emotion intelligence dashboard', 'jQYyrsk1QosfECv0yJgmgaFOMEw': 'Step 1 Capture voice interactions for emotion analysis', 'ljr0kuVx1X0KvrQnvvGEeurBaaM': 'Step 2 Classify 10 core emotions from vocal tone in real time', 'FywqKnURb8DfxC08Q1uQqykuhg': 'Step 3 Respond with empathy using emotion-aware insights', 'IgnoPm5DBWOxNUMwEGhuA4ixNNw': 'Decorative wave accent', 'qcEWW3Sp1szegwCsNpcc8c18rQ': 'Partner logo', 'ASnmCgAbJVMrD6z9obPnwOL0k': 'Customer logo', 'kVGbbYrnDtnkUj0SDoCAkvf1Y': 'Customer logo' }; imgs.forEach(function(img) { var src = img.src || ''; for (var key in altMap) { if (src.indexOf(key) !== -1) { img.setAttribute('alt', altMap[key]); return; } } if (img.width <= 30) { img.setAttribute('alt', 'Icon'); } else { img.setAttribute('alt', 'Valence AI'); } }); });