How It Works

It's simple: Observant uses the infrared depth sensors in iPhone X, XS, and XR to create a map of the user's face, and derives emotion using machine learning combined with additional heuristics. Data is provided in realtime through our iOS SDK.

The SDK

The Observant SDK (Software Development Kit) is a lightweight, high-performance module that can be embedded in any iOS app. Observant is super-easy to integrate into your app, as little as four lines of code. Once installed, the SDK combines infrared sensor data and our own machine learning layer to recognize users' emotions.

The Observant SDK is flexible, providing two modes of operation: (a) streaming emotion data in realtime, or (b) taking emotion snapshots based on events. Developers can also attach arbitrary metadata via the SDK.

The Observant SDK is fast. Latency to the first emotion in a stream or snapshot is between 10 and 200ms -- less time than it takes to blink! All further data is delivered in real-time, and Observant recognizes all possible emotions at every instant.

Depth sensor-driven

Using depth sensors on iPhone X, XR & XS, Observant recognizes facial expressions with groundbreaking accuracy and speed. Observant uses the IR depth sensor in iPhone's TrueDepth camera to capture a map of the user's face. This produces better input data for machine learning models compared to 2D RGB-only images more commonly used, as edge detection is much easier, particularly at oblique angles or against low-contrast backgrounds.

Real-time, Cloud Optional

The Observant SDK performs all processing on-device, and gives you real-time access to your users' emotions, no cloud required! You can use it for games, for optimally timing help, feedback, or app rating dialogs, and more. If you choose, emotion analytics will also be uploaded to the Observant web dashboard so you can figure out exactly what's causing your users to leave or convert. Want to get your data out of the cloud? We've got a Web API available as well (currently in BETA). The choice is yours!

Empowering Research and Focus Groups

Testing mobile apps or content? With Observant, no more sifting through hours of video or counting on honest survey answers. You can not only ask questions, but understand subjects' feelings from instant to instant. And with Observant's real-time emotion data available, you can ask individual participants to delve deeper on particular flashes of emotion, ones they might not otherwise remember a minute later.

Observant can also be used to test non-mobile experiences. Affix an iPhone to a MacBook to test web content, or setup an iPhone in a mock retail environment to evaluate in-store customer experience.

Observant-powered focus groups sound awesome, but you don't want to set one up yourself? Contact us.

FAQ

Which platforms do you support?

We support any native app on iPhone X, XS, and XR. We also support mobile web, via a native wrapper.

Desktop environments are supported as well. We do this by affixing an iPhone X device to your desktop display. We use custom native modules that map user emotion (captured via infrared dot projector) to the desktop display.

Is this a battery hog?
Nope! As it turns out, iPhone X continuously runs the infrared dot projector whenever the screen is on, in order to prevent screen dimming while the user is attentive.