typing rhythm
gaze pattern
micro-pauses
context switches
fatigue trend

Device-level cognitive state modeling for adaptive AI systems

Synstate Labs is an independent research and engineering lab focused on device-level modeling of cognitive state for AI systems.

Our core technology, the Cognitive State Engine, runs on personal devices and turns continuous interaction signals into a live model of user state that applications and AI agents can use to adapt their behavior.

Cognitive State Engine

A privacy-first, on-device engine that fuses interaction signals into a compact set of state variables, enabling adaptive timing, pacing, and interaction intensity across applications and AI agents.

TYPING MOUSE CONTEXT Focus 0.82 Workload 0.64 Fragment. 0.31

Multimodal signal fusion

Typing, mouse, gaze, context switching, micro-pauses — fused into a unified state model.

Live state variables

Focus, workload, fragmentation, fatigue trend — updated continuously in real time.

On-device processing

Raw content stays local. State extraction happens entirely on the user's device.

State interface for adaptation

A compact API for apps and AI agents to adapt timing, pacing, and interaction intensity.

See it working

Synstate Live Console

System-level live demo of real-time state modeling: from interaction signals to live state variables.

Synstate Focus Playground

Application-level demo to test state-aware interaction patterns such as pacing, simplification, and break timing.

Interactive Demo

Live demo (by request)

See how Synstate's Cognitive State Engine tracks user state in real time — from interaction signals to live state variables, in a desktop console and interaction playground.

We offer a guided live demo session where we walk through the Synstate Live Console and the Focus Playground, discuss your use cases, and answer technical questions about integration and roadmap.

Book a guided demo

What changes when AI can see state

Without state awareness
I'm fine, just need to finish this report
Sure! Here's a template for your report. Let me know if you need anything else! 😊
With Synstate perception
Focus: Low Fatigue: Rising Session: 3h 42m
I'm fine, just need to finish this report
You've been at this for nearly 4 hours and your focus pattern suggests fatigue. I'll keep the template minimal — here are the 3 essentials. Consider a short break after this section.

Synstate doesn't create understanding. It reveals what was already there in the signal.

From snapshot to trajectory

Current AI sees each interaction as a fresh start. Synstate tracks patterns over days — detecting the slow drift from baseline to burnout before it becomes a crisis.

Balanced

Rested baseline

Normal workload, regular breaks, stable patterns

Back-to-back meetings

Context switching rises, micro-pause frequency drops

Late-night work sessions

3 consecutive days of irregular patterns detected

Burnout trajectory detected

Synstate alert → manager notified → schedule review

Intervention and recovery

Adjusted schedule, resilience stabilizing

Research areas

Multimodal interaction signals

Modeling user state from everyday human–computer interaction: keyboard and mouse dynamics, gaze and micro-pauses, context switching and session structure.

Cognitive load and attention dynamics

Inferring attention, fragmentation and workload over time, rather than momentary "focus scores", with an emphasis on stability and interpretability.

State interfaces for AI systems

Designing APIs and protocols that let agents and applications adapt timing, pacing and intensity of interaction without creating new failure modes.

On-device privacy and safety

Methods for extracting useful state while keeping raw content local, minimizing identifiability and misuse risks.

Current work

Cognitive State Engine prototype — on-device engine that fuses typing, mouse, gaze and context signals into a small set of state variables (focus, workload, fragmentation, fatigue trend).

Live console and playground demos — internal tools to visualize state in real time and test how applications can adapt navigation, pacing and breaks.

Early design-partner studies — qualitative and quantitative studies with early partners to evaluate behavioral change and perceived overload reduction in state-aware systems.

Open questions

We are currently exploring questions such as:

How stable and transferable are state representations across tasks and devices?

How should agents negotiate with each other when they all want user attention?

What is the minimal state interface that is still useful for real applications?

How do we measure long-term impact on cognitive overload and burnout risk, not just short-term engagement?

Read the full scientific foundation

We collaborate with research groups and product teams interested in human-state modeling for AI systems.

Get in touch about research collaboration