Navigate

Introduction

Learn what Cephable is and how to embed its personal assistant or adaptive controls into your app or game.

What is Cephable?

Cephable is a personal AI assistant platform that processes user input — face expressions, head movement, voice commands, eye gaze, breath, and custom button switches — entirely on-device. All inference runs locally on the user's device inside the Cephable app, so no camera feed, microphone audio, or biometric data ever leaves the device.

The Cephable app acts as the on-device engine. Users configure their personal profiles with the inputs that work best for them. Your app or game receives only the resulting command strings (e.g., "eyebrows_raised", "move forward") — never raw sensor data.

How it works

┌─────────────┐       Device Hub          ┌──────────────────┐
│ Cephable    │ ──────────────────────────▶│  Your App / Game │
│ (on-device) │   sends command strings    │                  │
└─────────────┘                            └──────────────────┘
  1. Developer registers a project at portal.cephable.com and receives an OAuth Client ID, Client Secret, and Device Type ID.
  2. User authenticates in your app via Cephable OAuth — your app creates a Virtual User Device on their behalf.
  3. Virtual device appears in the user's Cephable app — the user assigns a profile with their preferred inputs.
  4. Your app connects to the Cephable Device Hub at https://services.cephable.com/device using a real-time WebSocket connection and a device token.
  5. Commands arrive as DeviceCommand events — face expressions, voice, gestures, virtual buttons — and your app maps them to actions.

Integration paths

Cephable supports two primary integration patterns:

Apps and Web

Embed Cephable's personal assistant into your web or Windows app. Users interact with your app hands-free through voice, facial expressions, and head movement — all processed privately on their own device. No data is sent to external AI services.

Use the Web SDK npm package for browser apps, or the Cephable.WPF NuGet package for Windows desktop apps.

Best for: Web apps, desktop Windows apps, Electron apps that want to embed a private, on-device personal assistant.

Web Integration Overview.NET Integration Overview

Games

Integrate Cephable's personal assistant adaptive controls to make your game more accessible. Players who can't use a standard controller can use face expressions, head movement, voice commands, or custom switches — all configured through their personal Cephable profile — to play your game.

Use the C# client (for Godot Mono, MonoGame, or generic .NET game engines) or the Unity sample with its VirtualController.cs and OAuth2Manager.cs scripts.

Best for: PC games and Unity games that want to support players with motor or physical disabilities through adaptive controls.

Virtual Controller IntegrationUnity Integration

Prerequisites

Before you start building:

  1. Start a free 30-day trial at services.cephable.com/trial/developers to get your API credentials — no credit card required
  2. You will receive an OAuth Client ID, Client Secret, and Device Type ID
  3. Create a project at portal.cephable.com
  4. Install the Cephable app on an iOS, Android, Mac, or PC device for testing

Resources

Resource Link
Cephable Portal portal.cephable.com
API / Swagger UI services.cephable.com/swagger
Virtual Controller Samples github.com/Cephable/Cephable-VirtualController-Sample
Unity Samples github.com/Cephable/Cephable-Unity-Samples
Tutorials cephable.com/tutorials

Next steps