Bose AR BLE API


You don't have Web Bluetooth enabled. The demos won't work unless you enable it

Table of Contents:

  1. What is Bose AR?
  2. Connecting to Your Bose AR device
  3. Gestures
  4. Gesture Information
  5. Gesture Configuration
  6. Gesture Data
  7. Sensors
  8. Sensor Information
  9. Sensor Configuration
  10. Sensor Data
  11. Active and Controllable Noise Cancellation
  12. Why Bose AR Failed
  13. The Potential of Bose AR

What is Bose AR?

"Bose AR" is just a motion sensor Bose slapped on a bunch of their headphones, specifically:

It contains an accelerometer, gyroscope, and magnetometer, and provides the following services:

The official Bose AR SDK's (iOS, Android, and Unity) simply provide access to this motion sensor and nothing more. Since Bose has removed their official SDK's, this tutorial will show you how to access this motion sensor yourself.

Note that while Bose refers to it as an SDK, it's really more of an API, since it's just an interface to the motion sensor.

You may need to update your Bose AR device's firmware for the demos to work


Connecting to Your Bose AR device

Service UUID: "0000fdd2-0000-1000-8000-00805f9b34fb"

Start by connecting to the Bose AR device via Bluetooth LE (Low Energy). The service UUID for all Bose AR devices is "0000fdd2-0000-1000-8000-00805f9b34fb". You can see it in action in the Bose AR Web SDK.

Note that the "Bose AR device" is separate from the headphones itself. The "Bose AR Device" refers to the motion sensor that you connect via Bluetooth LE, whereas the headphones is just a regular pair of audio headphones you connect via Bluetooth Classic. This requires you to connect to your device twice - once to get the motion sensor data and again to get audio playback and microphone access.

While this may be annoying, it also has its advantages. You can have the Bose AR device connected to one system while the audio headset is connected to another system, decoupling motion sensor streaming and audio streaming.

Also note that a single Bose AR device can be connected to multiple systems at once, since it's just broadcasting motion sensor data. This is a bit of a security issue since anyone can enable/disable/access motion sensor data without the wearer's permission or even awareness.

You can also connect to multiple Bose AR devices from the same system for multi-user applications. Thus, you can even have a many-to-many connection (several Bose AR devices connected to several systems)

This tutorial just shows you how to connect to the motion sensor; you can connect to the audio headset like you do any other bluetooth headset (going to whatever system settings your platform OS has)

Demo

Code



        

Gestures

Bose AR devices allow you to enable/disable gestures, as well as be notified when they are trigged by the user.

There are 3 characteristics related to gestures:


Gesture Information

Characteristic UUID: "a0384f52-f95a-4bcd-b898-7b9ceec92dad"

This is a Read characteristic that provides how large each entry in Gesture Configuration is to help with parsing. The payload size is always 1 byte (a Uint8 byte indicating whether it's enabled or not), but we'll still use it when parsing Gesture Configuration anyway.

Below is each gesture's id number (it's pretty random, especially the gap between 131 and 192):

  1. Double Tap
  2. Head Nod
  3. Head Shake

  4. Unknown (on the Bose Frames it behaves the same as double-tap, though it may be something else on the QC35's or the 700's)
  5. Unknown (same as 192, but for Head Nod)
  6. Unknown (same as 192, but for Head Shake)

Characteristic Value

byte #
entry #
type
name

Demo

Code



        

Gesture Configuration

Characteristic UUID: "21e550af-f780-477b-9334-1f983296f1d7"

This is a Read/Write/Notify characteristic that describes which gestures are enabled. The byte array is composed of an array of [gestureId, isEnabled] entries, each a Uint8 byte.

To enable or disable a gesture, write a byte array in the same format as the characteristic value itself (see below). In the byte array, substitute a 0 or 1 in the isEnabled byte index after a given sensorId.

See Gesture Information to see each gesture's id number.

Characteristic Value

byte #
entry #
type
name

Demo









Code



        

Gesture Data

Characteristic UUID: "9014dd4e-79ba-4802-a275-894d3b85ac74"

This is a Notify characteristic that notifies you when gestures are triggered by the user. The first byte (Uint8) is the gesture id, and the next 2 bytes (big-endian Uint16) is the timestamp.

Characteristic Value

byte #
entry #
type
name

Demo









Code



        

Sensors

Bose AR devices allow you to enable/disable sensors (accelerometer, gyroscope, rotation, game rotation), as well as be notified at given intervals of new values.

There are 3 characteristics related to sensors:

Sensor Information

Characteristic UUID: "855cb3e7-98ff-42a6-80fc-40b32a2221c1"

This is a Read characteristic that describes how large each entry in Sensor Configuration is to help with parsing.

Below is each sensor's id number:

  1. Acceleration
  2. Gyroscope
  3. Rotation
  4. Game Rotation

Characteristic Value

byte #
entry #
type
name

Demo


Code



        

Sensor Configuration

Characteristic UUID: "5af38af6-000e-404b-9b46-07f77580890b"

This is a Read/Write/Notify characteristic that describes which sensors are enabled, and at what sampling rate. The byte array is composed of an array of [sensorId, sampleRate] entries, the first a Uint8, the second a Uint16.

See Sensor Information to see each sensor's id number

Characteristic Value

byte #
entry #
type
name

Demo






Code



        

Sensor Data

Characteristic UUID: "56a72ab8-4988-4cc8-a752-fbd1d54a953d"

This is a Notify characteristic that notifies you with new sensor values at a given sample rate. The first byte (Uint8) is the sensor id, the next 2 bytes (big-endian Uint16) is a timestamp. The following bytes are parsed differently depending on the sensor id. I've provided different parsing tables for each instance (the parsing table is the same for accelerometer and gyroscope)

Accelerometer and Gyroscope provide a 3D Vector, while Rotation/Game Rotation provide a Quaternion. Also while Rotation provides an 'accuracy' big-endian Uint16, Game Rotation does not.

The Orientation of the 3D Vector (relative to the direction the wearer is facing) is:

Sometimes you will receive 2 entries in the same notification byte array, but in the byte tables below just provide a single entry for simplicity. However the code/demo will parse the entire notification byte array, so you'll sometimes see 2 sensor values in the output if you run the demo.

Characteristic Values

Accelerometer (sensor id = 0) and Gyroscope (sensor id = 1)

byte #
entry #
type
name

Rotation (sensor id = 2)

byte #
entry #
type
name

Game Rotation (sensor id = 3)

byte #
entry #
type
name

Demo








Code



        

Active and Controllable Noise Cancellation

The Bose QC35 II's and 700's also have the ability to control Active Noise Cancellation (with 3 levels of noise cancellation) and Passive Noise Cancellation (with 10 levels of noise cancellation) respectively.

I haven't played with either of these headphones so I don't know how to access that stuff. I only have the Bose Frames and I don't plan on wasting any more money on Bose to get either of the other 2 to find out.


Why Bose AR Failed

There are 3 main reasons why I believe Bose AR failed:

  1. Bad Marketing
  2. Lackluster Developer Experience
  3. Incompetent Developer Relations

However, I think Fabien Bénétou put it best:

"Bose lacked respect for the developer community. There are numerous examples but the most poignant ones are IMHO closing the SDK or not supporting Android.

Bose is an audio hardware company that tried to emulate the success of the Apple AppStore and the Google PlayStore. Unfortunately they completely misunderstood the importance of having a successful ecosystem. They went back and forth between opening and closing their ecosystem. Developers have to strategically pick platforms to target. Platforms that are first of all working yes but also popular. If a developer invest time to learn a new SDK for a platform that could be popular they are taking a bet over actual revenue from a platform that already is, but also have more competition.

Developing an ecosystem goes way behind having an amazing product, it's about the relationships you establish with the community and this comes from trust. Closing an SDK or not communicating clearly with the strategy does not create trust.

Despite a cool product that I still tinker with I'm sadly not surprised."


The Potential of Bose AR

Bose AR was led by John Gordon who had a remarkable vision for what it could become. Unfortunately he was let go back in May 2019, as well as Business Lead Vivek Agrawal who left in February 2020. Others soon followed. One reported the team as toxic.

It's interesting that Bose is closing its AR SDK's at the same time Apple is introducing Spatial Audio to their AirPods Pro. The Bose AR/Frames leadership team was built by Mehul Trivedi, who had previously worked at Apple developing the now-deprecated OpenAL spatial audio library for iOS/Mac OS X. One of the members of this team was Kapil Krishnamurthy, who had also worked at Apple on spatial audio.

Despite Bose AR being a simple 3DoF head-tracker, it still had a lot of potential. They could've marketed it as cheap research tool ($200 for the Frames, or free if you already have a pair of Bose AR headphones) for head-tracking like the Leap Motion was for hand-tracking. Plus Bose attempted to approach 6DoF (position tracking) by partering with Navisens, which tracks your position using your smartphone's motion sensors.

Bose could've allowed developers to experiment with the platform (while sharing their own research from Bose Labs) in an open collaborative environment to study its design patterns, heuristics, and markets. Over time they could've iterated on the hardware/software, such as programmable haptics or 6DoF (possibly by applying Navisens sensor fusion on the Bose AR device's own motion sensors instead of the smartphone's).

They also could've partnered with Audio-related organizations such as the International Community for Auditory Display or Stanford Center of Computer Research in Music and Acoustics for explorations in Audio-based User Interfaces, including Audification, Sonification, Earcons, and Voice Messaging. Carla Scaletti gives a great talk on such topics.

The official SDK's are closed, so now a bunch of people are walking around with headphones that have an exposed motion sensor they can't even access. That's why I made this: so people can use something they've paid hundreds of dollars for.