These smart wireless earbuds can soon be controlled by touching your face
Bragi wants to turn your body into a remote control.
The Munich-based tech company is announcing the latest software update to its smart, fully wireless Dash earphones on Wednesday. It'll be released on November 21.
Bragi OS 2.2, as it's called, has some expected upgrades - there's an update to the voice technology powering the device, and a new "Windshield" feature that's supposed to let it block the sound of breeze in its "audio transparency" mode (which allows you hear the outside world while you have the earphones playing).
The interesting bit, though, is something Bragi calls "MyTap." This lets you double-tap the side of your face, right toward the top of your jawbone, to access a digital assistant: Siri on iOS, or Google Now on Android.
Here's what that looks like:
The Dash supports a suite of touch and gesture controls as it is, but those are all bound to tiny, often inexact touchpads on the earphones themselves.
This new feature is the first step toward overhauling that. It's fairly minor on its own - and technically in beta - but Bragi says it's the first part of a new set of controls called the Kinetic User Interface (or, KUI). Buzzwords aside, the company claims that'll consist of multiple movement-based controls, which'll be added in subsequent monthly software updates.
Jarrod Jordan, Bragi's chief marketing officer, says you might be able to change, fast forward, or rewind songs with separate taps of your face, call a specific person by nodding your head in a certain manner, or get the forecast just by looking up at the sky, among other things.
Trying to make "hearables" a thing
The idea here - as it was when the Dash raised nearly $3.4 million through Kickstarter in 2014 - is to let you replace many of the things you'd do with your phone with a computer in your ear.
The Dash was one of the first of these so-called "hearables" to hit the market. Right now it can track fitness stats, noise-cancel, and use touch commands to control music. In the future, Jordan says the company is working on translation (where the Dash translates foreign languages to your native one in your ear) and mapping tools. But the grand goal, Jordan says, is to build a screen-less platform to anchor an Internet of Things. The company partnered with IBM earlier this year to work toward that.
Bragi isn't alone, however. It's soon to be joined by similar buds like the Doppler Labs Here One, and, to a lesser extent, Apple's AirPods. Several other earphones have taken the fully wireless form factor, too, like Samsung's Gear IconX.
Like all of those devices, though, the Dash has gone through growing pains. Though Bragi said it's sold more than 100,000 units thus far, press and user reviews have been largely mediocre. After testing the device myself for the past several weeks (without the new update), I can say that using it means confronting short battery life and frequently choppy connections. At $300, they aren't cheap either.Part of those struggles come down to making something new. The other part comes down to Bluetooth - it's extraordinarily difficult to maintain a stable stream and save power using the current protocol. The Bluetooth SIG is planning to move all audio applications onto the more accommodating Bluetooth low-energy standard by next year, which should help greatly, but that's a ways off.
For now, there's a reason Apple delayed its AirPods to ensure they're "ready."
Supplanting the touchscreen
Many other complaints with the Dash are addressable through software updates. ("Windshield" is one example.) If Bragi and other hearable makers can work out all the kinks, though, they then have to convince people that actually using an in-ear computer is something that can be done comfortably.
The KUI looks to be Bragi's stab at that. Hitting a tiny sensor in your ear can be hit-or-miss, and calling out to a digital assistant in public is awkward. Pressing the side of your face is also awkward, but at least has the potential to be discreet while giving less room for error.
Does that mean we're in for a future where tapping on your face - perhaps with the help of an augmented reality display - is more natural than tapping on a touchscreen? I'd guess not, but who knows.
Either way, Bragi knows it's early days. "It's part of the big idea," Jordan says. "To remove a lot of the needs for looking at a screen, and instead let you do things or get feedback or assistance or enhancements in your life through auditory systems. Letting you manipulate those systems while using your body is just a step in that same direction."