Apple is thinking about using Apple Watch-tracked gestures to control your devices
In a patent application published on Thursday, the Californian technology company outlines a way to interact with the Apple Watch and other linked devices using special sensors in the wearable device.
These sensors will monitor hand and arm movements, figure out what the user is doing, then translate them into commands.
You might rotate your wrist to modify the volume, or raise the angle of your hand to accept a phone call, or lower your hand to scroll through a list of apps.
"The device can be attached to, resting on, or touching the user's wrist, ankle or other body part," the filing says. "One or more optical sensors, inertial sensors, mechanical contact sensors, and myoelectric sensors can detect movements of the user's body. Based on the detected movements, a user gesture can be determined. The device can interpret the gesture as an input command, and the device can perform an operation based on the input command."
Gesture controls for computers has never really hit the mainstream, and often relies on camera input to track the users' hands. (Gaming is a different story.) Making use of wearables that the user already owns is an interesting alternative approach - and would work even in low-light conditions or situations where the the users' hand(s) are not directly in front of a camera.
It's worth noting that Apple files thousands of patents every year, and many never make it into finished products. They can be wildly experimental, or precautionary, or intended to trip up competitors who might be exploring similar areas.
But it does demonstrate that Apple is thinking about the potential of gesture controls as a new input method, and has been for years. The patent application was published on Thursday, but it was first filed back in October 2013, a year before the first Apple Watch was even announced.
It's also similar to another patent application published in March 2016, that was reported on at the time by 9to5Mac.