Twitter/Yann Lecun
Movidius, based in San Mateo, California, has essentially put a deep learning chip inside a USB drive. Deep learning involves "training" a computational model so it can decipher natural language.
The "Fathom Neural Compute Stick" has been designed to connect to existing systems (running Linux) and increase the performance of deep learning tasks by 20-30 times.
Movidius chips are also used to help drones to avoid obstacles and thermal cameras to spot people in a fire. The company also signed a deal with Google that will see its chips deployed in some of Google's upcoming products.
Facebook's director of artificial intelligence, Yann LeCun, said in a statement: "As a participant in the deep learning ecosystem, I have been hoping for a long time that something like Fathom would become available. The Fathom Neural Compute Stick is a compact, low-power convolutional net accelerator for embedded applications that is quite unique.
"As a tinkerer and builder of various robots and flying contraptions, I've been dreaming of getting my hands on something like the Fathom Neural Compute Stick for a long time. With Fathom, every robot, big and small, can now have state-of-the-art vision capabilities."
The Fathom contains the same chip that DJI uses in its drones and FLIR uses in its cameras. That chip, the Myriad 2, can handle many processes simultaneously without consuming much power (no more than 1.2 watts).
The Fathom works with with existing deep learning frameworks, including Google TensorFlow and Caffe, which is a deep learning framework developed by the Berkeley Vision and Learning Centre (BVLC).
Google's AI technical lead, Pete Warden, added: "Deep learning has tremendous potential - it's exciting to see this kind of intelligence working directly in the low-power mobile environment of consumer devices. With TensorFlow supported from the outset, Fathom goes a long way towards helping tune and run these complex neural networks inside devices."
The device is going on sale at less than $100 (£70), according to CNET.