This app can tell what you Instagram the most, but sometimes it's hilariously wrong
Image recognition is hard, and your Instagram filters aren't helping.
Fast Forward Labs released a new app called Pictograph that scans your Instagram photos and identifies what you photograph the most.
It's not tracking hashtags or locations, the labels that the photographer chooses. Instead, the app analyzes the images themselves to categorize each shot, sorting photos of shorelines and burgers into their own categories.
The result shows how accurate computer artificial intelligence can be - and how hilariously wrong it can be.
One Instagram account from a burger restaurant was shown as being really into burgers and fries...and crabs. Those "crabs" were actually just photos of fries.
It didn't just label them as crabs, though. Pictograph tried to identify the darker fries as king crabs, and the sauce-covered ones were Dungeness crabs.
When I added my Instagram, I was not surprised to see that I photograph food, dogs, and shorelines a lot. Considering I enjoy brunch, volunteer at a dog shelter, and live in San Francisco surrounded by water, it made sense.
However, I was surprised to see that they had correctly labeled some photos of my dog as a Spitz, a breed most people have never heard of since she's normally considered a chow or told that she looks like a fox.
Any photos scanned by the service are added into the algorithm so it can learn over time, although there is an option to delete your data entirely after you generate your most-Instagrammed categories.
With Pictograph, Mason particularly wants to dispel the myth that artificial intelligence is dangerous. "For one thing, deep learning is just too exciting," Mason said.
It may be exciting, but it's obviously not totally there yet.
While Pictograph may have pegged my top three categories, it also mislabeled the fourth, confusing shots of wineries in Napa for photos of fences.
In my entrees category, it identified a photo of hedgehog-shaped buns filled with red bean paste as a "custard apple" and "ice cream." I'll give it a pass on that one since that's a more obscure food, and the algorithm, Mason explained, had likely never seen it before.
The algorithm is trained on a set of images called ImageNet. The photos are all stock photos, which don't have people in them and are shot on a white background, Mason said.
Instagram is designed for selfies of people and for using filters, making it a hard challenge for computers to nail.
"That data set has no people in it. That means someone got upset because her toddler in a white t-shirt was classified as a bandage," Mason said. "My selfies come up as wigs, and my photos of the New York subway come up as a correctional system."
I can see how a computer could mistake rows of grapes for a fence or a pile of fries as crab legs. Algorithms may be able to guess many things about us, but I'm comfortable knowing they can be hilariously wrong, too.