Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.
Instagram won't let users search for the eggplant emoji, but they can still search for guns and knives
Instagram won't let users search for the eggplant emoji, but they can still search for guns and knives
Megan WillettApr 28, 2015, 22:09 IST
Instagram users rushed to start hashtagging their favorite emojis after the company revealed its new update on Monday.Searching for each emoji hashtag revealed the various ways people use the popular cartoon symbols. Who knew, for example, that the red "100" emoji was so popular?But soon, Buzzfeed's Katie Notopoulos realized that the only emoji users could not tag was the purple eggplant, which has long been used to symbolize the male anatomy.Similarly to how Instagram bans hashtags such as #penis, it now forbids users from searching for the eggplant hashtag. You can post it, but if you try to find other people's eggplant posts, nothing comes up:InstagramHowever, the most violent emojis - such as the gun and knife - are still fair game on Instagram.When users search the knife, which had over 1,000 tags at the time of this post, images and posts about users cutting or self-harming themselves pop up.In addition to the knife, the gun was one of the more popular vice emoji hashtags with over 5,000 posts. The posts included people posing with guns in addition to pictures of people hunting and at shooting ranges (plus lots of selfies).This was a pretty standard snap:Not all of the violent emoji hashtags brought up negative or upsetting images. For instance, the bomb emoji had nearly 3,000 tags, but was mostly pictures of things the user liked (think: "The Bomb"), such as food or scantily clad women.The syringe emoji is not for the squeamish. Though nothing in this tag was too offensive, you'll see a lot of pictures of people giving blood, sick in hospitals, or getting tattoos.The pill emoji also was filled with images of sick people in the hospital or taking medication.There were a few questionable images, however, that reference users abusing prescription medications.Business Insider contacted Instagram for comment about the banned eggplant. A spokesperson confirmed that Instagram has banned searching the eggplant emoji, saying that they "target terms and/or symbols that are typically used to violate [the] Community Guidelines."They added, "One of the signals we use to determine if a hashtag should be made unsearchable is if it's consistently associated with photos or videos that violate our policies."
For reference, this is Instagram's policy on nudity and genetalia on the app [emphasis ours]:
We know that there are times when people might want to share nude images that are artistic or creative in nature, but for a variety of reasons, we don't allow nudity on Instagram. This includes photos, videos, and some digitally-created content that show sexual intercourse, genitals, and close-ups of fully-nude buttocks. It also includes some photos of female nipples, but photos of post-mastectomy scarring and women actively breastfeeding are allowed. Nudity in photos of paintings and sculptures is OK, too.
Advertisement
And this is Instagram's policy on images that promote self harm [emphasis ours]:
The Instagram community cares for each other, and is often a place where people facing difficult issues such as eating disorders, cutting, or other kinds of self-injury come together to create awareness or find support. We try to do our part by providing education in the app and adding information in the Help Center so people can get the help they need.
Encouraging or urging people to embrace self-injury is counter to this environment of support, and we'll remove it or disable accounts if it's reported to us. To protect people, we may also remove content identifying victims or survivors of self-injury if the content targets them for attack or humor.