+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

A supermarket experimented with AI to generate meal ideas for leftovers. It suggested drinking bleach and eating ant-poison sandwiches.

Aug 10, 2023, 17:45 IST
Business Insider
A stock photo shows a woman holding a cellphone and a shopping list in a supermarket.Getty Images
  • A New Zealand supermarket chain created a meal-planning bot using AI.
  • The Savey Meal-Bot by PAK'nSAVE recommends recipes based on leftover ingredients.
Advertisement

A New Zealand supermarket's experiment with AI has raised eyebrows after a bot designed to generate meal plans produced some highly dangerous recipes, The Guardian reported.

The Savey Meal-Bot, created by supermarket chain PAK'nSAVE, uses Chat GPT-3.5 to help users create meals out of any food they may have leftover in their fridge.

It requires users to input just three household ingredients, or more, to generate a recipe, which then comes with a suggested name and description.

The bot was created in order to help people save money and to reduce food waste, according to a report last month by FMCG Business.

But while the online tool sometimes offers helpful ideas, the potentially fatal concoctions it has offered some users are drawing unwanted attention.

Advertisement

The Guardian reported that one recipe, named the "aromatic water mix," would actually create chlorine gas. The bot described the recipe as "the perfect nonalcoholic beverage to quench your thirst and refresh your senses."

Inhaling chlorine gas can cause vomiting, suffocation, and even death.

Other users reported being recommended a "fresh breath" mocktail containing bleach and a "bleach-infused rice surprise," according to The Guardian.

The bot even recommended ant-poison-and-glue sandwiches, as well as "methanol bliss" — made with methanol, glue, and turpentine.

PAK'nSAVE did not immediately respond to Insider's request for comment.

Advertisement

But a spokesperson for the supermarket chain told The Guardian that they were disappointed to see "a small minority have tried to use the tool inappropriately and not for its intended purpose".

In a statement, the supermarket said it is "fine-tuning" the bot to ensure that it is safe and helpful to use.

The fine-tuning appears to be working, with the previously highlighted dangerous recipes no longer available.

When Insider attempted to input the same hazardous ingredients into the bot, a message read: "Invalid ingredients found, or ingredients too vague. Please try again!"

But while the potentially deadly recipes may be out, the supermarket's bot still recommended some unusual creations, including a "toothpaste beef pasta."

Advertisement
You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article