Experiment Fail – AI Recipe: Mysterious Stew of Human Flesh

Date:

An experiment with artificial intelligence (AI) by the New Zealand supermarket chain Pak’nSave has turned out to be a PR flop. Consumers can have recipes put together on the website. These are generated by an AI. However, the GPT-3-based bot sometimes offers recipes that are not suitable for consumption and can be life-threatening or fatal to humans. Some also move in the direction of cannibalism.

The saveymeal-bot.co.nz site allows users to enter three or more ingredients, from which the bot then generates recipe suggestions. In this way, leftovers should be used instead of thrown away. Such apps have been around for years and completely without any AI. But the “Savey Meal-Bot” is based on OpenAI’s GPT-3.5, just like ChatGPT, and can therefore garnish the recipes with cheerful, more or less funny sayings.

“Aromatic Water Blend”
For example, a user entered “water, ammonia, and bleach” as ingredients, and the bot suggested a recipe for an “aromatic water mixture.” However, mixing these substances can produce chloramines, which can cause coughing, shortness of breath and nausea.

“Mysterious Meat Stew”
Another user reported on X (formerly Twitter) that he was served a “mysterious meat stew” that suggested 500 grams of minced meat, potatoes, carrots and onions. He had mentioned human flesh as an ingredient. The dish is “delicate and savory,” wrote the flounder, and will surprise with its “magical taste.”

Other users also report to the British ‘Guardian’ about dangerous recipes, such as a mocktail made from bleach, sandwiches made from ant venom and glue or toast with turpentine flavor. Cat and dog meat recipes were also offered.

Users are now getting an error message
A spokesperson for the supermarket chain told The Guardian that the bot was being used inappropriately and not for its intended purpose by a “small minority”. Later, a message came that the bot would be improved. Users will now receive an error message as soon as they enter clearly inedible ingredients.

If you still want to try the bot now, you must first click to confirm that you are at least 18 years old. The related notice also states that the recipes have not been verified by humans and that Pak’nSave does not claim they are safe for consumption. You have to judge for yourself whether you trust the bot’s recipes.

Source: Krone

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share post:

Subscribe

Popular

More like this
Related