cross-posted from: https://lemmy.ml/post/3109500

Pak ‘n’ Save’s Savey Meal-bot cheerfully created unappealing recipes when customers experimented with non-grocery household items

  • giacomo@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    1 year ago

    Sure, it can make a recipe for chlorine gas, but can it recommend a wine pairing to go with the gas?

    • TheSpookiestUser@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      Agree. “Chatbot outputs ridiculous response when given ridiculous inputs” gets old.

      This was at least funny.

      • Buddahriffic@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        Though I would say that it spitting out recipes for things that aren’t even ingredients indicates that it’s not a useful tool. It’s not basing recipe recommendations on any knowledge of food, cooking, flavours, textures, or chemistry. Seems like it’s just arbitrarily fitting a list of ingredients into some other patterns.

        If it doesn’t understand “this isn’t a safe ingredient”, I doubt it understands anything about what ingredients that aren’t poison would go well together, other than ones it has seen paired in it’s training set.

  • TheDoctorDonna@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    The headline makes it sound as if it was just randomly suggesting this, but of course it would do that with people inputting non-food ingredients.

  • wahming@monyet.cc
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    1 year ago

    and noted that the bot has terms and conditions stating that users should be over 18.

    We should definitely prosecute kids who poison themselves or others via use of this app