A jailbreak version of ChatGPT is becoming popular with women who prefer it to real world dating.

  • enkers@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    8
    ·
    19 days ago

    Unless you jailbreak your AI, they’re generally designed to deescalate any potentially romantic situations, so I’d imagine it’d result in a very platonic friends situation where both parties chatter on about nothing.

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      18 days ago

      Eventually it would devolve into gibberish as language models today can not create new content from scratch