• lemmyvore@feddit.nl
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        8 months ago

        As long as you can reduce something to a pattern, it will work with a LLM. That’s what they’re great at, matching and recognizing patterns.

        You might still do better with random moves. Depends on a couple of things.

        First, a LLM is only as good as its training data. Depends on whether that data contained enough good moves that would work against a random button pusher.

        There’s also the question of whether the random pusher is human or not. Humans are not great at generating random data, we tend to think in patterns and there’s also muscle memory. So I think the moves of a human random masher could easily fit into defendable patterns.

        If the random masher is a computer I think it comes down to how well the game is designed, whether it rewards combos, whether longer patterns that build on each other have a large advantage over a series of completely random individual moves.

    • KoboldCoterie@pawb.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 months ago

      Yeah, it’d be more interesting to see this done with, for instance, an RTS. Something where smarter decisions can beat out faster gameplay some percentage of the time. Obviously high APM is important in an RTS, but in this Street Fighter example, I’m pretty sure a 5 year old who only knows how to Hadouken spam would beat any of these LLMs from what we’re seeing here; it’s not so much about how good their decision-making is, but just about which ones execute the most moves that have a chance to connect.

      • Lydia_K@startrek.website
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 months ago

        LLMs don’t make decisions or understand things at all, they just regurgitate text in a human like manner.

        I say this as someone who sees a lot of potential in the technology, though, but like this, or like most people are claiming we can use them.