Source

I see Google’s deal with Reddit is going just great…

  • sinedpick@awful.systems
    link
    fedilink
    English
    arrow-up
    19
    ·
    edit-2
    7 months ago

    huh, I looked into the LLM for compression thing and I found this survey CW: PDF which on the second page has a figure that says there were over 30k publications on using transformers for compression in 2023. Shannon must be so proud.

    edit: never mind it’s just publications on transformers, not compression. My brain is leaking through my ears.