Source

I see Google’s deal with Reddit is going just great…

  • Soyweiser
    link
    fedilink
    English
    arrow-up
    34
    ·
    2 years ago

    I also wanted to post this post. But it is going to be very funny if it turns out that LLMs are partially very energy inefficient but very data efficient storage systems. Shannon would be pleased for us reaching the theoretical minimum of bits per char of words using AI.

    • sinedpick
      link
      fedilink
      English
      arrow-up
      19
      ·
      edit-2
      2 years ago

      huh, I looked into the LLM for compression thing and I found this survey CW: PDF which on the second page has a figure that says there were over 30k publications on using transformers for compression in 2023. Shannon must be so proud.

      edit: never mind it’s just publications on transformers, not compression. My brain is leaking through my ears.