• merc@sh.itjust.works
    link
    fedilink
    arrow-up
    34
    arrow-down
    1
    ·
    3 days ago

    Summarizing requires understanding what’s important, and LLMs don’t “understand” anything.

    They can reduce word counts, and they have some statistical models that can tell them which words are fillers. But, the hilarious state of Apple Intelligence shows how frequently that breaks.