Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful youā€™ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cutā€™nā€™paste it into its own post ā€” thereā€™s no quota for posting and the bar really isnā€™t that high.

The post Xitter web has spawned soo many ā€œesotericā€ right wing freaks, but thereā€™s no appropriate sneer-space for them. Iā€™m talking redscare-ish, reality challenged ā€œculture criticsā€ who write about everything but understand nothing. Iā€™m talking about reply-guys who make the same 6 tweets about the same 3 subjects. Theyā€™re inescapable at this point, yet I donā€™t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldnā€™t be surgeons because they didnā€™t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I canā€™t escape them, I would love to sneer at them.

(Taking over for Gerard this time. Special thanks to him for starting this.)

  • swlabr@awful.systems
    link
    fedilink
    English
    arrow-up
    12
    Ā·
    2 days ago

    It probably deserves its own post on techtakes, but letā€™s do a little here.

    People are tool-builders with an inherent drive to understand and create

    Diogenesā€™s corpse turns

    which leads to the world getting better for all of us.

    Of course Saltman means ā€œall of my buddiesā€ as he doesnā€™t consider 99% of the human population as human.

    Each new generation builds upon the discoveries of the generations before to create even more capable toolsā€”electricity, the transistor, the computer, the internet, and soon AGI.

    Ugh. Amongst many things wrong here, people didnā€™t jerk each other off to scifi/spec fic fantasies about the other inventions.

    In some sense, AGI is just another tool in this ever-taller scaffolding of human progress we are building together. In another sense, it is the beginning of something for which itā€™s hard not to say ā€œthis time itā€™s differentā€; the economic growth in front of us looks astonishing, and we can now imagine a world where we cure all diseases, have much more time to enjoy with our families, and can fully realize our creative potential.

    AGI IS NOT EVEN FUCKING REAL YOU SHIT. YOU CANā€™T CURE FUCK WITH DREAMS

    We continue to see rapid progress with AI development.

    I must be blind.

    1. The intelligence of an AI model roughly equals the log of the resources used to train and run it. These resources are chiefly training compute, data, and inference compute. It appears that you can spend arbitrary amounts of money and get continuous and predictable gains; the scaling laws that predict this are accurate over many orders of magnitude.

    ā€œIntelligenceā€ in no way has been quantified here, so this is a meaningless observation. ā€œDataā€ is finite, which negates the idea of ā€œcontinuousā€ gains. ā€œPredictableā€ is a meaningless qualifier. This makes no fucking sense!

    1. The cost to use a given level of AI falls about 10x every 12 months, and lower prices lead to much more use. You can see this in the token cost from GPT-4 in early 2023 to GPT-4o in mid-2024, where the price per token dropped about 150x in that time period. Mooreā€™s law changed the world at 2x every 18 months; this is unbelievably stronger.

    ā€œMooreā€™s lawā€ didnā€™t change shit! It was a fucking observation! Anyone who misuses ā€œmooreā€™s lawsā€ outta be mangioneā€™d. Also, if this is true, just show a graph or something? Donā€™t just literally cherrypick one window?

    1. The socioeconomic value of linearly increasing intelligence is super-exponential in nature. A consequence of this is that we see no reason for exponentially increasing investment to stop in the near future.

    ā€œLinearly increasing intelligenceā€ is meaningless as intelligence has not beenā€¦ wait, Iā€™m repeating myself. Also, ā€œsuper-exponentialā€ only to the ā€œsocioā€ that Olā€™ Salty cares about, which I have mentioned earlier.

    If these three observations continue to hold true, the impacts on society will be significant.

    Oh hm but none of them are true. What now???

    Stopping here for now, I can only take so much garbage in at once.

      • swlabr@awful.systems
        link
        fedilink
        English
        arrow-up
        11
        Ā·
        1 day ago

        Youā€™d think that, at this point, LW style AGI wish fulfilment fanfic would have been milked dry for building hype, but apparently Salty doesnā€™t!

        • istewart@awful.systems
          link
          fedilink
          English
          arrow-up
          5
          Ā·
          12 hours ago

          The cult will be the very last constituency to drop away, and likely form the majority of his hiring pool at this point, especially to replace the high-level attrition they suffered last year.

          • skillissuer
            link
            fedilink
            English
            arrow-up
            6
            Ā·
            11 hours ago

            You know, i wondered how come that in modern, tech literate, western allied country presidency was captured by a cult. (South Korea) i donā€™t wonder about that anymore