Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful youā€™ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cutā€™nā€™paste it into its own post ā€” thereā€™s no quota for posting and the bar really isnā€™t that high.

The post Xitter web has spawned soo many ā€œesotericā€ right wing freaks, but thereā€™s no appropriate sneer-space for them. Iā€™m talking redscare-ish, reality challenged ā€œculture criticsā€ who write about everything but understand nothing. Iā€™m talking about reply-guys who make the same 6 tweets about the same 3 subjects. Theyā€™re inescapable at this point, yet I donā€™t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldnā€™t be surgeons because they didnā€™t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I canā€™t escape them, I would love to sneer at them.

Last weekā€™s thread

(Semi-obligatory thanks to @dgerard for starting this)

    • BlueMonday1984@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      8
      Ā·
      8 hours ago

      It was a pretty good comment, and pointed out one of the possible risks this AI bubble can unleash.

      Iā€™ve already touched on this topic, but it seems possible (if not likely) that copyright law will be tightened in response to the large-scale theft performed by OpenAI et al. to feed their LLMs, with both of us suspecting fair use will likely take a pounding. As you pointed out, the exploitation of fair useā€™s research exception makes it especially vulnerable to its repeal.

      On a different note, I suspect FOSS licenses (Creative Commons, GPL, etcetera) will suffer a major decline in popularity thanks to the large-scale code theft this AI bubble brought - after two-ish years of the AI industry (if not tech in general) treating anything publicly available as theirs to steal (whether implicitly or explicitly), Iā€™d expect people are gonna be a lot stingier about providing source code or contributing to FOSS.

      • gerikson@awful.systems
        link
        fedilink
        English
        arrow-up
        8
        Ā·
        7 hours ago

        Yeah, Iā€™m no longer worried that LLMs will take my job (nor ofc that AGI will kill us all) Instead the lasting legacy of GenAI will be a elevated background level of crud and untruth, an erosion of trust in media in general, and less free quality stuff being available. Itā€™s a bit like draining the Aral Sea, a vibrant ecosystem will be permanently destroyed in the short-sighted pursuit of ā€œdevelopmentā€.

        • BlueMonday1984@awful.systemsOP
          link
          fedilink
          English
          arrow-up
          5
          Ā·
          6 hours ago

          the lasting legacy of GenAI will be a elevated background level of crud and untruth, an erosion of trust in media in general, and less free quality stuff being available.

          I personally anticipate this will be the lasting legacy of AI as a whole - everything that you mentioned was caused in the alleged pursuit of AGI/Superintelligencetm, and gen-AI has been more-or-less the ā€œfaceā€ of AI throughout this whole bubble.

          Iā€™ve also got an inkling (which I turned into a lengthy post) that the AI bubble will destroy artificial intelligence as a concept - a lasting legacy of ā€œcrud and untruthā€ as you put it could easily birth a widespread view of AI as inherently incapable of distinguishing truth from lies.