Tech really can’t see begging for more money from VC’s as nothing short of a revolution, rolls eyes.

I can’t believe he invented Markdown, one of the most genuine amazing things to come out of languages and the like, especially after what I just read.

JOHN GRUBER wrote a… thing? About Open AI’s begging of more money they absolutely don’t need. I stopped reading at this paragraph. Maybe he proves me wrong in the latter parts of the post, but this paragraph sure as shit did nothing to convince me he’s worth keeping up with for serious tech journalism.

> Thus, effectively, OpenAI is to this decade’s generative-AI revolution what Netscape was to the 1990s’ internet revolution. The revolution is real, but it’s ultimately going to be a commodity technology layer, not the foundation of a defensible proprietary moat. In 1995 investors mistakenly thought investing in Netscape was a way to bet on the future of the open internet and the World Wide Web in particular. Investing in OpenAI today is a bit like that — generative AI technology has a bright future and is transforming the world, but it’s wishful thinking that the breakthrough client implementation is going to form the basis of a lasting industry titan.

I mean, if you wanna read it, have at it, but I simply have to know, where does his journalism shine?

https://daringfireball.net/2024/12/openai/_unimaginable

#AIHype #AI #Technology @techtakes

  • YourNetworkIsHaunted@awful.systems
    link
    fedilink
    English
    arrow-up
    5
    ·
    3 days ago

    I think his criticism of the economics and business sense is pretty reasonable, even though he is definitely being pretty credulous about the capabilities of the underlying tech. One of the fun side effects of the diminishing returns in raw scaling is that the competitors are rapidly catching up with the capabilities of ChatGPT, which is going to be bad news for Saltman and the gang. What goes unaddressed is the bigger underlying problem; these systems don’t actually do what they’re being advertised for and burn an unsustainable and unconscionable amount of money (and actual resources in case anyone forgot) to do it. That’s going to be the difference between OpenAI falling apart and being overtaken by another company with better monetization or the entire tech sector facing a recession, and I’m pretty sure the latter is more likely.

  • jaschop@awful.systems
    link
    fedilink
    English
    arrow-up
    10
    ·
    4 days ago

    Fits a pattern I’ve seen before. Kinda critical of OpenAI and not buying their PR wholesale, but also accepting the framing that AI is some kind of critical foundational tech instead of another shitty magic trick.

  • gerikson@awful.systems
    link
    fedilink
    English
    arrow-up
    7
    ·
    4 days ago

    I stopped reading Gruber years ago to preserve my blood pressure, but this particular piece is not that bad. In particular the Netscape analogy rang true.

  • luciole (he/him)@beehaw.org
    link
    fedilink
    English
    arrow-up
    7
    ·
    4 days ago

    People were formatting text only emails and README text files following loose conventions and Gruber came along, documented common practice and called it Markdown. I don’t know if I’d call that “inventing” anything.

      • luciole (he/him)@beehaw.org
        link
        fedilink
        English
        arrow-up
        7
        ·
        4 days ago

        Before Markdown there was setext, Textile, reStructuredText and atx, which were basically the same thing but less trendy. Gruber had the huge advantage of being popular with the Macbook Starbucks coder crowd.

        • froztbyte@awful.systems
          link
          fedilink
          English
          arrow-up
          8
          ·
          4 days ago

          so, for instance, in the case of markdown specifically: it’s actually a shitshow format to implement. the reference implementation driving the “spec” as done by gruber was (iirc) essentially a big stack of regexes with language-specific behaviours assumed in, and lots of by-design implementation presumptions rather than by-specification dictation

          want to make an implementation in another language? better hope you know every single corner case and behaviour!

          (this is one of the reasons why markdown impls varied for so many years, and why even now “commonmark” is still trying to fix the issue)

  • iii@mander.xyz
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    9
    ·
    edit-2
    4 days ago

    I mostly agree with the author. OpenAI was the first to demonstrate that LLM, trained on huge datasets, can create a convincing parrot. But alternatives are catching up, to the point that I’m no longer paying for openAI and am advising clients to use alternatives, too.

    • froztbyte@awful.systems
      link
      fedilink
      English
      arrow-up
      7
      ·
      4 days ago

      a swing and a miss! maybe your new years resolutions should include knowing something about where you’re posting

      • Soyweiser@awful.systems
        link
        fedilink
        English
        arrow-up
        11
        ·
        4 days ago

        But chatgpt said this was a place where people smart people talked about AI/LLMs/AGI.

        (Which reminds me that on reddit, being active on r/buttcoin also got you targeted by DMs trying to get you to buy into cryptocurrencies, and the sub itself often had pro crypto posts by bots not knowing where they were posting).