Complex internet services fail in interesting ways as they grow in size and complexity. Twitter’s recent issues show how failures emerge slowly over time as relationships between components degrade. Meta’s quick launch of Threads demonstrates how platform investments can compound over time, allowing them to quickly build on existing infrastructure and expertise. While layoffs may be needed, companies must be strategic to maintain what matters most - the ability to navigate complex systems and deliver value. Twitter’s inability to ship new features shows they have lost this expertise, while Threads may out-execute them due to Meta’s platform advantages. The case of Twitter and Threads provides a lesson for companies on who they want to be during times of optimization.

  • shiri@foggyminds.com
    link
    fedilink
    arrow-up
    37
    arrow-down
    1
    ·
    1 year ago

    @scrubbles My favorite early moment was him firing people based on lines of code written… which of course meant he fired all of his best because the worst programmers write many lines that do less while great programmers write few lines that do more.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      arrow-up
      16
      ·
      edit-2
      1 year ago

      The last few months have been me going back and removing code lol. Most of my time is spent reviewing my jrs code now, less even writing my own! But no he very smart for line count = good programming

      • Grimpen@lemmy.ca
        link
        fedilink
        arrow-up
        12
        ·
        edit-2
        1 year ago

        Wasn’t that an old example of perverse incentives? IBM ranked or paid bonuses based on lines of code. In short order, all their code became bloated and inefficient.

        This was an old example in the 90’s and maybe the 80’s, so could have been over of the other OG computer companies (Digital, Sun, HP, etc). Could also be apocryphal. Point is, it’s a classic example of dumb management ideas.

        • Scrubbles@poptalk.scrubbles.tech
          link
          fedilink
          arrow-up
          10
          ·
          edit-2
          1 year ago

          i remember a corporate rule came down that we needed something like 70% of all code unit tested for stability.

          Damn were our getters and setters rock solid. No errors there. Business logic however…

          • MasterBuilder@lemmy.one
            link
            fedilink
            arrow-up
            0
            ·
            1 year ago

            Well, then the developers committed fraud, as getters and setters generally have very little logic. I’m surprised the code coverage reports failed to show the low coverage… You did have code coverage reports, rright?

              • MasterBuilder@lemmy.one
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                I find it a bit obnoxious to claim unit testing is a waste of time and then point to worthless testing of logicless code as proof.

                All that illustrates is that worthless tests are worthless. Basically, a tautology. If one wants to convince people that tests are worthless, show how actual test coverage added no value.

                The reason most coverage requirements are about 80%, is precisely that testing should not be done on code that has no business logic, like getters and setters.

                So, testing the one thing for which tests are worthless is fraudulent behavior and ironically just makes their own jobs that much more painful.

                • Scrubbles@poptalk.scrubbles.tech
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 year ago

                  Yes. That was the joke of it all. That a useless business rule that came down made developers more focused on hitting a metric rather than building useful tests. Thank you for explaining my own story to me.

                  • MasterBuilder@lemmy.one
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    1 year ago

                    Aha, well I like to think I would have picked up on the joke if this was an in-person discussion. I’ve heard that talking point as a serious condemnation of automated unit tests.

    • uberrice@feddit.de
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      Meh, the best programmers are probably somewhere in the middle.

      This also depends on what kind of work you’re doing.

      Writing some frontend with lots of Boilerplate? That’s lots of lines.

      Writing efficient code that for example runs on embedded systems? That’s different. My entire master’s thesis code project on an embedded system consisted of around 600 lines of C code, and it did exactly what it should, efficiently.

      A better metric to that effect would be the git activity graph. People that do important changes don’t commit 20 times a day - they push a commit usually once a day tops to once every 2 weeks

      • shiri@foggyminds.com
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        @uberrice
        It’s the fact that from what I saw he didn’t differ by project, and the fact that better programmers do more with less.

        He literally had them print out their code to judge off of. Security engineer tracking down vulnerabilities? Well, since that’s just a couple lines of code (if not just characters) after long stretches of testing? Fired for “productivity”

    • bluebockser@lemmy.ml
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      the worst programmers write many lines that do less while great programmers write few lines that do more.

      That doesn’t sound exactly right. Readability is IMO the most important code quality followed by things like maintainability. Conciseness is a lot further down the list. If I have to use more lines of code or even leave out a little performance optimization for readability, I generally do.

      • conciselyverbose@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Great programmers aren’t playing code golf.

        Their code is naturally smaller because they recognize patterns and understand what should be turned into functions/classes/etc and what should not. There absolutely is a point where cutting out lines of code is a negative, but well structured code just takes so much less code than a mess that that’s not what really moves the needle on the metrics.