Linux is a branch of development of the old unix class of systems. Unix is not necessarily open and free. FOSS is what is classified as open and free software. Unix since its inception was deeply linked to specific industrial private interests, let’s not forget all this while we examine the use of linux by left minded activists. FOSS is nice and cool, but it is nearly 99.99% run on non-open and non-free hardware. A-political proposals of crowd-funding and diy construction attempts have led to ultra-expensive idealist solutions reserved for the very few and the eccentric affluent experimenters

Linux vs Windows is cool and trendy, is it? Really is it alone containing any political content? If there is such what is it? So let’s examine it from the base.

FOSS, People, as small teams or individuals “producing as much as they can and want” offering what they produced to be shared, used, and modified by anyone, or “as much as they need”. This is as much of a communist system of production and consumption as we have experienced in the entirety of modern history. No exchange what so ever, collective production according to ability and collective consumption according to need.

BUT we have corporations, some of them mega-corps, multinationals who nearly monopolize sectors of computing markets, creating R&D departments specifically to produce and offer open and free code (or conditionally free). Why? Firstly because other idiots will join their projects and offer further development (labor), contribute to their projects, for “free”, but they still retain the leadership and ownership of the project. Somehow, using their code, without asking why they were willing to offer it in the first place, it is cool to use it as long as we can say we are anti/against/ms-win free.

Like false class consciousness we have fan-boys of IBM, Google, Facebook, Oracle, Qt, HP, Intel, AMD, … products against MS.

Back when unix would only run on enterprise ultra-expensive large scale systems and expensive workstations (remember Dec, Sun, Sgi, … workstations that were priced similarly to 2 brand new fast sportscars each) and the PC market was restricted to MS or the alternative Apple crap, people tried and tried to port forms of unix into a PC. Some really gifted hacking experts were able to achieve such marvels, but it was so specific to hardware that the examples couldn’t be generalized and utilized massively.

Suddenly this genious Finn and his friends devised a kernel that could make most PC hardware available work and unix with a linux kernel could boot and run.

IBM saw eventually a way back into the PC market it lost by handing dos out to the subcontractors (MS), and saw an opportunity to take over and steer this “project” by promoting RedHat. After 2 decades of behind the scenes guidance since the projected outcome was successful in cornering the market, IBM appeared to have bought RH.

Are we all still anti-MS and pro-IBM,google,Oracle,FB,Intel/AMD?

The bait thrown to dumb fish was an automated desktop that looked and behaved just like the latest MS-win edition.

What is the resistance?

Linus Trovalds and a few others who sign the kernel today make 6figure salaries ALL paid by a handful of computing giants that by offering millions to the foundation control what it does. Traps like rust, telemetry, … and other “options” are shoved daily into the kernel to satisfy the paying clients’ demands and wishes.

And we, in the left are fans of a multimilioner’s “team” against a “trilioner’s” team. This is not football or cricket, or F1. This is your data in the hands of multinationals and their fellow customer/agencies. Don’t forget which welfare system maintains the hierarchy of those industries whether the market is rosy or gray. Do I need to spell out the connection?

Beware of multinationals bearing gifts.

Yes there are healthier alternatives requiring a little more work and study to employ, the quick and easy has a “cost” even when it is FOSS.

.

  • iriyan@lemmygrad.mlOP
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    linux and unix were built on alternatives. If you don’t like a piece of code offered as a tool to do something you write something better and offer it/share it with others. So you as a user have a choice among similar tools. Even the most basic ones like gnu-utilities have busybox and other specific alternatives.

    The latest trend is to have NO-ALTERNATIVES, to get everyone to use 1 core system. So instead of diverging as a system (as some of the BSD-unix projects did) linux is showing a tendency to converge into one system (fedora,debian,arch) with little differences among them.

    You get corporate media publishing articles of the “top -ten” linux distributions, or “top-ten” desktops, all based on the very same edition of IBM software, no exception, as there is none. This is marketing and steering the public into a single direction. The question you should answer to yourself is why! Without somone spelling it out to you drawing the attention of 3 lettered agencies.

    • debased@lemmygrad.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      okay i can totally see why you wouldn’t like linux as a whole becoming “one thing”, but what is your opinion on the growth of linux on the desktop? By far the biggest factor in my opinion that’s pushing people away (consumers as well as devs) is having to deal with so many different distros, packaging apps with different libraries on so many different systems. Having standards that aim to reduce that load can only be beneficial for the masses to adopt an objectively better operating system, even if not perfect, wouldn’t it ? i.e. the rise of appimages and flatpaks as a means to curb that issue is to me a good thing, even if not “the most optimal way of doing things”

      • Prologue7642@lemmygrad.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I always actually wonder if that is an actual issue. Apart from some duplicate effort with things like packaging for different distros (which is something that distro maintainers do anyway) I don’t really get this point. For me, this only makes sense for proprietary packages and not for open source.

        Apart from some small differences in how you install packages, using most distros is basically the same.

        I am always confused by this point because I see it repeated everywhere, but never with a good argument supporting it.

        • FuckBigTech347@lemmygrad.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          I only ever see people who work on proprietary software make this argument. For FOSS this is a non-issue. If you have the source code available you can just compile it against the libs on your system and it will just work In most cases unless there was a major change in some lib’s API. And even then you can make some adjustments yourself to make it work. Distro maintainers tend to do this.

        • debased@lemmygrad.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          For many admittedly smaller apps, it’s always a bit of a pain to have to install it manually because the dev simply gave up trying to package it for “the big 3” and distro maintainers can’t care about all small programs, although the current system works well enough for most programs.

          However i am not a developer, so i can’t speak firsthand about the difficulty of packaging and maintaining your app on different distros across years, and i’m not sure if the brunt of maintaining all these apps should fall onto distro maintainers.

          About users and using distros, i can agree that it’s roughly the same either way with the only real difference most of the time being “do you use apt or pacman to install packages”

          • Prologue7642@lemmygrad.ml
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 year ago

            Fair enough, but I only see that for some niche projects. And at that point you are probably not a regular user and can do it yourself.

            There is an issue on the other side, if you only provide appimage/flatpak it is much less customizable. You can’t optimize your software for your CPU, you can’t mix and match what version of the libraries your software uses. Personally, I think it is always a good idea to provide a flatpak alternative for those that want it, but I don’t see it as a replacement for regular packaging.

            Edit: I would much rather see something like nix being used to describe the dependencies. That is in my opinion the best solution, which also allows you to more easily port it to other systems.

            • debased@lemmygrad.ml
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 year ago

              Ideally, it’d be good enough to simply have say, an appimage/flatpak and have the source code and then let distro maintainers/end users build it how they want/need to, i have had the pleasure of trying to get NVENC working in OBS under Debian 10 and that was a massive pain, due to both outdated nvidia-drivers, i had to recompile ffmpeg with the right flags and that would break after every update, the easiest way was to get an OBS flatpak that came prebuilt with it all IIRC I guess my problems with that were mainly because i used debian stable at that time, it’s probably not as much of a pain now that i’m on sid.

              I don’t know anything about Nix, i heard a lot of good about it and how it’s “all config files” or something but the prospect of learning a whole new world scares me, but i trust your judgment on that. I’ll stick to what i know on my boring ass debian sid :D

              • Prologue7642@lemmygrad.ml
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                I would imagine that if you weren’t on Debian stable, it would be much better. From what I’ve seen, dealing with anything Nvidia on stable distros is pain.

                I just recently started working with it and it is really nice. You have NixOS, where you can define basically everything with just nix config files. You want to run MPD on some port, sure just use add this option, and we will create config file and put it in right place. It is really easy to define your entire system with all the options in one place. I don’t think I’ve ever had to change anything in /etc I just need to change an option in my system config. I think something like this is probably the future of Linux.

                Nix by itself is just a language that is used to configure things. You can do things like to define all the dependencies for your project with it, so it is easy to build by anyone with nix (which you can install basically anywhere). By doing it like this you can be sure all the dependencies are defined, so it is really easy to port the software to other distros even if you weren’t using Nix.

    • Prologue7642@lemmygrad.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      That just depends on what you use. There are loads of distros that allow you to use whatever you want. There are only so many ways you can do stuff, and it doesn’t make much sense to differentiate if you don’t have reason to. You have some genuinely diverging distros like NixOS that are significantly different.

      Not really sure what corporate media you read. In my experience, most of those are just a popularity contest. And usually there are non-corporate distros like arch, Debian, etc. And with desktops I mean I am not even sure there are ten desktop environments (at least with some reasonable amount of users).