• 3 Posts
  • 34 Comments
Joined 1 year ago
cake
Cake day: June 26th, 2023

help-circle
  • I cannot say that I have done extensive testing, but the Acer Swift 315-51G and Gigabyte Aero WV8 that I have both worked fine with Linux with zero prior research on my part. No issues with any drivers, even the SD card readers, although I have not checked the fingerprint sensor on the Acer. Maybe I have just been lucky.

    Both have hybrid Nvidia graphics, though, and 10-series and prior hybrid graphics especially, as I understand, have issues with high idle power usage unless you manually disable the dGPU when not gaming, which I had to do using envycontrol and nearly doubled my battery life on both. I might avoid hybrid dGPUs and especially older ones unless you need that.

    Used laptop-wise, I agree with others that a used business laptop like a Dell would probably be your best bet.



  • Veraxis@lemmy.world
    cake
    toLinux@lemmy.mlLinux on old School Machines?
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    20 days ago

    That covers a pretty wide range of hardware, but that era would be around 2009-2015, give or take, so you would be looking at around Intel 1st gen to 6th gen most likely (Let’s be honest, there is nearly zero chance institutions would be using anything but Intel in that era). Pentium-branded CPUs from that time range, unfortunately, likely means low-end dual core CPUs with no hyperthreading, so 2C/2T, but I have run Linux on Core2-era machines without issue, so hopefully the CPU specs will be okay.

    2-8GB of DDR3 RAM is most likely for that era, and as others point out, will be your biggest issue for running browsers. If the RAM is anything like the CPUs, I am assuming you will be looking at the lower end with 2-4GB depending on how old the oldest machines you have are, so I second the recommendation of maybe consolidating the RAM into fewer machines, or if you can get any kind of budget at all, DDR3 sticks on ebay are going to be dirt cheap. A quick look and I see bulk listings of 20x4GB sticks for $26.

    In terms of distro/DE, I second anything with XFCE, but if you could bump them up to around 8GB RAM, then I think any DE would be feasible.

    Hard drives shouldn’t be an issue I think, since desktop hard drives in the 320GB-1TB range would have been standard by then. Also, you are most likely outside of the “capacitor plague” era, so I would not expect motherboard issues, but you might open them up and dust them out so the fans aren’t struggling. Re-pasting the CPUs would also probably be a good idea, so maybe consider adding a couple $5 tubes of thermal paste to a possible budget. Polysynthetic thermal compounds which do not dry out easily would be preferable, and something like Arctic Silver 5 would also be an era-appropriate choice, lol.


  • Veraxis@lemmy.world
    cake
    toLinux@lemmy.mlWhat is/was your distrohopping journey?
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    22 days ago

    I am not sure that I can really call what I did distrohopping, but

    Mint w/ Cinnamon (several years ago on an old junker laptop and never ended up using it as a daily driver) -> Manjaro w/ KDE Plasma (daily driver for ~1 year) -> Arch w/ KDE Plasma (~2 years and counting).

    I have also used Debian with no DE on a file server I made out of an old thin client PC and I have used Rasbian on a raspberry pi.


  • For most things it has not been an issue. Mice and keyboards have all been plug and play for me. Bluetooth headphones also work just fine for me. Setting up a printer was probably easier to do than in windows. My USB DAC, external hard drives, USB SD card readers, etc. have all been plug and play.

    A persistent issue in Linux, however, are gaming peripherals. Anything which requires proprietary vendor software to configure RGB settings may be an issue. OpenRGB detects and allows me to configure the RGB on my Logitech G Pro Wireless Mouse, and I picked up a secondhand Drop CTRL mechanical keyboard which I was also able to reprogram in Linux, but broadly speaking any peripheral which requires dedicated software to program may or may not allow reconfiguration on a case-by-case basis. The last time I had to boot into Windows was to re-bind the key-map on an off-brand USB footswitch, which was a one-time fix and then it has worked fine since then. Similarly, the RGB on the keyboard in my Gigabyte laptop can only be configured from Windows.

    On the laptop side, the main things to watch out for will be compatibility issues with fingerprint readers and certain oddball WiFi chipsets, but generally speaking my peripheral experience has been good.



  • Welcome! Coming from Windows myself, I made the jump to Manjaro (It has certain issues and I do not recommend it), then to Arch less than a year after. I have been on Arch full time for around 2 years now. After the initial setup, I have found Arch to be pretty low-maintenance and no harder to maintain than any other distro, hardly requiring more than the occasional yay -Syu --noconfirm in the command line to update things. As someone with less computer knowledge than an IT professional, I think Arch’s reputation for being difficult is overblown IMO, and I suspect mostly due to intimidation from the more involved setup process prior to the availability of the install script.

    I don’t know if you have any familiarity with Linux already from your work, but regardless of what distro you go with, I would go into it with a mindset that you are learning a new skill. Some things are simply done differently in Linux than Windows and will require getting used to, such as how drives work using mounting points rather than drive letters.

    Realistically, setting things up for the first time often requires additional steps and may not “just work,” but when using my laptop and gaming desktop from day to day, it works just like any other OS. Gaming has been great for me generally, and the work Valve has done to improve game compatibility on Linux has been spectacular. Most Steam games do, in fact, “just work” for me.

    In the 2-3 years I have been using Linux, I have rarely had things spontaneously break as many folks seem to worry about, or if I do it is because of companies not supporting their Linux communities, like Discord not pushing out updates on time, or major-event changes like the move to the Wayland graphical stack on KDE 6 which undid some of my desktop customization settings.



  • The Arch installation tutorial I followed originally advised using LVM to have separate root and user logical volumes. However, after some time my root volume started getting full, so I figured I would take 10GB off of my home volume and add it to the root one. Simple, right?

    It turns out that lvreduce --size 10G volgroup0/lv_home doesn’t reduce the size by 10GB, it sets the absolute size to 10GB, and since I had way more than 10GB in that volume, it corrupted my entire system.

    There was a warning message, but it seems my past years of Windows use still have me trained to reflexively ignore dire warnings, and so I did it anyway.

    Since then I have learned enough to know that I really don’t do anything with LVM, nor do I see much benefit to separate root/home partitions for desktop Linux use, so I reinstalled my system without LVM the next time around. This is, to date, the first and only time I have irreparably broken my Linux install.




  • Veraxis@lemmy.world
    cake
    toLinux@lemmy.mlWith ou without desktop env?
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    2 months ago

    What is your use case? For me, something like a fileserver which I am mainly SSH-ing into anyway, I may not install a DE at all, but if this is going to be a general-use desktop, I see no reason not to install the DE right from the beginning, and selecting a DE will be part of the install process of most Linux distros, or some distros have different install disk images that you can download with any given DE which it supports.

    If you are very concerned about keeping your system lean and want full control of what gets installed, you might want to look up guides for installing Arch Linux. The initial setup process is more involved than other distros, but once you have it installed, I think its reputation for being difficult is overblown.


  • Thanks for the reply!

    I tried this, and it does not seem to have helped much, if at all. Visual elements like scroll bars and text boxes in Steam continue to flicker, and apps like Discord flicker or go completely black randomly. My main use case for this desktop is gaming, so sadly without more of a fix I am not sure that I can move away from X11 yet.


  • I do not think your issues are in any way unique. I am also using Plasma 6 on Arch with a 3080 Ti. Initially, it simply booted to a black screen with nothing but a cursor, and no ability to do anything.

    Adding the changes to /etc/mkinitcpio.conf per @markus@hubzilla.markusgarlichs.de 's reply as well as adding those kernel params to my /boot/refind_linux.conf, and now it boots into a desktop, but I am seeing large amounts of flickering, windows going black, or visual elements flickering in and out. Looking at my installed packages, I already had both xorg-xwayland and wayland-protocols (per @Varen@kbin.social 's reply) installed.

    It seems like there are still more steps before what I would consider a usable result. Personally, it was not “obvious” to me when Plasma 6 rolled out that I needed to do any of this.



  • I am not sure what graphics you have, but I have an older-ish laptop with hybrid 10-series Nvidia graphics which do not fully power down even with TLP installed. I was finding that it continued to draw a continuous 7W even in an idle state. I installed envycontrol so that I can manually turn off/on hybrid graphics or force the use of integrated graphics. I noticed my battery life jumped from 2-3 hours to 4-5 hours after I did this, and unless I am gaming (which I rarely do on this laptop) I hardly ever need the dgpu in this.

    I also use TLP. I have tried auto-cpufreq and powertop, and I found TLP offered the most granular control and worked the best for my system/needs.




  • Blah blah blah blah blah…

    tl;dr the author never actually gets to the point stated in the title about what the “problem” is with the direction of Linux and/or how knowing the history of UNIX would allegedly solve this. The author mainly goes off on a tangent listing out every UNIX and POSIX system in their history of UNIX.

    If I understand correctly, the author sort of backs into the argument that, because certain Chinese distros like Huawei EulerOS and Inspur K/UX were UNIX-certified by Open Group, Linux therefore is a UNIX and not merely UNIX-like. The author seems to be indirectly implying that all of Linux therefore needs to be made fully UNIX-compatible at a native level and not just via translation layers.

    Towards the end, the author points out that Wayland doesn’t comply with UNIX principles because the graphics stack does not follow the “everything is a file” principle, despite previously admitting that basically no graphics stack, like X11 or MacOS’s graphics stack, has ever done this.

    Help me out if I am missing something, but all of this fails to articulate why any of this is a “problem” which will lead to some kind of dead-end for Linux or why making all parts of Linux UNIX-compatible would be helpful or preferable. The author seems to assume out of hand that making systems UNIX-compatible is an end unto itself.


  • My particular testing was with an SSK SD300, which is roughly 500MB/s up and down. I have benchmarked this and confirmed it meets its rating.

    I have thought about buying something like a Team Group C212 or Team Group Spark LED, which are rated at 1000MB/s. The 256GB version of the C212 is available on places like Newegg and Amazon for around $27 USD at time of writing, but they make variants as high as 1TB.