I’m the administrator of kbin.life, a general purpose/tech orientated kbin instance.

  • 2 Posts
  • 907 Comments
Joined 1 year ago
cake
Cake day: June 29th, 2023

help-circle

  • OK, one possibility I can think of. At some point, files may have been created where there is currently a mount point which is hiding folders that are still there, on the root partition.

    You can remount just the root partition elsewhere by doing something like

    mkdir /mnt/rootonly
    mount -o bind / /mnt/rootonly
    
    

    Then use du or similar to see if the numbers more closely resemble the values seen in df. I’m not sure if that graphical tool you used that views the filesystem can see those files hidden this way. So, it’s probably worth checking just to rule it out.

    Anyway, if you see bigger numbers in /mnt/rootonly, then check the mount points (like /mnt/rootonly/home and /mnt/rootonly/boot/efi). They should be empty, if not those are likely files/folders that are being hidden by the mounts.

    When finished you can unmount the bound folder with

    umount /mnt/rootonly

    Just an idea that might be worth checking.




  • I think we should qualify the question. I think I’d like to hear a reason for society as a whole to exist that is reasoned and has a firm basis in logic and has no emotive or circular reference.

    Because I cannot see the point of it (and I’ve been accused of being a pessimist, depressed and worse for expressing this opinion). So, I would really like to hear an actual reason for us all to be here.



  • Yeah, but they’re not. That’s the modern world. But also even if it was a web server there’s usually ways to advertise the IP for the app to connect to. I’ve seen other stuff do that. So getting an IP is easy. Once the app knows the IP and if you really want to allow connections from outside to your IOT devices (I wouldn’t) it could remember the IP and allow that.

    You really don’t need to give a fixed IP to everything. I think I’ve given 1 or 2 things fixed IPv6 IPs. Everything else is fine with what it assigns itself.



  • Hah. But to be fair, ATM did have a specific use that it worked great for. That is the move to digital voice circuits. The small fixed cell size and built in QoS meant that if you had a fixed line size you could fit X voice channels, and they would all be extremely low latency and share the bandwidth fairly. You didn’t need to buffer beyond one cell of data and you didn’t need to include overhead beyond the cell headers.

    ATM was designed to handle the “future” or digital network needs. But, the immediate use was about voice frames and that likely dictated a lot of the design I’d expect.





  • I think in 99% of use cases, upgrading isn’t a problem. Most of the time new SQL versions are backward compatible. I’ve never personally had a problem upgrading a database for a product that expects an older version.

    They do have compatibility modes too, but those only go back so far too.

    But, I think companies with their production databases for perhaps older complex systems are likely very weary of upgrading their working database. This is most likely where this situation comes from. Imagine being the person responsible for IT, that upgraded the DB server and database to the latest version. Everything seemed to be working fine. Then accounts run their year-end process, it falls over and now there are months of data in the newer version that won’t work properly. It’d be an absolute pain to get things working again.

    Much safer to leave that SQL 2005 server doing what it does best. :P


  • I think most people that were gaming held onto their CRTs as long as possible. The main reason being, the first generation of LCD panels took the analogue RGB input, and had to present that onto the digital panel. They were generally ONLY 60hz, and you often had to reset their settings when you changed resolution. Even then, the picture was generally worse than a comparable, good quality CRT.

    People upgraded mainly because of the reduced space usage and that they looked aesthetically better. Where I worked, we only had an LCD panel on the reception desk, for example. Everyone else kept using CRTs for some years.

    CRTs on the other hand often had much better refresh rates available, especially at lower resolutions. This is why it was very common for competitive FPS players to use resolutions like 800x600 when their monitor supported up to 1280x960 or similar. The 800x600 resolution would often allow 120 or 150hz refresh.

    When LCD screens with a fully digital interface became common, even though they were pretty much all 60hz locked, they started to offer higher resolutions and in general comparable or better picture quality in a smaller form factor. So people moved over to the LCD screens.

    Fast-forward to today, and now we have LCD (LED/OLED/Whatever) screens that are capable of 120/144/240/360/Whatever refresh rates. And all the age-old discussions about our eyes/brain not being able to use more than x refresh rate have resurfaced.

    It’s all just a little bit of history repeating.


  • Are you sure it was CRT technology? Because bear in mind, colour CRTs had to focus the beam so accurately that it only hit the specific “pixel” for the colour being lit at that time. What there was, was blur from bad focus settings, age and phosphor persistence (which is still a thing in LCD to an extent).

    What DID cause blur was the act of merging the image, the colour and the synchronisation into a composite signal. All the mainstream systems (PAL, SECAM and NTSC) would cause a blurring effect. Games on 80s/90s consoles generally used this to their advantage, and you can see the dithering effects clearly on emulators of systems from that period. Very specifically, the colour signal sharing spectrum with the luminance signal would lead to a softening of the image which would appear like blurring. Most consoles from the time only output either an RF signal for a TV or if you were lucky a composite output.

    Good computer monitors (not TVs) of the time were extremely crisp when fed a suitable RGB signal.


  • I’d go further than that and say that deciding to leave the house or not, are both gambles.

    But in the context of spending money with the only net result being you lose money, make money or retain the same money with no other goods or services provided in return. Then gambling is the primary attribute of that spend.

    Bookmakers and investments meet that criteria, your other purchases are not.



  • Both, each have their place. I have a desktop in my office. Decent recent spec and kept fairly up to date.

    Laptop I have a reasonable “gaming” spec in the lounge we both use it.

    The laptop will always be a compromise. You cannot shift the dissipated heat from a full power gpu at all in that form factor, and most cpus are going to also be lower power editions because they need to work on batteries as well as connected to power. But they’re still for sure usable.

    Desktop will always outperform. Even the stock cpu and gpu options will perform at a higher tdp, and you can usually improve cooling in a big case to either improve stock boost frequencies, or over clock.

    Physics is the limiting factor for laptops, both in terms of power delivery, and heat dissipation.