• 0 Posts
Joined 1 year ago
Cake day: July 16th, 2023


  • It should work fine in a virtual machine. Just make sure you provide suitably ancient hardware like IDE storage and old ethernet cards. On something that old, I would only provide a single CPU. To be safe, I would also try installing with a low amount of RAM and then increase it later. Older kernels could not handle multi-processor or RAM above a certain size. I think I might start with 700 MB of RAM to do the install. That might sound like nothing but it probably runs in 8.

    It is easy today in our era of resource richness to forget just how meager the hardware was when these distros were new.

    A distro that old is going to require some fiddling to get XFree86 ( x11 ) up and running. It should be ok in a desktop VM but I have had problems with older versions of X in Proxmox in case you are using that.

    I kind of want to go install this myself now. Or an old version of SLS ( pre-cursor to Slackware ). I ran them both at some point in my Linux journey but it has been a while.

    What I really want to do is to make OCI containers from these old distros and try to run them in Distrobox on top of a modern kernel. Has somebody done that already? Really old versions of Red Hat ( not RHEL, Red Hat, < 6 ) would be cool too.

  • I “switched to Linux” from Windows 2000 but I have also had machines running with Windows and macOS during that time. My last work computer was Windows 11 ( but I hardly used it ).

    Hard to really put into words what kept me in Linux. At times, it has required work and knowledge Windows would not have demanded of me. At the same time, Linux has been largely free of “nonsense”. It just always felt like home.

    [ Edit: thinking about I more. I have used Linux since 1992 and honestly moved from primarily OS/2 to mostly Linux. I really liked Windows 2000 though and used it well into the XP era. ]

  • Don’t get me wrong, I am very excited by the possibility of having an independent browser engine. Firefox was that. Mozilla as an organization has compromised that. I hope Ladybird can avoid the same issues.

    The rationale for SerenityOS beyond therapy and fun was precisely the “being responsible for everything ourselves” aspect of the project. Andreas has previously described it as a form of accountability. He has also described it as a kind of freedom in contrast to the Linux model with its inevitable politics, bike-shedding, and inefficiency as you try to get dozens of projects to agree on direction and merge code. Ultimately, the mono repo allowed the project to do things right in the right place in the stack ( in the code ). It allowed the project to move quickly, to avoid legacy, and to continuously improve and modernize. It allows harmony across the entire developer AND user experience.

    Linux is famously fragmented. There is no open source desktop OS that provides “whole system” design and user experience harmony. Haiku comes close but it has never gotten much native app momentum. On Haiku, you are going to run the Falkon browser, use GIMP, and run LibreOffice. None of those are developed in concert with the OS.

    SerenityOS held the promise of Ladybird, and Hack Studio, and the userland utilities ( and everything else ) all being developed in concert. The same GUI toolkit was used for everything. The same standard library. The same error handling and code level security measures.

    This was the “need or use case”. Not anymore.

    And it is not just SerenityOS. Ladybird is not as independent as it was. Not just the sponsorships but the code. SerenityOS is no longer a dependency but 8 other projects are. No more mono-repo and goodbye to all the reasons it was considered a good thing.

  • Well, that is like, your opinion man.

    Ok. Obviously different licenses are useful in different circumstances. So, what you are saying is clearly true.

    That said, even though the MIT license is the most used license I believe, I wish MIT was used more and GPL less.

    I do not want to create or get drawn into a debate ( because we likely have the same facts and just disagree ) but what I dislike about the GPL is that does not respect freedom—specifically developer freedom. It constrains freedom and hopes that what it calls “the 4 freedoms” are a side effect. In my view, the GPL restricts freedom to bestow rights ( a net negative for freedom ).

    My opinion is no more valuable than yours. We do not have to convince each other. I am just explaining my view.

    Don’t get me wrong, the ability of the original author to choose the GPL is something I totally support. It is a totally valid constraint to place on people that want to use your code. A developer should get to choose the terms under which people can use their code. It is exactly this freedom that the GPL restricts. Again, I think this is totally ok ( as would be demanding money ) but it is certainly a restriction which, by definition, is not freedom.

  • All the ID Software stuff worked ( eg. DOOM, Quake ) but outside of them and Loki, not a lot of commercial stuff. And I don’t think the Loki catalogue was very current or extensive.

    There were lots of Linux native games but they were much more primitive ( though not necessarily less fun ) like Tux Racer and Pingus. There were also adventure games.

    There was also a thriving game engine “clone” scene, especially for Blizzard stuff. Not all of it ever really got there in terms of features or quality. These were designed to work with the “assets” from actual commercial games. There was Stargus for Warcraft and StarCraft for example. There is DevilutionX for Diablo which is great. Often there were fully open source games built off these same engines ( BosWars? ).

    By the standards of today, the Linux gaming scene would have seemed pretty shitty. You were not playing the same AAA titles as your Windows friends. However, if you were a Linux enthusiast, there were plenty of really fun options to keep you entertained.

    I think Linux has always been a bit better off than Mac with regards to gaming.

    [ Edit: Memory correction - DevilutionX is way more recent. Even Stargus did not appear until 1998 as did Pingus. Tux Racer was not until 2000. Loki was a 1999 thing too. So, my comments above are perhaps more valid for 1998 - 2005 than 1991 - 1998. DOOM was 1994 at least and Quake was 1996. ]

  • Well, XFree86 ( before Xorg and before KMS ) was an adventure. I spent hours guessing the vertical and horizontal frequencies of my monitor trying to get decent resolutions.

    Other than that, Linux was way more work but “felt” powerful relative to OS options of the time. Windows was still crashy. The five of us that used OS/2 hated that it still had a lot of 16 bit under the hood. Linux was pure 32 bit.

    Later in the 90’s, you could run a handful of Windows apps on Linux and they seemed to run better on Linux. For example, file system operations were dramatically faster.

    And Linux was improving incredibly rapidly so it felt inevitable that it would outpace everything else.

    The reality though was that it was super limited and a pain in the ass. “Normal” people would never have put up with it. It did not run anything you wanted it to ( if you had apps you liked on Mac, Windows, OS/2, Amiga, NeXTstep, BeOS, or whatever else you were using ( there were of potential options at the time ). But even for the pure UNIX and POSIX stuff, it was hard.

    Obviously installation was technical and complex. And everything was a hodge-podge of independently developed software. “Usability” was not a thing. Ubuntu was not release until 2004.

    Linux back then was a lot of hitting FTP sites to download software that you would build yourself from source. Stuff could be anywhere on the Internet and your connection was probably slow. And it was dependency hell so you would be building a lot of software just to be able to build the software you want. And there was a decent chance that applications would disagree about what dependencies they needed ( like versions ). Or the config files would be expected in a different location. Or the build system could not find the required libraries because they were not where the Makefile was looking for them.

    Linux in the 90’s had no package management. This is maybe the biggest difference between Linux then and Linux now. When package management finally arrived, it came in two stages. First, came packages but you were still grabbing them individually from FTP. Second came the package manager which handled dependencies and retrieval.

    The most popular Linux in the mid to late 90’s was Red Hat. This was before RHEL and before Fedora. There was just “Red Hat Linux”. Red Hat featured RPMs ( packages ) but you were still installing them and any required dependencies yourself at the command line. YUM ( precursor to NRF ) was not added until Fedora Core 1 was release in 2003!

    APT ( apt-get ) was not added to Debian until 1998.

    And all of this meant that every Linux system ( not distro — individual computer ) was a unique snowflake. No two were alike. So bundling binary software to work on “Linux” was a real horror-show. People like Loki gave it a good run but I cannot imagine the pain they went through. To make matters worse, the Linux “community” was almost entirely people that had self-selected to give up pre-packaged software and to trade sweat-equity for paying for stuff. Getting large number of people to give you money for software was hard. I mean, as far as we have come, that is still harder on Linux than on Windows or macOS.

    You can download early Debian or Red Hat distros today if you want to experience it for yourself. That said, even the world of hardware has changed. You will probably not be wrestling IRQs to get sound or networking running on modern hardware or in a VM. Your BIOS will probably not be buggy. You will have VESA at least and not be stuck on VGA. But all of that was just “computing” in the 90’s and the Windows crowd had the same problems.

    One 90s hardware quirk was “Windows” printers or modems though where the firmware was half implanted in Windows drivers. This was because the hardware was too limited or too dumb to work on its own and to save money your computer would do some of the work. Good luck having Linux support for those though.

    Even without trying old distros, just try to go a few days on you current Linux distro without using apt, nrf, pacman, zypper, the GUI App Store, or what have you. Imagine never being able to use those tools again. What would that be like?

    Finally, on my much, much slower 90’s PC, I compiled my own kernel all the time. Honestly multiple times per month I would guess. Compiling new kernels was a significant fraction of where my computing resources went at the time. I cannot remember the last time I compiled a kernel.

    It was a different world.

    When Linus from LTT tried Linux not that long ago ( he wanted to game ), he commented that he felt like he was playing “with” his computer instead of playing “on” his computer. That comment still describes Linux to some extent but it really, really captures Linux in the 90’s.