• 0 Posts
  • 31 Comments
Joined 1 year ago
cake
Cake day: June 20th, 2023

help-circle
  • If you’re getting back pain from an office chair then your arse is likely too far forward when you’re sitting and you’re putting pressure on your spine due it being at an angle other than 90 degrees from the seat, or your table is too low, lowering your arms, so you’re bending forward.

    You’re suppose to feel your arse pushing against the back of the chair not leaving enough of a hole between the chair and your lower back that you can fit an arm in it, and when your arms are resting on the table (which they should be pretty much all the time if your keyboard and mouse are sufficiently forward) you should feel no pressure either downwards or upwards on your shoulders

    I’ve been coding for over 3 decades, often for massive long hours (to the point that by the age of 17 I had RSI due to how my wrists were resting at the edge of the table and some years later when already doing it professionally went to the doctor with chest pain - which I feared were due to a hearth condition - which turned out to be work posture related) and at some point in my mid 20s I moved to The Netherlands and to a company which had its own Ergonomics Consultant (this was back in the peak of the 90s Tech boom so there was lots of money sloshing around) who would come around when you joined and adjust everything for you (they even had tables with adjustable height) and explain you all about the correct work posture.

    Been following that advice and haven’t had posture related problems since then whilst always using pretty standard office chairs (always with adjustable height, tough).

    I have however seen plenty of people doing the lazy (and stupid) posture of being all the way forward on their chair and quite a lot with arms too low or too high (which is more understandable since most cheap office tables don’t have adjustable height).


  • It’s not what makes them money so they don’t really have the business incentive for maximizing hardware sales that leads to a relentless pushing out of new versions of their hardware that are barely better than the last one and all manner of tricks for early obsolescence of older devices (things like purposeful OS and App under-performance and even incompatibility with older versions of the hardware).

    Also in the big picture of gaming the Steam Deck is tiny and in its early stages, so business-wise is not the time to go down a strategy of relentless new hardware versions and enshittification, quite the opposite.

    Absolutely, they’re doing the right thing and as the right thing aligns with their business objectives it’s a bit wishful thinking to claim its because they care so much about their customers as people.


  • One of the first things they teach you in Experimental Physics is that you can’t derive a curve from just 2 data points.

    You can just as easilly fit an exponential growth curve to 2 points like that one 20% above the other, as you can a a sinusoidal curve, a linear one, an inverse square curve (that actually grows to a peak and then eventually goes down again) and any of the many curves were growth has ever diminishing returns and can’t go beyond a certain point (literally “with a limit”)

    I think the point that many are making is that LLM growth in precision is the latter kind of curve: growing but ever slower and tending to a limit which is much less than 100%. It might even be like more like the inverse square one (in that it might actually go down) if the output of LLM models ends up poluting the training sets of the models, which is a real risk.

    You showing that there was some growth between two versions of GPT (so, 2 data points, a before and an after) doesn’t disprove this hypotesis. I doesn’t prove it either: as I said, 2 data points aren’t enough to derive a curve.

    If you do look at the past growth of precision for LLMs, whilst improvement is still happening, the rate of improvement has been going down, which does support the idea that there is a limit to how good they can get.


  • At some point in my career I’ve actually designed mission critical high performance distributed server systems for a living, so I’m well aware of that.

    You can still pack thousands of users per server and have very low latency as long as you use the right architecture for it (it’s mainly done with in-memory caching and load balancing) when you’re accessing gigantic datasets which far exceed the data space of a game where the actual shared data space is miniscule since all clients share a local copy of most of the dataspace - i.e. the game level they’re playing in - and even with the most insane anti-cheat logic that checks every piece of data coming in from the user side against a server-side copy of the “game level data space” it’s still but a fraction of the shared data space in equivalent situations in the corporate world, plus it tends to be easilly partitionable data (i.e. even in MMORG with a single fully open massive playing space, players only affect limited areas of the entire game space so you don’t really need to check the actions of a player against the data of all other players).

    Also keep in mind that all the static (never changing or slow changing stuff) like achievements or immutable level configuration can still be served with “normal” latencies.

    Further the kind LVL1 ISP that provides network access for companies like Sony servicing millions of users already has more than good enough latency in their normal service and hence Sony needs not pay extra for “low latency”.

    Anyways, you do make a good and valid point, it’s just that IMHO that’s the kind of thing that pushes the running costs per-player-month from one dollar cents or less to, at most (and this is likely quite a large overestimation), a dollar per-player-month unless they only have tens of players per-server (which would be insane and they should fire their systems designers if that’s the case).


  • After over 3 decades as a gamer and tech user this is maybe the single most consistent important benefit for any open platform were you can just install Linux.

    The rest is nice but this one means that 10 or 20 years from now your hardware might have been repurposed for something else and still be useful and in use whilst a closed platform will just be more junk in a junkyard or sitting in a box of those things you’ve kept just because you don’t like to throw expensive stuff away but will in practice never use again.




  • I have an Orange PI Pro 5 16GB on a box that smoothly runs a full blown Ubuntu Desktop version and would fit in a pocket though it’s maybe a little too thick (from memory the box it’s about 3x5x2 cm).

    Total cost was about $170.

    The board itself would fit a thinner box, but you might have to 3D print one.

    Mind you, a N100 Mini-PC that costs the same is even more capable as a Linux Desktop, but it’s significantly larger and will definitely not fit a pocket.

    You can find cheaper SBCs capable of running a Desktop Ubuntu but in my experience (with a $35 Banana Pi P2-Zero) if you go too far down the price scale Desktop Linux performance stops being smooth, even if the board is a tiny thing.

    It was actually quite surprising for me recently when I found out some of these things are perfectly capable Linux Desktops.




  • Aceticon@lemmy.worldtoLinux@lemmy.ml33 years ago...
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    2 months ago

    The amount of effort I do to try and avoid using double parentesis is trully herculean.

    I think that stuff is the product of a completionist/perfectionist mindset - as one is writting, important details/context related to the main train of thought pop-up in one’s mind and as one is writting those, important details/context related to the other details/context pop-up in one’s mind (and the tendency is to keep going down the rabbit hole of details/context on details/context).

    You get this very noticeably with people who during a conversation go out on a tangent and often even end up losing the train of thought of the main conversation (a tendecy I definitelly have) since one doesn’t get a chance to go back and re-read, reorganise and correct during a spoken conversation.

    Personally I don’t think it’s an actual quality (sorry to all upvoters) as it indicates a disorganised mind. It is however the kind of thing one overcomes with experience and I bet Mr Torvalds himself is mostly beyond it by now.



  • In a highly simplified way:

    • Think of Windows as an electricity provider with their own specially shaped wall socket.
    • Linux is also an electricity provider with a differently shaped wall socket.
    • In this metaphor Wine is just some guys providing an adaptor that makes the electricity of the Linux electricity provider available in a wall socket that has the same shape as the Windows provider’s.

    Wine isn’t breaking Windows copyright because it doesn’t copy any of the Windows internals: instead it provides the contact points with the right “shape” for programs which were made to work in Windows to connect to to get their needs fullfilled, and then internally Wine does its own thing which is mainly using the Linux under it to do the heavy lifting.

    Mind you, this simplification seriously understates just how complicate it is to implement what was implemented in Wine because the Windows interface is a lot more that just the shape of a wall socket.


  • I got an Orange Pi 5 Plus to play with smallish AIs (because it has an NPU) and I normally access it remotely, so I have to know its IP address to do it.

    In order to easilly know the IP address of it, I’ve wired a little 128x64 monochrome OLED screen to it (Orange PIs, like Raspberry PIs have a pin connector giving access to GPIO and interfaces like I2C, Serial and SPI) which talks via I2C.

    Turns out those interfactes aren’t active in Linux by default (I.e. no /dev/i2c-x), so I figured out that I had to add a kernel overlay to activate that specific interface (unlike with the Raspberry PI whose Linux version has a neat program for doing it, in the Orange Pi you have to know how the low level details of activating those things), which I did.

    To actually render characters on that screen I went with an ARM Linux port of a graphics library for those screens I used before with Arduino, called u8g2)

    Then I made a program in C that just scans all network interfaces and prints their names and IP addresses on that screen, and installed it as a Cron job running once a minute.

    Now, as it turns out when you shutdown your Linux on that board, if you don’t disconnect it from power there is actually still power flowing through the pin connector to any devices you wire there, so after shutdown my screen would remain ON and showing the last thing I had put there, but because the OS was down it would naturally not get updated.

    So the last thing I did was another small C program which just sends to that screen the command for it to go into power saving mode, shutting it down. This program was then installed as a Systemd Service to run when Linux is shutting down.

    The result is now that there is a little screen hanging from the box were I put this board with Linux which lists its IP addresses and the info is updated if it connects other interfaces or reconnects and gets a new IP address. Curiously I’ve actually been using that feature because it’s genuinely useful, not just a funny little project.


  • The whole business model of Apple is to force a hardware upgrade cycle on you and force all your devices to be in that same ecosystem.

    I mean, I can see the advantages of it on the short term, but on the longer term having stuff that keeps on working even as always even in older hardware (or you just install new hardware under it and it just recognizes it and keeps on working) is a massive benefit versus a $1500+ bill every two five years and having to migrate your stuff.


  • Having done the transition some months ago, there is still some stupid shit one has to deal with (especially, but not only, for games NOT from Steam) at times, more than in Windows, but it’s all so much better than it was before and by now quite close to the Gaming experience in Windows.

    Then on top of that there are all the the longer term peace of mind things versus Windows: upgrading your Linux costs zero, changing your hardware won’t invalidate your Linux “OEM License” (plus it will probably just boot up as normal with if you just move your SSD to a whole new machine rather than throw you into driver nightmare), games that work in today’s Linux will keep on working in tomorrow’s and so on - this is actually massive advantage of Linux versus Windows which is seldom talked about: more often than not, hardware migration with Linux is to just move your SSD to a whole new machine, with all the stuff just the way you like it and all you files, and it just boots with and keeps on working.

    (PS: Especially relevant for gamers who have to upgrade due to the increasing demands on hardware from the gaming side of things even though the hardware is fine for everything else they do in that machine, and who would rather that all those other things they’ve installed and kept on using rather than uninstall after “finishing the game”, just carry on configured just the way they like it and working just the way they’ve always did, even when they do upgrade the hardware because of games. People who are fine with hardware dedicated to gaming and with replacing the whole thing - hardware and software - for newer games, just get XBoxes or similar consoles, not PCs)

    Linux not only saves you from enshittification, keeps control in your hands and preserves your privacy, it’s also a reliable and functional long term OS layer for your hardware that doesn’t force hardware upgrades on you.



  • I did the same transition a couple of months ago (the Windows to Pop! OS one, not the desktop environment one) and even though I’m a gamer (something which has stopped me from moving to Linux on the main usage of my home desktop since the late 90s - were I’ve usually had it on dual boot but not used it that much) am very happy with it.

    I’ve actually been familiar with Linux since way back in the Slackware times, but only now have I started using as my main desktop.

    I do think it’s getting to be the Year Of Linux On The Desktop for a lot more people than ever before thanks to the aligned forces of Windows “all your computerz belongz to us” 11, software as a system with general enshittification and just how much easier it is to game on Linux thanks mainly to Valve and the steady, unrelentless, stream of improvements being done by the Wine devs.


  • More generally: delegate anything critical to a 3rd party and you’ve just put your business at the mercy of the quality (or lack thereof) of their own business processes which you do not control, which is especially dangerous in the current era of “cheapest as possible” hiring practices.

    Having been in IT for almost 3 decades, a lesson I have learned long ago and which I’ve also been applying to my own things (such as having my own domain for my own e-mail address rather than using something like Google) was that you should avoid as much as possible to have your mission critical or hard to replace stuff dependent on a 3rd Party, especially if the dependency is Live (i.e. activelly connected rather than just buying and installing their software).

    I’ve managed to avoid quite a lot of the recent enshittification exactly because I’ve been playing it safe in this domain for 2 decades.