• 1 Post
  • 160 Comments
Joined 11 months ago
cake
Cake day: October 4th, 2023

help-circle
  • If I need to do an emergency boot from a USB stick to repair something that can’t boot, which it sounds like is what you’re after, pretty much any Linux distro will do. I’d probably rather have a single, mainstream bootable OS than a handful.

    I’d use Debian, just because that’s what I use normally, so I’m most familiar with it. But it really doesn’t matter all that much.

    And honestly, while having an emergency bootable medium with a functioning system can simplify things, if you’re familiar with the boot process, you very rarely actually need emergency boot media on a Linux system. You have a pretty flexible bootloader in grub, and the Linux kernel can run and be usable enough to fix things on a pretty broken system, if you pass something like init=/bin/sh to the kernel, maybe busybox instead for a really broken system, and can remount root read-write (mount -o rw,remount /) and know how to force syncs (echo s > /proc/sysrq-trigger) and reboots (echo b > /proc/sysrq-trigger).

    I’ve killed ld.so and libc before and broght back systems without alternate boot media. The only time I think you’d likely really get into trouble truly requiring alternate boot media is (a) installing a new kernel that doesn’t work for some reason and removing all the old, working kernels before checking to see that your new one works, or (b) killing grub. Maybe if you hork up your partition table or root filesystem enough that grub can’t bring the kernel up, but in most of those cases, I’m not sure that you’re likely gonna be bringing things back up with rescue tools – you’re probably gonna need to reinstall your OS anyway.

    EDIT: Well, okay, if you wipe the partition table, I guess that you might be able to find the beginning of a filesystem partition based on magic strings or something and either manually reconstruct the partition table or at least extract a copy of the filesystem to somewhere else.



  • Internet Archive creates digital copies of print books and posts those copies on its website where users may access them in full, for free, in a service it calls the “Free Digital Library.” Other than a period in 2020, Internet Archive has maintained a one-to-one owned-to-loaned ratio for its digital books: Initially, it allowed only as many concurrent “checkouts” of a digital book as it has physical copies in its possession. Subsequently, Internet Archive expanded its Free Digital Library to include other libraries, thereby counting the number of physical copies of a book possessed by those libraries toward the total number of digital copies it makes available at any given time.

    This appeal presents the following question: Is it “fair use” for a nonprofit organization to scan copyright-protected print books in their entirety, and distribute those digital copies online, in full, for free, subject to a one-to-one owned-to-loaned ratio between its print copies and the digital copies it makes available at any given time, all without authorization from the copyright-holding publishers or authors? Applying the relevant provisions of the Copyright Act as well as binding Supreme Court and Second Circuit precedent, we conclude the answer is no. We therefore AFFIRM.

    Basically, there isn’t an intrinsic right under US fair use doctrine to take a print book, scan it, and then lend digital copies of the print book.

    My impression, from what little I’ve read in the past on this, is that this was probably going to be the expected outcome.

    And while I haven’t closely-monitored the case, and there are probably precedent issues that are interesting for various parties, my gut reaction is that I kind of wish that archive.org weren’t doing these fights. The problem I have is that they’re basically an indispensible, one-of-a-kind resource for recording the state of webpages at some point in time via their Wayback Machine service. They are pretty widely used as the way to cite a page on the Web.

    What I worry about is that they’re going to get into some huge fight over copyright on some not-directly-related issue, like print books or something, and then someone is going to sue them and get a ton of damages and it’s going to wipe out that other, critical aspect of their operations…like, some random publisher will get ownership of archive.org and all of their data and logs and services and whatnot.




  • released for public testing

    I mean, it’s not publicly-available either; it’s just available to a select group of testers.

    I haven’t been following the game’s development. But my guess is that the devs are going to prioritize targeting the machines that they’re using to do development of the thing. They won’t be using a Deck to develop the thing. This probably won’t be the only tradeoff made, either – I’d guess that performance optimizations aimed at the Deck or other lower-end machines might be something that would be further down on the list. I’d guess that any kind of tutorial or whatever probably won’t go in until late in the development – not that that’s not important to bring new users up to speed, but it’s just not something that the devs need to work on it. Probably not an issue for this game, which looks like it’s multiplayer, but I’d guess that breaking save or progress compatibility is something that they’d be fine with. That’s frustrating for a player, but it can make development a lot easier.

    Doesn’t mean that those don’t matter, just that they won’t be top of the priority list to get working. What they’re gonna prioritize is stuff that unblocks other things that they need.

    I worked on a product in the past that had a more “customer-friendly” interface and a command line interface. When a feature gets implemented, the first thing that a dev puts in is the CLI support – it’s low-effort, and it’s all that the dev needs to get the internal feature into a testable state for the internal people. The more-customer-friendly stuff, documentation, etc all happens later in the development cycle. Doesn’t mean that we didn’t care about getting that out, just that we didn’t need it to unblock other parts of the the development process. Sometimes we’d give access to development builds to customers who specifically urgently needed a feature early-on and were willing to accept the drawbacks of using stuff that just isn’t done, but they’re inevitably gonna be getting something that’s only half-baked.

    I mean, if it bugs you, I’d just wait. Like, they aren’t gonna be trying to provide an ideal customer experience at this point in the development cycle. They’re just gonna want to be using it as a testbed to see what works. It’s gonna inevitably be a subpar experience in various ways for users. The folks who are using the thing at this point are volunteering to do unpaid testing work in exchange for getting to play the thing very early and maybe doing so at a point where they can still alter the gameplay substantially. There are some people who really enjoy that, but depends on the person. It’s not really my cup of tea. I dunno about you, but I’ve got a Steam games backlog that goes on forever; it’s not like I’ve got a lack of finished games to get through.



  • I haven’t played it, but it sounds like the situation may be in flux:

    https://www.oneesports.gg/gaming/does-deadlock-have-controller-support/

    At the time of writing, the action game is in closed beta, and it doesn’t offer native controller support. However, it does have an option that players can use to play the game with a controller.

    With that in mind, the game is likely to feature controller support when it releases on PC, as it is expected to be Steam Deck compatible.

    However, you must keep in mind that since the game is still in early development, it doesn’t offer any key binding or customization feature.

    Additionally, even with a controller on default settings, some key actions in the game may not be mapped, so you might encounter limitations during gameplay.

    In the near term, if a keyboard can do what you want, if you can dig up macro software for your platform that can look for specific gamepad combinations and send keystrokes as a result, I imagine that you could make it work that way.


  • CIFS supports leases. That is, hosts will try to ask for exclusive access to a file, so that they can assume that it hasn’t changed.

    IIRC sshfs just doesn’t care much about cache coherency across hosts and just kind of assumes that things haven’t changed underfoot, uses a timer to expire the cache.

    considers

    Honestly, with inotify, it’d probably be possible to make a newer sshfs that does support leases.

    I suspect that the Unixy thing to do is to use NFSv4 which also does cache coherency correctly.

    It is easy to deploy sshfs, though, so I do appreciate why people use it; I do so myself.

    kagis to see if anyone has benchmarks

    https://blog.ja-ke.tech/2019/08/27/nas-performance-sshfs-nfs-smb.html

    Here are some 2019 benchmarks that show NFSv4 to generally be the most-performant.

    The really obnoxious thing about NFSv4, IMHO, is that ssh is pretty trivial to set up, and sshfs just requires a working ssh connection and sshfs software installed, whereas if you want secure NFSv4, you need to set up Kerberos. Setting up Kerberos is a pain. It’s great for large organizations, but for “I have three computers that I want to make talk together”, it’s just overkill.


  • Oh, and one other factor. I was just reading a paper on British housing policy. I’m not taken with the format – it’s imagining a world where planning restrictions on building new housing were reduced, and talking about the benefits of it – but it does also make a number of good points, including the point that some of it is that the UK hasn’t been building housing at the kind of rate that would probably be ideal for some time. Since newer buildings are better-insulated, that also means that the present stock of buildings tend to be less-well-insulated than would be the case had more construction occurred:

    https://iea.org.uk/wp-content/uploads/2024/03/IEA-Discussion-Paper-123_Home-Win_web.pdf

    Although this was not initially the motivation, there have been environmental benefits as well. For a long time, Britain used to have poorer energy efficiency standards than most neighbouring countries. It is not that all British homes were energy inefficient. It is just that Britain used to have the oldest housing stock in Europe (European Commission n.d.), and the energy efficiency standard of a dwelling is strongly correlated with its age (ONS 2022). Rejuvenating the housing stock has therefore accidentally driven up its average energy performance.

    This is the “the paper is from a potential future looking back at the imaginary past” format talking here.


  • But with the UK it always comes back to having the worst insulation in the world.

    Most of the UK has relatively-comfortable temperatures, so the impetus to add lots of insulation is relatively low.

    https://en.wikipedia.org/wiki/Climate_of_the_British_Isles

    Temperatures do not often switch between great extremes, with warm summers and mild winters.

    The British Isles undergo very small temperature variations. This is due to its proximity to the Atlantic, which acts as a temperature buffer, warming the Isles in winter and cooling them in summer.

    Over here, in the US, the places with the lowest temperature variations are also islands, like Hawaii. Extreme temperature swings happen in places like the Dakotas, far away from the ocean.

    You’ve been cursed with fairly comfortable temperatures. :-)


  • why the UK is pushing do hard for air to water is a mystery to me.

    I was a little confused when I followed a European forum. The transition to heat pumps has been a big deal, lot of discussion in the UK and Germany. It had a bunch of people talking about boilers and having heat pumps heat water, which confused me. I’d only heard about boilers in the US much in the context of large buildings with old steam heating systems.

    In the US, I’ve seen plenty of air source heat pumps. Some water source heat pumps. But they both are used to heat (or chill) air, which is then blown into ducting and circulated through the house. Provides ventilation and such and humidity control. But it seemed to be overwhelmingly the case that people in Europe were talking about heating water and circulating that.

    Small window or through-wall air conditioning units do obviously heat air, usually for one room. Split minis move refrigerant, which ultimately heats or cools air.

    And it seems like you’d rather have ducting, as it can provide control over a given ratio of fresh air to an area of a building and filter it.

    I was able to find a few companies in the UK dealing with ductwork, but they focused on new office buildings.

    I eventually figured out what was going on.

    A lot of houses in the US were built after the introduction of air conditioning. Not only that, but the US has more areas that get quite hot than Europe, so once air conditioning was an option, people really wanted it. The result is that a lot of US housing was built with central air conditioning.

    This meant that when houses were designed, ductwork was built into the design.

    Ducts are relatively-large. It takes a lot of space to move a given amount of heat.

    It is not easy to retrofit ducting into an existing house. You have to have this big thing jammed into the house somewhere that runs to all rooms.

    Water is much denser. If you use water to move your heat around, you don’t require that much space. So if you retrofit an existing house, you don’t have to mess the house up. Not only that, but a lot of buildings in the UK had apparently already been set up with systems that heated water with natural gas and then moved it around the building to heat it, so putting in a heat pump could use that existing system.

    My guess is that people did the math and decided that it didn’t make sense to massively go rip up existing houses when they could stick comparatively-unobtrusive additional water pipe in.

    My guess is that what will happen is that new buildings will incorporate ductwork, so there will be a very slow transition to ductwork. But it won’t happen overnight, just as buildings age out and are demolished.

    The current transition to heat pumps that they’re doing is on a much shorter timeline than that.







  • https://en.wikipedia.org/wiki/Great_raft_spider

    In October 2010 the first introduction of a great raft spider population into a new site in the UK was carried out in a joint project by Natural England and Suffolk Wildlife Trust and supported by a grant from the BBC Wildlife Fund. The project saw around 3000 spiderlings bred and reared by Dr. Helen Smith and the John Innes Centre, 1600 of which were released into suitable dykes at the Suffolk Wildlife Trust Castle Marshes nature reserve. The site is part of the Suffolk Broads and lies 50 kilometres (31 mi) downstream, from Redgrave and Lopham fen, between Lowestoft and Beccles. Work was carried out to improve the ditch network at the site to prepare for the reintroduction and provide optimal habitat for the new spider population.

    Dr. Helen Smith knew that the one great problem with the UK was an insufficient number of giant spiders running around, and she intended to remedy that.

    Each spiderling was hand reared in separate test tubes and fed with fruit flies.

    “Eat. Eat and grow large and strong.”