I have one and it works as a drive but all the seagate software doesn’t run in Linux.
I have one and it works as a drive but all the seagate software doesn’t run in Linux.
The Apple support window is pretty predictable. You get about seven years from device release to no os updates.
It used to be that they didn’t talk about it and it was kind of a “he who has eyes, let him see” situation.
Of course, we’re talking hardware here so that’s sort of neither here nor there.
The enterprise dell experience is indeed very good all around. I’d even include hp in the pile if I had any experience with em. Their scopes used to be decent.
If it’s like the older vtech educational toys it uses some z80 processor, so you won’t be able to run Linux but you can a few different hobbyist microcomputer operating systems like zeal8bit and fuzix.
This is sometimes true, but big brands have the market penetration to make hardware support very easy through second hand and third party parts and to be enticing fruit for third party support (see opencore and the dosdude patchers for apple stuff).
Generally if you want long support windows you go for big boring brands’ simplest business class laptops. Or Apple.
Small companies an make a commitment to support, but they often have neither the money, customer base or manpower to follow through when the going gets tough.
I have found that popularity is a better predictor of spare part availability than any commitment from a company of any size. When they stop selling parts, there’s always the second hand market. When that dries up there’s always third party parts.
Firmware updates are one of the places that dell, Lenovo and Apple shine. Because of their customers expectations they tend to release new updates and drivers as functionality expectations or security conditions change.
I haven’t worked on this particular model, but gaming laptop build quality is generally very bad.
Msi build quality is also generally pretty bad.
With that said, most people don’t need build quality because they don’t actually take their computers anywhere.
I think the defaults on your tunnel apply themselves to all interfaces(or whatever the active one(s) are.
If you wanna troubleshoot this from the ground up you’d start with looking at your routing table.
If you run into problems using the process enumerated in the link you posted a couple of replies down, you can start to troubleshoot it by looking at the routing table with iptables -L
Theoretically, no.
In reality, possibly/yes!
What do you have?
Yeah, there’s a baseline of network stack understanding that you gotta have in order to use some of the tools, even Theo es that are supposed to make it easier.
What don’t you get? Maybe I can point you in the right direction.
I always forget what subset of bins come on the livecd, does it do lsblk?
Maybe industry specific stuff like photoshop or something.
Web browsers and normal stuff will keep on trucking as long as the os has a valid root certificate.
Oh this is entirely different than soldering the ram to the motherboard (which is really common on pc laptops now too, it’s harder to find one with sockets now than it’s ever been!).
The ram is inside the cpu. The processor isn’t “just” a cpu (although you can’t call even the old pentium “just” a cpu, they do so much nowadays!), it’s got the video card, bus controllers, ram and all kinds of other stuff built into that one IC!
It’s a SoC, System on a Chip, just like the processors that run phones and tablets and stuff.
If you go the cheap m1 route, get the most ram you can find in it. The m series have ram built into the chip, so you can’t upgrade it later.
Also if the previous owner says it’s getting slow then nuke the ssd with the dd command after you have confirmed ownership is transferred. You’ll have a longer process to reinstall the os from first principles but it’ll fix slowness from the ssds old blocks having never been rewritten.
Maybe not as expensive as you think. The classic getting into the mac game choice is the 2012 mbp 12”, which can run a supported macos with opencore legacy patcher and costs <$200 with 16gb ram and an ssd.
The next best starter option is probably to make the big long leap to a first gen m1 air which can be had for ~$400 if you keep your eyes open.
Those are both expensive to me lol, but not the multiple thousands for a new computer.
The alternative route I took is maintaining a mac computer for when I need to “be normal”.
A good project between now and then is to investigate the iot sku. It has everything “unnecessary” cut out because it’s intended to be installed on refrigerators and has a much longer support window (2032?) for the same reason.
You should set up dual boot now so you don’t get surprised by differences when support ends and you feel the need to switch to an ltsc sku or use Linux.
Don’t wait, prepare!
Keep a hold of windows for a little while so that if something critical comes up that you can’t figure out you have a fallback.
No need to worry, disk failures almost never result in fires or hazardous conditions.
A-yuk-yuk-yuk.
Seriously: you have a disk that has failed, based just on that little snippet of the logs, internally (ICRC ABRT). You can either use a tool like spinrite to try and repair it, but you may lose all the data in the process, or replace it.
A user suggested bad cabling and that’s a possibility, one you can check easily if the error is reproducible by swapping the cable. Before I swap cables often I’ll confirm the diagnosis using smartctl and look for whatever the drive manufacturer calls the errors that happen between the media and disk controller chip on the drive. If it has those then there’s no point in trying a cable swap, the problem is not happening there.
People will say that you can’t “fix” bad disks with tools like spinrite or smartctl. I’ve found that to be incorrect. There are certainly times when the disk is kaput but most of the time it’ll work fine and can go back into service.
Of course, that’s recovering from errors when I get an email or text the first time and going back to service in a multi-parity array so lowered criticality and early detection could have lots to do with that experience.
I don’t know of any msi or asus boards with problems. Of course, I rejected coreboot as a requirement so that plays into it.
My personal experience is: don’t overclock and everything will run fine for at least ten years.
Blender works faster with nvidia and it’s been the optimal hardware for maybe two decades now. There’s just so much support and knowledge out there for getting every feature in the tool working with it that I couldn’t in good faith recommend a person use amd cards to have a slightly nicer Wayland experience or a little better deal.
If you’re only doing llm text work then a case could be made for a non cuda (non-nvidia) accelerator. Of course at that point you’d be better served by one of those coral doodads.
Were you only doing text based ml work or was there image recognition/diffusion/whatever in there too?
When was the last 4yr window on a computer? I think the ati 2011 15” mbp got dropped fast af but thats the last real short one I remember. I haven’t dealt extensively with the touchbar models though.
The m1 air looks to be another 2012 mbp 12”. It would surprise me if they cut it off at 7 years. Although that decision seems to have been driven by the enterprise install base and who knows if that’s still what it once was.
I think the reason why mobile os support windows are apples thing on computers is because they don’t have a separate business line. Iirc xps used to be dells enthusiast brand and now it’s part of the business line.
Thinking more about it, the core line of processors was a real stumble for intel because they were really good and lasted forever and manufacturers had to start pushing updates to fix realtek and qualcomm chip problems or get blamed for shit not working or being supported.
Also, this is kinda tangential because the op is asking about firmware support and hardware availability and firmware support is not as important on macs and they have incredible second hand hardware markets.