You can also delegate a subdomain to another provider with an API, but yes I see what you mean. Although I feel like getting port 80 open would be difficult as well in those situations.
You can also delegate a subdomain to another provider with an API, but yes I see what you mean. Although I feel like getting port 80 open would be difficult as well in those situations.
It does but it’s a bit of a weird way of doing things.
I’d say they’re actually easier, at least in my experience. Since wildcard certs use DNS-01 verification with an API, you don’t need to deal with exposing port 80 directly to the internet.
You shouldn’t have the do anything specific at all, local network stuff works without internet and Jellyfin doesn’t rely on any internet servers like Plex does for authentication.
I just do full system images for that reason, easier than trying to pick and choose what should be backed up. Used to use Veeam, currently using Synology Active Backup.
For online backups I don’t due to size, but for local backups it’s just way easier.
To install at minimum you’ll need to likely shrink existing partitions and create new ones for linux if you don’t want to wipe the drive, that would be a dual-boot setup with Windows still installed along side. Or you can just wipe the drive entirely and have only Linux.
Regarding the files you should already have backups of anything important, if you don’t, set it up ASAP.
Messing with partitions can easily cause data loss if something goes wrong.
You also never know when hardware failure, malware, power surges, lightning strikes, or whatever other disaster will happen and cause data loss. 1 copy of files might as well be 0 copies.
How did they end up thinking that everything must be done with terminal while using Ubuntu?
Most guides on installing things or help on fixing things will offer terminal commands, so I can see how that could certainly lead to that feeling as a new user.
Also depending on the DE and stuff certain very basic obvious settings are not available in the GUI, like fractional scaling on KDE which has to be done by editing some config file first.
Odd, I’ve had a Pixel, Oneplus 7 pro, and now a Galaxy S21 and they all pick up my DNS server from DHCP without any issues.
If you have private DNS turned off it doesn’t, unless maybe you have some manufacturer specific weirdness going on with extra software.
That seems a bit rough combining all those into one, can’t upgrade anything separately.
I’m not sure on the security/safety of combining your gateway and NAS either.
Does a PC connected to the same wifi network as the phone get the proper DNS servers and work like it should?
Strange, have you checked the interface info on Android to see what DNS info it’s getting from the DHCP server?
Also check that it’s getting an IP on the 192.168.x.y network, and not some other subnet if the AP is doing funky things.
I’ll have to try it out for youtube, I’m on gigabit internet (and hardwired), but youtube will often stall out when trying to buffer part way through videos and take quite awhile to figure itself out.
Do you have private DNS enabled on Android? That would use a public DNS server by default regardless of what DHCP configures.
Also check your browsers, some have their own DNS settings.
Frigate has been great, I’ve run it for years now.
Using OpenVINO on my Intel iGPU for hardware accelerated object detection and encode/decode.
The tunnels are encrypted. But I don’t know if they use SSL or something else.
I looked around awhile ago and didn’t really find anything good.
I think the best option is a raspberrypi and one of those 12-15" portable HDMI monitors.
True, although once per hour would still be a lot of data.
For example me running a fast.com test uses about 1.5GB of data to run a single test, so around 1TB per month if ran hourly.
Your router doesn’t handle LAN traffic so an upgrade shouldn’t make any difference, unless you have multiple VLANs and are passing traffic between them and don’t have a Layer 3 switch in use to handle inter-VLAN routing.
I would probably start with an iperf
test for download bandwidth to the Pi from the server. If that looks OK then I would benchmark the NFS share for read speed on the Pi, make sure that’s not doing something weird.
If that all looks good then I would probably suspect that Kodi either isn’t using hardware acceleration properly, or the specific media codec is not supported by the Pi for hardware acceleration.
Garage definitely seems better suited for selfhosters and small setups, Minio is just so large and complex with specific requirements now.