If there is a problem at low power usage then you can easily solve it by temporary add more power. Lets say add a 40watt lamp or something, later remove it from the calculation.
If there is a problem at low power usage then you can easily solve it by temporary add more power. Lets say add a 40watt lamp or something, later remove it from the calculation.
I see it like thank you that i don’t have to go to Mozilla website and download the installer. So much time saved, and it only takes like 5 second without manually doing anything. On Linux i saw please restart Firefox tab and clicked it. No problem. I got the update fast.
Thank you for all this time. I still use it today, along with pidgin.
That is something that should be possible: Choosing a backup sync server that have my data(profile). I hope they implement that
I can see that threads.net may harm the Fediverse. But, there might be some people that don’t like the threads client and want to use the fediverse to interact with them. This will divide us. Also, trying to have a standard were threads.net is blocked is very hard to spread. Maybe has to be default on the server side, or even better, a subscription.
People will not leave Youtube. They got too many videos. I believe they will just find some proxy solutions or similar. Using anything else will for the author just result in they get paid less.
Yes, they can, like any other company or organization can. But they can't remove the humans producing. That means the humans will just go anywhere else. Youtube is a standalone product that they probalby want to keep as I think it pays for itself with that amount of ads.
Youtube is just the database with video. Just use a different frontend. The problem is if I actually want recommended videos but without Google knowing about it, then it is hard due to the massive amount of videos. Only Google have the money so scan everything.
Yes, you can write bad code and that matters most. Rust is more low level than high level language. Rust is new so not much bloat library has been written yet :) So far I have seen many lean Rust applications in the open source world. Please note I used the word "should" - no guarantees.
SQLite makes minimum memory usage much lower than MySQL. Many that would selfhost this is just for one single user and don't need a standalone database
I can image that the application itself for doing this would not require much ram at all but having a MySQL requires much more ram usage in order of magnitudes.
I love it is written in Rust. Should mean fast and efficient, low memory usage.
Edit: It uses MySQL as database so it is heavy.
Save in terms on load I mean CPU usage. The question is how much money they will save by we utilize this instead. I does think it is as heavy like ChatGPT or anything like that.
Personally i dont save password in the browser. I use keepassxc and the web extension to the browser. But Firefox sync for all Firefox settings, sending tabs etc.
How much will we save on the production Mozilla Firefox servers in terms of load?
Keepassxc + syncthing to phone in read only mode and to other machine. So 3 copies on different machine, while one of them is on me
I use very simple software for this. My firewall can use route monitoring and failover and use policy based routing. I just send all traffic to another machine with the diagnosis part. It does ping through the firewall and fetch some info from the firewall. The page itself is not pretty but say what is wrong. Enough for parents to read what error. I also send DNS traffic to a special DNS server that responds with the same static ip address - enough for the browser to continue with a HTTP GET that the firewall will send forward to my landing page. It is sad that I don’t have any more problems since I changed ISP.
Had a scenario when the page said gateway reachable but nothing more. ISP issue. DHCP lease slowly ran out. There were a fiber cut between our town and the next. Not much I could do about it. Just configured the IP static and could reach some friends through IRC in the same city so we could talk about it.
The webpage itself was written in php that read icmp logs and showed the relevants logs of up and down. Very simple.
The dvds are fine for offline use. But I dont know how to keep them updated. Probably result in taking loads of spaces as I guess they are equal to a repo mirror
I use it with Kubuntu. Doing apt update is now much faster. I did some testing and found some good public mirror so I could max my connection(100 Mbit) with about 15ms latency to the server. But I think the problem was there are so many small files. Running nala to fetch the files in parallel helps of course. With apt local ng I don’t need nala at all. The low latency and files on gigabit connection to my server leads to fast access. Just need to find a good way to fill it with new updates.
A second problem is to figure out if something can be done to speed up the apt upgrade, which I guess is not possible. Workaround with snapshots and send diff does not sound efficient either, even on older hardware.
apt update - 4 seconds vs 16 seconds.
apt upgrade --download-only - 10 seconds vs 84 seconds;
First off. If Internet goes down I have a http captive portal that do some diagnos, showing where the problem is. Link on network interface, gateway reachable, dns working and dhcp lease. Second, now when it is down, show the timestamp when it went down. Third, phone number to the ISP and city fiber network owner.
Forth. Watch my local RSS feed and email folder. Also have something to watch from Youtube or Twitch game downloaded locally.
Use Veeam. If you hit the limit just configure it to send to a SMB share and you need no licens.
It might be enough to just rsync stuff to the secondary regularly and the inactive machine monitor the active machine and just start all services as the active machine stops responding.
Just stop supporting the biggest actor in the market.