Seems like a decision for the better, so long as they have the name Mine attached to it there will be higher expectations and lower initial trust than with an entirely different name
Just some IT guy
Seems like a decision for the better, so long as they have the name Mine attached to it there will be higher expectations and lower initial trust than with an entirely different name
You apparently have little interaction with regular users because one of the top problems a non-power user has is “oops I accidentally hit delete on this important file I don’t have a backup of”.
Not saying qbittorrent-nox of all things switching makes a ton of sense but at least for desktop applications there is a very good reason why deleting things becomes a two step process.
Excuselme they want how much for what boils down to a graphics upgrade?! Damn that is insanely greedy and definitely won’t affect sales negatively
A) funny how that works with Steam everytime. “We don’t need Steam” > sales plummet > “Release on Steam in 60 days”
B) I don’t think releasing the game on Steam will save them here, from what I’ve seen it’s just a bad game plain and simple. It will maybe fill the gap a bit but probably not by enough to actually achieve the sales numbers they would like to see.
I somewhat disagree that you have to be a data hoarder for 10G to be worth it. For example I’ve got a headless steam client on my server that has my larger games installed (all in all ~2TB so not in data hoarder territories) which allows me to install and update those games at ~8 Gbit/s. Which in turn allows me to run a leaner Desktop PC since I can just uninstall the larger games as soon as I don’t play them daily anymore and saves me time when Steam inevitably fails to auto update a game on my Desktop before I want to play it.
Arguably a niche use case but it exists along side other such niche use cases. So if someone comes into this community and asks about how best to implement 10G networking I will assume they (at least think) have such a use case on their hands and want to improve that situation a bit.
Personally going 10G on my networking stuff has significantly improved my experience with self-hosting, especially when it comes to file transfers. 1G can just be extremely slow when you’re dealing with large amounts of data so I also don’t really understand why people recommend against 10G here of all places.
Yeah they definitely could have been quicker with the patches but as long as the patches come out before the articles they are above average with how they handle CVE’s, way too many companies out there just not giving a shit whatsoever.
If I buy a switch and that thing decides to give me downtime in order to auto update I can tell you what lands on my blacklist. Auto-Updates absoultely increase security but there are certain use cases where they are more of a hindrance than a feature, want proof? Not even Cisco does Auto-Update by default (from what I’ve managed to find in this short time neither does TrendNet which you’ve been speaking well of). The device on its own deciding to just fuck off and pull down your network is not in any way a feature their customers would want. If you don’t want the (slight) maintenance load that comes with an active switch do not get one, get a passive one instead.
So first of all I see no point in sharing multiple articles that contain the same copy-pasted info, one of those would have been enough. That aside, again, patches were made available before the vulnerability was published and things like MikroTik not pushing Updates being arguably more of a feature since automatic updates cause network downtime via a reboot and that would be somewhat problematic for networking equipment. Could they have handled that better? Yes, you can almost always handle vulnerabilities better but their handling of it was not so eggregious as to warrant completely avoiding them in the future.
Can you elaborate on how their response was lacking? From what I found the stable branch had a patch for that vulnerability available for several months before the first report while the lts branch had one available a week before the first article (arguably a brief period to wait before releasing news about the vulnerability but not unheard of either).
MikroTik also offers a 2 year warranty since they legally have to, no idea what you’re on about there. Also also not sure what you think they sell other than networking because for the life of me I can’t find anything other than networking related stuff on their website.
Baldur’s Gate is a prime example for people not actually being bothered by the “woke” stuff all that much. It just gets the blame when the product turns out shit because it’s an easy tool to explain why everything else about a game sucks.
The mechanics are bad? They must’ve spent more time arguing how to include pronouns than how to make the game fun.
Buggy? Obviously revisions to make the game more inclusive had more priority.
Are those speculations true? Idk but stuff like the leaked Sweet Baby Inc. talks about how to include more progressive aspects in a game make it seem believable enough for most people.
There is a (from my guess) sizable chunk of soulslike fans who want to play Bloodborne but won’t get a console just to do it. The noise around it is likely as loud as it is because it is the only Fromsoft soulslike released after the initial success of Dark Souls that is not available on PC, the other games all either got a PC port (Dark Souls 1-3) or released on PC from the start (Sekiro, Elden Ring)
Difficult to say, Arm is a bit weird when you compare it to x64 CPU’s because it does not have comlex instructions (by design) which means that for low intensity and ‘simple’ workloads an Arm CPU will be vastly more power efficient. However the more complicated the workload gets the more x64 has an advantage due to specialized instructions.
So for most users yes Arm will start being very competetive since the #1 metric there is battery life. However for datacenter, workstation and gaming usage Arm just cannot compete and very likely never will.
As much as I currently prefer AMD processors over Intel I would hate to see them go. Without serious competition AMD will just do the exact same thing Intel did before Ryzen dropped. The problem I see now is that if Intel gets into a situation as horrible as AMD was in there are not as many revolutionary concepts out there anymore that would get them out of that hole.
As the other commenter already said it’s an abundance of caution. GItea is already moving in the direction of SaaS and an easily self-hostable solution runs counter to that plan (Gitea is already offering a managed Cloud so this is not a hypothetical). One thing that has already happened is Gitea introducing a Contributor License Agreement, effectively allowing them to change the license of the code at any time.
I’ll be that guy: Use forgejo instead, its main contributor is a Non-Profit compared to Gitea’s For-Profit owners
I feel like we’ll get weekly updates just from countries reaching the threshold.
I think you have the wrong idea about what I was referencing. I’m not talking about Cloudflare Tunnels but their Encrypted Client Hello. While Cloudflare could intercept the inital ClientHello the rest of the HTTP traffic still is encrypted between Client and Server not between Client and Cloudflare. In that sense they have not turned into more of a MitM than they (or any other DNS Nameserver) were already anyway. So unless governments decide to completely dismantle the trust chain the internet works on they won’t be forced to fuck with ECH for anti-piracy either.
But ultimately anything going over a public DNS Server is susceptible to being compromised. We simply trust that the providers don’t.
I’m sure this is definitely going to go how the regulator thinks it will go. What with Cloudflare being one of the driving factors behind e2e encrypting more and more of the HTTP stack, making it ever harder for ISPs and other 3rd parties to see inside the HTTP traffic.
That certainly didn’t help either