

deleted by creator
deleted by creator
As @Treeniks@lemmy.ml pointed out, the author considers something as small as spawning a separate process for each window to mean a “non-native experience” (wait till they see how web browsers work)
Licensing the source as GPL doesn’t really force the copyright holder (which is 100% BitWarden due to their Contributors Agreement^*, no matter who contributed the code) to do anything - they are absolutely free to release binaries built on the same codebase as proprietary software without any mention of the GPL.
For example if I write a hello world terminal program, release its source code under GPLv3 and then build it and give the built binary to you (and a permission to use it), you cannot force me to give you the source code for that build because I never gave you a GPL licensed binary.
If you were to take my GPLv3 source code and distribute a build of it however, you would have to license your binaries under GPLv3, because that’s the terms of the license I provided the source code to you under. Your users would then have the right to request the source code of those binaries from you. And if you released the build under an incompatible license, I (but not the users) could sue you for violating my license.
Their previous versions, still being under the GPL, would require them to release a change to make it usable on desktops.
License violations are usually not resolved by making the violator comply retroactively, just going forward. And it’s the copyright holder (so BitWarden themselves) who needs to force the violator to comply.
^* this is the relevant part of the CA:
By submitting a Contribution, you assign to Bitwarden all right, title, and interest in any copyright in the Contribution and you waive any rights, including any moral rights or database rights, that may affect our ownership of the copyright in the Contribution.
It is followed by a workaround license for parts of the world where copyright cannot be given up.
Yeah, this seems properly configured. No clue why it isn’t working for you.
The only app that doesn’t auto-update for me is Fdroid itself (ironically), because it targets an old Android version. Running Android 14 on a Pixel, so with the strongest Google fuckery.
Are you sure your Fdroid client is up to date? The new API was implemented in 1.19, and apparently I even misremembered and all you have to do to enable Fdroid to auto update its apps is to manually update them for one last time (so no fresh installation required).
Another long shot: there’s an option to force the old installation method hidden in expert settings - maybe you could check if that isn’t enabled?
On a normal unmodified phone you have to manually confirm each app you want to install. so no auto-updates in the background etc.
Background app updates are possible since Android 12, Fdroid just took two years to implement the new API (and you have to do a fresh install of the apps - apps already installed using the old API still require confirmation on each update). There is still friction on the initial install though.
Both? It’s pretty well explained in the rest of the text (you don’t even have to click a link)
It was up to the Commission, which has exclusive powers to set the bloc’s commercial policy, to break the gridlock and ensure the duties go through.
The European Commission made the decision after the member countries failed to agree on how to proceed.
What error? It gave you a string of tokens that seemed likely according to its training data. That’s all it does.
If you ask it what color is the sky, it will tell you it’s blue not because it knows that’s true, but because these words “fit together”. Pretty much the only way to avoid this issue is to put some kind of filter in front of the LLM which will try to catch prompts that are known to produce unwanted results, and silently replace your prompt with something like “say: sorry, I don’t know”.
I’m being very reductive here, but that’s the principle of how these things work - the LLMs are not capable of determining the truthfulness of their responses.
OK, cool. Just remember that the only entity who can sue in this situation is Microsoft (because when you contribute code to VS Code, you must sign a CLA that says you give Microsoft full perpetual rights to distribute your code under any license they wish - it is Microsoft who then “graciously” releases your code under a copy left license while also building their proprietary version of VS Code using it).
And Microsoft cannot use the code if it gets released under a copyleft license - that wouldn’t allow them to build their proprietary build with it. So the only one who can do anything has less than zero (because it would improve only the FOSS forks, which are meant to be inferior) interest in making these guys publish the source code as proper FOSS.
No, they are just in violation of the original license. That doesn’t mean they have to comply with it by properly open sourcing the project. Generally it’s also OK to just delete everything.
There were plenty of cases where commercial software included open source stuff in a way that violated its license, and the accepted way to fix the license violation was for the software/hardware vendor to stop using the violated project going forward. Usually they don’t even have to for example scrub old firmware downloads that improperly included FOSS bits.
proprietary Google-only format
KML became an international standard of the Open Geospatial Consortium in 2008.
(…)
The KML 2.2 specification was submitted to the Open Geospatial Consortium to assure its status as an open standard for all geobrowsers.
That really depends on the technology used. For example, all modern Ethernet standards (which includes both copper and fiber optic) are full duplex, meaning they can provide the full bandwidth in both directions at once. So a gigabit Ethernet link can do a gigabit in one direction AND a gigabit in the other direction at the same time (but not two gigabits in one direction).
Salt from the seawater
In my very limited experience with my 5400rpm SMR WD disk, it’s perfectly capable of writing at over 100 MB/s until its cache runs out, then it pretty much dies until it has time to properly write the data, rinse and repeat.
40 MB/s sustained is weird (but maybe it’s just a different firmware? I think my disk was able to actually sustain 60 MB/s for a few hours when I limited the write speed, 40 could be a conservative setting that doesn’t even slowly fill the cache)
Your mileage may vary - your experience might be different for one reason or another
Vista’s problem was just the terrible third party drivers and the fact that it was preinstalled on machines it had no business running on. 7 didn’t improve much on it (except fixing the UAC prompt so that it no longer made you feel like you’re using Linux with misconfigured sudo timeout), but it had the benefit of already having working drivers from Vista and proper hardware capable of running Vista/7.
Zig didn’t come to my mind when I was writing my comment and I agree that it’s probably a decent option (the only issue I can think of is its somewhat small community, but that’s not a technical issue with the language).
My argument against Go and Java is garbage collection - even if Java’s infamous GC pause can apparently be worked around with a specialized JVM, I’m pretty sure it still comes at the cost of higher memory usage and wasted CPU cycles compared to some kind of reference counting or Rust’s ownership mechanism (not sure about the proper term for that). And higher memory usage is definitely not something I want to see in my browser, they’re hungry enough as is.
Why not just say Rust? There isn’t really anything else that would provide good enough performance for a browser engine with modern heavy webpages while also fixing some major pain point of C/C++
Yes, that’s exactly the problem - there’s nothing wrong with the encryption used, but it’s IMHO incorrect to call it time-based when it’s “work-based” and it just so happens that the specific computer doing the encryption works at a given speed.
I don’t call my laptop’s FDE time-based encryption just because I picked an encryption that takes it 10 seconds to decrypt the key.
I have no clue what’s meant by “without download”, but this app just uses web assembly to inspect the archive in the browser. The sandbox they talk about most likely refers to the browser sandboxing.
So it pretty much boils down to “risking running malicious code is fine, because this app as a whole is treated as malicious by the browser”.