• 0 Posts
  • 5 Comments
Joined 3 months ago
cake
Cake day: April 2nd, 2024

help-circle
  • JS or really anything you’d make a web app in (I use Rust with something like Dioxus/Yew/Leptos/Tauri), C#/.NET (I use F# because OO-style languages are ugly and a hot mess, especially C# and Java), Java/JVM (I use Scala whenever I can and Kotlin otherwise), C++ with GTK or Qt. There are a lot of options but obviously anything that’s not C++ or web is gonna give you a lackluster experience (though I have a thing against web apps and will go through a lot of hoops to have my application use a native interface)


  • I can’t imagine most Nvidia employees don’t make enough to become millionaires within like 5-10 years if they aren’t already. Their entry-level software engineering positions have a base pay of $147K and total compensation of $180K. The lowest paying level of senior engineers gets more like $300K… Even the ones who leave before then are highly likely to get a job with comparable pay or benefits considering they have Nvidia on their resumé.

    Now, tens-of-millions-aires, I don’t think most employees get there.


  • Sounds like you’re mixing up AI with AGI and have no idea of what you’re talking about, like 99% of the people on the internet who suddenly act like they’re data science experts. This article is just taking advantage of the fact that people like you don’t know what “AI” means to get clicks by misdirecting you with improperly worded claims. “True AI” doesn’t mean anything.

    Also the term “AI” to describe complex algorithms existed long before the technology was ever in the capitalist market. You literally just completely made that part up. One of the guys that coined it (John McCarthy) was one of the most important computer scientists of all time, who was also a cognitive scientist, he’s the same guy who invented garbage collection and Lisp. One of the other guys to coin the term was Claude Shannon, who is widely considered the father of information theory and laid the foundation for the Information Age. The other people to participate in coining the term include the person who made the first assembler & designed the first mass-produced computer, and the guy who proposed the theory of bounded rationality. The guys who coined AI and founded/established the field were pretty much Turing’s successors, not people looking to “sell you shit”.


  • For at least 1440p 144 fps on high or ultra, depending on your budget, preferably, rx 6950xt, rx 6900xt, rx 6800xt, or less preferably rx 6800; but you’d be able to use the rx 6700xt and rx 6750xt for many games and get the same performance on more demanding newer games if you turn the graphics down a little bit, although I’d recommend them more for 1080p.

    For more expensive options, rtx 4090, rtx 4080 Super, rx 7900xtx, rtx 4070 Ti Super, rx 7090 gre, and rx 7800xt (probably the best value for the price GPU) are your options pretty much.

    I think the 6950xt and 7800xt are the most worthwhile for most people to upgrade to if they’re not looking to spend an absurd amount of money for high-performance cards. But obviously an overpriced 4090 or something is going to be significantly better and more future-proof in any scenario.


  • Definitely not a third. A $500USD Xbox Series X or PS5 has about the same performance as a ~600-650 PC in the current market maybe. They sell at a small loss (or used to), because they intend to get significantly more back from you via subscription payments. Most people want to actually be able to play games they paid for online or use basic online services, so after like 5 years you’ve already spent another 300 (xbox) to 500 (playstation) assuming you buy the cheapest option annually.

    On console you also have significantly less choice for peripherals and pay more for games, a lot of extra money spent for most people. With PC you can spend way less to get the functionality you need.

    Plus if you like pirating, you can consider that a few hundred dollars in savings on games… considering you don’t pay for them and all.