• DdCno1@beehaw.org
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    3
    ·
    22 hours ago

    He’s an ass kisser, but the company is doing excellently under his watch and also treating its employees quite a lot better than most of Silicon Valley. Bad Linux drivers alone don’t make a company bad.

    • Lucy :3@feddit.org
      link
      fedilink
      English
      arrow-up
      6
      ·
      22 hours ago

      CES didn’t shine a good light on their marketing, and the whole sector is literally a ripoff.

      • DdCno1@beehaw.org
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        20 hours ago

        Nvidia is active in more than just one sector and love them or hate them, but they are dominating in consumer graphics cards (because they are by far the best option there, with both competitors tripping over their own shoes at nearly every turn), professional graphics cards (ditto), automotive electronics (ditto) and AI accelerators (ditto). The company made a number very correct and far-reaching bets on the future of GPU-accelerated computing a few decades ago, which are now all paying off big time. While I am critical of many if not most aspects of the current AI boom, I would not blame them for selling shovels during a gold rush. If there is one company in the world that has a business model built around AI right, it’s them. Even if e.g. the whole LLM bubble bursts tomorrow, they’ve made enough money to continue their dominance in other fields. A few of their other bets were correct too, like building actual productive and long-lasting relationships with game developers, spending far more on building decent drivers than anyone else and correctly predicting two industry trends very early on that are now both out in full force by making sure that their silicon puts a heavy emphasis on supporting both ray-tracing and upscaling. They were earlier than AMD and Intel, invested more resources into these hardware features while also providing better software support - and crucially, they also encouraged developers to make use of these hardware features, which is exactly the right approach. Yes, it would have been nicer of them to open source e.g. DLSS like AMD did with FSR, but the economic incentives aren’t there for this approach, unfortunately.

        The marketing claim that the 5070 can keep up with the 4090 is a bit misleading, but there’s a method to the madness: While the three instead of just one synthetic frames created by the GPU are not 100% equivalent to natively rendered frames, the frame interpolation is both far better than it has been in the past from the looks of it (to the point that most people will probably not notice it) and has also now reached a point - thanks to motion reprojection similar to tech previously found on VR headsets, but now with screen edges being AI generated - where it has a positive impact on input latency instead of merely making games appear more fluent. Still, it would have been more honest to claim that the “low-end” (at $600 - thanks scalpers!) model of the new lineup is more realistically half as fast as the previous flag ship, but I guess they felt this wasn’t bombastic enough. Huang isn’t just an ass kisser, but also too boastful for his own good. The headlines wrote themselves though, which is likely why they were fine with bending the truth a little.

        Yes, their prices are high, but if there’s one thing they learned during COVID, it’s that there are more than enough people willing and able to pay out of their noses for anything that outputs an image. If I can sell the same number of items for $600 than for half the price, then it makes no sense to sell them for less. Hell, it would even be legally dangerous for a company with this much market share.

        I know this kind of upscaling and frame interpolation tech is unpopular with a vocal subset of the gaming community, but if there is one actually useful application of AI image generation, it’s using these approaches to make games run as well as they should. It’s not like overworked game developers can just magically materialize more frames otherwise - we would be more realistically back to FPS rates in the low 20s like during the early Xbox 360 and PS3 era rather than having everything run at 4K/120 natively. This tech is here to stay, downright needed to get around the diminishing returns paradigm that has been plaguing the games industry for a while, where every small advance in visual fidelity has to be paid with a high cost in processing power. I know, YOU don’t need fancy graphics, but as expensive and increasingly unsustainable as they are, they have been a main draw for the industry for almost as long as it has existed. Developers have always tried to make their games look as impressive as they possibly could with the hardware that is available - hell, many have even created hardware specifically for the games they wanted to make (that’s one way to sum up e.g. much of the history of arcade cabinets). Upscaling and frame generation are perhaps a stepping stone towards finally cracking that elusive photorealism barrier developers have been chasing for many decades once and for all.

        The usual disclaimer before people accuse me of being a mindless corporate shill: I’m using AMD CPUs in most my PCs, I’m currently planning two builds with AMD CPUs, the Steam Deck shows just how great of an option even current AMD GPUs can be, I was on AMD GPUs for most of my gaming history until I made the switch to Nvidia when the PC version of GTA V came out, because back then, it was Nvidia who were offering more VRAM at competitive prices - and I wanted to try out doing stuff with CUDA, which is how they have managed to hold me captive ever since. My current GPU is an RTX 2080 (which I got used for a pittance - they haven’t seen any money from me directly since I bought a new GTX 960 for GTA V) and they can hype up the 50 series as much as they want with more or less misleading performance graphs, the ol’ 2080 is still doing more than fine enough at 1440p that I won’t be upgrading for many years to come.