Sam “wrong side of FOSS history” Altman must be pissing himself.
Direct Nitter Link:
https://nitter.lucabased.xyz/jiayi_pirate/status/1882839370505621655
Sam “wrong side of FOSS history” Altman must be pissing himself.
Direct Nitter Link:
https://nitter.lucabased.xyz/jiayi_pirate/status/1882839370505621655
yeah, and it’d be a pretty fucking immense undertaking, as it’d be the driver and the application code and everything else (scheduling, etc etc). again, it’s not impossible, and there’s been significant headway across multiple parts of industry to make doing this kind of thing more achievable… but it’s also an extremely niche, extremely focused, hard-to-port thing, and I suspect that if they actually did do this it’d be something they’d be shouting about loudly in every possible PR outlet
a look at every other high-optimisation field, from the mechanical sympathy lot stemming from HFT etc all the way through to where that’s gotten to in modern usage of FPGAs in high-perf runtime envs also gives a good backgrounder in the kind of effort cost involved for this shit, and thus gives me some extra reasons to doubt claims kicking around (along with the fact that everyone seems to just be making shit up)
yeah, would you look at this https://www.tomshardware.com/tech-industry/artificial-intelligence/deepseek-might-not-be-as-disruptive-as-claimed-firm-reportedly-has-50-000-nvidia-gpus-and-spent-usd1-6-billion-on-buildouts
yep, a completely normal amount of non-specialist hardware that basically everyone has in their back shed. you just don’t turn it on all the time because the neighbours keep complaining about the fan noise. practically anyone could do this!