

we didn’t ask for LLM slop, thx
we didn’t ask for LLM slop, thx
under no circumstances is “my favorite stochastic parrot got it right / I reran the prompt and it worked this time hmmmm wonder why” an interesting post
I knew you were a lying promptfondler the instant you came into the thread, but I didn’t expect you to start acting like a gymbro trying to justify their black market steroid habit. new type of AI booster unlocked!
now fuck off
my god imagine being like this
this one was definitely my pleasure
“how can you fools not see that Wikipedia’s utterly inaccurate summary LLM is exactly like digital art, 3D art, and CGI, which are all the same thing and are/were universally hated(???)” is a take that only gets more wild the more you think on it too, and that’s one they’ve been pulling out for at least two years
I didn’t catch much else from their posts, cause it’s almost all smarm and absolutely no substance, but fortunately they formatted it like paragraph soup so it slid right off my eyeballs anyway
why would anyone want to play as an attractive Puerto Rican when peak sexiness has already been achieved
god I looked at your post history and it’s just all this. 2 years of AI boosterism while cosplaying as a leftist, but the costume keeps slipping
are you not exhausted? you keep posting paragraphs and paragraphs and paragraphs but you’re still just a cosplay leftist arguing for the taste of the boot. don’t you get tired of being like this?
holy shit I’m upgrading you to a site-wide ban
so many paragraphs and my eyes don’t want any of them
no problem at all! I don’t think the duplicate’s too much of an issue, and this way the article gets more circulation on both Mastodon and Lemmy.
E: ah, this is from mastodon. I don’t know how federation etc. works.
yep! any mastodon post whose first line looks like a subject line and which tags the community is treated by Lemmy as a new thread in that community. now, you might think that’s an awful mechanism in that it’s very hard to get right on purpose but very easy to accidentally activate if you’re linking and properly citing an article in the format that’s most natural on mastodon. and you’d be correct!
I feel like this article might deserve its own post, because I think it’s the first time I’ve ever seen an attempted counter-sneer. it’s written like someone’s idea of what a sneer is (tpacek swears sometimes and says he doesn’t give a shit! so many paragraphs into giving a shit!) but all the content is awful bootlicking and points that don’t stand up to even mild scrutiny? and now I’m wondering if tpacek’s been reading us and that’s why he’s upset, or if this is what an LLM shits out if you ask it to write critihype in the tone of a sneer
congrats, that’s awesome news!
it’s not pseudoscience unless it’s from the “literally studying ghosts” region of crankery, otherwise it’s just sparkling… actually I don’t know what your point is with all this
I agree, you are fucking done. good job showing up 12 days late to the thread expecting strangers to humor your weird fucking obsession with using LLMs for something existing software does better
imagine if you read the article at all instead of posting 6 paragraphs about an impossible game you’re fantasizing about, that LLMs do nothing to enable because they’re stochastic chatbots and don’t understand game systems (just like you!)
you know it’s weird
I looked for established reviews of Suck Up, the perfect local LLM game that isn’t local and is barely a game, and I couldn’t find any
all of the hype for this piece of shit that came out in 2023 and made zero impact was from paid influencers and the game’s dev Gabriel spamming reddit on a regular basis
so I guess what I’m trying to say is: fuck off with this shit, we’re not buying
Weird that you’re downvoting me already. Lol
weird that you’re complaining
The game Suck Up! is the perfect example save for the part where the developers chose to run it server-side on release
the perfect example. yeah, this is barely a game and they couldn’t even make it run locally. all of this shit is just an awful tech demo for an expensive gimmick. none of it is fun, nobody plays it. why in fuck are you even here pumping it?
the one that nvidia’s currently pumping as AI is the frame generation one, I believe. upscaling predates the current bubble and is mostly fine — I usually don’t like it outside of very limited use on my steam deck, but that’s personal preference
for an LLM? it’s a heavy GPU-bound workload that’ll tank performance for anything else using the GPU
if you’re considering pasting the output of an LLM into this thread in order to fail to make a point: reconsider