Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.
Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this.)


A lesswronger asks are
werationalfic protagonists the baddies? https://www.lesswrong.com/posts/FuGfR3jL3sw6r8kB4/richard-ngo-s-shortform?commentId=uDuzmfMEvEqpyApLhtldr; rationalfic has a very common trend of the protagonist gaining and using overwhelming power to radically reform the world. This is almost (with a few notable exceptions) portrayed as clearly unambiguously good thing.
My take: Don’t get me wrong, the Wizarding World (for example), as canonically portrayed needs some very strong reforms if not an entire revolution. But rationalfic almost never portrays the slow hard work of building support networks and alliances and developing a materialist theoretical understanding of how to reform society, as opposed to a lone (or small friend group) rationalist hero finding some overwhelming magical or technological advantage they can use to single-handedly take control and use their rationalist intellect to unilaterally fix everything. Part of it is the normal disconnect of fiction to the real world were it is more narrative satisfying (and easier to write) to have a central protagonist the solves the major problems or is at least directly involved with them, and rationalfic involves that protagonist gaining even more agency than they canonically do. The problem is that rationalist take this attitude back into real life, and so end up idolizing mythologized techbro billionaires or venture capitalist or the myth of the lone genius scientist/inventor.
Also, quality sneer in the replies, “rational” teletubbies: https://tomasbjartur.bearblog.dev/rational-teletubbies/
there’s a lot i want to pull out from this comment by ngo
first, shorn of context, i don’t know that this sort of power fantasy reflects so poorly on the rationalists. or perhaps it does, and in that case also reflects poorly on me, since it’s my preferred power fantasy. the world sucks and it would be nice to magically make it better. EDIT: ok im noticing that the naruto fanfiction excerpt is just straightforwardly jerking off about doing a fascist coup
second, we must remember that rat stories are implicitly either recipes for social change or warnings that society ought to stay away from particular demons. rationalism is in large part a political movement with what they believe to be practical aims
third, if ngo’s marxist fiction from the 1800s all ended with communist revolutions, the worrying thing for a member of the movement would not be a fantasy of triumph or a sense of certainty of triumph, but rather an inability to connect triumphant outcomes to action under the present conditions. as you highlighted, the fantastical element of these stories is in conflict with the practicality of their aims
fourth, as far as i can tell, that is not ngo’s objection at all. what he seems to be concerned about is the possibility that rationalists will make serious progress on actually taking over the world and make terrible things happen once they do. i don’t take this possibility seriously at all. fundamentally, rationalists are lapdogs, forever licking the negligently outstretched hands of billionaires. they cause real harm as lackeys of the ultrawealthy and vectors for the diseases of racism, eugenics, etc, but to take ngo’s concerns seriously i would have to buy into the same fantasy of magical omnipotence he’s pointing to, because there seems to be no other path from here to rationalist dictatorship.
Your first point is true… with the key words being “shorn of context”. When you look at how many ratfics go in that direction your second and fourth points become problems.
As to your fourth point… the techbro billionaires like Elon or Peter Thiel do like referencing fiction (often in hamfisted or ignorant ways that makes me think a bit of fandom gatekeeping actually is good sometimes… i.e. naming your surveillance company palantir, or naming one of your kids a nonsensical WH40K reference). So I wouldn’t entirely neglect the possibility of rationalist managing a bit of inspiration to the billionaires in between the bootlicking. And although there may not be a magical “coup the government” power in real life, the influence they are trying to focus on themselves and harness is still worrying.
Part of what makes the RatFic version of this so weird imo is that despite being ostensibly rooted in relatively low-hanging fruit (e.g. what if we industrialized this pre modern setting, what if we rationally looked at the rules of this magic system, etc.) nobody other than the protagonist has ever thought about these things and even once the protagonist starts demonstrating some real world-conquering results (benevolently, of course) nobody ever really seems to want to copy their successes. Part of what made the actual industrial revolution unfold the way it did was because of the ensuing arms race of it. In addition to causing the lines on various economist’s charts to go nearly vertical this also basically culminated in the first world war, which seems like the kind of event that they should be aware of. But of course in RatFic it seems like anyone who can’t be talked around to joining up with our protagonist is too weak or woke or stupid to actually pose a threat to the Glorious March of Rational Progress.
Once you commit to the idea that only your main characters have ever tried to study magic scientifically, you’re locked in to making all the rest of the magical world into dullards. (Really, no other eleven-year-olds were ever into computer programming, chemistry sets, exotic marine animals, outer space, or dinosaurs?) Or, to look at it another way, the only way you can find the premise plausible is if you’re already inclined to dismiss most of humanity as “NPCs”.
Being the kind of writer I am, whenever this comes up I am tempted to suggest ways it could have been done better. But, first, I am not glazing the work of Rowling, even indirectly, no way, no how. Fuck her for all the pain she has wrought, and fuck the whole LessWrong crew for tacitly accepting it. Second, HPMoR was cult shit all along, not meant to teach science but to sow distrust of scientists under the glossy sheen of being able to name the six quarks.
The premise kind of does work in a setting like Harry Potter, the wizarding world is insular enough that a clever kid could bring in some new ideas. The problem is Eliezer wanted to throw in too many shortcuts. Its not enough for creativity with transmutations to give the protagonist a small edge, transmutation is made into the ultimate all-purpose spell so the protagonist can exploit it easier. The protagonist isn’t just moderately better at Patronus with some muggle psychology, his patronus can kill dementors. And the philosopher’s stone is changed into some ancient atlantean super-magic, because fuck wizards ever inventing anything, and also instead of some moderate rate of its typical mythological powers it is super transmutation.
Rowling went mask off transphobe in 2018, HPMOR finished in 2015. So I won’t blame Eliezer for not picking a different fandom at the time. Eliezer has actually said moderately supportive comments, including of using people’s preferred pronouns (we’ve mocked another lesswronger for writing long screeds complaining about this). In general, I think the average lesswrong attitude towards trans people is better than the average American’s attitude… but that is because the bar is in hell. But yeah I’ve seen plenty of shitty takes towards trans people on lesswrong.
Yep. And it didn’t even stick to its premise of “try to do science to magic and compare muggle scientifically gained knowledge to magic” and instead went into some Ender’s game pastiche followed by Death Note style “I know you know I know” plotting, then Harry gets handed all the magical power handed to him at the end of the story thanks to Dumbledore following some insane combination of prophecy.
I was never on twitter. Can you point me to a concise list of what Rowling said and when she said it?
Most social media posts and online news stories just talk around what she said.
This article is where I was getting the 2018 date from: https://theweek.com/feature/1020838/jk-rowlings-transphobia-controversy-a-complete-timeline
And since in that 2018 incident Rowling was trying to backpedal/downplay it, I assume before that she was keeping the mask firmly on.
I forgot that she published novels alluding to anti-trans tropes under a male penname! Sorry Joanne “Robert Galbraith” Rowling, facts don’t care about your feelings.
The thread for collecting HPMoR sneers linked to this timeline, but it’s paywalled now:
https://www.vox.com/culture/23622610/jk-rowling-transphobic-statements-timeline-history-controversy
To be clear, I don’t care about Yud picking the fandom he did at the time (apart from the cheapness of “playing on easy mode” and the blatant attempt to ride popularity for propagating his cult shit). What strikes me is the silence during the time when other people are most definitely reacting:
https://awful.systems/post/5169331/8248381
Yeah. And it is not like Eliezer usually holds himself back from throwing out hot takes or inserting himself into conversations he is tangentially relevant to, so the silence is conspicuous in this case.
I have also occasionally been tempted to try and get a Goncharov thing going, where everyone collectively recalls that Tommy Berry and the Forevernight Forest got them into reading.
That’s all I wrote in the thread that prompted me to take a stab. Oh, I think I had decided that the girl’s name is Elfriede? And the principal of magic school is nonbinary.
Fuck it, I’m good to Gonch this out.
I forget, did we ever actually learn who the killer was in Murder at Wizard University? I remember it kept coming up through the first book as a kind of motif for how this new world wasn’t necessarily as safe and clean as Tommy expected, but I think that whole business with the Thoughtknot ended up overshadowing it before the actual killer was revealed. Like, I get it thematically or whatever but it just stuck in my head as a loose thread and has bugged me for years.
Subscribed
They also don’t want to believe in chaos theory. This post tries to explain it to them, but check out Gwern in the comments being skewered by a book written by Freeman Dyson around the time he was born. They want the future to be perfectly predictable (even though Yud says that 1 and 0 are not probabilities) and they don’t like game theory, repeated games, or non-zero-sum games, because those reward people from building trust then violating it.
Gwern’s been updating those comments! This was in 2023, and in 2025 he was still so mad about it that he wrote a list of ways to cheat at pinball and edited the comment to add a link.
The attitude that you can substitute a bunch of cheap tricks and hacks to get around fundamentally difficult problems reminds me of the techbro attitude that leads to stuff like pushing fundamentally non-viable technologies (like Theranos or the LLM boosters) and of DOGE trying to asking an LLM how to cut the DEI.
I feel so sad because so many of his examples are ways to make people think you won. And we are about to see what happens when you take 20-30% of global fossil fuel and helium production offline for six months to a few years. Public relations and cooking the books can’t change that.
Its easy to make people believe you are wise and know the future. There is no way to predict the weather one month out much better than we can now, and if you plant your crops and the sun scorches them, those crops are dead and you have to wait until next season to replant.
On a purely rhetorical point, it seems like the whole counterargument from Gwern is just an argument-by-disorganization or something to that effect. He doesn’t actually challenge the factual information presented, but does shift how those facts are framed and what the actual contention is in the background, and then avoids actually engaging with the new contention from the bottom up.
In a lot of discussions with singularity cultists (both pros and antis) they assume that a true superintelligence would render the whole universe deterministically predictable to a sufficient degree to allow it to basically do magic. This is how the specifics of "how and why does the AI kill all humans again?’ tend to be elided, for example. This same kind of thinking is also at the heart of their obsession with “superpredictors” who can, it is assumed, use some kind of trick to beat this kind of mathematical limit in certainty (this is the part where I say something about survivorship bias). In the context of that discussion, the fact that a relatively simple arrangement of components following relatively simple, deterministic rules is still not meaningfully predictable past a dozen or so sequential events due to the magnification of the inevitable error in our understanding of the initial circumstances is a logical knockout.
Rather than engage with this, however, Gwern and his compatriots in the thread focus in on the tangent about how high-level pinball players are able to control for that uncertainty by avoiding the region of the board where those error-magnifying parts are. However this is not the same argument and begs the question of whether those high-chaos areas are always avoidable as they are in a pinball machine. Rather than engage with that question, Gwern doubles down on the pinball analogy, shifting the question even further from “how well can we predict the deterministic motion of a ball given the inevitable uncertainty of our initial state” to “how many ways can we convince a third party we’ve gotten a high score on a pinball machine”. At this point we’re not just moving the goalposts, we’ve moved the entire stadium into low earth orbit and gotten real cute about whether we’re playing 🏈 or ⚽ football.
And given the conversation surrounding the thread and these topics on LW I’m not even going to assume that such a wild shift is the result of bad faith instead of simple disorganization and sloppiness of rhetoric. This is what happens to a community that conflates “it makes me feel smart” with “it actually communicates the point effectively”.
Gwern’s turn to “what if I just make people believe I won at pinball?” also come back to their idea that the smartest being is the best manipulator, even though some excellent manipulators like Trump don’t have a lot of logical-analytical intelligence, and some brilliant thinkers like John Nash get into a fight with the inside of their own head and lose. It also reminds me of how they love markets in theory but are not interested in starting a business which would compete with other businesses.
Ironically I think it’s also been discussed most frequently within Rationalist circles that these types of intelligence aren’t often correlated. I’m not going to chase down links right now because doing an SSC archive exploration requires more mental fortitude than I currently possess, but I distinctly remember that a recurring theme was “if nerds are so smart why don’t they rule the world?” In my less cynical days I had assumed that his confusion on this point was largely rhetorical, intended to illustrate some part of whatever point was buried in the beigeness. Now it seems like I was falling victim to the ability to project whatever tangentially-related thesis you want onto the essay and find supporting arguments because of how badly it’s written.