I’m noticing that people who criticize him on that subreddit are being downvoted, while he’s being upvoted.
I wouldn’t be surprised if, as part of his prodigious self-promotion of this overlong and tendentious screed, he’s steered some of his more sympathetic followers to some of these forums.
Actually it’s the wikipedia subreddit thread I meant to refer to.
As a longtime listener to Tech Won’t Save Us, I was pleasantly surprised by my phone’s notification about this week’s episode. David was charming and interesting in equal measure. I mostly knew Jack Dorsey as the absentee CEO of Twitter who let the site stagnate under his watch, but there were a lot of little details about his moderation-phobia and fash-adjacency that I wasn’t aware of.
By the way, I highly recommend the podcast to the TechTakes crowd. They cover many of the same topics from a similar perspective.
For me it gives off huge Dr. Evil vibes.
If you ever get tired of searching for pics, you could always go the lazy route and fall back on AI-generated images. But then you’d have to accept the reality that in few years your posts would have the analog of a geocities webring stamped on them.
Trace seems a bit… emotional. You ok, Trace?
But will my insurance cover a visit to Dr. Spicy Autocomplete?
So now Steve Sailer has shown up in this essay’s comments, complaining about how Wikipedia has been unfairly stifling scientific racism.
Birds of a feather and all that, I guess.
what is the entire point of singling out Gerard for this?
He’s playing to his audience, which includes a substantial number of people with lifetime subscriptions to the Unz Review, Taki’s crapazine and Mankind Quarterly.
why it has to be quite that long
Welcome to the rationalist-sphere.
Scott Alexander, by far the most popular rationalist writer besides perhaps Yudkowsky himself, had written the most comprehensive rebuttal of neoreactionary claims on the internet.
Hey Trace, since you’re undoubtedly reading this thread, I’d like to make a plea. I know Scott Alexander Siskind is one of your personal heroes, but maybe you should consider digging up some dirt in his direction too. You might learn a thing or two.
Please touch grass.
The next AI winter can’t come too soon. They’re spinning up coal-fired power plants to supply the energy required to build these LLMs.
Until a month ago, TW was the long-time researcher for “Blocked and Reported”, the podcast hosted by Katie ‘TERF’ Herzog and relentless sealion Jesse Singal.
I wouldn’t know anything about the thread, as it’s impossible for me to read without a twitter account. Yet another reason why the site is trash.
But by all means, go on generating content for a bigoted fascist who will use everything you write to increase engagement with his platform and give it undeserved credibility and $$$.
(And no, I won’t block you.)
Sorry for the off-topic rant, but WTF is Emile Torres doing on twitter? Anytime I see someone creating content for that Nazi hellsite, I start looking at them differently.
I’d really like to know the back story on this interview too. I realize weirdness isn’t exactly distinctive when it comes to rationalists, but Zack is in a league of his own.
deleted by creator
I’m probably not saying anything you didn’t already know, but Vox’s “Future Perfect” section, of which this article is a part, was explicitly founded as a booster for effective altruism. They’ve also memory-holed the fact that it was funded in large part by FTX. Anything by one of its regular writers (particularly Dylan Matthews or Kelsey Piper) should be mentally filed into the rationalist propaganda folder. I mean, this article throws in an off-hand remark by Scott Alexander as if it’s just taken for granted that he’s some kind of visionary genius.
Non-paywall link.
You know the doom cult is having an effect when it starts popping up in previously unlikely places. Last month the socialist magazine Jacobin had an extremely long cover feature on AI doom, which it bought into completely. The author is an effective altruist who interviewed and took seriously people like Katja Grace, Dan Hendrycks and Eliezer Yudkosky.
I used to be more sanguine about people’s ability to see through this bullshit, but eschatological nonsense seems to tickle something fundamentally flawed in the human psyche. This LessWrong post is a perfect example.
As anyone who’s been paying attention already knows, LLMs are merely mimics that provide the “illusion of understanding”.