convexer’s dumpster site

Hi, my name is not convexer and this is my garbage site. I created this site because I wanted a place where I could be my full & terrible self without worrying too hard about making a positive impression.

Topics of interest include personal shit, gender politics, regular politics, and the modern workplace. I don’t really proofread my posts, so let me know if I say anything that’s just wrong.

guestbook | todo page | FAQ page | tech & colors | RSS feed | bottom of the barrel

convexer’s dumpster site 88x31

“If I have peed farther, it is by standing on the shoulders of giants.”

: Is there such a thing as a “reliable source” online anymore?

In school, we all learned about reliable sources online using a sort of pyramid metaphor, with .gov and .edu websites at the top, followed by mainstream newspapers and magazines, and random blogs and bullshit at the bottom. If your teacher was bad, Wikipedia belonged to the latter category; if your teacher was good, they would point out that it can be a good place to get an overview but then you should click on the citations and follow through.

Everything was so neat and tidy then, right? It was very easy to follow up the warning of “You can’t trust everything you read online” with the reassurance of “But there is great information if you know where to look.” I don’t think this is true anymore.

Oddly, the most reliable and quick source for information of the form “just give me a basic overview of what this thing is” is often an LLM itself these days—they just give you concise bullets, no frills, and you can control the prompt, e.g. by asking it for a best-effort summary of both sides of a debate, unlike the LLM summaries that you encounter on websites, which are skewed toward a particular point of view.

Unfortunately, that won’t last long, because all the LLM chatbots are now connected back to Google search, meaning that instead of hallucinating average information (which is at least sometimes useful), they will begin reporting authentic, misleading information as found on Google, which has been overrun with LLM bullshit. You know what I’m saying? LLMs have stepped in to fill a gap that Google stopped servicing around 2015 or so, but they are now undermining their main value proposition precisely by being so cheap. Soon, asking GPT for laptop recommendations will be equivalent to asking an LLM to summarize the top clickbait article on Google with some dumb title like “top 10 laptops in 2025,” and it won’t take long after that for the AI companies to just make explicit marketing deals to put product placement in LLM responses.

The only websites I can sort of trust these days for reliable information about what’s happening in the world are basically legacy news websites. But even those have particular topics they like to showcase more than others. And it creates an epistemological problem when you have no reference point by which to judge the newspapers themselves as credible or not. Even the phrase “mainstream media” has positive/negative associations that depend on your political beliefs.