I Let AI Build My Blog From Telegram Posts

I crap-coded, sorry, vibe-coded – that is what they call it now.
As an experiment to see whether a top Codex-like model can build something complete and working on top of an existing codebase, I decided to give it a task that was not too critical but still interesting: to build my blog.
I want to write, but I do not want to commit to writing regularly, so I needed a workaround. That is how the whole system appeared:
1. On a schedule, all posts from my Telegram channel (this one) are imported automatically.
2. They are passed through GPT-5.1: translate, generate a title, and create tags.
3. The translation is stored in the database.
4. Images (if any) are uploaded to S3.
You can see the result here: https://tobishua.com/blog
My contribution there: a few edits (simply because I cannot stand bad code architecture) and lots and lots of prompts.
More to explore
Human-Like Memory for LLMs
TL;DR I wrote a manifesto-style essay about a memory model for LLMs that is as close as possible to human memory and lets the system build a relationship histor…
Why True Long-Term Memory Will Make AI Less Predictable (and More Human)
LLMs and 'memory' (part 2) As the volume of stored 'memories' grows, memory reconsolidation becomes necessary. For example, once a critical …
When Companies Finally Say the Ugly Part Out Loud
Now we are finally fucking talking. Not all that crap like "internal policies", "no explanation needed", "just because".1Office are the first who wrote it plain…
When Your Startup Dream Runs Into a Brick Wall
Shit happens. Picture this: you are working on your side project and pouring a ton of time into it because you genuinely believe you are solving a real problem …