About
Pascal Cuoq - 27th Sep 2011It struck me, writing both the last and the next posts, that I mostly assume that the reader — you — has already read and remembers the gist of previous posts and/or the documentation. I am pretty sure this would be considered a bad habit in some circles. In fact, there must be entire manuals and classes on the topic of avoiding exactly that. Perhaps reassuringly, it is not a new trend : Virgile's first serious post (after the mandatory \hello world" post) was about arcane loop annotation minutia. My first post was the continuation of a an unfinished tutorial. If you did not unsubscribe after these two you knew what you were in for.
In truth whenever I throw in an opinion or a casual post I am always worried I will diminish the technical value of the entire feed. Perhaps there should be a "norant" tag that all technical posts would have. It is very simple to subscribe to tags (if you are reading this you can figure it out) but to the best of my knowledge it requires client-side work to exclude a tag from one's subscription so that one cannot simply avoid posts tagged "rant" but get the others.
The main reason I have for not providing the larger picture in every post — or indeed in any post — is that I find it boring to repeat myself and the larger picture does not change so quickly. Before this blog existed I would answer questions in the mailing list; it was always the same thing and I was getting really tired of it.
We solved the "mailing list questions" problem accidentally when for technical reasons we decided not to distribute a Windows binary version. That took complete care of that.
It must have happened to you to stumble on someone's blog (or any sort of episodic writings) to find the initial samples you read so great that you can't stop and after a few hours to overdose from too much of the same "big picture". The last time it happened to me was with this one (perfectly healthy and recommended in small quantities though). I am also trying to avoid that. Since even people who are obviously much better than me at expressing their thoughts fall into that trap I mean.
By the same token there won't be any more analysis of numerical programs for a while. I don't care if the program is shaped like an egg and animates a flying swan in ASCII art I am not analyzing it.