Introduction
Argh! Time for another rant, methinks.
I’m trying to write a simple networking library in Java (I have a few projects ongoing that require networking, so I think a library that abstracts some functionality makes sense, especially considering point 1 of my horror story)
Point 1 of my horror story
You’d think that Java would avoid throwing unhandled Exceptions as much as possible, since it can quickly turn neat code into a spaghetti bowl of try/catch blocks and if/else statements. Nope, apparently they make an Exception to the “handle every Exception” rule for IOException, which it seems they want to use like some sort of high-ROF code munger and to put people off writing network code.
This is annoying, but it’s nothing compared to…
Point 2 of my horror story
Unsigned types. Anybody familiar with Java will already have facepalmed as they realise they’re about to read yet another furious network coder’s ranting about this problem. The justification, it appears is that “many C developers are fucking idiots.” Well, duh. I guess that’s why you stole my macro preprocessor too, huh? Thing is, I don’t mind people writing tools for idiots. I mind them writing tools for idiots and branding them as tools for serious developers. And I mind serious developers actually using these tools. I mean, the tool is good. It abstracts away memory management and other gumph that most developers couldn’t care less about and used to see as a tiring necessary evil (mind you, there are some benefits to managing your own memory). However, this sort of language feature isn’t idiot-proofing the language, it’s genius-proofing it. Genius-proofing is good, idiot-proofing is bad. Genius-proofing means making the language so simple that even a genius can use it (in much the same way that idiot-proofing is making the language so simple that even an idiot can use it). The difference lies in the fact that geniuses use very different methodologies to idiots. A genius will not want to have to do mundane tasks like allocating and freeing memory when it can be automated, whereas an idiot wouldn’t want to be able to accidentally refer to an unsigned int in a signed context, or vice-versa.
The biggest problem with this is the fact that it’s inherently the wrong way around. Treating everything as signed because idiots don’t know how unsigned works is somewhat akin to welding all the knives into sheaths so that the sheaths don’t hurt anyone. It’s far more sensible to treat everything as unsigned, so that people get used to converting to signed themselves, and write abstraction layers that allow them to use signed stuff where they want to shoot themselves in the foot. Especially as you then please the networking camp, because everyone will then be using their kind of code!
Just one more reason to write my own language, I think…