Java networking code

Introduction

Argh!  Time for another rant, methinks.

I’m trying to write a simple networking library in Java (I have a few projects ongoing that require networking, so I think a library that abstracts some functionality makes sense, especially considering point 1 of my horror story)

Point 1 of my horror story

You’d think that Java would avoid throwing unhandled Exceptions as much as possible, since it can quickly turn neat code into a spaghetti bowl of try/catch blocks and if/else statements.  Nope, apparently they make an Exception to the “handle every Exception” rule for IOException, which it seems they want to use like some sort of high-ROF code munger and to put people off writing network code.

This is annoying, but it’s nothing compared to…

Point 2 of my horror story

Unsigned types.  Anybody familiar with Java will already have facepalmed as they realise they’re about to read yet another furious network coder’s ranting about this problem.  The justification, it appears is that “many C developers are fucking idiots.”  Well, duh.  I guess that’s why you stole my macro preprocessor too, huh?  Thing is, I don’t mind people writing tools for idiots.  I mind them writing tools for idiots and branding them as tools for serious developers.  And I mind serious developers actually using these tools.  I mean, the tool is good.  It abstracts away memory management and other gumph that most developers couldn’t care less about and used to see as a tiring necessary evil (mind you, there are some benefits to managing your own memory).  However, this sort of language feature isn’t idiot-proofing the language, it’s genius-proofing it.  Genius-proofing is good, idiot-proofing is bad.  Genius-proofing means making the language so simple that even a genius can use it (in much the same way that idiot-proofing is making the language so simple that even an idiot can use it).  The difference lies in the fact that geniuses use very different methodologies to idiots.  A genius will not want to have to do mundane tasks like allocating and freeing memory when it can be automated, whereas an idiot wouldn’t want to be able to accidentally refer to an unsigned int in a signed context, or vice-versa.

The biggest problem with this is the fact that it’s inherently the wrong way around.  Treating everything as signed because idiots don’t know how unsigned works is somewhat akin to welding all the knives into sheaths so that the sheaths don’t hurt anyone.  It’s far more sensible to treat everything as unsigned, so that people get used to converting to signed themselves, and write abstraction layers that allow them to use signed stuff where they want to shoot themselves in the foot.  Especially as you then please the networking camp, because everyone will then be using their kind of code!

Just one more reason to write my own language, I think…

Advertisements

Ubuntu and Firewalls

Why Ubuntu’s firewall is open by default

Please, please, please, if you don’t know what a port is (literally, not metaphorically, if you think it’s a “doorway to your computer” then you fall into this category) do not administer a Unix firewall.

The number of people I see carting around fallacies and idiotic mantras like “Ubuntu comes with its firewall locked down by default, so it’s nice and safe” and “always close every port you aren’t using!!!”

The simple fact of the matter is that Firewalls are barely even necessary for a home user behind a NAT gateway.  But we’re going to discuss why it doesn’t matter even if you fall into the category of users who connect their computer directly to the internet (via cable modem or some such non-firewalling device).

A “port” is basically an area in memory on a computer that is reserved to allow programs to interact with the network in a sane way.

Think of it as more of a pigeon-hole than a “gateway”.  You put stuff into it (from the network) and the appropriate program, which has asked to be “bound” to the given port, takes stuff out of it and uses it in a sane way.

Before we had ports, only one service could run on a computer at once, otherwise the services wouldn’t know which packet was for which service.  Incidentally, that’s part of the history of why “www” prefixes pretty much every web address online, but that’s a whole other story.

So, we have these marvellous things called ports that allow services to interact with the network data that only that service wants.  In fact we have 65,535 of the little buggers.  So what happens if one of those 65,535 ports is “open”?  Well, nothing.  Nothing, that is, unless a service is running on the machine that wants to use that particular port.  So, unless you specifically tell the computer to run a service that will listen on that port, there’s no disadvantage to leaving it open.  There is, however, a disadvantage to closing it (which we’ll get to in a minute).

“Ah,” I hear you say, “but what if somebody else installs a service onto my machine which uses one of these ports to send all my keystrokes to the web?  Surely then, a Firewall would have protected me?”.  The answer is yes, but frankly you’re too stupid to be allowed to own a computer if that scenario ever occurs.  It also probably wouldn’t protect you from whatever the virus decides to do if it can’t get an active network connection which, knowing the way crackers tend to think, is probably “nuke the f* drive, b* won’t let me steal his data, I can at least destroy it!”  Firewalls do not stop people accessing your computer, they simply stop you using ports you probably want to use!  Which leads me nicely on to…

Why you probably don’t want to lock the firewall down yourself

Let’s say you’re clever enough to lock down the firewall yourself but (obviously) too stupid to realise that there’s probably a reason it’s not locked down when it’s shipped to you.  What’s the worst that could happen?

Well, let’s say you decide (at some point in the future) to download a software package that allows you to play multiplayer games online with all your linux buddies.

Oh no!  You can’t open any connections!  Why not?  The software said it doesn’t need to be run as root, it doesn’t work even if you do run it as root, your friend is definitely running the software, you’ve forwarded all the necessary ports from your router (if applicable) but still no connection!  Is this software that’s been made by someone too thick to test it before they ship?  Or is something else afoot?

Well, it’s quite possible this software uses 2-way communication between you and The Internet.  You’ve locked down all your incoming ports (all the ones you weren’t using at the time, anyway) and so this new piece of software is going to be broken until you realise your mistake and open the port for it.

Even if you kept a written record of all the ports you’ve got locked and unlocked, even if you were the most conscientious firewall admin in the history of firewall admins, you can’t deny that it would’ve been simpler if you’d just let all the network traffic flow in the first instance.

So if firewalls are this useless, why do they even exist?

Firewalls are not there for extra security for average joe home user.  They’re there for specific networking equipment and devices that require an extra level of security and configurability.  Firewalls like iptables are used in NAT gateways to forward ports (yes, the firewall actually allows network traffic to flow where it otherwise wouldn’t).  They’re also used in servers to disallow users from behind a subnet to send data on certain protocols; maybe you don’t want any of your ISP customers to spam people, so you block port 25.  Maybe you don’t want anybody at the office to run a web server, so you block port 80.  Maybe you don’t want your employees surfing the web when they should be working, so you rather foolishly block outbound port 80.  These are trivial, pointless examples but they are the legitimate uses of firewalls.  Blanketing your computer in a shroud of “nobody can connect to my non-existant services! haha!” is idiotic at best and creates problems for yourself at worst.

If you don’t understand what a “port” is, don’t administer a firewall.

 

Caveat: Windows

Whilst the same is mostly true for Windows, the Windows Firewall comes as locked down by default.  This is mostly because of the common misconception that causes people to think that Ubuntu’s is (that “locking down” a firewall improves your security) but it’s also partly to do with the fact that most Windows users very quickly download themselves viruses because they’re so braindead they don’t think “Frep0rn4liefnocreditc4rdn33dfromthai.exe” could possibly be a virus.  I mean, seriously.  Just, seriously.

Bluray and DRM

I’ve always had a strenuous relationship with DRM, but up until now it has been a purely ethical loathing of the restrictions placed on technology I’m supposed to own.  Tonight, after buying a BluRay Disc and trying to play it in my computer’s BluRay Drive, my anger at DRM has taken on a technical justification.

Previously, anecdotal evidence had told me that DRM does suffer from technical difficulties – massive ones at that – but I’d never come across anything serious in my own experience.  Occasionally, playing certain formats from certain companies required some tweaking and hacking but nothing has ever been rendered completely unplayable.  Until now.

I have legitimately bought a BluRay Player.  I have legitimately bought a BluRay Disc.  I have legitimately put the BD into the BD-ROM drive.  I have legitimately installed a legitimate operating system on my legitimately owned computer.  I have not broken the law at any stage along the process.  Yet the assumption that I wish to, or already have broken copyright law precludes me from playing this now very-expensive-coaster (although in actuality it wasn’t that much more than I’d expect to pay for a decent coaster as it was preowned from blockbuster during a sale).

Here comes the hugely ironic part.  Due to the simple fact that I want to watch the film now, not after hacking and tweaking and fucking around for four hours to get the fucking thing to play on my legitimate device (did I mention that the device is 100% shop-bought, prefabricated, manufacturer-warrantied, legal shit?) which is legitimate.

Ok, ready for irony?

I’m seriously considering torrenting The Men Who Stare At Goats because of the DRM software on The Men Who Stare At Goats’ BD.

Yup, you read correctly.  The DRM software is actually ENCOURAGING me to pirate the film.  ENCOURAGING.  Its entire purpose of existence is to DISCOURAGE this behaviour, and yet it fails miserably.

Breakdown of reasons:

Pirated BD rips can be played on any device, any OS, anywhere.
BDs can only be played on computers with the right software installed, or official BluRay players.
Pirated BD rips do not need to “load” for 5-10 seconds before they do anything.
BDs have loading screens.  Loading screens for a frigging film.
Pirated BD rips are lower quality so they can be downloaded, but are still much higher quality than DVD rips and even DVD rips are generally acceptable quality.
BDs are very high quality but, due mainly to DRM, the performance overheads are enormous.  Meaning BDs will struggle to maintain smooth playback on low-end hardware.  This isn’t a direct concern for me personally, as my hardware can cope, but I worry for people who can’t afford to upgrade their entire PC just to playback BDs (considering such an upgrade could cost upwards of £500, compared to the £30 I spent buying a BD-ROM drive).

So, there you have it.  Big media companies wonder why we keep pirating their shit, but then make their product technologically limited and generally WORSE than the FREE alternative.  It’s like selling sweets for £30 a time, then arresting people who make similar sweets that taste better and are free, and getting all uppity and self-righteous about it at the same time.  And then you start telling people who buy sweets from you, rather than getting them for free, that they can’t share their sweets with their friends or family, and they can’t give their sweets away or sell them, and then you start watching them eat them so you can be sure.  And all the while wondering, why are my customers breaking the law by going to the copycat sweet makers?

Random.org

Q1.3: Can I download the generator software and run it on my own computer?

No. It’s not just the software you’d need, but also three radios (or one, at any rate), which must be carefully adjusted to pick up atmospheric noise at the right volume. It’s not completely trivial to set up.

What a pompous ass.  It IS completely trivial to obtain random entropy.  Three radios?  Give me a fecking break.  This guy clearly sees himself as some sort of God of Random.  Evidence suggests otherwise.  A pure random distribution of bits can be found in every single computer in the world via simple and indeed TRIVIAL programming.  All you need do is:

Query RTC, or if there is no RTC in hardware ask the software for an RTC.

Dependant on RTC, read a bit of memory.

Truly random.  100% guaranteed to produce a bit that could be either 1 or 0 and is entirely unpredictable.

Sure, it requires kernel-level code, and sure it’s a tiny bit predictable, but if you think that is any less random than using atmospheric noise, you’re in for a shocking treat sonny.  Randomness is EVERYWHERE.  Not just at random.org.

Christ, it’s like this guy’s literally walked out of some kind of weird-ass steampunk machine and said “WOW I SEE RANDOM”.

 

EDIT: In the clear light of day, one obvious flaw is apparent; the procedure for grabbing random entropy I outlined would give a tendency towards zero.  However, it is indeed entirely trivial to program a RNG (not a PRNG) using hardware available to every computer system on the planet.  Even if you don’t have a Real Time Clock in hardware, you can quite easily program a recursive function whose randomness increases with each recursion:

#define RECURSION 1
#define RECURSION_LEVEL 3
int current_level = 0;
int randomInt(int seed) {
int n = // use inline assembly to fill n with whatever is currently in an arbitrarily chosen register
if(RECURSION && current_level++ <= RECURSION_LEVEL) randomInt(seed);
return seed;
}

Sure it’s messy but it’s a 2 minute proof of concept.

Linux Memory Usage

I recently became concerned about the performance of a particular application (cmatrix on the 1080p screen – it looks beautiful but it’s quite choppy) so I began some investigating and turned up some surprising statistics.

Uptime: 12:53; of that time, Chrome has been running for most of it and we all know web browsers are notorious for leaking all over your RAM.  Chrome is better than firefox but still pretty leaky.  On Windows XP, it was not uncommon for chrome to be using more than half a GB of RAM.

Now, the surprising thing is, Linux has managed to gobble up all my RAM.  But has it?  If you observe carefully, I’ll explain to you why Linux has -not- gobbled up all of my RAM (despite what ‘top’ tells me) whereas Windows, had it been given the same constraints, would’ve.

The way Linux handles memory usage differs extraordinarily to Windows.  If something leaks in Windows, that RAM is unrecoverable, especially after the application has quit.  However, in Linux, if something leaks, once the application quits, all the memory Linux allocated it is returned to Linux in the form of ‘cached’ memory.  In other words, the memory doesn’t get -deleted-, so if the application comes back and wants it again it’s entitled to the same memory (efficiency++) but if another application wants to use the memory that application -was- using, Linux isn’t gonna say no.  In other words, the memory is still ‘in use’ but it’s also still ‘available’ to new applications who request it.

This presents a difficult situation when you’re measuring RAM usage – do you count the memory as ‘in use’ or as ‘available’?  Well, ‘top’ (and some other tools) see it as ‘in use’, and ‘free’ will also tell you (on the Memory line) that it’s ‘in use’.  However, ‘free’ also has a “-/+ buffers/cache” line which basically takes the buffered memory into account, giving you a fuller picture of how much RAM is available to new applications.  So when you’re measuring RAM, you should use -that- line in your statistics.  Not the above raw figure which counts unreferenced (‘dead’) memory as ‘used’.

Windows XP, on the other hand (but don’t quote me about Vista or 7, I’m unfamiliar with the new kernel but I’m pretty sure it’s the same story in terms of memory allocation), trusts applications to free memory.  If they don’t, Windows will never reclaim the lost memory (not until a reboot anyway).  I’m unsure exactly how the kernel works (only Microsoft know) for this behaviour to be apparent, but I’m not the only one to have made that observation, and I can assure you Windows XP, if left to its own devices, will start swapping out pages like there’s no tomorrow after a couple of days (depending how much RAM you have).

Suffice it to say, if you run WinXP, you’re gonna get slowdown after a while, if you run Linux you can (in theory) continue running it forever.  Note that kernel leaks (which are rare but do happen) and certain other situations (eg video driver bugs) can cause a required reboot due to memory loss, but the situation is far rarer in the Linux world than in Windows.

It is owing mainly to this reason (and the kernel’s greater stability, owing to a better design and fewer crash-inducing bugs) that Linux servers are able to run for…

chris@w1zard:~$ uptime
20:54:58 up 125 days,  4:24,  2 users,  load average: 0.00, 0.00, 0.00

Yup, w1zard (my server) has now been running for 125 days, 4 hours and 24 minutes without a reboot.  The load average statistics being 0.00 is due to the quad core processor; the system doesn’t strain under the minute amount of traffic it’s required to handle (mail only at the moment, too).  I’m very proud.
Have fun hacking!

Ubuntu and HDMI audio

Ok, so I got the video part of the HDMI output working with a couple of deft mouseclicks and the help of my previous workings with nvidia control panel, but when it came to audio…

First off, I go internetting, ask the mighty Google how the heck it is to be done.  I happened upon a ubuntuforums.org post detailing steps to get a GT240 to work, whereupon I hit snag after snag, eventually crippling my entire audio setup.

Eventually, I found that my saving grace was “alsa-utils reset”, but the internet did not tell me this – rather, my brother pointed it out (him being an audiophile and a closet ubuntu geek).  I managed to get my speakers back, and seeing as I wanted to watch a film and had spent a good few hours troubleshooting audio things I decided HDMI audio can wait until it works ‘out of the box’.  Especially considering, as I said earlier, it’s actually a downgrade in speaker quality.

That said, it appears my tweeters have blown (muddy, muffled sounds coming from the hifi at the moment; my brother diagnosed it as blown tweeters, but he didn’t properly examine the patient so I don’t think he’d want to be quoted on that).

Those poised to criticize Ubuntu for not supporting HDMI ought remember that it is an immature technology (HDMI, that is, not Ubuntu) and is steeped in DRM.  Aside from the ethical problems I have with DRM, it also creates lots of technical hassle for projects wishing to use technologies that rely on it.  Especially if the company responsible for the DRM is uncooperative or demands financial recompense for their support in making technology work with it.  The mind boggles at why consumers put up with this kind of treatment, but I guess conglomerates preclude us from having much choice.

Have fun hacking!

Ubuntu dual-screen

Ubuntu doesn’t detect my second monitor automatically, but all it takes is a button ‘detect monitors’.  Windows XP requires multiple reboots and a painfully slow Nvidia Control Panel in order to get anywhere near dual headed graphics.  It also does not provide me with a full 1080p picture (without going too much into the ins and outs of it, Windows XP’s best resolution for the panasonic TV was 1200×950-ish, whereas Ubuntu can give me the full 1920×1080 with a little bit of tweaking).

Admittedly, my experience in the past with Ubuntu has been slightly different when it comes to dual monitors, but for Ubuntu to reach this level of desktop usability within 6 years from their first release (compared to WinXP having 16 years to get from Win 1.0 to WinXP); I’d say that’s pretty gosh-darned impressive.

What’s more, Ubuntu doesn’t route audio through my TV unless I ask it to.  Windows detects the TV’s speakers as an audio device and then PREFERS it to my hifi.  I’d rather have MSN go through the hifi and only films and whatnot that I’m gonna be watching with MSN turned off to go through the TV.  If that, ’cause frankly my hifi’s speakers are gonna be better than my TV’s – they’re just in slightly the wrong place, is all.

Every day I get happier and happier that I’m home.  ❤ ubuntu.  So much.

Have fun hacking!