A lot happened on 1st January 1970


If you’ve spent any time playing with code and dates, you will at some point have come across the date the 1st January 1970.

In fact, even if you’ve never touched any code, you’ll have probably come across it. I came across it today when I was looking at the stats on WordPress:


Bizarrely, the WordPress hit counter starts in 1970. Not so bizarrely, no one read my blog that day. But then they were probably all so excited by Charles “Chub” Feeney becoming president of baseball’s National League. Or something.

Most likely, this is caused by the Unix Timestamp, a number I wrote about the other day. As I said, time is a real faff, but numbers are great, so computers sometimes store time as numbers. Specifically, the number of seconds since midnight on the 1st January 1970. It’s a real oddity when you first encounter it,  but it makes a lot of sense.

It’s not, though, the only way of storing time. Microsoft, typically, do it a different way, and use a value that’s affectionately known as Integer8, which is an even bigger number. This is the number of nanosecond intervals since midnight on January 1st, 1601.

With both of these, you need to do a calculation along the lines of:

January 1st 1970 + number of seconds

To turn the number into a date. Of course, this means that if you report the Timestamp as 0, the computer adds 0 to January 1st 1970, and gets January 1st, 1970.

Presumably, it’s something along these lines that have resulted in WordPress reporting me hit stats from 1970. According to computers, a lot of things happened on 1st January 1970.


The Unix Timestamp

Time and Dates

Time is a bit of a faff really.

If you want to add 20 minutes to 6:50, you get 7:10. Not, as you would with normal maths, 6:70. I remember at school putting calculators into “clock mode” to add times.

I also remember spending a surprising amount of time during my sound engineering days adding and subtracting times. There was a brief period when I was convinced that we needed to decimalise time; change the system so there are 100 seconds in a minute and 100 minutes in an hour. It will be a bit of a faff for everyone, but if it saves me from having to do slightly tricky mental arrhythmic occasionally, then I’m all for it.

It turns out that computers, similarly, have difficulty with time. Which must be partly why many computer systems use the Unix Timestamp.

The Unix Timestamp is a number. A really long number, like 1369634836 or something. But, ultimately, just a number. And this means that adding and subtracting from it is easy.

The number corresponds to the number of seconds since midnight on 1st January 1970. 1369634836, for example, corresponds to seven minutes and sixteen seconds past 1 AM on 27th May 2013.

Funnily enough, though, the Unix Timestamp was invented until 1971, meaning that the first few thousand numbers only ever occurred in the past. Dates that are before January 1970 are recorded as negative numbers, so theoretically it can go back as far as you want.

These days, the Unix Timestamp is used in loads of, if not all, computer languages.

In javascript, you can generate it with this code:

new Date().getTime();



And so on.

Now, I don’t mean to alarm anymore, but there is an apocalypse coming. Well, I say apocalypse, it’s more just an expensive and inconvenient problem. But on January 19th, 2038, we’re going to have another “Millennium Bug” situation, when the Unix Timestamp gets so big that it cannot be stored in normal (32-bit) computer file systems.

Before this time, we’re either going to have to switch to using 64-bit systems, which support much bigger numbers, or using a different method for storing dates. That is, assuming we there are any 32-bit computer systems still running from today.

Even when we switch to 64-bit systems, though, we’re only prolonging the problem, not solving it indefinitely. At 15:30:08 on Sunday, 4 December 292,277,026,596, we will again run out of numbers. Luckily, though, as Wikipedia notes “This is not anticipated to pose a problem, as this is considerably longer than the time it would take the Sun to expand to a red giant and swallow the earth.”

Which is slightly more of an apocalypse than the dates not displaying correctly on websites really. But I’m still more worried about the date thing than the sun thing.

Coding is hard

Coding is easy

“Coding”, to misquote Douglas Adams, “is hard. Really hard. You just won’t believe how vastly, hugely, mindbogglingly hard it is. I mean, you may think it’s difficult to walk  down the road to the chemist’s, but that’s just peanuts to code, listen…”

Except it’s not. Not really. And that bit about the chemist’s makes no sense. Thanks for nothing, Simon.

Writing a little bit of code is easy. So easy, that I’ll do some now. Here:

<b>Make this bold</b>

See, that wasn’t so hard, was it? That’s one line of HTML. HTML is what’s called “markup”, code for styling text on a website. The angled brackets indicate that what is within them is a command. This command is sent to the “code interpreter”, which reads the code and turns it  into:

Make this bold

At the end, there is another set of angular brackets, with a forward slash to indicate that you can stop making it bold now, thank you very much. So to the compiler, it reads:

Start making the next bit bold until I tell you to stop

This is the specific text that I want you to print out, so just print it out

Stop making it bold now

That’s quite straightforward, but using essentially that principle (and a few others), you can start building most websites in the world. And end up with code that looks like this:

Html Code

A little bit of code is easy. A lot of code is hard.

The way computers “think” is just very different to humans. Densely written text like that is almost impossible for a human to read. But for computers, this isn’t a problem. They just start at the beginning, and work their way through. Very quickly. In milliseconds. It would take us humans with our monkey brains much, much longer to go through it all.

The upshot of this is that once you go beyond writing a couple of simple lines of code, managing the code becomes a bigger job that actually writing the damn stuff. And when you throw some other coders into the mix, with their own way of coding, and their own idea of how code should be indented (it’s tabs, for Christ’s sake, not spaces!), you start to see why coding is so hard, and why so many software projects fail.

In this blog, I’m going to take a walk through coding, development and software. Looking at what goes right (and wrong), why coding is the way it is, and talk about some of the concepts involved in coding and development. You won’t learn anything useful from this site (it’s not Coding for Dummies or anything like that). But you will learn lots of useless things. And, if I’m honest, I’ve always preferred useless information anyway.