Saving the day

When you think about it, the idea of “saving” your work, is quite a strange one.

In the real world (you know, that annoying place where ctrl+F doesn’t work when you’ve lost your keys), you never have to “save”. If you pick up a pad of paper and write something down, you don’t have to then do anything to keep it. It’s written down; it’s permanent. You just put the paper in a drawer and next time you open the drawer, it’s still there.

On a computer, the equivalent action (Pressing File => New) doesn’t keep your writing, unless you “save” your “changes”. Both of these are strange concepts. When I take a blank sheet of paper and write on it, I don’t consider that a “change” to the blank paper, I consider that “my angry letter to the newspaper” or “my shopping list”.

Similarly, when picking options, in real life, I just set my oven to 200ºC and walk away. I don’t have to click “Apply” to change it from 180.

saving

We’ve become so used to saving now (or at least, I have. My parents haven’t and regularly wonder where their things have gone) that we do it without thinking.

But saving is a faff. Over the course of my life, I’ve lost a huge amount of work because someone, once, years ago, made the decision that after spending all day typing a document, I probably don’t want to keep it. On the computer, saving is an after thought. The alert box that pops up when you leave a document says (I’m paraphrasing, but if you read between the lines it sort of says this), “Oh, by the way, you didn’t want to keep this did you? I’ll just chuck it out, shall I?”.

There are a couple of weird things here. The concept of “saving” is an abstraction, added on top of the computer. After all, when you type a character onto the screen, the computer receives that character and stores it in a temporary place. This is probably RAM, but why not put it straight to a permanent place? Some applications, depending on what they do, even keep temporary copies of the file on the actual hard drive, so when you leave the application without saving they then have to delete the records of the file you’ve been working on. Since Office 2003, auto-save writes a copy of the file you’re working on to a temp folder every few minutes, but still assumes that keeping that file is the exception, rather than standard thing to want.

That’s bizarre.  I’m much more likely to want to keep what I’ve done than throw it away. But as far as the computer is concerned, saving is the odd thin;, the change from the standard workflow. Surely it would make more sense to save by default, and I’d have to specifically say “please throw this away”. In much the same way that once I’ve written a page of text by hand, I  have to chose to screw it up and through it in a bin.

Some applications, of course, prompt you to create a file before you start working. When you open Adobe InDesign, for example, you have to chose whether to create a new Document, Template or Book. In typical Adobe form, you need to go on a course before you can work out what it is you want to do. What’s the difference  between a document and a book? If I want to make a three page flyer is that a book? I don’t know. It’s impossible to tell without going on a course.

And even with InDesign, where I begin by creating a file and naming it, once I’ve done making my changes and editing it, InDesign pops up with a box saying, “oh, you want to save now, do you?” as if that isn’t the expected behaviour. While this is better, it still isn’t automatically saving what I’m doing, which is the most likely thing I’m going to want to do. WHY DOES MY COMPUTER SEEM TO THINK I’M WEIRD FOR WANTING TO KEEP MY WORK?

Thankfully, online, this seems to be beginning to change now. Google Docs automatically saves my work as I type. It’s almost like using paper. Increasingly, web applications initiate actions once I chose the option, rather than having to click “apply”. On the iPad, all the options come on when you press them. There is no concept of “Apply”, “OK” or “Cancel” on option screens.

But the idea of “saving” is difficult to shake. In Google Docs, I keep wanting to click save, and am momentarily confused when there is no “save” button. Strangely, over the last twenty years or so of  computer usage, we’ve managed to train people to think “saving” is a special activity.

Maybe I’m being unreasonable about this (I am); after all, we’re all used to saving now, and I almost never lose work (despite the odd power cut). But I’m reminded of a quote from George Bernard Shaw:

The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore, all progress depends on the unreasonable man.

Once we stop being reasonable about saving and start building standard systems where you don’t chose to save but chose to throw away, we won’t missing the little floppy disc icon in the top left hand corner. Or, as ten year old’s today must think of it, the weird square thing that old people used to use when there were dinosaurs around.

Apps and Websites

I have mixed feelings about apps. There’s an XKCD comic that pretty much sums up my experience:

xkcdapps

It’s quite common to come across apps that are just content from the website but in a more limited container. You can’t interact with them as you would with webpage content. Sometimes you can’t copy and paste from them or use  features that have been in webpages as standard for twenty years.

People who are familiar with computers tend to forget that most normal users  are worse with computers than we think. I once told a story about a project to build an appstore at work:

A colleague of mine had to give a presentation about a Corporate iPhone AppStore that we’re building. Half way through, he realised the audience weren’t feeling it, and so said, “Who here knows what an appstore is?” About four people put their hands up.

The excellent web app Forecast.io has a blog that talks about the confusions people get into when trying to pin their webclip to the homescreen:

I’m fairly certain none of them will ever know that Forecast is actually a web app. To them, it’s just an app you install from the web.

Users don’t really understand what they’re doing. Instead they do get frustrated and confused when things don’t behave when they expect. The user who wrote in to Forecast.io was confused that he couldn’t download the web app from the Apple Appstore.

Apps are too much like 1990’s CD-ROMs and not enough like the Web. I feel like I’m always updating my apps. Every time I pick up my phone there’s a little red box next to the Appstore, telling me I have more updates to download.

Perhaps it’s my OCD coming out, but I just can’t leave the updates sitting there. I have to download them all. But if I ever look at the reason for the updates, I see it fixes an issue like “error with Japanese timezone settings for people living in Iceland” or “fixes an issue when you plug an iPhone 1 into a particular model of ten year old HP TV” that I’ll never encounter. Sometimes they change apps so they no longer work in the way I’ve got used to.

It’s probably worth adding: I’m not complaining that they’re fixing problems. Someone in Iceland is probably on Japanese time. And even if they’re not, I’m idealistic enough to think it needs fixing just because it’s wrong. The problem is the nature of apps. I never have to update Amazon.com before I buy a book, or update Facebook before I poke someone. Websites don’t need updating. They are always the latest version.

There are, of course, some good things about apps. Jeff Atwood wrote about how much better the ebay app is than the website. An he’s right. It’s slicker, simpler, easier to use:

Above all else, simplify! But why stop there? If building the mobile and tablet apps first for a web property produces a better user experience – why do we need the website, again?

But maybe the solution here is to build a better website.

Of course, some apps carry out functions on the device, or display static data. And it makes sense for them to be native apps. On my iPhone, I have a torch app (it forces the flash on my camera to remain on), and I have a tube map app (it essentially shows me a picture of the tube map). One of these interacts with the base firmware, so that one has to be a native app. The other displays static data. It would be unnecessary to connect to the Internet to pull the map down every time I want to look at it.

But other apps, like Facebook or LinkedIn are just a native wrapper around the website.

So, what’s the solution?

Of course, there isn’t one. It’s a compromise. At the moment we’re in an era obsessed with native apps. All companies have to have an “app”, if only just to show that they’re up to date.

I was in the pub the other day and accidentally got chatting to someone. He told me that his company had just released an app. “What does it do?” I asked him. “No idea,” he said, “but you’ve got to have an app.” He was the managing director of the company.

Hopefully, when we’ve got over the novelty of the technology, we can start using apps for what they’re good at, rather than just having apps before they are there.

A lot happened on 1st January 1970

WordPress

If you’ve spent any time playing with code and dates, you will at some point have come across the date the 1st January 1970.

In fact, even if you’ve never touched any code, you’ll have probably come across it. I came across it today when I was looking at the stats on WordPress:

Hits

Bizarrely, the WordPress hit counter starts in 1970. Not so bizarrely, no one read my blog that day. But then they were probably all so excited by Charles “Chub” Feeney becoming president of baseball’s National League. Or something.

Most likely, this is caused by the Unix Timestamp, a number I wrote about the other day. As I said, time is a real faff, but numbers are great, so computers sometimes store time as numbers. Specifically, the number of seconds since midnight on the 1st January 1970. It’s a real oddity when you first encounter it,  but it makes a lot of sense.

It’s not, though, the only way of storing time. Microsoft, typically, do it a different way, and use a value that’s affectionately known as Integer8, which is an even bigger number. This is the number of nanosecond intervals since midnight on January 1st, 1601.

With both of these, you need to do a calculation along the lines of:

January 1st 1970 + number of seconds

To turn the number into a date. Of course, this means that if you report the Timestamp as 0, the computer adds 0 to January 1st 1970, and gets January 1st, 1970.

Presumably, it’s something along these lines that have resulted in WordPress reporting me hit stats from 1970. According to computers, a lot of things happened on 1st January 1970.

The Unix Timestamp

Time and Dates

Time is a bit of a faff really.

If you want to add 20 minutes to 6:50, you get 7:10. Not, as you would with normal maths, 6:70. I remember at school putting calculators into “clock mode” to add times.

I also remember spending a surprising amount of time during my sound engineering days adding and subtracting times. There was a brief period when I was convinced that we needed to decimalise time; change the system so there are 100 seconds in a minute and 100 minutes in an hour. It will be a bit of a faff for everyone, but if it saves me from having to do slightly tricky mental arrhythmic occasionally, then I’m all for it.

It turns out that computers, similarly, have difficulty with time. Which must be partly why many computer systems use the Unix Timestamp.

The Unix Timestamp is a number. A really long number, like 1369634836 or something. But, ultimately, just a number. And this means that adding and subtracting from it is easy.

The number corresponds to the number of seconds since midnight on 1st January 1970. 1369634836, for example, corresponds to seven minutes and sixteen seconds past 1 AM on 27th May 2013.

Funnily enough, though, the Unix Timestamp was invented until 1971, meaning that the first few thousand numbers only ever occurred in the past. Dates that are before January 1970 are recorded as negative numbers, so theoretically it can go back as far as you want.

These days, the Unix Timestamp is used in loads of, if not all, computer languages.

In javascript, you can generate it with this code:

new Date().getTime();

In PHP:

time

And so on.

Now, I don’t mean to alarm anymore, but there is an apocalypse coming. Well, I say apocalypse, it’s more just an expensive and inconvenient problem. But on January 19th, 2038, we’re going to have another “Millennium Bug” situation, when the Unix Timestamp gets so big that it cannot be stored in normal (32-bit) computer file systems.

Before this time, we’re either going to have to switch to using 64-bit systems, which support much bigger numbers, or using a different method for storing dates. That is, assuming we there are any 32-bit computer systems still running from today.

Even when we switch to 64-bit systems, though, we’re only prolonging the problem, not solving it indefinitely. At 15:30:08 on Sunday, 4 December 292,277,026,596, we will again run out of numbers. Luckily, though, as Wikipedia notes “This is not anticipated to pose a problem, as this is considerably longer than the time it would take the Sun to expand to a red giant and swallow the earth.”

Which is slightly more of an apocalypse than the dates not displaying correctly on websites really. But I’m still more worried about the date thing than the sun thing.

Coding is hard

Coding is easy

“Coding”, to misquote Douglas Adams, “is hard. Really hard. You just won’t believe how vastly, hugely, mindbogglingly hard it is. I mean, you may think it’s difficult to walk  down the road to the chemist’s, but that’s just peanuts to code, listen…”

Except it’s not. Not really. And that bit about the chemist’s makes no sense. Thanks for nothing, Simon.

Writing a little bit of code is easy. So easy, that I’ll do some now. Here:

<b>Make this bold</b>

See, that wasn’t so hard, was it? That’s one line of HTML. HTML is what’s called “markup”, code for styling text on a website. The angled brackets indicate that what is within them is a command. This command is sent to the “code interpreter”, which reads the code and turns it  into:

Make this bold

At the end, there is another set of angular brackets, with a forward slash to indicate that you can stop making it bold now, thank you very much. So to the compiler, it reads:

Start making the next bit bold until I tell you to stop

This is the specific text that I want you to print out, so just print it out

Stop making it bold now

That’s quite straightforward, but using essentially that principle (and a few others), you can start building most websites in the world. And end up with code that looks like this:

Html Code

A little bit of code is easy. A lot of code is hard.

The way computers “think” is just very different to humans. Densely written text like that is almost impossible for a human to read. But for computers, this isn’t a problem. They just start at the beginning, and work their way through. Very quickly. In milliseconds. It would take us humans with our monkey brains much, much longer to go through it all.

The upshot of this is that once you go beyond writing a couple of simple lines of code, managing the code becomes a bigger job that actually writing the damn stuff. And when you throw some other coders into the mix, with their own way of coding, and their own idea of how code should be indented (it’s tabs, for Christ’s sake, not spaces!), you start to see why coding is so hard, and why so many software projects fail.

In this blog, I’m going to take a walk through coding, development and software. Looking at what goes right (and wrong), why coding is the way it is, and talk about some of the concepts involved in coding and development. You won’t learn anything useful from this site (it’s not Coding for Dummies or anything like that). But you will learn lots of useless things. And, if I’m honest, I’ve always preferred useless information anyway.