I recently visited Conwy Castle in Wales, one of King Edward the 1st’s great welsh castles.  Conwy castle has stood for over 700 years, and it makes you reflect on what we will leave behind that still stands in 700 years’ time.  The last time I visited the castle was 15 years ago, which of course to the castle is just a tiny drop in what is an ocean of time. 

In that 15 years, though, technology has evolved very quickly, leaving us all in a constant state of retraining.  Today you can buy a mobile phone that is a computer, with a GPS chip inside for location; with a high-speed connection to the Internet; which plays mp3 music, and fits into a shirt pocket.  Back in the 1990’s, GSM was still in its infancy (never mind 3G); computers sat firmly on desks; GPS units were large and primitive; the Internet was just being born; the cost of flash memory was in the thousands of dollars, and in music anyway a lot of people had not yet made the transition from records to CDs.

Looking at software rather than hardware, you could argue that the whole of software engineering history fits into the last 60 years, with The first general purpose computers appearing with the development of electronic computers in the 1950s (although historians cite Ada Lovelace as the first programmer, having written a ‘program’ for Charles Babbage’s Difference Engine in the 1840s, so this would make the history of software another 100 years older). Ada Lovelace

This makes the “C” language an “old timer”, since it was invented in the early 1970s, grew in popularity through the 80s and 90s, and of course it is still going today.  It is probably the most successful programming language ever, having been adopted for embedded applications (even “C” on a chip) as well as general-purpose programming.  But will programmers still be programming in “C” in 700 years?  Will there actually still be “programming” as a human activity?  It’s hard to imagine that span of time, since so much of the technology that we know today was created during our lifetimes.