Brian Armstrong

Greater Levels of Abstraction

At the risk of oversimplifying, the entire history of computer programming could be summarized as a series of higher and higher levels of abstraction.

Early computers just had individual bits you could flip on and off.

Then came assembly languages that allowed you to manipulate numbers.

C made strings, arrays, and all sorts of things possible.

Further evolutions came with C++, Java, Ruby, etc.

With each iteration the languages produced less efficient code at the bit level, but programmers could write better stuff because it was easier to think at higher levels.  Processors kept getting faster so this made up for the slower code.

It’s sort of funny because debates still occur constantly in the tech community about the merits of various languages.  Usually one nerd will say “real programmers use X technology - it’s just so much more efficient.  Sure Y is newer but it’s just a toy, it doesn’t scale in the real world” while another nerd will say “not true - so and so is using Y now, it’s web scale and the way of the future!”

It’s important to keep history in perspective.  With Moore’s law continuing for the foreseeable future, it’s hard to imagine programming NOT continuing to move toward higher and higher levels of abstraction.

As with all trends you have to time it right, so not every technology is right today.  But if you had to pick one to win over time I’d pick higher abstraction over speed of code every time.

When C came out there was probably some hardcore assembly programmer who said “bah - C is a nice idea but will never be fast enough on real machines”.

And in 10 years, there will probably be some kid who says “you still use Ruby?  that’s so low level!”

Comments