I’m on my way to Linz, Austria to participate in a panel at an event called Cleantech Venture Forum. We’ve spent a lot of the last two years looking at ways to save power in data centers and mini-data centers (which may be a term nobody else uses). If we look at the history of software, the key thing to optimize changes over time:
- Alan Turing Era – minimize processor time and memory use (machines are few and very expensive)
- Gene Amdahl/Seymour Cray Era – maximize throughput (big batch jobs in expensive machines)
- Gordon Bell Era – maximize responsiveness (suddenly we have terminals and impatient people screaming at them.)
- Bill Gates Era (client server era) – minimize programmer time and maximize functionality. This is the era marked by three inspiring principles:
- Can’t we get these 4000 poorly educated, poorly treated code monkeys to produce something valuable?
- No matter how crappy this is, it will work better after more memory and more processors.
- Get it out the door and let the customers debug it (or “Ready, Fire, Aim” as a former boss used to put it before our company went bankrupt!).
- The Al Gore Era -Â power use/functionality becomes important.
Something like that.
Cleantech and history of software