In 1977, Ken Olsen, founder and CEO of DEC, made an oft-misapplied statement, “there is no reason for any individual to have a computer in his home”. A favourite of introductory college computing modules – supposedly highlighting the difficulty in keeping pace with rapid change in computing technology, it appears foolish at a time when personal computers were already under development – including in his own laboratories. The quote is out of context, of course, and applies to Olsen’s scepticism regarding fully-automated assistive home technology systems (climate control, security, cooking food, etc.). However, as precisely these technologies gain traction, there may be little doubt that, if he stands unfairly accused of being wrong in one respect, time will inevitably prove him so in another.
So, in short:
- Yes, he did say that
- No, that wasn’t quite what he meant
- He wasn’t (obviously) wrong at the time
- He is now/will be soon
The ‘Just because we’ll be able to, will we? Should we? Must we?’ discussion revisited in the light of the news that CCTV cameras will be compulsory in English abattoirs soon. Again, a reminder that technological ethics have more to do with ethics than technology.
But we’ll start with a different topic: one that isn’t as controversial as it used to be …
There was a time when smoking wasn’t just seen as socially acceptable but positively beneficial …
All that healthy smoke filling your lungs, seeping its goodness into your bloodstream was just the boost your body needed. You’d live years longer if you smoked … and it was SO COOL.
The concept of a ‘reasonable amount of time’ figures a fair bit in abstract computational complexity theory; but what is a ‘reasonable amount of time’ in practice? This post outlines the problem of balancing between the two competing ideals of determinism and adaptability and offers a flexible working definition. (Not to be taken too seriously: it’s summer vacation time.)
A standard text on combinatorial problems and optimisation algorithms – perhaps discussing the TSP, for example – might read something like:
“… so we tend not to be as interested in particular complexity values for individual problem instances as how these complexities change as the input problem size (n) increases. Suppose then, that for a given problem, we can solve a problem instance of size n = L in a reasonable amount of time. What then happens if we increase the size of the problem from n = L to n = L+1? How much harder does the …?”
or, filling in a few gaps:
“… so we tend not to be as interested in particular complexity values for individual problem instances as how these complexities change as the input problem size (n) increases. Suppose then that we can solve a TSP of 20 cities on a standard desktop PC in a reasonable amount of time. What then happens if we increase the number of cities from 20 to 21? How much longer does …?”
All good stuff, and sensible enough, but what’s this ‘reasonable amount of time’?