Clear as Maths!

This post follows on (loosely) from a previous discussion on maths and computing and asks what it really means to ‘prove’ something in each discipline.

An apocryphal story has an Oxbridge maths don lecturing to a group of undergraduates … After some time completely filling a huge blackboard with heavy calculus – with accompanying commentary, he turns to the class and casually notes, “So then, it’s clear that …” (the exact claim isn’t important). As he turns to resume his chalk-work, a particularly bold student enquires, “Excuse me, Professor; but is that really ‘clear’?” The don steps back and surveys his work; studying the entire board from top-left to bottom-right, with numerous head and eye movements to-and-fro – even some pointing – to cross-check various parts with each other. After a full five minutes of silent contemplation, he turns back to the students, smiles, announces, “Yes!”, and carries on as before.

So who’s defining ‘clear’ here?

Continue reading


It’s a ‘Full Stop’. Period!

The Americans say ‘period’; the British say ‘full stop’.  So, which is it?  And what does this have to do with Computing?

The USA gets a lot of stick for ‘American English’ (AE) and it’s fundamentally unfair.  Both AE and BE (‘British English’) have their origins in multiple phases of British history over the past few hundred years and both have evolved and expanded since.  Both are very different from the language of the UK in the 17th and 18th centuries and there’s even some argument for saying that AE is a closer match than BE in that it’s stayed truer to the original principles.  Bill Bryson is very worth reading on this.  In fact, general comparison is almost impossible but a more focused approach might yield something useful, so let’s try …
Take the ‘.’ character used by both versions of English at the end of a sentence.  What is it?  Is it a ‘full stop’ (BE) or a ‘period’ (AE)?  Surely, it’s just a matter of taste?  There can’t be a right or wrong, can there?  Well, let’s see …
And here comes the Computer Science: more particularly, the programming …

Continue reading


Is Inequality the Fault of Technology?

On May Day, and with the UK General Election just a few days away, some thoughts on the role technology plays (or could play) in making the world a better (or worse) place.

Owen Jones in his Guardian piece, Don’t Blame Rising Inequality on Technological Change (Wednesday 8th April, 2015) makes short work of dismissing the ridiculous arguments that the widening gap between rich and poor is an unavoidable product of technological advancement. As he points out …

“Professor Anthony Atkinson is a pioneer of the study of the economics of poverty and inequality. His latest work, Inequality: What can be done?, is an uncomfortable affront to our reigning triumphalists. His premise is straightforward: inequality is not unavoidable, a fact of life like the weather, but the product of conscious human behaviour. The explosion of inequality as a result of intentional policy decisions has been rather spectacular. Take the US, which became steadily more equal from the end of the second world war to the late 1970s. By 2012, the top 1% had more than doubled the share of national income they enjoyed in 1979, and now receive a fifth of gross US income.”

Continue reading


Alan Turing’s Notebook to be Lost?

An interesting read from The Telegraph

http://www.telegraph.co.uk/news/worldnews/northamerica/usa/11530156/How-Alan-Turings-secret-notebook-could-disappear-forever.html

“He was one of Britain’s finest minds, the father of modern computing, but his troubled life and classified wartime work meant his legacy was neglected for decades. As the world finally recognises his brilliance, Rob Crilly reports on how a hidden manuscript is to go on auction amid fears it may be snapped up by a private collector.”

Isn’t Capitalism wonderful?