Category Archives: Computer Science

‘Things’ Ain’t What They Used to Be

(or “Is ‘Everything’ Going to be OK?”)

A very brief note, this one, along the lines of, “Why do we always over-hype ideas?  Even the good ones?”

So is it the ‘Internet of Things’ (IoT) or the ‘Internet of Everything’ (IoE)?  Or are they different things?  If so, what’s the difference?

Well, we’ve been talking about the IoT for some time now.  And it certainly seems to some that the IoE is just a better-sounding name for it.  Cisco though seem to have other ideas.  Here, “Cisco Senior Vice President Rob Soderbery explains how technology transitions like the Internet of Things are enabling the Internet of Everything to revolutionize industries and create value.”  Any idea what that actually means?  Nope, thought not. Continue reading


The Algorithm of Evolution

(Does nature ‘run programs’?  And, if it does, are the ‘algorithms’ any good?)

So, what is an algorithm?

Algorithm

No, not that; this is a far more fundamental question.  What IS is algorithm?  Animal?  Vegetable?  Mineral?  Ah …

OK, start with something simpler; often the first class discussion on a pure maths degree … What IS a number?  How would you answer that?  Now, if you’ve pointed at something, perhaps even pictured something, you’re wrong. Continue reading


A Christmas Cracker Algorithm!

T’is the season to be jolly … and silly.  So here’s a seasonal and jolly silly example of why it’s hard to implement high-level languages efficiently.  Liberties are taken with the hardware/software relationship in some parts of the analogy but, hey, it’s Christmas!

Let’s write an algorithm for pulling a Christmas cracker …

  • Buy a box of crackers and bring home
  • Take a cracker out of the box
  • Get two people to hold an end each
  • Pull in opposite directions
  • Have fun with what’s inside
  • Clear up the mess

That’s probably going to be enough detail for most people (and more than enough for some).  However, if you’re the one that’s been entrusted with the initial purchase or the child told to do the clearing up, you might want a bit more to go on; what’s actually involved in that bit?  And who are the ‘two people’ anyway?  OK then, if needed, we can easily expand this a touch …

Continue reading


“To the Finish” … and Beyond?

Can we continue to make computing devices smaller and/or faster?  Can we do this without limit?  If so, how?  What’s the next generation?

Microchip designers use a wonderful armoury of terminology, most of it (deliberately, one suspects) impenetrable to outsiders.  However, one of the – on the surface at least – least alarming, and certainly most charming, is the phrase “To the finish”.  It’s an intriguing term and behind it is the spirit of an admirable intention.  The only problem is no one really seems to know exactly what it means.

“To the finish”, in its broadest sense, is some mythological-technological future in which logic circuits have shrunk to such an extent that individual components are measured on the atomic scale.  On one level, although in nominally different research fields, this is comparable to the “intelligent dust” predictions of the most enthusiastic Internet of Things proponents. Continue reading