Computer Science Writer of the Year

April 19, 2007

On Friday I had a rather exciting email from the Engineering and Physical Sciences Research Council. For those unfamiliar with the EPSRC, it is one of the seven government-funded bodies that coordinate research and allocate grants to UK universities. Apparently, I have been awarded their Computer Science Writer of the Year prize for a piece I wrote on new developments in computing that help people suffering from dementia.

Obviously, I’m very pleased about this award but, so far, there has been no mention of a prize-giving lunch.

Starter for ten: what would a post-Google search engine look like?

April 16, 2007

Bamber Gascoigne, the original presenter of University Challenge, is back in the news this weekend. Since 1994, when he turned down the opportunity to present the new series of the quiz show, Bamber has been working on a history-based search engine. All has finally been revealed, and Timesearch has just been released.

The tool is an aggregator, and it allows users to collate information from various sources (Google, Wikipedia etc.) by selecting various search criteria: geographical location, date etc. I’ve not had chance to play with it at length but there does seem to be a large number of potential options.

The tool is billed on their website as an “early example of the post-Google generation of online search tools, capable of being more finely tuned to the individual needs of the user.” This may be the beginning of a new trend, as people look for more focused ways of finding information. And who better to trust with building the next generation of Web search engines than the man who knew the answers to all of University Challenge’s ‘starter for ten’ questions?

Did Morse invent the mobile phone?

April 5, 2007

iPods, mobile phones and blackberries are all symbols of our modern era. Or are they? Not according to the Museum of Lost Interaction, a collection of recently found technologies from the earlier part of last century. The museum, which is based at the University of Dundee, features artefact classics like the 1952 Zenith Radio Hat (a combined trilby and walking cane) and the 1900 Richophone, a multi-player role playing game based on a series of hotel telephone booths. My favourite is the mobile Morse Code device.

At first I thought this was an elaborate April Fool, but it turns out to be genuine archaeological work of staff and students in the computing and design departments at the University. Or is it?

Happy Easter.

What cost, software innovation?

April 4, 2007

Much fun has been had of late by the anti-Microsoft brigade as the new $9 billion Windows operating system release, Vista, has hit the desktops. There have been concerns about the cost of the UK licence, worries about the general uptake within the business community and reports of incompatibilities with software drivers.

A particular point that is being raised a great deal is that Vista requires some serious ‘beef’ when it comes to hardware. Many users will need to upgrade. Indeed, environmental campaigners have raised the amount of potentially unnecessary dumping of old computers (old as in last year’s) as an issue.

What’s the response from Microsoft? Well, Andrew Herbert, speaking at last week’s jubilee event (see previous blog entries) made an interesting point. He said that new software and operating systems are planned so that they can still be in distribution in six or so years’ time and that system designers are forced to think carefully about what future hardware will be capable of. This is why new operating systems are often quite ‘clunky’ when first released: they are pushing the technical limits of current PCs (processor, memory etc.) with the knowledge that Moore’s Law (of ever-increasing computer power) will be able to deliver the goods in a few months’ or years’ time.

I’m sure this is scientifically and technically true and it’s a view that fits with the history of the personal computer. But I think it raises a question. After thirty years should the computer industry continue to prioritise software innovation over making use of previous generations of hardware? This is perhaps particularly true as more and more applications are performed as services over the Web. This sets operating systems designers a challenge: can they design more backwards-compatible systems that work really well on new kit but are still adequate on older machines?

Hexadecimal Beer

April 3, 2007

On a lighter note I should just mention the lunch at the Leeds University Jubilee and give credit to the caterers. The starter and main courses were very good, but the real praise has to be reserved for the dessert, a rather fine crème brûlée. Apart from the fact that crème brûlée is a favourite of mine this was a truly magnificent specimen. It was accompanied by a tangy rhubarb compote sitting in its own little chocolate basket, three shortbread biscuits and a six-inch, white chocolate straw. Obviously there was some debate on my table as to the exact etiquette surrounding the use of a chocolate straw at a formal lunch, but, since the Vidal Pinot Noir 2005 had been flowing liberally during the previous two courses, we decided to forego any unnecessary concern over formalities and either tucked or sucked in.

Computer scientists can be a competitive lot in their own little way, and the department was not to be outdone by some of Gordon Lucifer 0×32 beerRamsey’s little helpers. In the late afternoon we were offered a free (yes – for once free as in beer, not as in software) bar and lavish buffet. The good burghers of Leeds had stumped up for the brewing of an exclusive range of real ales, created by Elland Brewery, which had been named after the different departmental mainframe computers: Eldon, Lucifer, Amdahl and KDF-9. They also provided take-home bottles of the premium beer, Lucifer 0x32 (see photo). Each bottle had a unique identifier from the hexadecimal numbering system (mine was 0xC5 of 0x3FF). I could’ve wept tears of joy into my wispy beard (if I still had one)…

Director of Microsoft Research, Cambridge, gets his revenge

April 2, 2007

At Leeds on Friday (see yesterday’s entry for more background) the main keynote speech was delivered by Dr Andrew Herbert, Director of Microsoft Research, Cambridge, who was a student at Leeds in the early 1970s. His talk was entitled, “Why everything I learned at Leeds in 1972 is no longer true” and he declared it to be, in part, an opportunity to have his revenge on his lecturers of thirty years ago. Continuing this blog’s interest in all things lunch oriented, Andrew noted that revenge is a dish best eaten cold, and that in this case, given the thirty-five year wait, it was “glacial”.

His main point was that there have been enormous technological changes and much of what was being taught in the ’70s was focused on “overcoming the limitation of the machine” e.g. speed of processor, size of memory, poor quality displays, use of punch cards etc. Some of Dr Herbert’s points were more obvious, for example: the day of the stand-alone computer is over, we live in the age of the networked computer. Secondly, software is much more complex than it used to be: testing code by simple ‘desk’ review (people who work in publishing can equate this to being a bit like proofreading) is no longer sufficient. Also, we need new ways of organising and visualising the large amounts of information on our systems – hierarchical file systems have had their day.

Other things were more complex. For example, the questions affecting Artificial Intelligence research have changed – the issue now is how far are we prepared to put our faith in computer algorithms that demonstrate ‘intelligence’. What if computers can’t explain WHY they’ve made a decision?

He also made the point that we are still in the process of defining the subject of computer science and there is a need to make it clear to people today that there is a definite discipline of computing and it is not just a group of techies playing around with gadgets. He said: “we have some way to go in persuading other disciplines that we have a theoretical underpinning”. This was very interesting to me as I feel quite strongly that we need to keep the ‘science’ in computer science. And when I say that, I mean we need to keep hold of the importance of ideas and the spirit of exploration and investigation, rather than turning into a purely vocational course of study.

Mike Wells and the JANET April Fool

April 1, 2007

I was up in Leeds on Friday, helping to celebrate 50 years of computing at my old university. The keynote speech was delivered by Dr. Andrew Herbert, a Leeds alumnus (1975), and now Director of Microsoft Research in Cambridge (UK). He mentioned one of his lecturers, Mike Wells, and his views on networks. Back in the early 1970s Mike was adamant that stand-alone computers would not be stand-alone for much longer. Apparently, during one of his lectures, Dr Wells had revealed: “there’s this thing called ARPANET in the United States which could be interesting”. ARPANET was, of course, the forerunner of the Internet.

In another of Friday’s talks, Dave Holdsworth, an ex-member of staff, gave a talk on the history of computing at the university. He mentioned that by the mid seventies a diverse and pretty incoherent collection of networks had sprung up between self-selecting groups of universities and research agencies. In 1975, Professor Wells was instrumental in producing what has become known as the Well’s Report which led to the creation of the JANET (Joint Academic NETwork). This backbone network successfully linked the growing jumble of university inter-networks into one powerful national system. This was pioneering work in those days, and, as I have already outlined in an earlier blog entry about Tom Loosemore, provided a skeleton for the vision of the later development of the public Internet in the UK.

Professor Mike Wells was therefore not only a lecturer with an early grip on the importance of linking computers together, but was also a leading figure both in the university computing service and on the national networking scene. As the JANET was formally launched 23 years ago, on 1st April 1984, it also seems fair to say that he probably also had a rather wry sense of humour.

A modelling assignment in Birmingham

March 27, 2007

I spent an interesting day last week in Birmingham (or Brum as it is affectionately known) at a workshop on Business Process Modelling given by Balbir Barn of Thames Valley University. Birmingham has certainly changed since my youth there and the workshop was held in the completely redeveloped Brindley Place area of the city. This used to be a decaying network of stinking canals, collapsing Victorian warehouses and rat-infested walkways. It is now home to flash hotels, bars, offices and a series of conference venues including Austin Court, where the workshop took place. Lunch included chocolate covered strawberries which were extremely tasty, although probably ethically dubious given that this is March.

The workshop itself covered Business Process Modelling Notation (BPMN) a diagrammatic notation for representing workflows in the business environment. This probably sounds fairly dull, but it is quite interesting in that it was aimed at higher education and is a sign that universities are becoming more aware of the methods used in the commercial sector. Towards the end of the day, one subject of debate was the likely uptake and impact of using such workflow tools in higher education settings. It can certainly be argued that there are parts of the university system that are akin to the bureaucratic functions of a business (HR, payroll, student registration, course validation). But what of more non-traditional areas like library repositories or e-learning systems? Delegates were certainly interested in debating the potential return-on-investment for groups of developers within the education community who have spent time learning and mastering these kinds of workflow tools.

Will Freeview be able to provide High Definition TV to the public?

March 23, 2007

Wednesday was budget day. Thankfully lunch was not taxed, but one little-noticed item could cause serious debate amongst technology types.

Digital TV delivered by the FreeView system makes use of a portion of the radio spectrum. With the switchover from analogue to digital TV there is an opportunity to re-jig the way the radio spectrum is used and, in the process, release some spare capacity. This spare spectrum is known in technology circles as the ‘digital dividend’.

So why is this important? Well, buried on page 151 of the Budget Report, the Government notes that, through its agency Ofcom, it is consulting on a proposal that the “spectrum released by switchover should be auctioned on an open basis during 2008-9”.

Herein lies the rub. Note the word ‘open’ in the Budget report. The Government is suggesting that this ‘digital dividend’ could be auctioned off in a process similar to the radio spectrum auction that took place a few years ago, when the Government made billions auctioning spectrum to phone companies for 3G mobile phone capacity. Yet it is this ‘spare’ capacity that is partly needed if Freeview is going to be able to deliver High Definition TV as a free-to-view service.

Such an auction might end up with prices that no public sector broadcaster could compete with and therefore effectively freeze out Freeview from the next generation of spectrum capacity. This could be a problem for the millions of people, who, anticipating the great switchover, have invested in a nice, shiny, new HD-ready TV. If an open sell-off happens, there’s a good chance that they won’t be able to get HD TV pictures over free-to-view services.

Second Life and the 3-D Web

March 21, 2007

I spent yesterday afternoon at the eBusiness Expo 2007 which was held on my home turf of Nottingham. Lunch was a buy-your-own sandwiches affair which I’m afraid doesn’t win any prizes for originality. The main talk of the afternoon was from Danny Meadows-Klue, from the Digital Training Academy, on how to use Web 2.0 technologies for marketing purposes. On the whole, the talk was a fairly reasonable trot through the different areas (social networking, RSS, podcasting, mash-ups etc).

There was one point, though, where Danny and I parted company. He doesn’t think Second Life is worth much attention and I’m afraid I have to disagree. If you look at what is likely to be the next step in the development of the Web it is in advanced, 3-D graphics. IBM are investing money in a project to take the visual ideas behind Second Life and transplant them to the Web and Tim Berners-Lee, speaking at last year’s WWW2006 conference, indicated that he thought advanced graphics were the next stage in the Web’s development. Second Life may only be a ‘dry-run’ for a more visually arresting Web, but if you’re interested in where the Web is going, then Second Life is currently the easiest way to explore the implications of 3-D at an early stage.