Archive for the ‘Technology’ Category

A level playing field for open source?

May 28, 2008

A few weeks ago I mentioned a conference on the issues surrounding the procurement of open source software which was being hosted at the University of Oxford’s OSSWatch service. I was there to write a report on the main events of the day so I thought you might be interested to know that it’s just been published.

For those of you who just want the edited highlights, the key question was whether or not open source software solutions get a fair shout when procurement managers (particularly in the public sector) start to think about bringing in new systems or upgrading existing systems (they don’t!).

For me, the most thought provoking comment came from Boris Devouge, from RedHat, who argued that the most important question anyone should be asking about a new system is whether it supports open standards or not.

Boris said: ‘”One of the very first questions when using public money should be: ‘Are you using open standards? Is my data safe?’ You need to know that [with] the solution you are advocating now, [that] in ten years’ time it’s not going to cost forty times as much to migrate the data somewhere”.

By this means he means that if you’re bringing in new systems you need to make sure that you will be able to take your data out and ‘migrate’ it to a new system (if you so wish) easily and with minimal cost. This is not necessarily about open source software per se. You can have closed source software that adheres to open standards for data exchange and you can have standards that describe themselves as open when they’re not really very open at all. If it sounds confusing, don’t worry. The important thing is to focus on the data and how easily you can transfer it to other systems. I think this is going to be one of the big issues over the next few years, as ordinary people start to feel the effects of being ‘locked in’ to things like the everyday Web services they use.

Here comes the flood – the curator is as important as the creator

May 15, 2008

He once sang ‘Here comes the flood’ and he now seems to have taken the message to heart. Former Genesis front man Peter Gabriel is working on a project called The Filter which will help people navigate through the ever-expanding ocean of online information and digital assets.

In a world of a bewildering range of choice, Gabriel argues (in a Reuters video clip) that the curator is as important as the creator. What he seems to be arguing is that sharing our collections, playlists etc. of digital content can help us find new, interesting and relevant content.

The Filter tool joins a growing band of personalisation services that help people make sense of the huge choice of music, video, films and other media that is now available online thanks to the Long Tail. These services track your personal preferences, make sense of your online purchases and keep an eye on the stuff that you browse. In the case of the Filter, the ‘engine’ that drives it is a complex algorithm based on a branch of maths called Bayesian statistics. It works out patterns of interest and makes suggestions for related materials. The real power will come when these mathematical pattern profiles can be shared through social networking websites.

A public launch is promised next month, although I hear news that his server was stolen over the recent bank holiday.

The preaSOAic era

April 18, 2008

I came across a new computer-related term the other day: the “preaSOAic” era. SOA stands for Service Oriented Architecture and – together with Enterprise Architecture (EA) – form the two hottest buzz-words in the business computing world.

The SOA ideology envisages recasting a company or public sector institution’s myriad software applications into a series of services that are open to each other via the Web and have formalised methods for exchanging messages and data. By turning software applications into services all the different business processes and databases of an institution should be able to co-operate merrily with each other. It is hoped that this will avoid the usual situation that most companies find themselves in, where there are many applications spread across dozens of departments, all with their own databases, most of which are extremely reluctant to talk to each other or use each other’s data. In this “preaSOAic” era there is the potential for massive amounts of data duplication (referred to as ‘data silos’) and general muddle. It is generally portrayed as a period when large amounts of staff time are spent simply taking data from one computer application and [manually] entering it into another.

SOA is potentially a huge paradigm shift for an organisation, not only for the computer development team, but also for the business processes that link departments and functions. The recognition of the potential for large scale ‘reordering’ of the way information is handled within an organisation has led to increasing interest in the second concept: Enterprise Architecture. This involves a formal process of analysing and articulating a company’s fundamental organising business logic (i.e. what it actually does on a day-to-day basis) and activities, and tries to work out how the ICT infrastructure should go about modelling this. Frankly, it’s big brain stuff, but research by Harvard Business School seems to suggest that organisations that get it right can lower their ICT costs and be more effective and efficient in their day-to-day activities.

The commercial world has been pretty heavily engaged with this in the last few years and the education community is now starting to take notice. JISC is starting to articulate the ideas of SOA and EA to its community of higher and further education institutions and has started to fund a series of pilots. As part of this work, I’ve been commissioned to help out by providing technical reporting and editorial support for these activities and I’m off to Glasgow next week to learn more at the OpenGroup’s annual Enterprise Architecture Practitioners conference. As you’ve probably gathered by now, this is all ‘adult material’ and so I’ll probably require some light relief: I’ll be on the hunt for a vegeterian haggis or two and perhaps a wee dram.

Vikings predicted our renewable future

April 16, 2008

There was an interesting piece in the FT yesterday about the potential for sea tidal power to be used to generate renewable energy in the Orkney islands. Scientists estimate that the Pentland Firth, that strip of ocean which separates the islands from the mainland, could generate a whopping 10% of the energy needs of the whole of the UK.

As a technologist, with a deep interest in environmental issues, it has always seemed plain daft that sea-bound UK is not storming ahead with wave and tidal power systems. Although it’s good to see that there are trials going on around Orkney and that £15m in grants have been ploughed into exploring the practical realities, it seems peanuts compared to to what’s being invested in other energy sources.

It seems to me that the ancient Vikings actually had the right idea for where the future of the islands lay – according to the article in the FT, the Icelandic meaning of Orkney is “energy islands”!

The Future of Libraries

April 11, 2008

The Web is having a profound impact on the role and function of libraries. This goes way beyond ‘the demise of the book’, which is, quite frankly, a very simplistic way of looking at things. It’s actually more about having a vision for the future and how you realise that vision. For example, one of the problems facing librarians is how to create high quality ‘digital objects’, as they are called. If you think about a book, you might judge its quality in terms of the jacket design or the type of paper used or whether or not you can see guillotine marks on the edge of the pages. You probably wouldn’t think about some of the very obvious quality factors unless they were missing. If you opened a book and, say, the pictures were missing or all the pages were in the wrong order, you’d probably want your money back.

The problem for librarians is that when you are creating things like e-books, you have to think about a different set of ‘quality’ criteria because these digital objects will not be used in the same way that physical books are. They will need to designed so that they can be searched, for example, or delivered as separate pages. For the average library user, accessing information that spans multiple digital sources is increasingly a messy process and for those who are used to search tools such as Google and Yahoo this new and highly fluid environment can be a considerable barrier to accessing information from digital libraries and online collections. What is concerning about this is, unless we are careful, people will increasingly see the search results thrown up by Google, Yahoo etc. as the be-all and end-all of a particular area of interest or subject. There is no doubt that the library and information community recognises this problem.

One of the ways of helping to ease these problems is covered in a technical report just published by JISC Technology and Standards Watch, for which I am the technical editor. The report is by Richard Gartner, the man who brought the Internet into Oxford University’s Bodleian Library, who argues that rectifying this problem requires the acceptance of the importance (and standardization) of what’s called metadata.

Metadata is information about the information contained within the digital object, and can be as simple as a tag which says who the author is, ranging to a complex layer of additional information about digital rights (who’s allowed to access it or how much you might have to pay). There are different ways of approaching this problem – the more sober Digital Library is being usurped a little at the moment by the ‘hipper’ Library 2.0 – but it’s a hot topic, and even though it’s a technical subject, the report should be quite readable for a tech-curious audience.

This is part of an ongoing debate about the future of libraries, and will be one of the key themes of JISC’s annual conference in Birmingham, next week, which I’ll be attending for TechWatch.

World’s first computer animation?

April 10, 2008

I was at a computer conference the other day where this YouTube clip was shown. It shows “The Kitte”, a 1967 animation by the Russian, Niklaevich Konstantinov, and is described as the “first animated sequence using a computer”.

What’s interesting is that it is not quite clear how this animation was generated. It was posted by UnFathomable42 who says that as far as he is aware it was entirely generated by computer. However, if you read the comments that follow the video, there are several people who argue that what actually happened is that the computer printed out a series of pictures of the cat onto paper and these were then animated in traditional fashion by taking a film camera shot of each picture to form each frame.

See what you think. It’s not quite Rhubarb and Custard, but the cat walking along is fairly impressive. But is it genuine? I’d love to know more about this clip’s history if anyone has other historical information.

Twitter – time for a killer app

March 25, 2008

The technology de jour seems to be Twitter, the increasingly popular micro-blogging service that allows you to post bite-sized, online updates on what you are doing at the moment. These 140-character texts are then circulated to groups of registered friends or, if you choose, placed on public display. There’s a YouTube video that provides a basic introduction.

Twitter was originally envisaged as a tool to exchange with friends simple messages (or “tweets”) about what you’re doing at any particular moment – “In the tea shop enjoying lemon drizzle cake” – but it seems to be morphing into more of a conversational tool which supports highly fluid, spontaneously forming online discussions. Those that love this new form of communication – the “twitterati” – seem to be revelling in it. The TweetVolume tool even lets you gauge what people are particularly interested in at any one time (try entering Obama and Clinton).

If you think about what it actually does, Twitter and services like it (such as Pownce) provide a kind of device agnostic form of paging. But is there a killer app for Twitter – beyond facilitating conversations? LunchoverIP has some material on how traffic news is being streamed through Twitter in St. Louis, and Howard Rheingold has a page of links and news items, including information on how protesters use it for co-ordinating meetings, but none of these really fit the bill.

An alternative view is provided by David Tebbutt of Information World Review who recently wrote: “If ego-driven, time wasting blog postings are being shrunk and shifted to Twitter, then what’s left ought to be a better, more thoughtful, blogosphere.”

Ouch…

Or should I say:

Squawk!

Why is the UK so bad at using open source software?

March 19, 2008

As the economy suffers, and tax revenues start to fall, bearing down on spending within the public sector is becoming increasingly important. As just one example, the UK Government is looking for half a billion pounds of savings in the education sector’s total procurement costs. One would’ve thought, then, that open source software solutions such as Linux and OpenOffice, which have no licence fees associated with them, would be seeing an increase in take up.

Apparently not. At the Risk Management in Open Source Procurement conference in Oxford yesterday, speaker after speaker gave examples of other European countries with large-scale, public sector, open source procurement strategies. Notable examples that were mentioned included a 120,000 Linux-based desktop installment in schools across Macedonia and the outfitting of the French Parliament with open source-based desktop systems. But in the UK, we’re still lagging behind.

There are several reasons for this, but one of the most important is the number of barriers present in the process of procurement. It seems that open source software suppliers are not being offered a level playing field when it comes the bureaucratic procedures and check-lists involved in making procurement decisions within public sector bodies. A high profile example involves Becta, the school’s technology agency, and its recent decision not to include the popular open source package Moodle as a potential e-learning platform.

The good news is that, judging from the level of interest at the conference it seems there is growing willingness on the part of the public sector to work on this, alongside moves amongst open source developers to work together through consortia.

If this is something you’re interested in, watch this space. I’ve been commissioned to write up the main findings of the conference (in an interesting way!) so there will be more coming out on this in a few weeks’ time.

Babel TV – a set-top box or a Linux PC?

March 13, 2008

The announcements of ultra-cheap, Linux-based PCs, which I wrote about last week, reminded me of Peter Dawe’s Babel TV, which was launched back in November. This is also based on Linux and combines a computer and Internet access device, running common open source software tools such as OpenOffice, with a video recorder (PVR) and Freeview TV set-top-box.

Dawe is widely respected as a technologist and credited with being one of the founding fathers of mainstream Internet in the UK as he set up Pipex, the UK’s first commercial Internet service provider. During a round-table discussion at the 2005 PACT Content Lab conference in Birmingham Dawe announced that the day was rapidly approaching when a basic PC with Internet access and IPTV facilities could be given away with a box of cornflakes. And he wasn’t joking – he claimed the costs would be recouped either through advertising and sponsorship or providing online services at a cost along the model of mobile phone handsets (which are heavily subsidised by the telecoms networks).

Babel TV is not free (at £295) but there’s a also monthly charge for online back-up storage, so given his comments in Birmingham, one can perhaps see where this might be heading.

Web 2.0 keeps you busy

March 6, 2008

I said at the beginning of the year that social networks and other Web 2.0 activities might start to tail off, partly because of the considerable amount of time involved in keeping everything up-to-date and tracking all of your friends’ content. Well, it looks like I’ve been backed up by Liberal Democrat MP, Steve Webb, who told the Empowering Citizens symposium the other day:

“Anyone who thinks they can do Web 2.0 in their spare time can forget it. If you go down this avenue be prepared to spend some time on it, or pay someone to spend time on it.”

Sadly, not all of us can get our hands on public funds in order to keep our Facebook accounts spick and span.