Archive for the ‘Technology’ Category

Semantic Robots

December 2, 2008

One of the sessions at last week’s European ICT 2008 conference looked at the area of robotics research. It was pointed out that we are starting to see robotic applications move beyond their traditional use in high end automobile manufacturing (remember the Picasso car advert?). There is a push to put manufacturing robots like this into smaller companies and also a lot of work going on for applications in the health and service industries. Of most interest though, I thought, were discussions about plans to integrate robotic work with that of the semantic Web to deliver knowledge-based robotics.

You can read a bit more about these semantic robots on the JISC TechWatch blog, which I will be contributing to over the coming months.

ICT 2008

November 26, 2008

Red Pizza Man Statue, Lyon

Red Pizza Man Statue, Lyon

You know you’ve arrived at ICT 2008, the EU’s biggest research technology conference, when you round a corner and narrowly miss being knocked off your feet by a suited and booted segway rider. There are further techie references as you cross the courtyard of the Centre de Congrès in Lyon. The thirty-foot high, plastic pizza delivery boy for example, complete with plastic scooter, proffers an appropriately scaled pizza box. This is either something to do with the contemporary arts building next door, or a municipal French tribute to that staple of the late-night, carbohydrate-fuelled code-fest.

I’m here as part of my work for JISC TechWatch. I’ll be blogging some of the more technical stuff at TechWatch’s newly launched blog – Notes from the Future – but for the foodie stuff, stay tuned. There’s more to come on cakes, caffeine, and the comedy of European manners.

Philips launches electronic pill

November 14, 2008
Philips iPill is 11 x 26 mm

Philips Research’s intelligent pill (iPill) for electronically controlled drug delivery

Nowadays it seems just about anything can have ‘i’ added as a prefix and now Philips have added their two penn’orth, this time in healthcare.

The company has announced that they will launch their ‘iPill’ next week at a science conference. This is an electronic pill that can pass through the digestive system, measuring its position by determining the acidity of its surroundings and releasing medicine in a programmable pattern as required. The company envisages its use both as a drug research tool and therapeutically.

This would also seem to have wide-ranging application in the software development industry, say for the intravenous delivery of pizza, but caused me a few problems in terms of this blog. I wasn’t quite sure whether to file this story under the ‘technology’ or ‘lunch’ category.

Open Source CRM

November 6, 2008

Many people equate Customer Relationship Management (CRM) software with the hard-nosed world of sales and marketing. However, setting up and maintaining an on-going relationship between people within an organisation and externally is something that we all need to do, whether we’re public or private sector.

I was recently commissioned by Oxford University’s open source advisory service (OSS Watch) to produce a piece about their experience of introducing a CRM package into their daily work. They are a non-profit service working in the education sector and the word ‘sales’ is pretty much anathema to them. So their experience of using CRM was not straightforward and they were not afraid to say so. The piece has lots of lessons learned for those considering a similar undertaking in the public sector.

Open Source Oxford

October 23, 2008

I’ve just spent an enjoyable couple of days in Oxford at the Community and Open Source development workshop. A diverse mixture of researchers, software developers and open source experts gathered to debate how to build the all-important community of users and developers that drive successful open source projects. Many people think that open source is just about developing some code, sticking an open source licence on it and posting the lot to your website. Unfortunately, successful projects – those with long-term sustainability – need far more nurturing than this. The workshop explored what that nurturing actually entails and how to go about it.

My role was to act as the technical reporter and there will be a full report on the workshop in due course. In the meantime, you can view some of the slides at Ross Gardler’s slideshare space.

My other mission was to interview Gianugo Rabellino, CEO of Sourcesense, a leading European open source services company. His company made headlines a few months ago when they agreed to partner with Microsoft on an open source file reader for the controversial OOXML office document format. It was an extremely interesting interview and the results will be featuring in a couple of pieces I’ve been commissioned to produce in the near future. In the meantime though, I can reveal that Gianugo was trained as a lawyer and his mother was somewhat dismayed when he told her that he was abandoning a highly lucrative legal career to, as she understood it, “give software away for free”.

Open Office for the Mac

October 14, 2008

Yesterday Open Office 3.0 was officially launched. This is a major new release of the open source, free, office productivity suite that provides functions such as word processing, which has historically been pretty much the sole preserve of Microsoft’s Office package. The demand for the new software has been so strong that the Open Office website has been crashed out for the last day or so.

I was excited at first as the release promises a native version for the Mac OS X system. In the past one could only run Open Office on the Mac through an X11 window (which basically meant you had to download and run additional software and using this slowed things down considerably).

However, I gather that the new version has been built only for the newer, Intel-based Macs and, unfortunately, we are still using PowerPC systems in our office. To my knowledge this is the first major piece of software that has had this restriction. Perhaps the day has come for an office re-fit.

G1 spotted in UK

September 29, 2008
Google G1 phone snapped in London

Google G1 phone snapped in London

Whilst enjoying a quick coffee in a public place in central London the other day I noticed that the chap next to me was sporting a Google jacket and playing with a familiar-looking black mobile phone with a slide out keypad.

Putting two and two together I realised that this was possibly the Google G1 phone, launched in the USA earlier last week, and subject to much press excitement over the last few days. This was confirmed when I rudely interrupted his surfing. It turned out that he was an employee of ‘the big G’ and the phone was an early prototype which had been issued to staff to undertake tests in the UK. He kindly let me have a very quick play and snap a photo.

I was quite impressed. The screen is bright, clear and easy to read, and the slide-out keypad keys were reasonably functional for one-fingered typing. Movement around the screen is via a tiny trackerball rather than the touch-screen concept that Apple’s iPhone employs. The phone has handy red and green keys (rather than touch-screen pads) for starting and ending a call and I liked the way you can hit the ‘home’ key at any point to get back to the main menus. I thought it displayed the Web well, although I only had a few seconds of experimentation. On the down-side, it seemed a little heavy, certainly in comparison to my standard issue Nokia.

Although it’s early days it is probably fair to say it doesn’t stand up that well when directly compared with the iPhone, especially on the design side. But as the guy said, the real issue is the open software platform called Android which runs the phone. Google are hoping that thousands of developers will see the opportunity and get coding up a wondrous array of applications for the phone and its subsequent versions.

As ever in this industry the market will decide. But this chance encounter perked up the start of my day.

Vint Cerf in London

September 25, 2008

Vint Cerf, often described as the ‘father’ of the Internet, was the keynote speaker at the Visions of Computer Science conference, which I attended yesterday. Although he doesn’t like this moniker (it implies he did it single-handedly and he’s always keen to stress that he was part of a team) the reason for it is that he co-invented the basic protocol of the Net (TCP/IP) and was there in the early days of the ARPANET, the forerunner to today’s Internet. He is now employed as Google’s Chief Internet Evangelist.

Vint pointed out the enormous growth of the Internet, remarking that there are now half a million computer servers (i.e. hosts that provide some kind of service such as Web or email routing) on the system and a couple of billion ‘terminators’ at the ‘edge’ – the end user devices such as a home PC or a mobile phone.

This enormous growth presents huge challenges and he argued that the next few months are likely to be “dramatic” in the world of the Internet. He then went on to elaborate some of the issues that are coming to the fore, including the problem of network addressing.

Network addressing uses something called IPv4. This is the coded address that is given to every single device on the network (even your home PC). When he was helping to create the original designs for the Internet he designed this address to make use of 32-bits of data. This limits the number of devices that can be on the Net to around 4 billion (2 to the power 32). He admits that at the time he didn’t think that this would ever be reached, but we are fast approaching that limit. Vint speculated that we would hit the limit by mid-2010, if not before. The answer is a new addressing system called IPv6 which offers many billions more potential addresses. Internet Service Providers (ISP), network operates and the rest of us need to start moving to IPV6 and he mentioned Google’s efforts in this regard. (See ipv6.google.com).

During the questions and answers section I asked him about the capacity of the existing Internet to cope with heavy data uses like video. There have been many recent reports in the UK press about the Net being close to capacity. Vint agreed that this was an issue, but said he was not overly concerned. The main backbone of the Internet will be fine, since the fibre optics involved have plenty of spare capacity. The problem arises the nearer one gets to the end user (the last mile problem), which is where there may well be an issue. Vint argued that researchers and Internet companies need to rethink the process of distributing video over the Net and rely less on streaming and more on storage and caching locally nearer the actual users. He called this process ‘edge storage’. As I know that Google and Microsoft have been rolling plans out to distribute their data centres nearer to users, I suspect we will hear a lot more about this in coming months.

Liquid lunch 2.0

September 23, 2008
Tipping the iPhone pint

Tipping the iPhone pint

The key thing about anything bearing the ‘2.0’ moniker is that it’s not real. And so it is with iPhone’s pint of lager.

In order to get the pint you have to play a virtual game of skittles, a viral advert courtesy of a well-known UK lager company. If you score enough points you ‘win’ a virtual pint.

The trouble is, thanks to the quality of the visuals, the pint looks really realistic. And because of the iPhone’s built-in motion detector and accelerometer you can actually nurse your pint and watch it disappear as you tip the iPhone as though it were a glass.

The idea is very clever and it does leave you wondering what better use the bright sparks that came up with it could be put to. It goes without saying, however, that as a liquid lunch it leaves much to be desired.

Broadband: there is another way

September 15, 2008

There has been a recent flurry of media interest in the ‘last mile’ broadband problem. This has been an ongoing issue for some while now, but has been rekindled by the launch of a report by the Broadband Stakeholder Group – the government’s advisory group on broadband – on the cost of deploying fibre-based, next-generation broadband in the UK.

The problem is how you get very fast broadband Internet speeds down to individual households. Although there are all sorts of high speed technologies that can deliver very high bandwidth across the Internet and down to your local telephone exchange, the ‘last mile’, down the street to the front door, remains a major technical issue.

Traditionally, houses have been connected to the telephone exchange through a cabinet, tucked away at the end of the street, that each house links to via a simple copper wire. Although technology has improved there is still a limit to what can be carried down a metal wire. The new report is all about fibre optics which can provide astonishingly fast speeds of up to (theoretically) 2.5Gb/s. The problem, of course, is the cost of digging up the streets to lay the fibre to each house – a cost that balloons rapidly when you take in rural areas. The report estimates a cost of £28billion to lay fibre directly to every home in the UK and £5billion to lay fibre to each street level cabinet and retain the existing copper wire for the last dozen yards or so. The latter is much cheaper but limits speeds to a theoretical max of around 100Mb/s.

There is though, as they say, another way. The alternative to laying cable is to use the growing range of wireless networking technologies. Most people have come across WiFi which is widely used by laptops for access to the Internet from hot spots such as railway stations and coffee shops. Less well known is the emerging WiMax standard.

By coincidence on the day after the broad band report came out my pals at Third Sector Media alerted me to a technology trial of WiMaX taking place just down the road from our office, here in Nottingham. Intel are conducting a trial called Forest in an area just north of the city.

The beauty of WiMax is that it can operate from a single mast over ranges measured in kilometres as opposed to WiFi whose effective range is measured in metres. It trumps fibre in the sense that there is no need to dig up the streets. According to the WiMax forum, who oversee its development, the technology can deliver ‘last mile’ broadband at speeds of up to 1-5 Mb/s. A later version of the standard is likely to increase this by a factor of around 7.

The BSG report makes no mention of WiMax – it is purely focused on the costs of fibre – and fibre does offer higher speeds. But technology is constantly changing. All this leaves me with one question: will the massive investment in digging up the streets be undertaken before technology and standards move on again and deliver even higher speeds wirelessly?