Archive for the ‘My Published Work’ Category

Looking back to IWMW 2003

May 9, 2016

I was recently approached by Brian Kelly, who runs UK Web Focus, about writing a blog item about a workshop I ran at the Institutional Web Management Workshop (IWMW) in 2003. This would form part of a series of blog posts on the conference, which has been running for almost twenty years.

After a few days of dredging memories from deep within my brain, the blog post is written and is now published on the IWMW website.

The 2003 workshop was about Web-based research expertise systems and, in particular, work I had been leading at the University of Nottingham on the East Midlands Research Expertise Database (EMRED).

It was a long time ago, and a lot has changed, both in terms of technology and in the way universities now handle their engagements with business. So it’s been a fascinating exercise to pause for a moment and look back.

Web 2.0 at Nottingham Festival of Words

February 12, 2013
Map of the Internet (courtsey of Opte project, CC BY 2.5)

Map of the Internet (courtsey of Opte project, CC BY 2.5)

The inaugural Nottingham Festival of Words has officially started and is building up to the main events over the weekend of 16th/17th February.

I’ll be speaking on Sunday afternoon, presenting some of the future-facing material from my recent Web 2.0 book and looking ahead to the development of a global brain.

If you are interested in the future of the Internet, Web Science, artificial intelligence and the wisdom of crowds then why not pop along?

There are still tickets: http://www.nottwords.org.uk/homeIndex.html

First review of the book

September 17, 2012

“Web 2.0 and beyond: Principles and technologies explains Web 2.0 and its wider context in an accessible and engaging style, helping readers, especially beginners, understand every aspect of Web 2.0 without difficulty.”

The first formal review of my new book has been published in the highly respected Internet journal, First Monday. The author, Yijun Gao, an Assistant Professor in library and information science studies, paints a generally very favourable view of the book, particularly emphasising its suitability for undergraduates with little formal academic knowledge of Web 2.0 and social media.

You can read the full review at First Monday’s September issue: http://www.firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/4227/3314

Nottingham’s new Literary Festival

September 10, 2012

On Wednesday evening I’ll be at Antenna media centre in Nottingham, doing a quick spot at the official launch of the inaugural Nottingham Festival of Words. It’s a taster for my full session, which takes place in February 2013, where I will be talking about the new science of the Web and exploring some of the stuff there wasn’t space for in Web 2.0 and Beyond. Unfortunately I gather that there are no tickets left, so I can’t invite anyone, but I’ll post a summary later this week.

Web 2.0 and Beyond is published

May 18, 2012

A couple of years ago I was approached by an American publisher about the possibility of writing a general reference/textbook that covered Web 2.0 and Social Media. It followed on from the success of a report I wrote for JISC in 2007, which was written for both technical and non-technical readers, and the publishers wanted something similar, but more of it.

Well yesterday a friend rang to ask if I knew that the ‘buy’ link had been activated on Amazon, so I guess I can say that my book, Web 2.0 and Beyond (published by Chapman & Hall/CRC, a computer science imprint of Taylor & Francis), is well and truly published.

The remit was challenging – CRC were developing a new series, aimed at reinventing the textbook format. Their point was that, increasingly, it is students from business studies, economics, law, media studies, psychology etc. who want to understand what CompSci is up to but who don’t necessarily have the deep technical knowledge to really understand how the technology came to be or what the implications of it are. However, as CRC is primarily a computer science imprint they also didn’t want to compromise on the requirements of their primary audience.

I was particularly interested in this idea because studying social media is increasingly becoming an interdisciplinary melting pot. Also, having taught computer science I was keen for students to have a well-rounded sense of the discipline – that they should have a sense of context rather than just learn how to write code. I could also see parallels with Web Science, the study of the Web as the world’s largest and most complex engineered environment (which at the time was only just starting to emerge), and I thought that if ever there was going to be a moment when it was possible to bring all this together in one book, it would be now.

The tricky thing, of course, was getting it all to come together. With the help of some extremely skilful editing I think what we’ve done is to obey three golden rules: only tell readers what they need to know at that point in time; use narrative techniques that engage the reader and allow them to read through the filter of their own discipline; and to keep highly specialised information (hard-core technical information, overviews of research etc.) in separate sections and chapters.

The framework for all of this is the ‘iceberg model’, which tackles Web 2.0 using a layered approach. The premise of the book is that if you understand the iceberg model you will be better equipped to understand how the Web is likely to evolve in the future. There are, of course, a few pointers as to what that might look like.

In the spirit of Web 2.0 there are also various information sources associated with the book. There’s a YouTube channel where I post information about relevant videos, and you can find out about these if you subscribe to the book’s Twitter feed (@web2andbeyond) where I also post other snippets of relevant information that help to keep the book fresh. More detailed information is on the book’s Facebook page (www.facebook.com/web2andbeyond), which also includes notes and excerpts to give a taste of the narrative style of writing I mentioned earlier.

It has been a while in the making and part of me still can’t believe that it’s actually here, but it is, so now all I need is for people to buy it. Hint hint.

Curating Scholar

November 30, 2011

Thanks to a heads up from Brian Kelly, I’ve been having a look at the latest improvements to Google Scholar, a search engine for academic papers that served me well whilst writing the Web 2.0 book. The thing that caught my eye was that the site now allows authors to curate a collection of their papers and calculate the number of citations each one has had.

The citation figures for my authorial output follow the classic ‘long tail’ distribution in which one or two papers receive a large or moderately large number of citations and the rest each receive a handful. I was pleased (and a little surprised) to see that the Web 2.0 report I wrote for JISC back in 2006 has received almost 600 citations in the intervening years. I knew that the report had consistently been the most downloaded document on the website (over 100,000 in the first three years), but I’d assumed that a lot of this traffic was due to students preparing course work, particularly as the stats rose during term times. However, it seems researchers have also picked up on some of the ideas, which is rather reassuring as when I was writing the book I had to fight my corner to get a detailed look at the state-of-the-art in research included. Let’s hope this bodes well for sales.

Beyond Web 2.0

November 7, 2011

It has been an awfully long time since my last blog posting.

For those who don’t Twitter me, I’ve been writing a book. It’s called Web 2.0 and beyond: principles and technologies and it’s going to be published in May by CRC Press, the computer science imprint of Taylor & Francis.

I should say that it’s not your usual comp. sci. textbook. My brief was to ‘reinvent the textbook format’ and while that’s quite an exciting thing to do, it’s been a huge undertaking. The underlying premise is that understanding the Web is too big a job for computer scientists alone, and the book looks at where understanding the technical infrastructure behind Web 2.0 intersects a range of other subject areas such as business studies, economics, information science, law, media studies, psychology, social informatics and sociology.

This was not my idea. It was first put forward by Tim Berners-Lee and Nigel Shadbolt in an article for Scientific American in 2008. Since then Web Science, a new, interdisciplinary research area, has emerged. However, using this as a template for a textbook has been hard work: as well as linking to aspects of many different subject areas I’ve had to write the book so that non-engineers can not only understand it, but also find it interesting. So I’ve included some of the history of the Web, both for colour and context, and on the basis that a picture paints a thousand words I’ve developed and refined my ‘iceberg’ model of Web 2.0 (read the original description of the iceberg model in a 2007 JISC TSW report).

Finally, of course, there’s a section on the future (the beyond bit) – or rather, potential futures. By the time the reader gets to this part of the book they should have learned enough to be able to form their own ideas about Web 2.0 and to have an informed opinion on what might come next.

So, a huge undertaking. I’m still a bit dazed – can’t quite get used to the idea that when I get up I have a choice of what to do – but I have it on the highest authority that there is life beyond Web 2.0. All I can say is that there’d better be some pretty good lunches.

Microsoft opens up for the next ten years

January 5, 2010

For many Microsoft watchers 2009 was the year of Windows 7 – the latest version of their market-dominating operating system – and Bing, the company’s latest salvo in the continued battle with Google over Web search. But in the background, and little noticed by the media, the company has also been re-thinking how it makes software and these changes are likely to have a more long-term impact on the computer world over the next decade.

Bill Gates has moved on. Ray Ozzie is the new Software Architect and he has brought new ideas about cloud computing, software as a service and, whisper it quietly, open source software. It is the latter that has caused most surprise and in the process has split the free and open source development community. A number of moves made by the company in the last six months have left some in that community talking about a ‘sea change’ and others accusing it of simply using small forays into open source as another form of PR. With this in mind, Oxford University’s OSSWatch service commissioned me to research and write an in-depth article and this has just been published on their website. If you’ve got half an hour to spare before things get too hectic again why not have a look.

Low Carbon Computing

November 24, 2009

What will computers be like in 2020, or even 2050? Given the rapid pace of innovation, predicting the future of technology is notoriously difficult, but one thing we can be sure of is that it will use less energy. Thanks to rising concern about climate change there has been an astonishing level of interest over the last year or so in investigating ways to develop computers, displays, printers, data centres and other technology which use less energy. And the pace of innovation will only increase.

I’ve been busy in the last few months as a co-author on a major new report for JISC. The “Low Carbon Computing” report, published today, looks at how ICT can be made more energy efficient. The report takes as it premise the UK’s Climate Change Act and maps a future for computing which is framed by the CCA’s targets, processes and frameworks. By 2020 the public sector will be expected to have reduced its carbon levels to 30% less than it used in 1990. It is a ‘big ask’ and ICT will have a major role to play.

How to achieve these kinds of cuts? The sociologist, Anthony Giddens, is quoted in the report as saying we have to “Season policy with a dash of utopian thinking”. In this spirit the report covers a very wide range of emerging ideas and technologies varying from simple behaviour changes (switch off your PC when you’re not using it, for god’s sake!) to radical suggestions such as switching data centre equipment to run on DC power alone (more efficient to run from renewable sources). Along the way the report takes in a wide variety of interesting new ideas such as thermal energy harvesting, hydrogen fuel cells and nano data centres. We’ve deliberately looked at a long time period and the report presents a first attempt at a Low Carbon ICT roadmap for up to 2020.

The full report’s a bit of a beast at nigh on 80 pages, but there is a 5-page executive summary for the lightweights among you.

Bacon for lunch

November 9, 2009

Last week I interviewed Jono Bacon about his new O’Reilly book, The Art of Community. Jono is the open source community manager for Ubuntu, a popular version of GNU/Linux. As such he has a wealth of experience in setting up and running a virtual software development community. His key argument is that community is essential to the development and sustainability of open source software projects and to achieve that you need to foster a sense of belonging.  His book outlines the practical reality of going about doing that. You can read the interview on the OSSWatch website.