Posts Tagged ‘JISC’

Web 2.0 report: ten years on

January 23, 2018

My Web 2.0 report, written for JISC, way back in 2007, is fast approaching its tenth anniversary.

I had a quick look at Google Scholar to see how the citations were doing and was pleased to see it has now passed the 2,000 mark. Surprisingly, even last year it received 90 citations, despite its age, which means, given how the technology has changed, that some of core concepts are still relevant.

Reading through it again, the thing that stands out most, as far as changes are concerned, is that Facebook is only mentioned a couple of times. Though one comment perhaps pointed to the future:

As one lecturer recently found out, it is easier to join with the herd and discuss this week’s coursework online within FaceBook (a popular social networking site) than to try and get the students to move across to the institutional VLE.

The other huge difference is the term ‘Web 2.0’ is rarely used these days; everyone uses social media.



August 7, 2012

How many people might get to see a particular university lecture? The biggest academic halls hold a few hundred students. If the talk is more or less repeated every year then this might tot up to a handful of thousands in the entire career of a lecturer. How then about the idea of reaching out to a million?

The flattening of the globe which was so powerfully explored by Thomas Friedman in his book, The World is Flat, is coming to the fusty old world of the university lecture. Leading institutions are scrabbling to get on-board the latest educational technology vehicle, massive open online courses or MOOCs.

Using the power of Internet video technology these services offer university-level lectures to anyone with a computer and broadband, anywhere in the world. The leading proponent is Coursera, a US for-profit social enterprise which provides free online courses in a range of subjects and themes from the human genome to algorithm design.

The service launched in April with a small, but blue-chip, selection of US universities, but the University of Edinburg announced last week that it is joining the scheme and offering courses including an introduction to Astrobiology and Extra-terrestrial Life.

It is part of a wider move to what are being called open educational practices, offering as Edinburgh’s Jeff Haywood describes it: “ways to flex and bend the constraints that much of our traditional HE formats impose on us, and on our learners.” Or as Tim Berners-Lee put it at the Olympics opening ceremony – “This is for everyone.”

Curating Scholar

November 30, 2011

Thanks to a heads up from Brian Kelly, I’ve been having a look at the latest improvements to Google Scholar, a search engine for academic papers that served me well whilst writing the Web 2.0 book. The thing that caught my eye was that the site now allows authors to curate a collection of their papers and calculate the number of citations each one has had.

The citation figures for my authorial output follow the classic ‘long tail’ distribution in which one or two papers receive a large or moderately large number of citations and the rest each receive a handful. I was pleased (and a little surprised) to see that the Web 2.0 report I wrote for JISC back in 2006 has received almost 600 citations in the intervening years. I knew that the report had consistently been the most downloaded document on the website (over 100,000 in the first three years), but I’d assumed that a lot of this traffic was due to students preparing course work, particularly as the stats rose during term times. However, it seems researchers have also picked up on some of the ideas, which is rather reassuring as when I was writing the book I had to fight my corner to get a detailed look at the state-of-the-art in research included. Let’s hope this bodes well for sales.

Beyond Web 2.0

November 7, 2011

It has been an awfully long time since my last blog posting.

For those who don’t Twitter me, I’ve been writing a book. It’s called Web 2.0 and beyond: principles and technologies and it’s going to be published in May by CRC Press, the computer science imprint of Taylor & Francis.

I should say that it’s not your usual comp. sci. textbook. My brief was to ‘reinvent the textbook format’ and while that’s quite an exciting thing to do, it’s been a huge undertaking. The underlying premise is that understanding the Web is too big a job for computer scientists alone, and the book looks at where understanding the technical infrastructure behind Web 2.0 intersects a range of other subject areas such as business studies, economics, information science, law, media studies, psychology, social informatics and sociology.

This was not my idea. It was first put forward by Tim Berners-Lee and Nigel Shadbolt in an article for Scientific American in 2008. Since then Web Science, a new, interdisciplinary research area, has emerged. However, using this as a template for a textbook has been hard work: as well as linking to aspects of many different subject areas I’ve had to write the book so that non-engineers can not only understand it, but also find it interesting. So I’ve included some of the history of the Web, both for colour and context, and on the basis that a picture paints a thousand words I’ve developed and refined my ‘iceberg’ model of Web 2.0 (read the original description of the iceberg model in a 2007 JISC TSW report).

Finally, of course, there’s a section on the future (the beyond bit) – or rather, potential futures. By the time the reader gets to this part of the book they should have learned enough to be able to form their own ideas about Web 2.0 and to have an informed opinion on what might come next.

So, a huge undertaking. I’m still a bit dazed – can’t quite get used to the idea that when I get up I have a choice of what to do – but I have it on the highest authority that there is life beyond Web 2.0. All I can say is that there’d better be some pretty good lunches.

Display technologies

June 17, 2011

Back in 2005 I wrote a fairly long report for JISC on the future of display technologies, covering the likes of 3-D TV and holographic imaging. Two of the peer reviewers, Mark Fihn and Wayne Cranton, were particularly helpful.

It appears they’ve not been idle in the intervening years as they’ve just announced the publication of a new book, the Handbook of Visual Display Technology. Weighing in at a mighty 2000 pages in two volumes this is not for the faint-hearted, but it looks like this book represents a substantial summary of this important area of electronics.

Data mash-ups and the future of mapping

September 7, 2010

I’m pulling my head out of the latest research on social networking (for my textbook!) to pass on the TechWatch latest. There’s a new report out called Data mash-ups and the future of mapping and it’s quite exciting. If you’ve ever worried about data being left on trains or unencrypted disks going missing in the post, this will really make your eyes water.

Pioneering a low carbon future with Enterprise Architecture

August 8, 2009

It’s been a busy August so far, putting the finishing touches to a report on Enterprise Architecture (EA) which has just been published by JISC TechWatch.

EA is a strategic management technique which aims to align business strategies and goals with information systems. The process involves mapping out both the current situation within an organisation, what’s termed the ‘as is’ and then laying out a vision for the future, the ‘to be’.

It has been in use in the commercial world for a decade or so, although it is new to the education sector. The report synthesises the results of a year-long pilot project by a group of pioneers who looked into the day-to-day practicalities of introducing this technique into the higher education institutions. In particular they looked at the use of The Open Group’s TOGAF method for this kind of work.

The report comes to a number of conclusions, but I think the most interesting relates to the potential for the technique to be used to help the sector move to a low carbon future. As the report makes clear low carbon ICT is an area of activity that is strategically conducive to the EA approach as it needs long-term planning within, and possibly between, institutions. Work is already underway in the sector on the feasibility of shared data centres and the introduction of EA can only help these initiatives.

The report’s called Unleashing Enterprise Architecture and you can have a look at a PDF of the report on the JISC TechWatch website.

Using Twitter

April 15, 2009

Regular readers will know that I have been a little sceptical about Twitter, saying, for example, that it needs some kind of killer app. in order to really take off. Recently though I have begun to mellow, as I have been making serious use of my Twitter account. There is, as you might expect, a vibrant community of users across the JISC universe that I inhabit. By following tweets I’ve been able to keep up with how enterprise architecture (EA) is shaping up across the higher education sector – particularly useful at the moment as I am in the process of preparing a synthesis report on EA.

Twitter started out as a quick way of saying ‘What I’m doing at the moment is…’. It’s still used a lot like this, but what I’ve noticed is that it also seems to be increasingly taking the same form as the earliest blog postings took i.e. very short statements along the lines of ‘Oh, have you seen this interesting thing’ followed by a link. Perhaps we are coming full circle? How long before Twitter expands to allow more than 140 characters?

The Future of Libraries

April 11, 2008

The Web is having a profound impact on the role and function of libraries. This goes way beyond ‘the demise of the book’, which is, quite frankly, a very simplistic way of looking at things. It’s actually more about having a vision for the future and how you realise that vision. For example, one of the problems facing librarians is how to create high quality ‘digital objects’, as they are called. If you think about a book, you might judge its quality in terms of the jacket design or the type of paper used or whether or not you can see guillotine marks on the edge of the pages. You probably wouldn’t think about some of the very obvious quality factors unless they were missing. If you opened a book and, say, the pictures were missing or all the pages were in the wrong order, you’d probably want your money back.

The problem for librarians is that when you are creating things like e-books, you have to think about a different set of ‘quality’ criteria because these digital objects will not be used in the same way that physical books are. They will need to designed so that they can be searched, for example, or delivered as separate pages. For the average library user, accessing information that spans multiple digital sources is increasingly a messy process and for those who are used to search tools such as Google and Yahoo this new and highly fluid environment can be a considerable barrier to accessing information from digital libraries and online collections. What is concerning about this is, unless we are careful, people will increasingly see the search results thrown up by Google, Yahoo etc. as the be-all and end-all of a particular area of interest or subject. There is no doubt that the library and information community recognises this problem.

One of the ways of helping to ease these problems is covered in a technical report just published by JISC Technology and Standards Watch, for which I am the technical editor. The report is by Richard Gartner, the man who brought the Internet into Oxford University’s Bodleian Library, who argues that rectifying this problem requires the acceptance of the importance (and standardization) of what’s called metadata.

Metadata is information about the information contained within the digital object, and can be as simple as a tag which says who the author is, ranging to a complex layer of additional information about digital rights (who’s allowed to access it or how much you might have to pay). There are different ways of approaching this problem – the more sober Digital Library is being usurped a little at the moment by the ‘hipper’ Library 2.0 – but it’s a hot topic, and even though it’s a technical subject, the report should be quite readable for a tech-curious audience.

This is part of an ongoing debate about the future of libraries, and will be one of the key themes of JISC’s annual conference in Birmingham, next week, which I’ll be attending for TechWatch.