Feasibility study into the potential for covering live events utilising multimodal user generated content (UGC)
There has been a massive increase in the use of UGC in covering breaking news stories over recent times. From the video footage of Concorde’s crash at Paris, the first photos (uploaded to Twitter) from US Airways Flight 1549 crash landing in the Hudson River to the early coverage of the recent earthquake in Chile, UGC is rapidly becoming the default source for breaking news. This project looks at how UGC, and in particular content from mobile devices, can be utilised to provide coverage of a live event. It is anticipated to utilise live audio/video streaming and pictures from mobile devices linked by social media platforms to “broadcast” both live and provide archive material.
There it is; the title and description of my Masters level dissertation. I picked it because I found several aspects of it really interesting; It is clear to me that the Internet is the perfect tool for the sharing of instantaneous information, news, events and even entertainment; with the explosion of the popularity of smart phones and other such mobile devices, access to this information and content is open wherever you are, at any time. What’s even more important is that UGC can be created and shared with similar ease. As shown by the examples in the title description, this has already lead to several groundbreaking cases of UGC providing information and content previously unobtainable, or at least shared at a rate and scale previously unimaginable, thanks to the Internet and social media. What is the overall potential, and what are the implications, of UGC moving even further to the forefront of live event coverage?
Up to date blogs related to the event are available at http://alexmcc.wordpress.com
This topic fascinates me, would love to read the final thing. I may have mentioned this piece I wrote about UGC http://www.edinteractive.co.uk/af_creative_tech/media_cultural/Reply to comment