ProBlogger lists 8 reasons why new media is growing
Darren Rowse, a professional blogger, has made a list of 8 reasons why new media is growing.
(Photo from Darren Rowse's blog
First he mentions participation
as an important factor. The old one-way communication style where news stories are presented to the public has become more or less obsolete as there are now effective feedback and publishing channels available. Internet based information channels like blogs, podcasts and online news pages are growing. According to an annual study of the Swedish people's media habits done by Nordicom (Mediebarometern 2005
), in 2005 Swedish people spent more time on the Internet than reading the daily newspaper. The new media is highly more participative, giving the reader the ability to comment and discuss the news. Newspapers are also involving readers in creating the material. Several Swedish newspapers have started blog portals, where some highlights from the blogs are presented by an editor. For example, on Aftonbladet's Läsarbladet
they let the readers chose debate topics and current popular blog subjects are also presented like regular news stories. Aftonbladet also invites people to provide their own photographs and tips about news stories. All material being published resulting in a revenue for the provider.
As Rowse put it: "People want to interact with news and with the rise of technology they want an opportunity to even take part in it’s creation and reporting."
He also mentions suspicion of institution
as one possible reason why people tend to turn their backs on more traditional sources of information. In many countries, including Sweden, the traditional media is controlled by a small number of players which creates suspicion and ground for conspiracy theories to grow. Playfulness
is another factor, there are more space for humor and irony in forums like blogs and podcasts than in the traditional one-way media. The more informal language may also be one reason that new media seem to be grow-ground for social networks, something Rows describes as relationality
. By participating on discussion forums and making comments on blogs and news pages, people are more involved and act more or less like in real life social situations, but with the distinction that it is a lot easier to find people sharing the same interests on the Internet, when geographical distance is not any longer an issue.
Although, I beleive it is a little bit far fetched, holism
and the integration of our lifestyle with a little help from new Web 2.0 tools, is another aspect of it. He also think that people are more and more willing to hear different oppinions and arguments in matters, juxtaposition
. Perhaps the world is not so black and white anymore.DIY
(Do It Yourself) is a strong force within several areas today and Rowse means that it is a growing phenomena. With the Internet you can do almost anything you want with only your computer, your internet connection and the time you are willing to invest. Finaly, immediacy
is something we expect in our modern lives. It has become more and more usual that news stories are first told on blogs and discussion forums.Web links
. (May 18, 2006) http://www.aftonbladet.se/ettor/webb/2927_normal.html
Nordicom: Aktuellt resultat från Mediebarometern 2005
. (May 9, 2006). http://www.nordicom.gu.se/?portal=mt&main=press_mediebarometer_2006.php&menu=menu_sve&me=8
ProBlogger: About Darren
. (January 6, 2005). http://www.problogger.net/archives/2005/01/06/about-darren/
ProBlogger: 8 Reasons Why New Media is Growing
. (May 10, 2006). http://www.problogger.net/archives/2006/05/10/why-new-media-is-growing/
Tags: blogging, blogs, new media, Darren Rowse, user-created, user, created, participation, suspicion of institution, playfulness, relationality, holism, juxstasposition, DIY, immediacy
Using the Global Brain
they have been looking on the increasingly popular trend to involve customers in the process of creating products. According to the article, the motivation for people to put their effort into this involvement is:
- Status: people love to be seen, love to show off their creative skills and thinking.
- Bespoke lifestyle: something consumers have been personally involved in should guarantee goods, services and experiences that are tailored to their needs.
- Cold hard cash: getting a well deserved reward or even a profit cut for helping a company develop The Next Big Thing is irresistible.
- Employment: in an almost ironic twist, CUSTOMER-MADE is turning out to be a great vehicle for finding employment, as it helps companies recruit their next in-house designer, guerrilla advertising agency or brilliant strategist.
- Fun and involvement: there's pleasure and satisfaction to be derived from making and creating, especially if co-creating with brands one loves, likes or at least feels empathy for?
Not surprisingly, it sounds pretty much the same as why people contribute to the Open Source Software community or why they are blogging. Even though the "cold hard cash" may be hard to gain, it is still there as a distant possibility. People have a lot of good ideas, and that is what the "global brain" is. In product development, customer involvement often take place as competitions which for the company brings a lot of ideas without paying for more than a fraction of it.
The authors point out that "it's often the brands that already have a strong competence in design or product development"
, making those brands even stronger.A Nokia wrist-band mobile phone, which was the winning concept in a design competition. Photo from Trendwatching.com.
A good example on how to motivate people is that of the Austrian flavoured milk manufacturer Frenkenberger. In a competition to find a new flavour they are offering the winner 1 eurocent in royalty for each sold bottle of the new taste.
Why is this happening now? At Trendwatching they believe that people are now more demanding, but also more creeative. They refer to Generation C
as a part of the explanation. I think another explanation lies in the increased possiblities of communication that comes with the Internet. You can reach millions of people for almost no cost at all, and even if only a fraction of them are giving you some kind of feedback (the 1:100 rule?), you are still going to get a lot of response.Web Links
. (May 18, 2006). http://www.trendwatching.com/trends/customer-made.htm
Trendwatching.com: Generation C
. (May 18, 2006). http://www.trendwatching.com/trends/GENERATION_C.htm
Blog comments analysed
Mishne & Glance (2006)
have been researching how blog comments are related to the importance of a blog and what kind of information the comments hold. They automatically collected data about posts and comments from 36044 blogs over a time period of 20 days.
From a random sample of 500 of these blogs, 80% allowed comments but only 28% had any. They estimate the comments to be between 10% and 20% of the total blogosphere text content. The number of comments follows a typical power-law distribution, where most blogs have few, if any comments per post but some few, very influental blogs have more than 100 comments per post.
Comparing the number of comments of a blog to other popularity metrics, they concluded that comment amount is an indication of a blog's popularity and influence level. When a blog has more comments than expected, it is often a personal blog where the writer's friends use it more or less as a chat forum, or blogs that attract people with lower technological skills, such as fashion or celebrity blogs. To me, it seem to be possible that it can also be related to the gender and social skills of the reader. The "non-tech" blogs attracting more females, which according to recent reports (need a reference to that Brittish report I read recently) are more social than males on the internet. Also, fewer comments than expected is often the result of some kind of comment moderation on popular blogs, due to massive spamming.
Not surprisingly, "The posts that are most insightful or controversial get the most comments."References
Mishne, G., Glance, N. (2006). Leave a Reply: An Analysis of Weblog Comments
. WWW2006, May 22-26, 2006, Edinburgh, UK.
SvD's review of Swedish online encyklopedias
On March 30, 2006, the Swedish newspaper Svenska Dagbladet (SvD) published a comparative review of three different online encyklopedias
, The Swedish language instance of Wikipedia
. SvD used nine experts to review and rank nine different articles
within their fields of expertise.
Nationalencyklopedin (NE) is the largest regular Swedish encyklopedia. The articles are written by field experts and reviewed by NE's editorial staff. The online version reviewed is a pay service. Wikipedia is a wiki-based online encyklopedia available in 200 different languages and maintained by its users. Susning is also wiki-based, but not any longer open for the public to edit, due to vandalism. It is since April 2004 maintained by a number of already registered enthusiasts.
Nine article subjects were chosen for the study: Stella McCartney
(Scientology), Slobodan Milosevic
, Bear Quartet
(Reality show), Muhammedteckningarna
(Mohammed cartoons) and Homeros
(Homer). The articles were ranked by the expert on the subject from 1 to 3, where 1 was the best.
The results can be seen in the table below, where the numbers show the number of times the encyklopedia has been given the ranking of that column:
NE is generally the highest ranked, due to its higher degree of objectivity but it lacks information on recent happenings and pop culture. Wikipedia holds a high standard and is surprisingly well updated, but does not always cover all important details and also has some errorounus information. Susning is hardly comparable to the others, it lacks objectivity and does not cover recent events as good as Wikipedia.
Another aspect of the test was to inject errorounus information into the Wikipedia to analyse how quickly it would be corrected by its users. SvD added information about two fictive authors and also put false information about their own newspaper in the
article on Svenska Dagbladet. Both authors were deleted within three days, but the false information about the newspaper were left unmodified after two weeks.Discussion
What does this kind of news research show us about collaborative material on the Internet? First of all, there exists a lot of critism on the second part of the evaluation. To alter information in the way the reporters did is by many seen as a form of sabotage. Stefan Rimm asks on his blog (translated from Swedish): "in what other area sabotage is accepted as a method of journalism?"
) arguing that it would not be seen as good journalism to hide library books or pull the emergency brake on a train, just to see if it stops. His point is that this reflect a lot of peoples view on Wikipedia and similaronline sources. They are places where you can do that kind of things because it is possible, forgetting about the people actually using this kind of information sources in their daily life. Although it is unenethical to perform this kind of information disturbance experiments, we can never prevent them from being done. Others have done it earlier, like Alexander Halavais and his Isuzu experiment
, and others will do it in the future. It lies in the Wiki concept itself that it can be done and it will be done, but it will also sooner or later be undone.
On another blog, Suburbia
, the writer is critic to that the reporters are not only comparing the content of the encyklopedias but also the different technical platforms, pointing out that it is a bigger problem that errors can not be immediately fixed in the NE than that anyone can alter Wikipedia. Regardless of the source, you should always read with a critical eye.
Alex Halavais - The Isuzu Experiment
. August 29, 2004. http://alex.halavais.net/news/index.php?p=794
Loci.se - Stefan Rimms blog. SvD, Wikipedia och sabotage som journalistisk metod
. (March 30, 2006). http://loci.se/2006/03/30/svd-wikipedia-och-sabotage-som-journalistisk-metod/Nationalencyklopedin
. (May 8, 2006). http://www.ne.se/
Suburbia - SvD och Wikipedia
. (March 31, 2006).Susning.nu
. (May 8, 2006). http://susning.nu/
Svenska Dagbladet. Gratis nätlexikon får bra betyg
. March 30, 2006. http://www.svd.se/dynamiskt/kultur/did_12245366.asp
Svenska Dagbladet. Experterna föredrar NE
. March 30, 2006. http://www.svd.se/dynamiskt/kultur/did_12245317.aspSvenska Wikipedia
. (May 8, 2006). http://sv.wikipedia.org/wiki/Huvudsida
The Neutral Point of View in Practice
A neutral point of view (NPV) is a fundamental policy of Wikipedia to maintain objectivity of the articles. From Wikipedia:Neutral point of view
"All Wikipedia articles must be written from a neutral point of view, representing views fairly and without bias. This includes reader-facing templates, categories and portals."
When anyone on the Internet can edit an article instantaneously, it is of course impossible to guarantee this. The Wikipedia material holds a high standard, though, as several examinations have shown (see for example Nature's comparision between Wikipedia and Brittanica
). How can this be?
In theory, Wikipedia is self-regulated, based on the fact that it is both easier and faster to revert to an old version of an article than to vandalise it (see for example IBM's History Flow results
). Information and tools like recent changes
, related changes
, page histories
and user contribution lists
are used by the Wikipedia community to maintain it (Wales, 2004
). For most of the Wikipedia content this is enough, but for some controversal subjects other regulating mechanisms are needed.
One recent example is the Wikipedia page of Wal-Mart
, the worlds largest reatail company. According to Demsyn (2006)
, the wiki page has been edited by Wal-Mart lobbyists, trying to hide critisism towards the company. Among the comments to the article, some commenters argue that Demsyn's critisism is unfair while others agree with him. It is also still debated whether Wal-Mart officials really were involved in the editing. What the discussion shows though, is that a neutral point of view is very far from easy fetched, if not impossible to hold. As an action to prevent vandalism and to inform readers about the current status, the Wal-Mart page is currently flagged to possibly violating the neutral point of view and editing is restricted to registered users (see picture below).
Another well known controversy is that of US Government officials editing pages of several senator biographies as well as other US Government related articles (BBC News, February 9, 2006
). As a result of "inappropriate contributions", the Congressial computer network's IP numbers were banned for edits for some short periods of time. Banning users and IP numbers is another way for the Wikipedia community to handle vandalism from known sources. Although one can argue that it might be unethical to edit pages where one holds a vested interrest, it can also be benificial in terms of information availability.
Demsyn, R. (2006). Wal-Marts Wikipedia War
. Whitedust Security. http://www.whitedust.net/article/55/Wal-marts_Wikipedia_War/
Giles, J. (2005) Special report: Internet encyclopaedias go head to head
. Nature 438, 900-901 (15 December 2005)
Wales, J. (2004) Wikipedia Sociographics
. Talk at the 21C3-Conference, December 27, 2004, Berlin. http://www.questia.com/PM.qst?a=o&se=gglsc&d=5008697163Web links
BBC News: Congress 'made Wikipedia changes'
. February 9, 2006. http://news.bbc.co.uk/2/hi/technology/4695376.stm
IBM Collaborative User Experience Research Group: History Flow: results
. (2002). http://researchweb.watson.ibm.com/history/results.htm
. May 5, 2006. http://en.wikipedia.org/wiki/Walmart
Wikipedia: Neutral point of view
. May 5, 2006. http://en.wikipedia.org/wiki/Wikipedia:Neutral_point_of_view
Open Source Software Quality - Coverity's Scans
The use of open source software (OSS) increases, as well as worries about the code quality. Under mission from the US Department of Home Security, Coverity is now performing nightly scans of the most commonly used OSS packages. Currently, 50 projects are automatically analyzed by the Coverity software and the detailed reports are made available to the open source developers within their projects.Code quality concerns
As the U.S. government is funding the source code scanning project (C|NET, January 10, 2006
), there is obviously a real concern. Without all the OSS around, the Internet as we know it today, would not exist. A large portion of the world's web servers run the popular LAMP (Linux, Apache, MySQL, Perl/PHP/Python) setup. In most (if not all) cases, there is no party taking the juridical responsibility for an OSS package and thus you have very limited possibilities to sue anyone when things go wrong. By using OSS you take a calculated risk. If you can make a measurement on a certain software package's code quality you can at least make a qualified assumptions on how big that risk is.Bugs are not always shallow
In his classic work The Cathedral and the Bazaar (1999)
, Erich Raymond
stated what he was calling Linus's Law
(after Linus Torvalds
, the creator of Linux): "given enough eyeballs, all bugs are shallow"
. In the real world, practically all software have bugs, though. Only more or less. Viega (2000)
shows that the availability of the source code "can also lull people into a false sense of security"
, because "eyes that look do not always see"
. He argues that most people reviewing a project's source code will only see a small fraction of the code. It is also likely to be the same fraction of the code, because there are naturally some parts that are more useful to modify than others.How it works
The Coverity software uses a concept called metacompilation
to find bugs using user specified rules. A good rule finds as many bugs as possible without producing many false positives (Bradley et al., 2004
It performs a static analysis
on the program package's source code, which means that the actual program does not need to be executed. Hallem et al. (2003)
can be used as a resource for more information on how this is done technically.Results
The initial scans were performed on March 6, 2006. Within the chosen 32 OSS projects an average rate of 0,434 bugs / 1000 lines of code were found, with a range between 0,051 (XMMS) and 1,237 (Amanda) (ZDNet, April 4, 2006
). Spokesmen for Coverity reported that the OSS community were fast in producing fixes for the discovered bugs. In two weeks, more than 900 bugs were fixed, which made some packages entirely bug free.
The scanning project has since grown to 50 OSS projects which are all being nightly scanned. On today's date (April 23, 2006) over 2900 bugs have been fixed in the scanned projects, which means that an average of over 60 bugs/day have been fixed. This is of course an improvment to the selected software projects.Criticism & Discussion
Just because a certain software contains zero bugs according to Coverity's measurement that doesn't mean the software is entirely bug free. As Viega (2000)
reported, it is easy to feel safe under false premises. As ZDNet (March 6, 2006)
"Coverity's analysis looked for 40 of the most critical security vulnerabilities and coding mistakes in software code. The company did not give details on the scope of the flaws it found."
There are of course other programming errors that can be made than the 40 selected. Those flaws are invisible to the system and therefore this kind of automatic scanning can only be an indication of the code quality. It is also important to remember that there exist other forms of errors on higher levels of abstraction. As the authors of the Coverity software state (Hallem et al. 2003)
: "It is impossible to completely automate tests for such software features as user interaction and business logic."
It is also easy to be critic to the fact that Coverity does not release their results to the public, instead you have to register as a developer for a certain project to obtain the scanning data. In a standardized email, sent to several project mailing lists (for example the email sent to GDB developers
), Coverity defend their decision with:
"(1) We think that you, as developers of gdb, should have the chance to look at the defects we find to patch them before random other folks get to see what we found and
(2) From a support perspective, we want to make sure that we have the appropriate time to engage with those who want to use the results to fix the code."
Whatever the objective is, it clearly differs from the general OSS idea that the information should be public. In the C|NET (January 10, 2006) news story Ben Laurie, involved both in the Apache web server project and OpenSSL, comments:
"It is regrettable that DHS [Department of Home Defense, the funder of the project] has decided once more to ensure that private enterprise profits from the funding, while the open-source developers are left to beg for the scraps from the table."
"Why does the DHS think it is worthwhile to pay for bugs to be found, but has made no provision to pay for them to be fixed?"
Meaning that it would be better if the government provided the OSS developers with the bug searching tools instead of paying for the scan service, but regardless of the financing situation the tool seem to do its work.
* When I was working for an information security company, a similar rule-based technique was used to detect attacks, by matching rules for known exploits to the customers' data traffic. As these rules are constantly being improved, this approach performs quite well.
Bradley, A. R., Sipma, H. B., Solter, S., Manna, Z. (2004) Integrating tools for practical software analysis
. Proceedings of the 2004 CUE Workshop.
Hallem, S., Park, D., Engler, D. (2003) Uprooting Software Defects at the Source
. Queue 1(8):64-71, 2003.
Raymond, E. S. (1999) The Cathedral and the Bazaar
. (Also available in printed form: O'Reilly. ISBN 1565927249, October 1999)
Viega, J. (2000) The Myth of Open Source Security
. Available online at: http://www.developer.com/tech/article.php/626641Web Resources
C|NET News.com. Homeland Security helps secure open-source code
. (January 10, 2006). http://news.com.com/Homeland+Security+helps+secure+open-source+code/2100-1002_3-6025579.html
scan.coverity.com. Accelerating Open Source Software Quality
. (Accessed at April 23, 2006). http://scan.coverity.com/
Mail list archive of firstname.lastname@example.org. Ben Chelf: Coverity Open Source Defect Scan of gdb
. (April 5, 2006). http://sourceware.org/ml/gdb/2006-04/msg00045.html
The U.S. Department of Homeland Security. (Accessed at April 23, 2006). http://www.dhs.gov/
Wikipedia. Linus's Law
. (April 21, 2006). http://en.wikipedia.org/wiki/Linus%27s_law
Wikipedia. Linus Torvalds
. (April 23, 2006). http://en.wikipedia.org/wiki/Linus_Torvalds
ZDNet. Developers fast to fix open-source bugs
. (April 4, 2006) http://news.zdnet.com/2100-1009_22-6057669.html
ZDNet. LAMP lights the way in open-source security
. (March 6, 2006). http://news.zdnet.com/2100-1009_22-6046475.html
Short background history on Wikipedia
As Wikipedia is one of my research objects, I think it might be a good idea with a short background. Since most of the information below is from Wikipedia itself, it should be notet that:
"Wikipedia is a wiki—a collaborative, open-source medium. Articles are never "complete and final". Just as human knowledge evolves, so does our wiki coverage of it. Wiki articles are continually edited and improved over time, and in general this results in an upward trend of quality, and a growing consensus over a fair balanced representation of information. It will tend to gain citations, new sections, and so forth. Dubious statements tend to be removed over time, but they may have a long life before they are removed."
(From Wikipedia: Researching with Wikipedia
evolved from the Nupedia
project, both of them founded by Jimmy Wales
. Nupedia started as a freely available, web-based encyklopedia, with the content being written by experts and all articles being peer reviewed in the same manner as scientific publications (Wikipedia: Nupedia
). All of the material was licensed under the Nupedia Open Content License
, but later changed to the GNU Free Documentation License
"The purpose of this License is to make a manual, textbook, or other functional and useful document "free" in the sense of freedom: to assure everyone the effective freedom to copy and redistribute it, with or without modifying it, either commercially or noncommercially. Secondarily, this License preserves for the author and publisher a way to get credit for their work, while not being considered responsible for modifications made by others.
This License is a kind of "copyleft", which means that derivative works of the document must themselves be free in the same sense. It complements the GNU General Public License, which is a copyleft license designed for free software."
Wikipedia was started in 2001, as a side project to Nupedia to make it possible to perform collaborative work on the articles before entering the review stage in Nupedia (Wikipedia: Nupedia
). It was controversial among Nupedia's editors and reviewers to use a wiki
, a webpage that anyone can edit, as a collaborative tool (Wikipedia: History of Wikipedia
). The project soon outgrew its predecessor, though. During the first year about 20 000 articles were written and eventually the Nupedia project was abandoned and to some extents the articles were copied into Wikipedia (Wikipedia: Nupedia and Wikipedia
Localized versions of Wikipedia have been available since May, 2001. Right now (April 19, 2006) there are over 3,8 million articles written in more than 100 languages, with more than one million articles available in English alone (Wikipedia: About
This is my new blog which I will use as a writing tool for my Ph.D. thesis in human-computer interaction
. I write about user-created information and tools
, like open source software
, photo sharing
, discussion forums
and so on.
My aim is to post at least one entry every day from now on.