In a comment on an earlier post, Dmitry Epstein asked about the quality and impact of ICT4D research. I’ve already blogged about what constitutes good quality ICT4D research, so here I will just add a few data snippets on these topics.
Quality of ICT4D Research
The good news is that there is good quality ICT4D research around:
– It makes its way into the top journals: there’s the 2007 special issue of MIS Quarterly (top info. systems journal) on IS in developing countries; and I count at least four articles on ICTs and developing countries in World Development (top dev. studies journal) during 2009.
– Inclusion in ISI’s Web of Knowledge is a rather rough quality benchmark but it’s a benchmark nonetheless and, as noted in a previous post, there are a few hundred papers per year within the boundaries of development informatics recorded on WoK.
– On the assumption that citation rates are linked to quality, some individual ICT4D items score well. The article “Information Systems in Developing Countries: Failure, Success and Local Improvisations” gets 44 citations on WoK; 217 on Google Scholar. (Modesty, be damned: if you don’t believe in the quality of your own work, you shouldn’t be writing.)
But then there’s the bad news about ICT4D research quality:
– Of the eleven specialist journals in the field only one, Information Development, is seen as worthy of inclusion in the WoK.
– My subjective but honest opinion based on reading all 250 papers submitted to the ICTD2009 conference: the general quality of the big, fat, long tail of work in our field is pretty poor.
And lastly, there’s the average news, as conveyed next, that citation evidence (if a proxy for quality) suggests ICT4D research is no better or worse than other sub-fields of research.
Impact of ICT4D Research
Citation evidence is only one part of the impact story but . . . I’ve already noted that some individual ICT4D papers get highly-cited. And I should also note a bit of background on citation: work in early 2009 looked at citation rates for papers published during 1998-2008.
In computer sciences, the average number of citations per paper was 3.06, ranging from 7.06 for papers published in 1998 to 0.10 for papers published in 2008 (papers take time to pick up citations). Equivalent figures are: economics and business – 5.02 average (from 10.19 to 0.13 for 1998 – 2008); and social sciences – 4.06 average (from 7.48 to 0.16).
Note these are narrow citation rates for WoK-type databases; Google Scholar citation rates are much higher because they record a much wider range of published material: a 1:4 ratio of WoK:Google Scholar citation numbers looks about standard. Note also that even for papers in peer-reviewed articles, somewhere between 25% and 75% of articles – varying by discipline – are never cited in other peer-reviewed articles.
Some data bits:
– Comparing the total number of WoK citations for the four articles in MISQ’s special issue on IS in developing countries (vol.31, issue 2), with those for the first four articles in the next issue, shows no difference: both sets get 14 citations.
– Taking those (15) journal articles published in 2006 that come up on WoK using a search of ‘ict*’ and ‘developing countr*’ (and excluding out-of-topic items), they are cited 34 times: an average of 2.27 times. Comparing figures with 2005 for the work reported above (which came out about one year ago, hence the need to move back one year for a comparison), that 2.27 figure is somewhat better than the average for computer science (1.85) but a bit worse than that for social sciences (2.99), and economics and business (3.08). [For comparison, substituting ‘agricultur*’ for ‘ict*’ and taking those listed as being in the public administration subject area; the 15 on-topic articles were cited 25 times: 1.67 average.]
– Taking the same type of journal articles over the period 2004-2007, there are 46 in all of which 22 are uncited: 48%; average citation rate over all items is 1.67 cites per article. [4 of 15 agriculture articles in 2006 were uncited compared to 3 of 15 ict4d.]
– Performance in open access journals outside WoK is lower. Both EJISDC and ITID journals had four issues in 2007. EJISDC has so far scored 11 citations in WoK from 26 papers (0.42 citations per paper average); 18 (69%) papers uncited. ITID has so far scored 23 citations from 19 papers (1.21 average); 8 (42%) papers uncited. [For comparison, I took four issues of EJEG from late 2006-2007: 20 citations from 32 papers (0.63 citations per paper average); 24 (75%) of papers uncited.]
– And finally some Alexa comparisons of traffic and links for the two main online ICT4D journals and those in other information systems sub-areas (note ITID has only very recently gone online):
|Journal||Alexa Traffic Rank||Sites Linking In||Online Since|
|Electronic Journal of Information Systems in Developing Countries||1,422,355||120||2001|
|Information Technologies and International Development||4,527,166||35||2009|
|Electronic Journal of e-Government||1,785,669||72||2002|
|Electronic Journal of e-Learning||2,092,075||126||2002|
|Electronic Journal of Knowledge Management||2,286,374||96||2002|
|Journal of Community Informatics||4,535,331||129||2004|
My conclusion from all this: in terms of impact, ICT4D looks like a pretty standard research sub-field. We’re not punching above our weight but neither are we left out in the cold.
The sample sizes here are small, so conclusions are indicative rather than definitive:-
– Where ICT4D research is good enough to get into peer-reviewed WoK-covered journals it is of similar quality and with similar impact to other sub-fields. Where it is good enough to get into peer-reviewed open access journals, the same is true. And where it’s not good enough for that, it’s probably pretty bad but, again, the same is no doubt true of other sub-fields.
– If you want your work to have maximum impact then, on average, your best bet is to publish in a traditional journal. But open access, non-WoK-listed journals should not be regarded as “no go” areas: they are read and used.
– The average (modal) ICT4D article is never formally cited, but that’s true of many research fields.
– If you want to feel better about the citation impact of your work, stick to Google Scholar.
– Finally, what about conferences? Of about 120 papers on ICT4D published during 2003-2008 and covered by WoK and listed as having come from conferences, six are cited having been published in journals; three are cited from their proceedings. The remaining 90%+ whether in journals or proceedings are uncited. Conclusion: conferences may be good for networking, learning, and scenery; they are not good if you want your work to have a citation impact.
How big is the ICT4D research field? And is it growing or shrinking?
The first question is harder to answer. I’ll offer an estimate based on conferences (which will only attract a sub-set of the field e.g. 350 attendees at ICTD2009, and 500 authors submitting papers) and my own contact lists. On that basis, I estimate that, worldwide, several hundred academics and several thousand PhD researchers are working specifically on ICT4D topics. They work alongside thousands of staff in donor agencies, national governments and private firms who occasionally contribute to research outputs. Several thousand more academic staff, particularly in business/management and informatics, undertake occasional research in the field.
What about trends? One way to measure is through the ISI Web of Knowledge which records books, all papers in a large number of journals and some conferences. Searching for the term ‘ict4d’ produces only 25 results, almost all during 2007-2009; too few for any real analysis. Searching for ‘ict*’ and ‘developing countr*’ produces 395 results. A manual review suggests the great majority are ICT4D-relevant publications, and analysis shows the following trend (note results for 2009 are incomplete as papers are still being entered):
This shows dramatic growth in ICT4D research during the “noughties”: a nearly 2000% increase from 1999 to 2008; an average 39% annual growth rate.
In part, this might be an “ICT effect” reflecting greater use of the term. But it does also seem to reflect more general research growth in the area of development informatics. Publications using ‘info*’ and ‘developing countr*’ grew by 80% from 1999-2008 (7% annual average); and the narrower band of publications using ‘information technolog*’ and ‘developing countr*’ grew by 153% from 1999-2008 (from 83 to 210; an 11% annual average).
Just to check there wasn’t a general research growth effect, a cross-check with ‘political’ and ‘developing countr*’ showed a couple of hundred items per year published, but only a 14% growth in the literature from 1999-2008 (1% annual average). A better cross-check was with just ‘developing countr*’ which showed a 57% growth (5% annual); and just ‘ict*’ which showed a 126% growth (9% annual) during the same period.
From this data, then, ICT4D research publication is growing significantly faster than cognate research areas.
We can draw a similar conclusion of high growth by looking at ICT4D-specific journals. In 1999, these produced 33 articles. In 2009, they produced 182 articles; a 450% rise. This rise was very much related to growth in the number of journals: two in 1999 and eleven today (though one has yet to produce its 2009 edition, and one only produces its first edition in 2010; only one of these journals is covered by the ISI Web of Knowledge).
Given growth and items not covered by these two methods, this suggests at least 300 ICT4D journal articles will be published in 2010, and likely several hundred more under the general banner of development informatics in its broad sense. Plus, of course, all of the books, reports and conference papers not to mention blogs, wikis and the like. [See more consideration and detail on this in comment.]
That’s one hell of a change from 1987 when I first started academic work in the field and when, as I never tire of saying, the entire historical academic output on IT and development would fit on a single shelf of my bookcase. And it indicates ICT4D research as a fast-growing field with all the pros (greater audience, more jobs, more collaborators, more new ideas, more impact) and cons (more to read, greater competition) that brings.