In a comment on an earlier post, Dmitry Epstein asked about the quality and impact of ICT4D research. I’ve already blogged about what constitutes good quality ICT4D research, so here I will just add a few data snippets on these topics.
Quality of ICT4D Research
The good news is that there is good quality ICT4D research around:
– It makes its way into the top journals: there’s the 2007 special issue of MIS Quarterly (top info. systems journal) on IS in developing countries; and I count at least four articles on ICTs and developing countries in World Development (top dev. studies journal) during 2009.
– Inclusion in ISI’s Web of Knowledge is a rather rough quality benchmark but it’s a benchmark nonetheless and, as noted in a previous post, there are a few hundred papers per year within the boundaries of development informatics recorded on WoK.
– On the assumption that citation rates are linked to quality, some individual ICT4D items score well. The article “Information Systems in Developing Countries: Failure, Success and Local Improvisations” gets 44 citations on WoK; 217 on Google Scholar. (Modesty, be damned: if you don’t believe in the quality of your own work, you shouldn’t be writing.)
But then there’s the bad news about ICT4D research quality:
– Of the eleven specialist journals in the field only one, Information Development, is seen as worthy of inclusion in the WoK.
– My subjective but honest opinion based on reading all 250 papers submitted to the ICTD2009 conference: the general quality of the big, fat, long tail of work in our field is pretty poor.
And lastly, there’s the average news, as conveyed next, that citation evidence (if a proxy for quality) suggests ICT4D research is no better or worse than other sub-fields of research.
Impact of ICT4D Research
Citation evidence is only one part of the impact story but . . . I’ve already noted that some individual ICT4D papers get highly-cited. And I should also note a bit of background on citation: work in early 2009 looked at citation rates for papers published during 1998-2008.
In computer sciences, the average number of citations per paper was 3.06, ranging from 7.06 for papers published in 1998 to 0.10 for papers published in 2008 (papers take time to pick up citations). Equivalent figures are: economics and business – 5.02 average (from 10.19 to 0.13 for 1998 – 2008); and social sciences – 4.06 average (from 7.48 to 0.16).
Note these are narrow citation rates for WoK-type databases; Google Scholar citation rates are much higher because they record a much wider range of published material: a 1:4 ratio of WoK:Google Scholar citation numbers looks about standard. Note also that even for papers in peer-reviewed articles, somewhere between 25% and 75% of articles – varying by discipline – are never cited in other peer-reviewed articles.
Some data bits:
– Comparing the total number of WoK citations for the four articles in MISQ’s special issue on IS in developing countries (vol.31, issue 2), with those for the first four articles in the next issue, shows no difference: both sets get 14 citations.
– Taking those (15) journal articles published in 2006 that come up on WoK using a search of ‘ict*’ and ‘developing countr*’ (and excluding out-of-topic items), they are cited 34 times: an average of 2.27 times. Comparing figures with 2005 for the work reported above (which came out about one year ago, hence the need to move back one year for a comparison), that 2.27 figure is somewhat better than the average for computer science (1.85) but a bit worse than that for social sciences (2.99), and economics and business (3.08). [For comparison, substituting ‘agricultur*’ for ‘ict*’ and taking those listed as being in the public administration subject area; the 15 on-topic articles were cited 25 times: 1.67 average.]
– Taking the same type of journal articles over the period 2004-2007, there are 46 in all of which 22 are uncited: 48%; average citation rate over all items is 1.67 cites per article. [4 of 15 agriculture articles in 2006 were uncited compared to 3 of 15 ict4d.]
– Performance in open access journals outside WoK is lower. Both EJISDC and ITID journals had four issues in 2007. EJISDC has so far scored 11 citations in WoK from 26 papers (0.42 citations per paper average); 18 (69%) papers uncited. ITID has so far scored 23 citations from 19 papers (1.21 average); 8 (42%) papers uncited. [For comparison, I took four issues of EJEG from late 2006-2007: 20 citations from 32 papers (0.63 citations per paper average); 24 (75%) of papers uncited.]
– And finally some Alexa comparisons of traffic and links for the two main online ICT4D journals and those in other information systems sub-areas (note ITID has only very recently gone online):
Journal | Alexa Traffic Rank | Sites Linking In | Online Since |
Electronic Journal of Information Systems in Developing Countries | 1,422,355 | 120 | 2001 |
Information Technologies and International Development | 4,527,166 | 35 | 2009 |
Electronic Journal of e-Government | 1,785,669 | 72 | 2002 |
Electronic Journal of e-Learning | 2,092,075 | 126 | 2002 |
Electronic Journal of Knowledge Management | 2,286,374 | 96 | 2002 |
Journal of Community Informatics | 4,535,331 | 129 | 2004 |
My conclusion from all this: in terms of impact, ICT4D looks like a pretty standard research sub-field. We’re not punching above our weight but neither are we left out in the cold.
Overall Conclusion
The sample sizes here are small, so conclusions are indicative rather than definitive:-
– Where ICT4D research is good enough to get into peer-reviewed WoK-covered journals it is of similar quality and with similar impact to other sub-fields. Where it is good enough to get into peer-reviewed open access journals, the same is true. And where it’s not good enough for that, it’s probably pretty bad but, again, the same is no doubt true of other sub-fields.
– If you want your work to have maximum impact then, on average, your best bet is to publish in a traditional journal. But open access, non-WoK-listed journals should not be regarded as “no go” areas: they are read and used.
– The average (modal) ICT4D article is never formally cited, but that’s true of many research fields.
– If you want to feel better about the citation impact of your work, stick to Google Scholar.
– Finally, what about conferences? Of about 120 papers on ICT4D published during 2003-2008 and covered by WoK and listed as having come from conferences, six are cited having been published in journals; three are cited from their proceedings. The remaining 90%+ whether in journals or proceedings are uncited. Conclusion: conferences may be good for networking, learning, and scenery; they are not good if you want your work to have a citation impact.
Just a postscript that part of my verdict on at least the more technical papers submitted for ICTD2009 could be my techie-turned-social scientist background. My view of much technical work in the area is wonderfully summarised by Mike Best reflecting on those papers which did make the cut for this conference: “…social scientists complained that the technical work lacked sophistication, was weak in evaluation, and was not grounded in the needs and realities of the users. A common story that I would hear from the social scientists is the computer scientists would get up and say “I decided to build this thing. So I worked on this thing. Then I worked a bit more on the thing, then I adjusted the thing, and then the thing was done. Then I took my thing to Ghana and asked ten people whether they liked my thing. Nine people liked my thing. Hoorah for my thing.””
Though I will just add a rider: somewhat to my surprise, I think I’ve cited the technical papers from the conference as many times as the social science papers in my subsequent writing. My explanation would be that technical ICT4D research is on average less good than social science ICT4D research but often more interesting, more exciting and more innovative.
Thank you for this posts that builds nicely on the previous one. I am glad to be the trigger 🙂
I understand that focusing on citations is limiting. At the same time, if citations are the focus I think the number of citations is only part of the story. I wonder what does that map look like if you factor in the impact factors of the journals where ICT4D research is getting published?
Thank you again.
Good question. One of the difficulties of asking it is that the specialist ICT4D journals are not listed in WoK, so I don’t think they have listed impact factors. However, since impact factor is related to average no. of citations per article, the blog provides a (very) little bit of data on that.
For example, the eight articles listed in MISQ show an average 3.5 citations per paper. You can then compare to this “impact factor” for ITID journal (1.2) and EJISDC (0.4).
Although not identifying individual journals, my next blog posting will be comparing the citation impact factor of different types of ICT4D publication.
Thank you! I will be looking for your next post.
I am not a specialist in the way impact factors are calculated, but I think it is a bit more than mere number of citations. At least it should be some kind of proportion of the total of published articles vs. cited articles. On a more sophisticated edge this would also take into account the impact factor of the journals that cite the articles (sort of the way Google algorithm works). I am sure there must be literature on that. Just thinking our loud…
Impact factor (at least as calculated by WoK) is as stated: average number of citations per article; that of course does incorporate the fact that some articles are uncited.
I just checked for MISQ. The two-year impact factor (means average no. of cites per paper in Year X of all papers published in Year X-1 and X-2) for MISQ is 5.2. The five-year impact factor is 11.6.
But MISQ is exceptional – the median two-year impact factor for all Info. and Library Science journals (the closest category to ICT4D, including many IS journals) is 0.8.
The figure I quoted for ITID (1.2) is for all citations of the listed articles. Given the 2007 publication date, the two-year impact factor might be about half that – 0.6 – suggesting ITID is not far off the average for WoK-listed journal impact factor.
On the second point, yes you could look at the secondary-effect data on impact factor of sources in which ICT4D papers are cited but, as noted, many of those sources are specialist ICT4D journals which don’t have published impact factors. So you’d have to create those for each journal. And for all other non-WoK-listed sources. And to do it properly (since impact factors vary each year), you’d need to do it for each year of each source. And the only way to do it would be manually, one cite at a time. Unless I’ve missed a shortcut, you’d be talking weeks if not months of work. That’s for a WoK-based approach. If you used Google Scholar, then there might be a quicker way using link-counting. But it would still be time-consuming. And I’m not quite clear what the valuable question is that justifies the time required to answer it.
Thank you for this posting. My views about modern metrics are not fit for polite society whether it is in the field of economics, financial analysis and accounting or the academic world. On the other hand, I appreciate very much what science and technology has accomplished over the last 50 years and wish that some of this could have been applied to give us meaningful metrics for a smart society.
I am an advocate for management information … that is the least amount of information that enables good decisions to be made reliably, and then confirms that decisions were good. This is not a statistical construct or numbers that have no meaning … this is a best pragmatic approximation to cause and effect, and with this approach good things can have great impact.
In my experience, ICT for anything has been way less successful than it should have been based on the theoretical possibilities. For years there was the “PC Paradox” where investment in PCs ended up reducing office productivity. Much of modern mobile technology is producing way less than what it could in terms of impact. The metrics being used to understand this are clumsy … to be polite.
With technology perhaps a million times more powerful than when I started my career … I wonder why the metrics of society have improved so little?
Peter Burgess
Community Analytics (CA)
As I recall, there are (at least) four possible answers to that last question:
– Technology does improve things; it’s just that we’re measuring the wrong things, or measuring things wrong (“Please fund my research project on impact measurement”)
– Technology can improve things; but only if we get a lot of other things right (“Please fund my research project on implementation best practice”)
– Technology will improve things; but only if we get some better technology (“Please fund my research project to develop this new hardware/software)
– Technology does not improve things (“Please fund my research project on Proust”)
The Information Technology for Development Journal has had an impact in ICT4D research through citations in other Journals such as MIS Quarterly. Journals Citing Information Technology for Development
Data analysed from Thomson Reuters’ Web of Science™
Years analysed: 1998-2008
2 or more cites
Journal Articles Citing Jnl % Articles Citing Jnl
GOVERNMENT INFORMATION QUARTERLY 4 6.35%
INFORMATION SOCIETY 4 6.35%
MIS QUARTERLY 4 6.35%
ELECTRONIC LIBRARY 3 4.76%
INTERNATIONAL JOURNAL OF TECHNOLOGY MANAGEMENT 3 4.76%
JOURNAL OF INFORMATION TECHNOLOGY 3 4.76%
EUROPEAN JOURNAL OF INFORMATION SYSTEMS 2 3.17%
INFORMATION SYSTEMS FRONTIERS 2 3.17%
INFORMATION SYSTEMS JOURNAL 2 3.17%
JOURNAL OF GLOBAL INFORMATION MANAGEMENT 2 3.17%
JOURNAL OF WORLD BUSINESS 2 3.17%
I’m not sure this allows much of a guide – no doubt many journals will find themselves cited a few times over a ten year period.
Plus it’s not clear what “% Articles Citing Jnl” means – it certainly doesn’t mean the percentage of articles in the target journal that cite IT for Devel journal articles.
An appropriate measure of the relative value of this journal would be the average citation rate of all articles published in 2007, as done for ITID and EJISDC journals.
I see what you mean. If you want to be able to compare ITDJ with the online journals in this area, then, the stats for online access for the year 2007 are as follows:
Abstracts 18,758
Full text 7,495
Based on a selection of highly cited papers reported by the publisher in 2008, the number of times ITDJ papers were cited in 2007 is: 3598.
In terms of an average citation rate and ranking in other Journals, for that year, I would have to request additional information from the publisher. I know that citations of ITD Journal in the fields of Economics and Government have been high. So it seems that our own MIS colleagues may not cite ITDJ paper as high as scholars in these fields.
In 2007, ITDJ had around 70-80 papers available at its web site. So that suggests around 250-300 abstract downloads per paper on average, and about 100 full text downloads per paper.
Download figures for the Electronic Journal of Information Systems in Developing Countries appear to be around five times greater.
However, it looks as if citation rates for ITDJ are higher; something I will publish on at some point.