Development Studies Journal Ranking Table

The following represents a first attempt at a “league table” for development studies journals.

Rank Journal Citation Score
1 World Development 6.04
2 Journal of Development Studies 4.90
3 Oxford Development Studies 4.06
4 Development Policy Review 3.20
5 Studies in Comparative International Development 2.40
6 Sustainable Development 2.39
7 European Journal of Development Research 1.90
8 Development and Change  1.89
9 Information Technology for Development 1.58
10 Information Technologies and International Development 1.55
11 Journal of International Development 1.46
12 Development 1.33
13 Third World Quarterly 1.30
14 Public Administration and Development 1.21
15 Development in Practice 1.03
16 Progress in Development Studies 0.88
17 Electronic Journal of Information Systems in Developing Countries 0.81
18 African Development Review 0.79
19 Gender and Development 0.58
20 Enterprise Development and Microfinance 0.45
21 Canadian Journal of Development Studies 0.45
22 IDS Bulletin 0.40
23 Information Development 0.37
24 Forum for Development Studies 0.17
25 Journal of Third World Studies 0.11
     
  Comparator Journals  
  Journal of Development Economics 10.90
  Human-Computer Interaction 4.06
  Environment and Planning D 3.42
  Information Systems Journal 2.89
  The Information Society 1.64
  Mountain Research and Development 0.91

Basis

– Selection was on the basis of development studies journals that appear in various other tables or lists.  However, development economics journals (inc. Economic Development and Cultural Change, Journal of Development Economics, Review of Development Economics, and The Developing Economies) were not included.  If you have suggestions for additions (or deletions), then let me know.

– Citation score is calculated by taking papers published in each journal in 2008 and identifying how many times each paper is cited in Google Scholar.  The average number of cites per paper was then divided by the average number of years since publication.  Very roughly, then, the score equates to average number of GS citations per paper per year.

– All papers published in 2008 were used if less than 20 were published; a sample of at least 20 building outwards from the mid-year issues was used if more than 20 were published.

– One anomalous paper, with over 10 times the citations of any other (a pattern not seen in any other journal), was omitted from African Development Review.  Had this been included, ADR would place seventh.

– This exercise will be repeated and expanded in future years.  What is presented here should only be seen as a first, fairly rough-and-ready set of figures.  The original data used for the calculations can be found here.

Notes

– The raw figures shown here should not be compared with the impact factor scores under Planning and Development provided in ISI’s Journal Citation Reports.  The rankings can be compared.

– Different disciplines have different citation habits and norms.  Specifically, if economists cite more highly, then those development studies journals that include a greater proportion of development economics papers may gain a greater overall citation score.

– Conversely – and requiring further investigation – in compiling the figures, I got some sense that papers in special issues tend to receive fewer citations.  Journals that have a lot of special issues may receive a lower overall citation score.

Reflections

– These average figures provide no guidance on whether your individual paper would be cited more highly if published in one journal or another.  However, the rankings could be used to provide guidance or evidence on the general impact of a selected journal.  (Of course recognising that overall impact is about more than just citations.)

– The figures suggest that, beyond the obvious top two of JDS and World Development, there may be some mismatch between previous subjective ratings and actual impact.  For example, Oxford Development Studies and Development Policy Review rank 3rd and 4th here, yet are unrated by most other journal rating schemes.

– There is a moderate mismatch with the ISI JCR 2008 impact factor ranking.  Most notably, four of the top ten journals here do not appear at all in the ISI list including the two top-cited ICT-for-development journals.

Other Data

– The table below gives details of other ranking and rating data on development studies and some development economics journals.

High->Low Aston 2008 (4->0) CNRS 2008 (1*->4) Ideas 2010 (/731) SJR 2010 (/118) WoK 2010 (/43) ABDC 2010 (A*->C) ABS 2010 (4->1) SoM 2010 (4->1) Heeks 2010 (/25)
African Development Review       65 43     2 18
Canadian Journal of Development Studies       78 42       21
Development     666 28         12
Development and Change 2 2   15 19 B 2   8
Development in Practice       32         15
Development Policy Review     270 10 8       4
Economic Development and Cultural Change     117   24 A 3 4  
Electronic Journal of Information Systems in Developing Countries                 17
Enterprise Development and Microfinance                 20
European Journal of Development Research     438 48         7
Forum for Development Studies                 24
Gender and Development       73         19
IDS Bulletin       70 37       22
Information Development                 23
Information Technologies and International Development                 10
Information Technology for Development                 9
Journal of Development Economics     43   36 A* 3 4  
Journal of Development Studies 2 2 152 2 26 A 3 4 2
Journal of International Development 1 3 292 22   B 1 1 11
Journal of Third World Studies       86         25
Oxford Development Studies     192 58       1 3
Progress in Development Studies       30         16
Public Administration and Development       62 39 A 2 2 14
Review of Development Economics     129 26 32     1  
Studies in Comparative International Development       23 31 A     5
Sustainable Development       9 11       6
The Developing Economies     474   35 B      
Third World Quarterly   2   29 30 A 2   13
World Development 3 1 134   9 A 3 3 1
High->Low Aston 2008 (4->0) CNRS 2008 (1*->4) Ideas 2010 (/731) SJR 2010 (/118) WoK 2010 (/43) ABDC 2010 (A*->C) ABS 2010 (4->1) SoM 2010 (4->1) Heeks 2010 (/25)


Key
 
– ABS – UK Association of Business Schools: http://www.the-abs.org.uk/?id=257

– Ideas – citation data from RePEc project of paper downloads: http://ideas.repec.org/top/top.journals.simple.html (economics and finance research)

– SJR – Scopus-based citation ranking: http://www.scimagojr.com/journalrank.php?category=3303&area=0&year=2008&country=&order=sjr&min=0&min_type=cd (development journals)

– SoM – Cranfield School of Management: https://www.som.cranfield.ac.uk/som/dinamic-content/media/SOM%20Journal%20Rankings%202010%20-%20alphabetical.pdf

– WoK – 2008 impact factor in ISI Journal Citation Reports under Planning and Development

– All other data from Harzing’s Journal Quality List: http://www.harzing.com/jql.htm

Advertisement

Why ERP Systems in Developing Countries Fail

Enterprise resource planning (ERP) systems are increasingly being used in business organisations in developing countries; also in the public and NGO sectors.  ERP promises to integrate data systems – financials, logistics, HR, etc – across the organisation; thus saving money and improving decision-making.  But the failure rate for ERP implementations is high, with particular problems found in developing country organisations.

A new research paper from the University of Manchester’s Centre for Development Informatics analyses why ERP systems in developing countries fail: https://www.gdi.manchester.ac.uk/research/publications/di/di-wp45/

It draws evidence from an in-depth Middle East case study, and first uses an analytical model based on DeLone & McLean’s work.  This gathers evidence on the success or failure of any ICT project against five evaluation criteria: system quality, information quality, use and user satisfaction, individual impact, and organisational impact.  This provides an objective basis for identifying the case study ERP system as an almost-complete failure.

A second analytical model – the design—reality gap framework – was then used to explain why this ERP implementation failed.  Using rating scale evidence gathered on seven ‘ITPOSMO’ dimensions, this shows there was a large gap between ERP system design expectations, and case organisation realities prior to implementation.

This is often true of ERP systems since they seek to make significant changes within client organisations.  However, the design—reality gap analysis was repeated later on, showing that gaps did not close during implementation, as they need to do for a successful system.

Practical recommendations for risk identification and mitigation are outlined based on closure of both specific design—reality gaps during ERP implementation, and also on a set of generic gap closure techniques such as development and use of ‘hybrid’ professionals.

In research terms, the case demonstrates the value of the DeLone/McLean model for categorisation of ERP and other information system project outcomes, and the value of the design—reality gap model for analysing project implementation, and in explaining why project outcomes occur.

A revised version of the paper has been published in the Journal of Enterprise Information Management: http://www.emeraldinsight.com/10.1108/17410391011019741

Other experiences of ERP or similar enterprise system implementations in developing countries would be welcome as comments.