European Ph.D on Social Representations and Communication

Multimedia and Distance Learning System

 

29th International Lab Meeting

of the

European/International Joint Ph.D. in Social Representations and Communication

 

 

DIDACTIC AND COMPLEMENTARY MATERIALS

 

"Advanced Training in the meta-theoretical analysis of the specialised literature on Social Representations and Communication"

 

 at the European/International Joint PhD in Social Representations & Communication Research Center and Multimedia LAB, P.za Cavalieri di Malta 2, Rome-Italy

 

24th - 27th January 2016

 

DIDACTIC MATERIALS FOR THE META-THEORETICAL ANALYSIS

Meta-Theoretical Analysis of Literature on Social Representations: Basic Guidelines for European PhD Research Trainees

Meta-Theoretical Analysis of Literature on Social Representations: Basic Bibliography on Social Representations

de Rosa, A.S. (2014).  “Geo-mapping” Tutorial for visualizing statistical data on geographical maps. In A.S. de Rosa, (2014c) Guidelines for the development of the So.Re.Com. “A.S. de Rosa”@-library. Unpublished University Report

 

COMPLEMENTARY DOCUMENTS REGARDING BIBLIOMETRIC TOOLS

Sources for Bibliometric Tools and documents about the controversial debate concerning the evaluation of scientific and publishing quality from different points of views and from different cultural and geographical contexts and institutional sources (U.S.A., France, Italy, European Commission, Fédération Française des Psychologues et de Psychologie, Italian Society of Psychology, National Agency for the Evaluation of Universities and Research Institutes, The Carnegie Foundation for the Advancement of the Teaching....).

1. Sources for Bibliometric Tools:

 

 

2. Bibliometrics and beyond: Literature on the controversial debate about its application for the quality research assessment

Adam, D. (2002). Citation analysis: the counting house, Nature 415,  6893: 726-729

Allen L., Jones C., Dolby K., Lynn D. and Walport M (2009). Looking for Landmarks: The Role of Expert Review and Bibliometric Analysis in Evaluating Scientific Publication Outputs. PLoS ONE

Baccini A. (2010), Valutare la ricerca scientifica. Uso e abuso degli indicatori bibliometrici. Bologna: Il Mulino.

Bar-Illan J. (2008). Which h-index? A comparison of WoS, Scopus and Google Scholar. Scientometrcs 74, 257-271,

Batista P. D., Campiteli M. G.; Kinouchi O.  (2006). Is it possible to compare researchers with different scientific interests?, Scientometrics, 68 (1): 179-189.

Burgelman, J.C. Osimo, D. Bogdanowicz, M. (2010) Science 2.0 (change will happen…), First Monday, 15, 7-5 July 2010

De Bellis, N. (2009). Bibliometrics and Citation Analysis: From the Science Citation Index to Cybermetrics, Lanham, MD: Scarecrow Press

De Bellis, N. (2014). Introduzione alla bibliometria: dalla teoria alla pratica, Milano: Associazione Italiana Biblioteche.

Chimes, C. (2014) Altmetrics, 7th Unica Scholarly Communication Seminar: Visibility, Visibility, Visibility, Rome, 27th -28th November 2014.

Cronin, B (2005) The Hand of Science: Academic Writing and Its Rewards, Lanham, MD: Scarecrow Press,

Cronin, B. Sugimoto, C. (2014). Beyond bibliometrics. Harnessing multidimensional indicators of scholarly impact, Cambridge, MA: MIT Press.

CRUI Commissione Biblioteche. Gruppo di lavoro sull’ Open Access. (2012), Linee guida per i metadati degli archivi istituzionali

de Rosa, A.S. (2014). Article, Book format, or both? Shared criteria adopted for the double doctoral thesis format and language in a European/International joint networked Ph.D. program. EUA-CDE 7th workshop on “The Outcomes of Doctoral Education” organized at the University of Izmir, Turkey (23-24 January 2014)

DORA (2012) San Francisco Declaration on Research Assesment. Putting Science into the Assessment of Research.

Ernst, R. E. (2010). The follies of citation indices and academic ranking lists. A brief commentary to ‘Bibliometrics as weapons of mass citation’, Chimia, 64 (1/2): 90.

European Bulletin of Social Psychology (EASP) (November 30, 2013) Opinions and Perspectives by the Participants of the Small Group Meeting: ‘Developing diversity in EASP’, Lausanne, June 12-14, 2013. Developing Diversity in EASP as a mean to achieve a vibrant and relevant social psychology, 5 (2): 5-9

Fanelli, D. (2010) Do pressures to publish increase scientists' bias? An empirical support from US states data. PLoS ONE 5(4), published online 21 April. DOI: 10.1371/journal.pone.0010271.

Figà-Talamanca, A. (2000). L'Impact Factor nella valutazione della ricerca e nello sviluppo dell'editoria scientifica.IV Seminario Sistema Informativo Nazionale per la Matematica SINM 2000: un modello di sistema informative nazionale per aree disciplinary (Lecce, 2 ottobre 2000)

Galteron, I. Williams , G, (2014) The French experience of evaluation of research in humanities and social sciences,  Workshop Internazionale “La valutazione della ricerca nelle Humanities and Social Science”, ANVUR (National Agency for the Evaluation of Universities and Research Institute), Rome 17 Nov. 2014

Garfield E. (1955). Citation indexes for science. A new dimension in documentation through association of ideas, Science, 122 (3159): 108-111.

Garfield E. (2006), The history and meaning of the journal impact factor, JAMA, 295, 90-93

Gimenez Toledo, E. (2014) Indicators for SSH’s book publishers: recent development in Spain, Workshop Internazionale “La valutazione della ricerca nelle Humanities and Social Science”, ANVUR (National Agency for the Evaluation of Universities and Research Institute), Roma 17 Novembre 2014

Glaser, J. (2014) Definitions of quality and internationalisation in Humanities, Workshop Internazionale “La valutazione della ricerca nelle Humanities and Social Science”, ANVUR (National Agency for the Evaluation of Universities and Research Institute), Rome 17 Nov. 2014

Halevi, G., Moed, H.F. (2014) 10 years of research impact: top cited papers in Scopus 2001-2011. Research Trends., vol. 38.

Higher Education Funding Council for England HEFCE (2015). Metrics cannot replace peer review in the next REF

Hug, S.E.  Ochsner, M. (2014), Quality criteria and indicators in the light of humanities scholars’ notions of quality: a bottom-up approach, Workshop Internazionale “La valutazione della ricerca nelle Humanities and Social Science”, ANVUR (National Agency for the Evaluation of Universities and Research Institute), Rome17 Nov. 2014

Kurtz, M.  Bollen J., (2010). Usage bibliometrics, Annual Review of Information Science and Technology, 44 (1): 1-64.

Labini, S. (2013). Ricerca scientifica: dati falsificati, meritocrazia e competizione, posted on 21st November 2013

Moed H.F. (2002). The impact-factors debate: the ISI's uses and limits, corrispondence, Nature, 415, 14 February, 731-732

Moed H. F. (2005). Citation analysis in research evaluation. Dordrecht: Springer,

Moed H. F. (2010), Measuring contextual citation impact of scientific journals, Journal of Informetrics, 4 (3): 265-277. 

Moed H. F. (2014a). Uso degli indicatori bibliometrici nella valutazione della ricerca, Teaching course at the Scuola Superiore di Studi Avanzati, Sapienza University of Rome, 20 October 2014.

Moed H. F. (2014b). New development in the evaluation of HSS research, Workshop Internazionale “La valutazione della ricerca nelle Humanities and Social Science”, ANVUR (National Agency for the Evaluation of Universities and Research Institute), Rome 17 Nov. 2014

Molinié, A. Bodenhausen, G. (2010). Bibliometrics as Weapons of Mass Citation, CHIMIA International Journal for Chemistry, 64 (1): 78-89 (12)

Moskovkin, V.M.  Bocharova, E. A. Balashova, O.V (2014). Journal benchmarking for strategic publication management and for improving journal positioning in the world ranking systems, Campus-Wide Information Systems, 31 (2/3): 82 – 99

Plume, A. van Weijen, D. (2014). Publish or perish? The rise of the fractional author… – Originally published on the Elsevier newsletter “Research Trends Issue 38″. SciELO in Perspective. [viewed 09 November 2014].

Priem, J.  Piwowar, H. (2012). The launch of ImpactStory: using altmetrics to tell data-driven stories, Impact of Social Sciences blog

Priem, J.  Taraborelli, D. Groth, P. Neylon, C.(2010). Altmetrics: A manifesto

Ségalat, L. (2010).La scienza malata, Milano: Cortina.

Seglen, Per O. (1997) Why the Impact Factor of journal should not be used for evaluating research, BMJ 314: 498–502 

Torres-Salinas, D. Cabezas-Clavijo, A. Jiménez-Contreras, E. (2013). Altmetrics New indicators for scientific communication in Web 2.0, Comunicar, 21, 41:53-60

Van Leeuwen, T.  (2014)The meaning of referencing in the humanities and social sciences and its interpretation in an evaluative context, Workshop Internazionale “La valutazione della ricerca nelle Humanities and Social Science”, ANVUR (National Agency for the Evaluation of Universities and Research Institute), Rome 17 Nov. 2014

Van Raan A.F.J. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods, Scientometrics, 62, 1, 133- 143

Vincent, N. (2014) Outputs and access. Two issues in the evaluation of HSS research, Workshop Internazionale “La valutazione della ricerca nelle Humanities and Social Science”, ANVUR (National Agency for the Evaluation of Universities and Research Institute), Rome 17 Nov. 2014

Zuccala, A.  (2014) Importance of books and development of book citation indices for evaluating the humanities, Workshop Internazionale “La valutazione della ricerca nelle Humanities and Social Science”, ANVUR (National Agency for the Evaluation of Universities and Research Institute), Rome 17 Nov. 2014

 

3. Big Data, Meta-data, Big Science, Open Access, Open Data, Academic Social Networking

Association for Psychological Science (A.P.S.) (2014) Big Data, Big Sciences, 26th Annual Convention, San Francisco May 22-25 2014.

Aventurier, P. (2014) Academic social networks: challenges and opportunities, 7th Unica Scholarly Communication Seminar: Visibility, Visibility, Visibility, Rome, 27th -28th November 2014

CRUI Commissione Biblioteche. Gruppo di lavoro sull’ Open Access. (2009). L’Open Access e la valutazione sui prodotti della ricerca scientifica. Raccomandazioni, luglio 2009.

Delle Donne R. (2010), Open access e pratiche della comunicazione scientifica. La politica della CRUI. In Guerrini M. Gli archivi istituzionali.Open access, valutazione della ricerca e diritto d’autore, Milano, Editrice Bibliografica, 125-150.

European Union (2014). The Data Harvest: How sharing research data can yield knowledge, job and growth, An RDA Europe Report, Brussels

Lourenço, J. Borrell-Damian, L. (2014) Open Access to Research Publications: Looking Ahead, EUA briefing paper, Brussels: European University Association (available  on http://www.eua.be)

Nielsen, M. (2012). Reinventing discovery: The new era of networked science. Princeton, N.J.: Princeton University Press.

Scheliga, K. Friesike, S.(2014). Putting open science into practice: A social dilemma?. First Monday, [S.l.], aug. 2014. ISSN 13960466. doi:10.5210/fm.v19i9.5381.

Tapscott D. Williams A.D. (2008). Wikinomics. How Mass Collaboration Changes Everything, Penguin Group (USA), New York.

Van Noorden, R. (2014) Online collaboration: Scientist and the social network, Nature, 512 (7513): 126-129.