• Search Menu
  • Sign in through your institution
  • Advance articles
  • Author Guidelines
  • Submission Site
  • Open Access
  • Why Publish?
  • About Research Evaluation
  • Editorial Board
  • Advertising and Corporate Services
  • Journals Career Network
  • Self-Archiving Policy
  • Dispatch Dates
  • Journals on Oxford Academic
  • Books on Oxford Academic

Article Contents

  • 1. Introduction
  • 4. Discussion
  • Acknowledgements

Four types of research in the humanities: Setting the stage for research quality criteria in the humanities

  • Article contents
  • Figures & tables
  • Supplementary Data

Michael Ochsner, Sven E. Hug, Hans-Dieter Daniel, Four types of research in the humanities: Setting the stage for research quality criteria in the humanities, Research Evaluation , Volume 22, Issue 2, June 2013, Pages 79–92, https://doi.org/10.1093/reseval/rvs039

  • Permissions Icon Permissions

This study presents humanities scholars’ conceptions of research and subjective notions of quality in the three disciplines German literature studies, English literature studies, and art history, captured using 21 Repertory Grid interviews. We identified three dimensions that structure the scholars’ conceptions of research: quality, time, and success. Further, the results revealed four types of research in the humanities: positively connoted ‘traditional’ research (characterized as individual, discipline-oriented, and ground-breaking research), positively connoted ‘modern’ research (cooperative, interdisciplinary, and socially relevant), negatively connoted ‘traditional’ research (isolated, reproductive, and conservative), and negatively connoted ‘modern’ research (career oriented, epigonal, calculated). In addition, 15 quality criteria for research in the three disciplines German literature studies, English literature studies, and art history were derived from the Repertory Grid interviews.

Email alerts

Citing articles via.

  • Recommend to your Library

Affiliations

  • Online ISSN 1471-5449
  • Print ISSN 0958-2029
  • Copyright © 2024 Oxford University Press
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Ethics in the Humanities

  • Living reference work entry
  • First Online: 01 January 2015
  • Cite this living reference work entry

research in humanities pdf

  • Cheryl K. Stenmark 2 &
  • Nicolette A. Winn 2  

2 Altmetric

Ethical behavior is critical in both academic and professional life. Because most professionals and academics work collaboratively with other people, it is important for them to behave ethically in order to develop quality collaborative relationships, so that they can trust each other. Because of the importance of ethical behavior in academic and professional settings, research and training programs aimed at improving ethical behavior, and the cognitive processes underlying ethical behavior are becoming increasingly widespread (National Institutes of Health 2002 ; Steneck 2002 ).

These research and training efforts have largely focused on professionals in the sciences and business. Ethical behavior, however, is important in any endeavor which involves multiple people working together. The Humanities have largely been ignored in explorations of ethical issues, particularly with regard to research ethics. This chapter argues that extending knowledge of ethical issues into the Humanities domain is important in order to identify the ethical problems faced by individuals in the Humanities, so that tailored research and training on these types of situations can help these individuals to deal with such problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Similar content being viewed by others

research in humanities pdf

Academic Integrity in Social Sciences

research in humanities pdf

Framing Integrity Resolution: An Integrative Approach to Academic Ethics

AAUP. (2009). Professional ethics . http://www.aaup.org/issues/professional-ethics . Retrieved 28 August 2014.

AHA. (2011). Statement on standards of professional conduct . http://www.historians.org/jobs-and-professional-development/statements-and-standards-of-the-profession/statement-on-standards-of-professional-conduct . Retrieved 28 August 2014.

AIA. (2012). 2010 code of ethics and professional conduct . http://www.aia.org/aiaucmp/groups/aia/documents/pdf/aiap074122.pdf . Retrieved 28 August 2014.

ASMP. (1993). Photographer’s code of conduct . http://ethics.iit.edu/ecodes/node/3666 . Retrieved 28 August 2014.

CAA. (2011). Standard and guidelines professional practices for artists . http://www.collegeart.org/guidelines/practice . Retrieved 28 August 2014.

De Vries, R., Anderson, M. S., & Martinson, B. C. (2006). Normal misbehavior: Scientists talk about the ethics of research. Journal of Empirical Research on Human Research Ethics, 1 , 43–50.

Article   Google Scholar  

Helton-Fauth, W., Gaddis, B., Scott, G., Mumford, M., Devenport, L., Connelly, S., & Brown, R. (2003). A new approach to assessing ethical conduct in scientific work. Accountability in Research, 10 , 205–228.

Kligyte, V., Marcy, R. T., Sevier, S. T., Godfrey, E. S., & Mumford, M. D. (2008). A qualitative approach to responsible conduct of research (RCR) training development: Identification of metacognitive strategies. Science and Engineering Ethics, 14 , 3–31.

Kuta, S. (2014). Philosophers call for profession-wide code of conduct . http://www.dailycamera.com/cu-news/ci_25331862/philosophers-call-profession-wide-code-conduct . Retrieved 28 August 2014.

Lewis, M. (2002). Doris Kearns Goodwin and the credibility gap. Forbes . http://www.forbes.com . Retrieved 28 August 2014.

MLA. (n.d.). Statement of professional ethics . http://www.mla.org/repview_profethics . Retrieved 28 August 2014.

Mumford, M. D., Connelly, M. S., Brown, R. P., Murphy, S. T., Hill, J. H., Antes, A. L., Waples, E. P., & Devenport, L. D. (2008). A sensemaking approach to ethics training for scientists: Preliminary evidence of training effectiveness. Ethics and Behavior, 18 (4), 315–399.

Mumford, M. D., Antes, A. L., Beeler, C., & Caughron, J. (2009). On the corruptions of scientists: The influence of field, environment, and personality. In R. J. Burke & C. L. Cooper (Eds.), Research companion to corruption in organization (pp. 145–170). Cheltenham: Edward Elgar.

Google Scholar  

National Institutes of Health. (2002). Summary of the FY2010 President’s budget. http://officeofbudget.od.nih.gov/UI/2010/Summary%20of%20FY%202010%20President%27s%20Budget.pdf . Retrieved 3 June 2009.

New York Times. (2004). Ethical journalism a handbook of values and practices for the news and editorial departments . http://www.nytco.com/wp-content/uploads/NYT_Ethical_Journalism_0904-1.pdf . Retrieved 28 August 2014.

Ohiri, I. C. (2012). Promoting theatre business through good contacts and theatre business ethics. Insights to a Changing World Journal, 2 , 43–54.

Resnik, D. B. (2003). From Baltimore to Bell Labs: Reflections on two decades of debate about scientific misconduct. Accountability in Research, 10 , 123–135.

Sadri, H. (2012). Professional ethics in architecture and responsibilities of architects toward humanity. Turkish Journal of Business Ethics, 5 (9), 86.

Sekerka, L. E. (2009). Organizational ethics education and training: A review of best practices and their application. International Journal of Training and Development, 13 (2), 77–95.

Stanford Humanities Center. (2015). http://shc.stanford.edu/what-are-the-humanities . Retrieved 28 August 2014.

Steneck, N. H. (2002). ORI introduction to the responsible conduct of research . Washington, DC: U.S. Government Printing Office.

Stenmark, C. K., Antes, A. L., Martin, L. E., Bagdasarov, Z., Johnson, J. F., Devenport, L. D., & Mumford, M. D. (2010). Ethics in the Humanities: Findings from focus groups. Journal of Academic Ethics, 8 , 285–300.

Thielke, J. (2009). A 1945 code of ethics for theatre workers emerges . http://lastagetimes.com/2009/08/a-1945-code-of-ethics-for-theatre-workers-surfaces/ . Retrieved 28 August 2014.

Waples, E. P., Antes, A. L., Murphy, S. T., Connelly, S., & Mumford, M. D. (2009). A meta-analytic investigation of business ethics instruction. Journal of Business Ethics, 87 (1), 133–151.

Download references

Author information

Authors and affiliations.

Department of Psychology, Sociology, and Social Work, Angelo State University, 2601 W Avenue N, 76903, San Angelo, TX, USA

Cheryl K. Stenmark & Nicolette A. Winn

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Cheryl K. Stenmark .

Editor information

Editors and affiliations.

University of South Australia, Adelaide, South Australia, Australia

Tracey Ann Bretag

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer Science+Business Media Singapore

About this entry

Cite this entry.

Stenmark, C.K., Winn, N.A. (2015). Ethics in the Humanities. In: Bretag, T. (eds) Handbook of Academic Integrity. Springer, Singapore. https://doi.org/10.1007/978-981-287-079-7_43-1

Download citation

DOI : https://doi.org/10.1007/978-981-287-079-7_43-1

Received : 08 October 2014

Accepted : 22 April 2015

Published : 24 June 2015

Publisher Name : Springer, Singapore

Online ISBN : 978-981-287-079-7

eBook Packages : Springer Reference Education Reference Module Humanities and Social Sciences Reference Module Education

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Quick links

  • Make a Gift
  • Directories

What is Humanities Research?

UW English majors at the 2013 Rutgers English Diversity Institute

Research in the humanities is frequently misunderstood. When we think of research, what immediately comes to mind for many of us is a laboratory setting, with white-coated scientists hunched over microscopes. Because research in the humanities is often a rather solitary activity, it can be difficult for newcomers to gain a sense of what research looks like within the scope of English Studies. (For examples, see Student Research Profiles .)

A common misconception about research is reinforced when we view it solely in terms of the discovery of things previously unknown (such as a new species or an archaelogical artifact) rather than as a process that includes the reinterpretation or rediscovery of known artifacts (such as texts and other cultural products) from a critical or creative perspective to generate innovative art or new analyses. Fundamental to the concept of research is precisely this creation of something new. In the humanities, this might consist of literary authorship, which creates new knowledge in the form of art, or scholarly research, which adds new knowledge by examining texts and other cultural artifacts in the pursuit of particular lines of scholarly inquiry.

Research is often narrowly construed as an activity that will eventually result in a tangible product aimed at solving a world or social problem. Instead, research has many aims and outcomes and is a discipline-specific process, based upon the methods, conventions, and critical frameworks inherent in particular academic areas. In the humanities, the products of research are predominantly intellectual and intangible, with the results contributing to an academic discipline and also informing other disciplines, a process which often effects individual or social change over time.

The University of Washington Undergraduate Research Program provides this basic definition of research:

"Very generally speaking, most research is characterized by the evidence-based exploration of a question or hypothesis that is important to those in the discipline in which the work is being done. Students, then, must know something about the research methodology of a discipline (what constitutes "evidence" and how do you obtain it) and how to decide if a question or line of inquiry that is interesting to that student is also important to the discipline, to be able to embark on a research project."

While individual research remains the most prevalent form in the humanities, collaborative and cross-disciplinary research does occur. One example is the "Modern Girl Around the World" project, in which a group of six primary UW researchers from various humanities and social sciences disciplines explored the international emergence of the figure of the Modern Girl in the early 20th century. Examples of other research clusters are "The Race/Knowledge Project: Anti-Racist Praxis in the Global University," "The Asian American Studies Research Cluster," " The Queer + Public + Performance Project ," " The Moving Images Research Group ," to name a few.

English Studies comprises, or contains elements of, many subdisciplines. A few examples of areas in which our faculty and students engage are Textual Studies , Digital Humanities , American Studies , Language and Rhetoric , Cultural Studies , Critical Theory , and Medieval Studies . Each UW English professor engages in research in one or more specialty areas. You can read about English faculty specializations, research, and publications in the English Department Profiles to gain a sense of the breadth of current work being performed by Department researchers.

Undergraduates embarking on an independent research project work under the mentorship of one or more faculty members. Quite often this occurs when an advanced student completes an upper-division class and becomes fascinated by a particular, more specific line of inquiry, leading to additional investigation in an area beyond the classroom. This also occurs when students complete the English Honors Program , which culminates in a guided research-based thesis. In order for faculty members to agree to mentor a student, the project proposal must introduce specific approaches and lines of inquiry, and must be deemed sufficiently well defined and original enough to contribute to the discipline. If a faculty member in English has agreed to support your project proposal and serve as your mentor, credit is available through ENGL 499.

Beyond English Department resources, another source of information is the UW Undergraduate Research Program , which sponsors the annual Undergraduate Research Symposium . They also offer a one-credit course called Research Exposed (GEN ST 391) , in which a variety of faculty speakers discuss their research and provide information about research methods. Another great campus resource is the Simpson Center for the Humanities which supports interdisciplinary study. A number of our students have also been awarded Mary Gates Research Scholarships .

Each year, undergraduate English majors participate in the UW's Annual Undergraduate Research Symposium as well as other symposia around the nation. Here are some research abstracts from the symposia proceedings archive by recent English-major participants.

UW English Majors Recently Presenting at the UW's Annual Undergraduate Research Symposium

For additional examples, see Student Profiles and Past Honors Students' Thesis Projects .

  •   Facebook
  •   Instagram
  •   Twitter
  •   Newsletter

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Rethinking Research Ethics in the Humanities: Principles and Recommendations

Profile image of Geoff  Pearson

2023, Rethinking Research Ethics in the Humanities: Principles and Recommendations

This AHRC-funded report is designed to stimulate reflection and discussion about ethical issues that could arise in qualitative, Humanities-based research designs that might be considered ‘risky’. The report can be used at project meetings; by University Research Ethics Committees (URECs), College Research Ethics Committees, and other governance bodies; and in discussions with project stakeholders. It is also designed to help postgraduate, early-career researchers, and PhD supervisors navigate key issues pertaining to risky qualitative research, and to provide additional readings and precedence in developing applications for ethical review. The report is organised thematically and proposes a series of principles for reforming ethical review in this space, as well as recommendations for URECs, governance bodies, and funders. The themes arising may not be applicable to all qualitative research designs, and the specific methods and context of the research will need to be reflected upon when using this report. Different types of methodologies, participants, stakeholders and local contexts will require different ethical-approval processes that use disparate forms and procedures. The reflection that this report intends to stimulate should be promoted by and among all those involved in the design and conduct of the research, including wherever possible with participants and their communities. How to cite this report: Kasstan, Jonathan R., Pearson, Geoff & Victoria Brooks (2023): Rethinking Research Ethics in the Humanities: Principles and Recommendations. doi.org/10.34737/w36yq.

Related Papers

Methodological Innovations Online

Janet Boddy

research in humanities pdf

Journal of Academic Ethics

Susan Tilley

Fiona C. Ross , Deborah Posel

The article is the Introductory chapter to Ethical Quandaries in Social Research.

Criminal Behaviour and Mental Health

Tim Exworthy

Dominique Riviere

Liora Bresler

Dominika Latusek

In this chapter, we explore a number of ethical questions and ethical dilemmas that can arise at different stages of the research process. Rather than attempting to provide an answer to these or a full overview of the ethical issues encountered by researchers, we aim to sensitize the reader to some of the complexities involved in trying to do qualitative research in an ethically sensitive manner. We see ethics not as a uniform set of rules or a formal institutional requirement but rather as an integral element of research praxis. We therefore consider a number of ethical questions that are likely to arise at different stages of the research process and alert the reader to some ethically important moments that they might encounter. We start by looking at some ethical questions linked to the research design. We then turn to discussing ethical challenges associated with negotiating access, trying to obtain informed consent from participants as well as maintaining and managing relations...

International Journal of Research Method in Education

Tanya Fitzgerald

The tales we tell here focus on the ethical issues arising from our research practice with vulnerable young participants and those for whom research has been inextricably linked with European imperialism and colonialism. The importance of relational obligations, temporality and potential for a continuing narrative approach to ethical research cannot be underestimated, and accordingly we recount our experiences in order to highlight these obligations as well as ethical tensions that we encountered. Importantly, our tales and the underpinning theorization suggest that Human Research Ethics Committees (HRECs) ‘perform’ their duties and make decisions to a pre‐ordained set of prescriptive standards that do not necessarily take into account the agency of vulnerable young people or Indigenous women.

Qualitative Social Work

Natalie J K Baloy , Sheeva Sabati , Rena Lederman , Kisha Supernant , Caitlin Cahill

CCREC hosted the Unsettling Research Ethics invitational conference in February 2016. Designed to disrupt formalized approaches to research ethics, the conference facilitated critical dialogue among social scientists, ethics specialists, community based and collaborative researchers, and community leaders. This dialogue was graphically visualized and documented in real time by a graphic facilitator, Julie Gieseke of Map the Mind, thereby providing materials used in the knowledge production of the conference itself and reworked for this report. The Unsettling Research Ethics conference and report presents a distinctive framework for grappling with the ethics of research, surfacing ethical tensions and dilemmas through the domains of knowledge, relationality, and space and time. This framework aims to deepen ethical praxis and professional formation for researchers and collaborators. Included in this report are learning tools like innovative cases, games, heat maps, and other materials designed for deep engagement with fraught ethical matters. The Unsettling Research Ethics conference was an intergenerational gathering, with both early career and foundational scholars in anthropology, archaeology, critical race and ethnic studies, black studies, computer science, education, feminist studies, geography, public health, sociology, and philosophy. Participating scholars identify as scholar-activists and/or engage in work related to research ethics, community-based and collaborative approaches to research, and ethics policy work at institutional, professional association, and national levels. Community leaders in attendance have collaboratively partnered with academics, and work in multiple domains of social justice activism and community organizing, including labor, race, women’s issues, immigration, and youth development. This report provides background on the CCREC ethics research project and its conceptual and pedagogical approaches, including its notion of ‘dwelling with/in the ethics of research.’ Additional frameworks and provocative invitations for engaging the ethics of research are offered by Troy Richardson (Cornell University), Joyce E. King (Georgia State University), Rena Lederman (Princeton University), Diane Fujino (University of California, Santa Barbara), Kisha Supernant (University of Alberta), Richa Nagar (University of Minnesota), Caitlin Cahill (Pratt Institute), and George Lipsitz (University of California, Santa Barbara). The materials included in this report, generated by the CCREC ethics project and Unsettling Research Ethics conference participants, are meant to serve as resources for fostering sustained ethical reflection and strengthened professional development for scholars and research partners concerning the ethics of knowledge, relationality, and space and time.

RELATED PAPERS

Thomas Schneider

Vojnosanitetski pregled

miroslav stojadinovic

Mario Bacalhau

Implementation Science

Jan van Lieshout

Physica A: Statistical Mechanics and its Applications

Andres M Kowalski

Prof. Hossein Jafari

Journal of Ilam University of Medical Sciences

Ijca Proceedings on International Conference on Green Computing and Technology

Rajkumar Vadgave

Analytical sciences : the international journal of the Japan Society for Analytical Chemistry

Safeer Ahmad

Jurnal Pendidikan: Teori, Penelitian, dan Pengembangan

Cipto Wardoyo

Journal of Thermal Spray Technology

Paulo Feistel

José Ricardo Descardeci

P.J.M.D. Essens

Health systems and reform

Athanasios Nikolentzos

Suched Samuhasaneetoo

Aquatic Sciences

Kaj Sand-jensen

Copublicaciones

jorge zorrilla

Juan Villacorta

Clinical Kidney Journal

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

WashU Libraries

Conducting research.

  • The Process
  • Step 1: Exploring an idea
  • Step 2: Finding background info.
  • Step 3: Gathering more info.
  • Get it This link opens in a new window
  • Step 5: Evaluating your sources
  • Step 6: Citing your sources
  • FAQs This link opens in a new window
  • Library Vocabulary

Researching in the Humanities

Recommended research guides.

  • Research in the Social Sciences
  • Research in the Sciences

Subject Librarians

The staff at Reference Help Desk can also help you finding appropriate sources for background information.  You are also welcome to contact a Subject Librarian for assistance:

Subject Librarians, alphabetical listing

Subject Librarians, subject listing

The fields in the Humanities discipline generally include the visual and performing arts, philosophy, literature, religion, history, languages, art history, and classics.  Although research methods differ among the Humanities, the Social Sciences, and the Sciences, any research project in any discipline starts with curiosity and a hypothesis.  Often research topics are interdisciplinary and may include multiple subject areas and methods from more than just one discipline.

When beginning a research project in the humanities, you must develop a deep knowledge base in a subject area, choose original sources to examine, locate and evaluate sources that also explore your areas of interest, and then come to your own original conclusions. Libraries can help you find the material you need to get started.  The research guides listed on the right are created by WUSTL Subject Librarians.  Subject Librarians have expertise in both searching techniques and academic fields, and their online guides suggest resources for the different phases of the research process: 1) Use background sources to establish your knowledge base.  These could be subject encyclopedias, key works in a field, bibliographies, etc. 2) Select original sources, commonly referred to as primary sources, for your analysis.  Primary sources are simply original works, e.g., novels, photographs, diaries, correspondence, advertisements, eyewitness accounts. 3) Find articles, reviews, and books that analyze primary sources.  These are known as secondary sources. Then, synthesize all this information with your own thinking and draw your original conclusions, thus creating new research in the field.

  • A Guide to African and African-American Studies by Rudolph Clay Last Updated Apr 30, 2024 496 views this year
  • A Guide to American History by Michael Schaefer Last Updated May 16, 2024 1805 views this year
  • A Guide to Comparative Literature by Walter Schlect Last Updated Apr 30, 2024 601 views this year
  • A Guide to East Asian Studies by Joan Wang Last Updated May 16, 2024 412 views this year
  • A Guide to English and American Literature by Kris Helbling Last Updated May 21, 2024 447 views this year
  • A Guide to Germanic Studies by Walter Schlect Last Updated Apr 30, 2024 324 views this year
  • A Guide to Islamic and Middle Eastern Studies by AJ Robinson Last Updated Apr 30, 2024 269 views this year
  • A Guide to Philosophy Resources by Michael Schaefer Last Updated Apr 30, 2024 442 views this year
  • A Guide to Philosophy-Neuroscience-Psychology (PNP) Resources by Sam Lindgren Last Updated May 6, 2024 461 views this year
  • A Guide to Religious Studies by Michael Schaefer Last Updated Apr 30, 2024 498 views this year
  • Romance Languages & Literatures Subject Guide by Lino Mioni Last Updated Apr 30, 2024 64 views this year
  • A Guide to Women, Gender & Sexuality Studies by AJ Robinson Last Updated May 16, 2024 264 views this year
  • A Guide to World History by Michael Schaefer Last Updated May 14, 2024 421 views this year
  • << Previous: Library Vocabulary
  • Next: Research in the Social Sciences >>
  • Last Updated: Apr 30, 2024 3:07 PM
  • URL: https://libguides.wustl.edu/research
  • Privacy Policy

Research Method

Home » Humanities Research – Types, Methods and Examples

Humanities Research – Types, Methods and Examples

Table of Contents

Humanities Research

Humanities Research

Definition:

Humanities research is a systematic and critical investigation of human culture, values, beliefs, and practices, including the study of literature, philosophy, history, art, languages, religion, and other aspects of human experience.

Types of Humanities Research

Types of Humanities Research are as follows:

Historical Research

This type of research involves studying the past to understand how societies and cultures have evolved over time. Historical research may involve examining primary sources such as documents, artifacts, and other cultural products, as well as secondary sources such as scholarly articles and books.

Cultural Studies

This type of research involves examining the cultural expressions and practices of a particular society or community. Cultural studies may involve analyzing literature, art, music, film, and other forms of cultural production to understand their social and cultural significance.

Linguistics Research

This type of research involves studying language and its role in shaping cultural and social practices. Linguistics research may involve analyzing the structure and use of language, as well as its historical development and cultural variations.

Anthropological Research

This type of research involves studying human cultures and societies from a comparative and cross-cultural perspective. Anthropological research may involve ethnographic fieldwork, participant observation, interviews, and other qualitative research methods.

Philosophy Research

This type of research involves examining fundamental questions about the nature of reality, knowledge, morality, and other philosophical concepts. Philosophy research may involve analyzing philosophical texts, conducting thought experiments, and engaging in philosophical discourse.

Art History Research

This type of research involves studying the history and significance of art and visual culture. Art history research may involve analyzing the formal and aesthetic qualities of art, as well as its historical context and cultural significance.

Literary Studies Research

This type of research involves analyzing literature and other forms of written expression. Literary studies research may involve examining the formal and structural qualities of literature, as well as its historical and cultural context.

Digital Humanities Research

This type of research involves using digital technologies to study and analyze cultural artifacts and practices. Digital humanities research may involve analyzing large datasets, creating digital archives, and using computational methods to study cultural phenomena.

Data Collection Methods

Data Collection Methods in Humanities Research are as follows:

  • Interviews : This method involves conducting face-to-face, phone or virtual interviews with individuals who are knowledgeable about the research topic. Interviews may be structured, semi-structured, or unstructured, depending on the research questions and objectives. Interviews are often used in qualitative research to gain in-depth insights and perspectives.
  • Surveys : This method involves distributing questionnaires or surveys to a sample of individuals or groups. Surveys may be conducted in person, through the mail, or online. Surveys are often used in quantitative research to collect data on attitudes, behaviors, and other characteristics of a population.
  • Observations : This method involves observing and recording behavior or events in a natural or controlled setting. Observations may be structured or unstructured, and may involve the use of audio or video recording equipment. Observations are often used in qualitative research to collect data on social practices and behaviors.
  • Archival Research: This method involves collecting data from historical documents, artifacts, and other cultural products. Archival research may involve accessing physical archives or online databases. Archival research is often used in historical and cultural studies to study the past.
  • Case Studies : This method involves examining a single case or a small number of cases in depth. Case studies may involve collecting data through interviews, observations, and archival research. Case studies are often used in cultural studies, anthropology, and sociology to understand specific social or cultural phenomena.
  • Focus Groups : This method involves bringing together a small group of individuals to discuss a particular topic or issue. Focus groups may be conducted in person or online, and are often used in qualitative research to gain insights into social and cultural practices and attitudes.
  • Participatory Action Research : This method involves engaging with individuals or communities in the research process, with the goal of promoting social change or addressing a specific social problem. Participatory action research may involve conducting focus groups, interviews, or surveys, as well as involving participants in data analysis and interpretation.

Data Analysis Methods

Some common data analysis methods used in humanities research:

  • Content Analysis : This method involves analyzing the content of texts or cultural artifacts to identify patterns, themes, and meanings. Content analysis is often used in literary studies, media studies, and cultural studies to analyze the meanings and representations conveyed in cultural products.
  • Discourse Analysis: This method involves analyzing the use of language and discourse to understand social and cultural practices and identities. Discourse analysis may involve analyzing the structure, meaning, and power dynamics of language and discourse in different social contexts.
  • Narrative Analysis: This method involves analyzing the structure, content, and meaning of narratives in different cultural contexts. Narrative analysis may involve analyzing the themes, symbols, and narrative devices used in literary texts or other cultural products.
  • Ethnographic Analysis : This method involves analyzing ethnographic data collected through participant observation, interviews, and other qualitative methods. Ethnographic analysis may involve identifying patterns and themes in the data, as well as interpreting the meaning and significance of social and cultural practices.
  • Statistical Analysis: This method involves using statistical methods to analyze quantitative data collected through surveys or other quantitative methods. Statistical analysis may involve using descriptive statistics to describe the characteristics of the data, or inferential statistics to test hypotheses and make inferences about a population.
  • Network Analysis: This method involves analyzing the structure and dynamics of social networks to understand social and cultural practices and relationships. Network analysis may involve analyzing patterns of social interaction, communication, and influence.
  • Visual Analysis : This method involves analyzing visual data, such as images, photographs, and art, to understand their cultural and social significance. Visual analysis may involve analyzing the formal and aesthetic qualities of visual products, as well as their historical and cultural context.

Examples of Humanities Research

Some Examples of Humanities Research are as follows:

  • Literary research on diversity and representation: Scholars of literature are exploring the representation of different groups in literature and how those representations have changed over time. They are also studying how literature can promote empathy and understanding across different cultures and communities.
  • Philosophical research on ethics and technology: Philosophers are examining the ethical implications of emerging technologies, such as artificial intelligence and biotechnology. They are asking questions about what it means to be human in a world where technology is becoming increasingly advanced.
  • Anthropological research on cultural identity: Anthropologists are studying the ways in which culture shapes individual and collective identities. They are exploring how cultural practices and beliefs can shape social and political systems, as well as how individuals and communities resist or adapt to dominant cultural norms.
  • Linguistic research on language and communication: Linguists are studying the ways in which language use and communication can impact social and political power dynamics. They are exploring how language can reinforce or challenge social hierarchies and how language use can reflect cultural values and norms.

How to Conduct Humanities Research

Conducting humanities research involves a number of steps, including:

  • Define your research question or topic : Identify a question or topic that you want to explore in-depth. This can be a broad or narrow topic, depending on the scope of your research project.
  • Conduct a literature review: Before beginning your research, read extensively on your topic. This will help you understand the existing scholarship and identify gaps in the literature that your research can address.
  • Develop a research methodology: Determine the methods you will use to collect and analyze data, such as interviews, surveys, archival research, or textual analysis. Your methodology should be appropriate to your research question and topic.
  • Collect data: Collect data using the methods you have chosen. This may involve conducting interviews, surveys, or archival research, or analyzing primary or secondary sources.
  • Analyze data: Once you have collected data, analyze it using appropriate methods. This may involve coding, categorizing, or comparing data, or interpreting texts or other sources.
  • Draw conclusions: Based on your analysis, draw conclusions about your research question or topic. These conclusions should be supported by your data and should contribute to existing scholarship.
  • Communicate your findings : Communicate your findings through writing, presentations, or other forms of dissemination. Your work should be clearly written and accessible to a broad audience.

Applications of Humanities Research

Humanities research has many practical applications in various fields, including:

  • Policy-making: Humanities research can inform policy-making by providing insights into social, cultural, and historical contexts. It can help policymakers understand the impact of policies on communities and identify potential unintended consequences.
  • Education: Humanities research can inform curriculum development and pedagogy. It can provide insights into how to teach critical thinking, cross-cultural understanding, and communication skills.
  • Cultural heritage preservation: Humanities research can help to preserve cultural heritage by documenting and analyzing cultural practices, traditions, and artifacts. It can also help to promote cultural tourism and support local economies.
  • Business and industry: Humanities research can provide insights into consumer behavior, cultural preferences, and historical trends that can inform marketing, branding, and product design.
  • Healthcare : Humanities research can contribute to the development of patient-centered healthcare by exploring the impact of social and cultural factors on health and illness. It can also help to promote cross-cultural understanding and empathy in healthcare settings.
  • Social justice: Humanities research can contribute to social justice by providing insights into the experiences of marginalized communities, documenting historical injustices, and promoting cross-cultural understanding.

Purpose of Humanities Research

The purpose of humanities research is to deepen our understanding of human experience, culture, and history. Humanities research aims to explore the human condition and to provide insights into the diversity of human perspectives, values, and beliefs.

Humanities research can contribute to knowledge in various fields, including history, literature, philosophy, anthropology, and more. It can help us to understand how societies and cultures have evolved over time, how they have been shaped by various factors, and how they continue to change.

Humanities research also aims to promote critical thinking and creativity. It encourages us to question assumptions, to challenge dominant narratives, and to seek out new perspectives. Humanities research can help us to develop empathy and understanding for different cultures and communities, and to appreciate the richness and complexity of human experience.

Overall, the purpose of humanities research is to contribute to a deeper understanding of ourselves, our communities, and our world. It helps us to grapple with fundamental questions about the human experience and to develop the skills and insights needed to address the challenges of the future.

When to use Humanities Research

Humanities research can be used in various contexts where a deeper understanding of human experience, culture, and history is required. Here are some examples of when humanities research may be appropriate:

  • Exploring social and cultural phenomena: Humanities research can be used to explore social and cultural phenomena such as art, literature, religion, and politics. It can help to understand how these phenomena have evolved over time and how they relate to broader social, cultural, and historical contexts.
  • Understanding historical events: Humanities research can be used to understand historical events such as wars, revolutions, and social movements. It can provide insights into the motivations, experiences, and perspectives of the people involved, and help to contextualize these events within broader historical trends.
  • Promoting cultural understanding : Humanities research can be used to promote cross-cultural understanding and to challenge stereotypes and biases. It can provide insights into the diversity of human experiences, values, and beliefs, and help to build empathy and mutual respect across different cultures and communities.
  • Informing policy-making: Humanities research can be used to inform policy-making by providing insights into social, cultural, and historical contexts. It can help policymakers understand the impact of policies on communities and identify potential unintended consequences.
  • Promoting innovation and creativity : Humanities research can be used to promote innovation and creativity in various fields. It can help to generate new ideas, perspectives, and approaches to complex problems, and to challenge conventional thinking and assumptions.

Characteristics of Humanities Research

Some of the key characteristics of humanities research:

  • Focus on human experience: Humanities research focuses on the study of human experience, culture, and history. It aims to understand the human condition, explore human values and beliefs, and analyze the ways in which societies and cultures have evolved over time.
  • Interpretive approach: Humanities research takes an interpretive approach to data analysis. It seeks to understand the meaning behind texts, artifacts, and cultural practices, and to explore the multiple perspectives and contexts that shape human experience.
  • Contextualization : Humanities research emphasizes the importance of contextualization. It seeks to understand how social, cultural, and historical factors shape human experience, and to place individual phenomena within broader cultural and historical contexts.
  • Subjectivity : Humanities research recognizes the subjective nature of human experience. It acknowledges that human values, beliefs, and experiences are shaped by individual perspectives, and that these perspectives can vary across cultures, communities, and time periods.
  • Narrative analysis : Humanities research often uses narrative analysis to explore the stories, myths, and cultural narratives that shape human experience. It seeks to understand how these narratives are constructed, how they evolve over time, and how they influence individual and collective identity.
  • Multi-disciplinary: Humanities research is often interdisciplinary, drawing on a range of disciplines such as history, literature, philosophy, anthropology, and more. It seeks to bring together different perspectives and approaches to understand complex human phenomena.

Advantages of Humanities Research

Some of the key advantages of humanities research:

  • Promotes critical thinking: Humanities research encourages critical thinking by challenging assumptions and exploring different perspectives. It requires researchers to analyze and interpret complex texts, artifacts, and cultural practices, and to make connections between different phenomena.
  • Enhances cultural understanding : Humanities research promotes cross-cultural understanding by exploring the diversity of human experiences, values, and beliefs. It helps to challenge stereotypes and biases and to build empathy and mutual respect across different cultures and communities.
  • Builds historical awareness: Humanities research helps us to understand the historical context of current events and social issues. It provides insights into how societies and cultures have evolved over time and how they have been shaped by various factors, and helps us to contextualize current social, political, and cultural trends.
  • Contributes to public discourse: Humanities research contributes to public discourse by providing insights into complex social, cultural, and historical phenomena. It helps to inform public policy and public debate by providing evidence-based analysis and insights into social issues and problems.
  • Promotes creativity and innovation: Humanities research promotes creativity and innovation by challenging conventional thinking and assumptions. It encourages researchers to generate new ideas and perspectives and to explore alternative ways of understanding and addressing complex problems.
  • Builds communication skills: Humanities research requires strong communication skills, including the ability to analyze and interpret complex texts, artifacts, and cultural practices, and to communicate findings and insights in a clear and compelling way.

Limitations of Humanities Research

Some of the key limitations of humanities research:

  • Subjectivity: Humanities research relies heavily on interpretation and analysis, which are inherently subjective. Researchers bring their own perspectives, biases, and values to the analysis, which can affect the conclusions they draw.
  • Lack of generalizability : Humanities research often focuses on specific texts, artifacts, or cultural practices, which can limit the generalizability of findings to other contexts. It is difficult to make broad generalizations based on limited samples, which can be a challenge when trying to draw broader conclusions.
  • Limited quantitative data : Humanities research often relies on qualitative data, such as texts, images, and cultural practices, which can be difficult to quantify. This can make it difficult to conduct statistical analyses or to draw quantitative conclusions.
  • Limited replicability: Humanities research often involves in-depth analysis of specific texts, artifacts, or cultural practices, which can make it difficult to replicate studies. This can make it challenging to test the validity of findings or to compare results across studies.
  • Limited funding: Humanities research may not always receive the same level of funding as other types of research. This can make it challenging for researchers to conduct large-scale studies or to have access to the same resources as other researchers in different fields.
  • Limited impact : Humanities research may not always have the same level of impact as research in other fields, particularly in terms of policy and practical applications. This can make it challenging for researchers to demonstrate the relevance and impact of their work.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Documentary Research

Documentary Research – Types, Methods and...

Scientific Research

Scientific Research – Types, Purpose and Guide

Original Research

Original Research – Definition, Examples, Guide

Historical Research

Historical Research – Types, Methods and Examples

Artistic Research

Artistic Research – Methods, Types and Examples

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 21 March 2017

The future of research assessment in the humanities: bottom-up assessment procedures

  • Michael Ochsner 1 , 2 ,
  • Sven Hug 1 , 3 &
  • Ioana Galleron 4  

Palgrave Communications volume  3 , Article number:  17020 ( 2017 ) Cite this article

18 Citations

10 Altmetric

Metrics details

  • Science, technology and society

Research assessment in the social sciences and humanities (SSH) is delicate. Assessment procedures meet strong criticisms from SSH scholars and bibliometric research shows that the methods that are usually applied are ill-adapted to SSH research. While until recently research on assessment in the SSH disciplines focused on the deficiencies of the current assessment methods, we present some European initiatives that take a bottom-up approach. They focus on research practices in SSH and reflect on how to assess SSH research with its own approaches instead of applying and adjusting the methods developed for and in the natural and life sciences. This is an important development because we can learn from previous evaluation exercises that whenever scholars felt that assessment procedures were imposed in a top-down manner without proper adjustments to SSH research, it resulted in boycotts or resistance. Applying adequate evaluation methods not only helps foster a better valorization of SSH research within the research community, among policymakers and colleagues from the natural sciences, but it will also help society to better understand SSH’s contributions to solving major societal challenges. Therefore, taking the time to encourage bottom-up evaluation initiatives should result in being able to better confront the main challenges facing modern society. This article is published as part of a collection on the future of research assessment.

Similar content being viewed by others

research in humanities pdf

Mapping the community: use of research evidence in policy and practice

research in humanities pdf

Insights from a cross-sector review on how to conceptualise the quality of use of research evidence

research in humanities pdf

Social sciences and humanities research funded under the European Union Sixth Framework Programme (2002–2006): a long-term assessment of projects, acknowledgements and publications

Introduction.

While there is more than a 100 years of scientific inquiry on research and dissemination practices in the natural and life sciences, until recently bibliometric and social studies on science and technology research neglected the SSH ( Hemlin, 1996 ). Therefore, there are methods for research assessment in the natural and life sciences that relate to the practices in these fields and are accepted by the community (even though there are more and more critical voices, see for example, Lawrence, 2002 ; Molinié and Bodenhausen, 2010 ) and the measurement properties are tested by bibliometric research. In the meantime, knowledge on research and dissemination practices in the SSH is scarce, while research assessment did not stop at the gate of the SSH disciplines ( Guillory, 2005 ; Burrows, 2012 ). The growing pressure of accountability, prevailing government practices based on New Public Management and the availability of quantitative data led to the implementation of (quantitative) research assessments also in the SSH during the last decades ( Kekäle, 2002 ; Hammarfelt and de Rijcke, 2015 ; Hamann, 2016 ). The creation of the European Research Area (ERA) increased the importance of research evaluation: the initial communication “Towards a European Research Area” listed under the first theme of action the “mapping of European centres of excellence” and “Financing plan for centres of excellence on the basis of competition” ( Commission of the European Communities, 2000 ); 15 years later, the ERA Roadmap listed the following as the first among the Roadmap’s priorities: “Strengthening the evaluation of research and innovation policies and seeking complementarities between, and rationalization of, instruments at EU and national levels”. ( European Research Area and Innovation Committee, 2015 : 5). The vast majority of research assessments, however, were implemented in a top-down manner by either governments or university administrators. In addition, research assessment procedures usually apply bibliometric and scientometric methods developed for the natural and life sciences that do not reflect SSH research and disseminations practices. Bibliometric research shows that these methods cannot readily be used for the SSH ( Hicks, 2004 ; Lariviere et al., 2006 ; Nederhof, 2006 ). Therefore, research assessment procedures (and oftentimes research evaluation in general) meet strong opposition in the scholarly communities of the SSH.

In the last decade, a number of projects were initiated in Europe to explore research assessment procedures that adequately reflect SSH research practices. These projects did not arise from within the discipline in the sense of auto-regulation or the discontent with the quality or the standing of the discipline. Rather, they are the reaction on how research is assessed through procedures not linked to the functioning of the disciplines itself but to top-down decisions on how research is to be evaluated. Also, with the ERA Roadmap in place, the discussion could no longer be whether research should be subject to systematic research assessments but rather how to assess it. With a few exceptions, however, the bottom-up initiatives unfortunately do not get the attention of research evaluators and policymakers they deserve.

In this article, we give an overview of selected European initiatives that are genuinely reflecting the SSH research practices and were initiated or developed by scholars with an SSH background. Due to restrictions of space, we do not report how SSH research is assessed in unitary evaluation procedures, that is exercises that apply the same basic procedure for all disciplines (for sciences, technology, engineering and mathematics (STEM), as well as for SSH disciplines) and allow only for small adaptions to SSH research practices (for example, use of bibliometrics or not, types of eligible outputs). For this reason, we do not report how SSH research is evaluated in the RAE and REF in the United Kingdom Footnote 1 or the RQF and ERA in Australia as they are clearly top-down (see for example, Kwok, 2013 ), follow a unitary approach and the SSH are not having a major impact on the design of the exercise. Furthermore, the RAE/REF and RQF/ERA procedures are well-documented in the literature. For the SSH in the RAE/REF, see for example Arts and Humanities Research Council (2006 , 2009) ; Butler and McAllister (2009) ; Hamann (2016) ; Johnston (2008) ; Norris and Oppenheim (2003) ; Oppenheim and Summers (2008) . For the RAE/REF in general, see for example Barker (2007) and Hicks (2012) . For SSH related matters in the Australian RQF/ERA, see for example Butler (2008) , Butler and Visser (2006) , Council for the Humanities, Arts and Social Sciences (2009) , Genoni and Haddow (2009) , Kwok (2013) , Redden (2008) . Because there is a wealth of such SSH initiatives in Europe, we also restrict our review to European initiatives and do not report other initiatives such as the Australian ERA and the Humanities Indicators project in the United States ( www.humanitiesindicators.org ).

In what follows, we first present the issues of research assessment in the SSH, such as the methodological issues and the SSH scholars’ critique of the assessment procedures. We then move on to present several bottom-up initiatives taken up in (mainly continental) Europe by concerned SSH scholars. These initiatives set out at different levels and with different scope, from simply improving the situation of SSH data availability and accuracy to complex evaluation procedures involving a broad range of quality criteria and indicators. Some initiatives take place at a local level, others at a national level; and there are even European initiatives concerned with bottom-up research evaluation in the SSH. We conclude with some recommendations for future research evaluation in the humanities.

Research assessment in the SSH

To describe the current situation of research assessment in the SSH, we analyse them from two perspectives. First, we take the perspective of bibliometricians and scientometricians and focus on what they say regarding the adequacy of their methods for SSH research. Second, we analyse the critiques of the SSH scholars regarding those methods, which gives us hints at how to design adequate methods for research assessment in the SSH.

Bibliometrics and scientometrics in SSH research assessments

The application of bibliometric methods to the SSH proved to be problematic and yielded unsatisfying results, so that even bibliometricians caution from applying bibliometric methods to SSH disciplines (see for example, Nederhof et al., 1989 ; Glänzel, 1996 ; Lariviere et al., 2006 ). This is because of several reasons, that we summarize in two main issues: coverage issues and methodological issues.

Coverage issues arise for several reasons. First, in the SSH, chapters in books and monographs are more frequently used as publication channels and get cited more often than journal articles ( Hicks, 2004 ; Nederhof, 2006 ). This leads to severe coverage issues in the most important databases for bibliometric analyses, which are mainly or exclusively based on scholarly journals ( van Leeuwen, 2013 ). Furthermore, even internationally oriented European journals are not covered well in the relevant databases compared with American journals ( Nederhof, 2006 ).

Second, some SSH disciplines are characterized by a more pronounced national and regional orientation ( Nederhof, 2006 ). Nederhof states in his review of bibliometric monitoring in the SSH: “Societies differ, and therefore results from humanities or social science studies obtained in one country may not always be very useful to researchers in other countries” ( Nederhof, 2006 : 83). Thus, even though the topics might be internationally relevant, this kind of output is less visible, as often written in the national languages, seldom covered in the bibliometric databases (see for example, Chi, 2012 ), or even published in other publication channels that are not covered at all (example, reports and other publications directed to national or regional readership).

Third, SSH scholars write not only for the scholarly readers but also for the lay public ( Hicks, 2004 ). This type of literature is usually not taken into consideration in evaluations and certainly not included in the databases used for bibliometric analyses. However, non-scholarly publications are an important part of SSH research and its societal impact.

Methodological issues arise amongst others from the fact that citation behaviour is different in the SSH disciplines. The age of references is remarkably high. Glänzel noted for example in his analysis from 1996 that a 3-year citation window is too short. Given the distribution of the citations over time, almost a 10-year citation window would have to be applied, leading to an obsolete publication set for evaluation purposes ( Glänzel, 1996 ). Furthermore, the citation culture is different ( Hellqvist, 2010 ; Hammarfelt, 2012 ; Bunia, 2016 ). Hicks (2004) notes also that SSH journals are usually more transdisciplinary, which leads to methodological problems such as field normalization.

While this is not a comprehensive analysis of methodological issues of quantitative assessments, it shows that there are several problems with the application of bibliometric indicators in research assessments in the humanities. Importantly, it makes evident that today’s bibliometric methods do not reflect SSH scholarship.

SSH scholars’ critique of quantitative research assessments

If research assessment procedures are to be accepted and the tools and methods should help determining the quantity and quality of humanities research without significant delays, refusal or boycott by the scholarly community, the criticisms put forward by humanities scholars become an important issue. We have analysed SSH scholars’ critique of (quantitative) research assessments elsewhere and summarized them into four main reservations ( Hug and Ochsner, 2014 ). We will only briefly summarize our findings, as relevant for the purpose of this article.

The first reservation relates to the section above: the methods were developed for, and reflect the research practices in, the natural and life sciences ( Vec, 2009 ). This means not only that the assessment practices do not account for SSH dissemination practices (monographs, diverse languages, local orientation, individual scholarship) as noted in the section above, but also that the assessment practices follow the natural sciences’ linear understanding of progress while the SSH scholars share the notion of the “coexistence of competing ideas” ( Lack, 2008 : 14), that is, an ever-increasing knowledge base. This conception of knowledge that is diverse and not dying out is not reflected in most evaluation practices.

Second, SSH scholars have strong reservations about quantification. A joint letter by 24 international philosophers to the Australian government as a reaction to the journal ranking in the Excellence in Research for Australia (ERA) exercise points to this issue: “The problem is not that judgments of quality in research cannot currently be made, but rather that in disciplines like Philosophy, those standards cannot be given simple, mechanical, or quantitative expression” ( Academics Australia, 2008 ). Other scholars argue that research does not produce products or goods in a free market, in which value can be defined according to the products’ economic value or efficiency ( Plumpe, 2010 ; Palumbo and Pennisi, 2015 ). Thus, many SSH scholars fear that the intrinsic benefits of the arts and humanities will be neglected or even lost because of the focus on quantitative measures. The report for the Humanities and Social Sciences Federation of Canada says for example that “some efforts soar and others sink, but it is not the measurable success that matters, rather the effort” ( Fisher et al., 2000 , “The Value of a Liberal Education”, para. 18; see also the report for the RAND corporation McCarthy et al., 2004 ).

The third reservation is the fear of negative steering effects of indicators. SSH scholars anticipate many dysfunctional effects such as mainstreaming or conservative effects of indicators, a loss of diversity of research topics or disciplines due to selection effects introduced by the use of indicators, or importance of spectacular research findings leading to unethical reporting of findings ( Fisher et al., 2000 ; Andersen et al., 2009 ; Hose, 2009 ; Burrows, 2012 ). More and more such negative steering effects of indicators are observed also in the natural sciences ( Butler, 2003 , 2007 ; Mojon-Azzi et al., 2003 ; Moonesinghe et al., 2007 ; Unreliable research. Trouble at the lab, 2013 ). Such findings support the fear of negative steering effects in the SSH.

Fourth, the SSH are characterized by a heterogeneity of research topics, methods and paradigms. Finding shared quality criteria or standards for research assessments becomes an intricate task if there is no consensus on research questions, the suitability of the methods applied and even the definition of disciplines and sub-disciplines ( Herbert and Kaube, 2008 ; van Gestel et al., 2012 ; Hornung et al., 2016 ). If criteria can be found, they are usually informal, refer to one (sub-)discipline and cannot easily be transferred to other sub-disciplines or evaluation situations ( Herbert and Kaube, 2008 ).

Bottom-up procedures for research assessment in the humanities

Despite these critiques of both bibliometricians and scientometricians on the one hand and SSH scholars on the other hand, more and more research assessments in the SSH are implemented. Usually, the procedures for research assessments are implemented in a top-down manner, not taking the situation at the coal face of research into account. However, there are several initiatives that reflect the characteristics of SSH research. In the following, we focus on initiatives that come from within the SSH research communities or are at least developed by scholars from SSH disciplines, genuinely taking into account SSH research practices in their approaches 2 . All of them address at least one of the issues mentioned in the previous section. While these bottom-up initiatives are more likely to be accepted by SSH scholars, some of them still face strong opposition or are boycotted.

Improving the databases

Considering that typical SSH publications (for example, books, proceedings, publications in local languages) are badly represented in current databases Footnote 2 , efforts have been made in several countries to improve coverage, especially in the countries with a performance-based funding model, like Spain, Norway, Denmark, Belgium (Flanders) and Finland ( Giménez-Toledo et al., 2016 ). There was also an attempt to create a full-coverage bibliographic/bibliometric database for Europe, but it did not result in an implementation of a European-wide database or standard ( Martin et al., 2010 ). In parallel, the ERIH project intended to create a European journal list for the SSH to overcome the problems of under-representation of (European) SSH journals in the main bibliometric databases; however, the project faced strong opposition ( Andersen et al., 2009 ), had to be remodelled (see Lauer, 2016 ) and was relaunched under the name ERIH Plus 4 .

Attempts to create publication databases suitable for the humanities have sometimes also been organized at the level of disciplines. The EERQI project included such a database for the educational sciences on the European level; it also investigated methods for using the data in research evaluations in a meaningful way ( Gogolin et al., 2014 ; Gogolin, 2016 ). The database allows scholars to search for publications using keywords in one language, while retrieving results in all four languages covered in the database. Therefore, beyond evaluative purposes, centralized and systematic coverage of SSH production appears as an endeavour with multiple potential benefits, such as improving information retrieval for scholars and widening access to publications in multiple languages.

In all cases, consciousness is raising about the need to compile complete and interoperable databases of SSH scholarly and non-scholarly outputs, so as to gain accurate knowledge about productivity and publication behaviour in these very diverse disciplines. At the same time, the creation of such databases should go hand in hand with the development of standards regarding their use, including standards on how not to use them.

An SSH approach towards bibliometrics and scientometrics

Bibliometric analyses face many problems when applied to SSH disciplines ( Nederhof et al., 1989 ; Archambault et al., 2006 ; Nederhof, 2006 ; van Leeuwen, 2013 ). However, Hammarfelt (2016 : 115) observes a shift from investigating coverage issues towards studying the characteristics of SSH publication practices and developing bibliometric approaches sensitive to the organization of SSH research fields. This includes, but is not limited to, extending bibliometric analyses to non-source items ( Butler and Visser, 2006 ; Chi, 2014 ) or the relatively new Book Citation Index ( Gorraiz et al., 2013 ), using other databases like Google Scholar ( Kousha and Thelwall, 2009 ) or data from social media services, the so-called altmetrics ( Holmberg and Thelwall, 2014 ; Mohammadi and Thelwall, 2014 ; Zuccala et al., 2015 ; Zuccala and Cornacchia, 2016 ), analysing the inclusion in library catalogues ( White et al., 2009 ), exploring national databases with full coverage ( Giménez-Toledo et al., 2016 ), extending data to references in research grant proposals ( Hammarfelt, 2013 ) or to book reviews ( Zuccala and van Leeuwen, 2011 ; Zuccala et al., 2015 ), exploring collaboration ( Ossenblok and Engels, 2015 ) and publication patterns ( Chi, 2012 ; Ossenblok et al., 2012 ; Verleysen and Weeren, 2016 ). From a more pragmatic point of view, attempts are made to “weigh” the various outputs, such as journals or books in the SSH, similar to the journal impact factor, commonly used in the sciences ( Giménez-Toledo, 2016 ).

While most of this research is done by bibliometricians and scientometricians, there are more and more SSH scholars still focusing on their SSH career and at the same time investigating research practices in their disciplines, such as citation practices ( Drabek et al., 2015 ; Bunia, 2016 ), the influence of databases ( Lauer, 2016 ), the relation of bibliometric indicators to research practices ( Gogolin, 2016 ) or career building and dissemination ( Williams and Galleron, 2016 ). Also, more methodological analyses are conducted by SSH scholars, such as the investigation of the inter-rater reliability of research assessment procedures ( Riordan et al., 2011 ; Plag, 2016 ) or the correlation of bibliometric and expert-based procedures ( Ferrara and Bonaccorsi, 2016 ). While Hammarfelt requests to build a “bibliometrics for the humanities” ( Hammarfelt, 2016 : 115), Zuccala (2016 : 149) goes further and demands that bibliometricians find ways to teach bibliometrics to humanities students so that a “new breed of humanistic bibliometrician can emerge successfully”.

Bunia (2016) , a German literature scholar, argues that the problem of applicability of citation analyses might, besides coverage and technical issues, as well be intrinsic to the field of literary studies: literature scholars seem not to read the work of their colleagues in the same field or at least they do not use or cite them in their own publications. He advocates using bibliometric analyses to study the citation behaviour of literary scholars since this is also important knowledge for the scholarly community in the field. The use of bibliometric methods in research assessment will not be possible until light is shed on this issue.

Summarising the situation of bibliometrics and scientometrics in the SSH, bibliometric methods cannot be readily used for research assessment in the SSH. But bibliometrics adapted to the SSH can help to study research practices, publication and citation practices as well as other practices important for knowledge production in the SSH. A thorough look at citation habits can also broach some delicate issues in research practice. Applied with some care, some quantitative indicators can also be used to complement peer review if they are defined bottom-up, that is, from within the disciplines.

Funding SSH research grants

Third-party funding becomes more and more important because, first, a higher share of the research budget in most countries is competitively distributed through funding organizations ( van den Akker, 2016 ), second, because the amount of third-party funding is used in most assessment procedures at least as an information criterion ( Ochsner et al., 2012 ). Especially for the careers of young scholars, grant allocation gains importance: on the one hand, job opportunities of young researchers are more and more characterized by short-term contracts based on external funding ( van Arensbergen et al., 2014b ); on the other hand, allocated grants serve as a prove of excellence in talent selection decisions ( van Arensbergen et al., 2014a ).

Third-party funding implies ex-ante research assessment, that is, research is assessed before it has been conducted. While most ex ante assessments are based on peer review, many of them use bibliometric data to inform the peers. Certainly, these processes have been already in place for some time, mainly unnoticed by most SSH scholars because research grants are less important for them as they do not need expensive infrastructure to do their research ( Krull and Tepperwien, 2016 ). The growing importance of grants in science policy at the national and international level, however, has drawn the attention of SSH scholars to the processes of distributing research grants because there are huge differences in the distribution of grants between the STEM and SSH disciplines ( Krull and Tepperwien, 2016 ), not to mention the differences of amounts.

The lower chances and the lower amount of acquired third-party funding have their roots in the epistemic differences of research practices between the STEM and SSH, as well as in a different disciplinary organization and divergent practices of research evaluation. Only a minority of SSH scholars needs expensive instruments to conduct experiments, as opposed to the basic needs SSH scholars usually express, which are a computer, access to archives, travel expenses and research time ( Krull and Tepperwien, 2016 ). Therefore, third-party funding did not play a role for a long time in most SSH disciplines and grants are usually of a comparably low amount.

Second, the way SSH scholars appreciate research output of colleagues is quite different from how STEM researchers do. SSH scholars are much more critical. They criticize even work they value as excellent. A bit-by-bit examination is considered a proof of love. In interdisciplinary panels, STEM researchers do not agree on funding research that is heavily criticized. Because SSH scholars always do criticize the work of their colleagues, irrespective of the quality of the research, SSH scholars are often discriminated in interdisciplinary granting schemes ( Krull and Tepperwien, 2016 ) even though this practice of criticizing works fine within SSH disciplines ( König, 2016 ; Krull and Tepperwien, 2016 ).

Third, in the STEM disciplines, paradigmatic issues are usually disputed internally while at the outside there is coherence. The SSH disciplines, however, do not resolve such issues but allow for diversity within their fields ( van den Akker, 2016 ). Of course, this is rooted in a different understanding of scholarly work—linear progress in the STEM disciplines versus increase of the knowledge base in the SSH disciplines ( Lack, 2008 )—but it is also the result of a lack of organization. This leads to further marginalization as the SSH disciplines do not stand together to criticize univocally the short-sighted focus on the linear progress of science ( van den Akker, 2016 ) and to demand funding schemes adequate for SSH research with a powerful united voice.

At the same time, some funders are frustrated that their schemes do not attract more proposals from SSH disciplines ( König, 2016 ), maybe because SSH scholars do not take the risk of writing a proposal when past experiences seem to make it likely that it will be turned down. Therefore, the Fritz Thyssen Stiftung and the VolkswagenStiftung have created a funding programme adapted to the needs of humanities scholars entitled “Focus on Humanities” that includes the grant Opus Magnum that could bridge the gap between the humanist way of doing research and at the same time adding a competitive component. In addition, the VolkswagenStiftung (2014) has established bottom-up guidelines regarding how to recognize intellectual quality in the humanities collected in a workshop with renowned scholars and young scholars.

SSH research practices and criteria for research quality

To assess research performance, there should be an explicit understanding of what “good” research is, since any assessment points out “high quality” research or tries to judge which research is “better” ( Butler, 2007 ). However, not much is known what actually research quality means (see e.g. Kekäle, 2002 ), especially so in the SSH. The literature on research assessment actively avoids this topic, while existing tools and procedures of research assessment do not include an explicit understanding of research quality ( Glänzel et al., 2016 ). Rather, authors revert to “impact”, which is easier to measure but not congruent with “quality” ( Gumpenberger et al., 2016 ) Footnote 3 . Therefore, if SSH research is to be assessed appropriately, there must be knowledge on what actually research quality means in these disciplines and assessment procedures must relate to the conceptions of research quality of the assessed scholars. To get a grasp on what guides judgement on what is good or bad research, we need empirical knowledge on research practices and the notions of quality that humanities scholars use to interpret, structure and evaluate the events and entities in their research activities.

During the last hundred years, scholars analysed research practices of the STEM disciplines, especially the natural sciences, in detail; however, the newly emerging field of social studies of science neglected its own (SSH) disciplines until recently ( Hemlin, 1996 : 53; Hammarfeldt, 2012 : 164). The literature so far describes the characteristics of SSH research in the following way: a) SSH research is interpretative, that is, humanities research is mainly text- and theory-driven and social sciences are more concept-driven, while the natural sciences set up their studies to answer specific questions and are progress-driven ( MacDonald, 1994 ; Guetzkow et al., 2004 ; Lamont, 2009 ); b) it is reflective and introduces new perspectives in academia, by fostering discursive controversy and competing visions ( Fisher et al., 2000 ; Hellqvist, 2010 ). With regard to the society, they bring a decisive contribution to the training of critical thinking as a prerequisite for democracy ( Nussbaum, 2010 ) or to the critical examination of modern trends, such as technologisation ( Luckmann, 2004) ; c) it is mainly individual ( Finkenstaedt, 1990 ; Weingart et al., 1991 ), few publications are co-authored ( Hemlin, 1996 ; Hellqvist, 2010 ) and research is often connected to the person conducting it ( Hemlin and Gustafsson, 1996 ; Guetzkow et al., 2004 ); d) productivity is not that important for research performance in the SSH ( Hemlin, 1993 ; Fisher et al., 2000 ; Hug et al., 2013 ); e) societal orientation is important, i.e. research is meant to influence society, direct interaction with society is part of SSH research ( Weingart et al., 1991 ; Hellqvist, 2010 ; Hug et al., 2013 ); but f) the influence of society or other stakeholders outside of academia, such as external funding, on SSH research is evaluated negatively ( Hemlin, 1993 ; Hug et al., 2013 ; Ochsner et al., 2013 ).

These characteristics must be considered when assessing SSH research. Therefore, there are several bottom-up projects by SSH scholars that analyse how quality is perceived in the SSH disciplines. The European Educational Research Qualitiy Indicators (EERQI) project ( Gogolin et al., 2014 ) started from the discontent with the current assessment practices applied to educational research ( Gogolin, 2016 : 105–106). The project lasted from 2008 to 2011 and aimed at the development of a set of tools (as opposed to a ranking or rating or a single indicator) to detect research quality (for a summary of the project and its tools, see Gogolin, 2016 ). The project differentiates between extrinsic quality indicators, that is, quality indicators that are not inherent to the text (such as number of citations, webometrics, authorships), and intrinsic quality indicators, that is, indicators that are inherent to the text (such as rigour, stringency). Part of this set of tools was a peer review questionnaire that included five intrinsic quality criteria for educational research: rigour, originality, significance, style and integrity. The criteria were developed in collaboration with experts in the field, mainly organized within national associations ( Gogolin and Stumm, 2014 ). The project included also an exploratory natural language processing system to highlight the most important sentences in an article. The idea behind the tool was to help reviewers judge an article’s quality by guiding their attention to the most important parts of an article ( Sandor and Vorndran, 2014a ). The tests with the tool showed that while texts in STEM disciplines follow a clear structure and reveal a high potential for automated highlighting, articles in SSH disciplines do not follow such a standard structure. Using keywords and different categories of sentences (for example, problem, summary), the authors argue that highlighting might considerably reduce the time needed for reviewing an article. However, highlighting did not cover two criteria appropriately, that is, integrity and rigour, thus, reviewers using highlighted versions of the article did not always rate those criteria. Furthermore, accuracy of the highlighting differs between (sub-)disciplines and the agreement between automated summaries and reviewers’ summary differed between languages ( Sandor and Vorndran, 2014a : 50–52). While the authors argue that automatic highlighting seems to work to a certain degree and that a highlighting tool is a promising help to ease peer review workload, the results suggest also that there are severe limits to its usefulness for the assessment of SSH manuscript, especially with regard to the quality criteria. Two out of five criteria tend to be overseen (i.e. integrity and rigour) and language and (sub-)discipline impact the results: summaries by English experts are closer to the sentences highlighted by the tool than the summaries of the French, while the error rate of the highlighting tool is higher for psychological articles than for sociological or historical. However, the authors used this tool also in the multilingual search engine for the EERQI-database and found that it can enhance the search results ( Sandor and Vorndran, 2014b ).

Also for educational research, Oancea and Furlong (2007) developed criteria for research performance. They define educational research as practice-based and state that such research is not confined to scientificity (that is, discoveries of universal findings or even laws), impact or economic efficiency but also encompasses, amongst others, methodological and theoretical rigour, dialogue, deliberation, participation, ethics and personal growth. They argue that the evaluation of practice-based research has to cope with the entanglement of research and practice, which means that evaluation still has to reflect reasoning and knowledge but it has also to open up for more experimental modes of knowledge coming from within a context of concrete situations and first-person action. While they do not aim at setting standards of good research practice, they conclude that research assessment needs to re-integrate a cultural and philosophical dimension that had been lost in the current discourse of research assessment ( Oancea and Furlong, 2007 ).

A more descriptive approach was chosen by Guetzkow, Lamont and Mallard (2004) . They analysed interviews with peer review panellists from multidisciplinary fellowship competitions and found that originality was the most frequently mentioned criterion for judging applications. They thus focused on analysing originality and found that originality is defined differently across different disciplines: Humanists referred often to originality of data and approach whereas social scientists emphasized originality of methods. Besides originality, however, there were also other important criteria, for example, clarity, social relevance, interdisciplinarity, feasibility, importance. Note that those criteria are not necessarily criteria for judging research quality but proposals for a fellowship. Because the authors focused on originality for a more thorough analysis, we do not learn whether there were also disciplinary differences in the salience of those other criteria and in the meaning that was given to the criteria. Given the results regarding originality, however, it is likely that such differences do exist.

The project “Developing and Testing Quality Criteria for Research in the Humanities” ( Ochsner et al., 2016 ) applied a strict bottom-up approach and developed a framework for the exploration and development for quality criteria for SSH research ( Hug and Ochsner, 2014 ) that consists of four pillars: adopting an inside-out approach (adequate representation of the scholarly community, also of young scholars, in the development process; discipline specific criteria), applying a sound measurement approach (linking indicators to quality criteria derived from the scholars’ notions of quality), making the notions of quality explicit (apply methods that can elicit criteria from the scholars’ tacit knowing of research quality to draw a comprehensive picture of what research quality is in a given discipline; make transparent which quality aspects are measured or included in the assessment and which are not), and striving for consensus (methods and especially criteria to be applied in research assessment have to be accepted by the community). This framework was applied to three humanities disciplines, known to be difficult to assess with scientometric methods: German literature studies, English literature studies and art history. In a first step, the scholars’ implicit knowing about research activities was investigated, made explicit and summarized into different conceptions of research using Repertory Grid interviews ( Ochsner et al., 2013 ). The results showed that two conceptions of research exist, specifically a modern and a traditional one. This differentiation is not connected to quality: both the modern as well as the traditional research can be of excellent or bad quality. Remarkably, the results also reveal that many commonly used indicators for research assessment, such as interdisciplinarity, internationality, cooperation and social impact, are, in fact, indicators for the modern conception of research and are not related to quality ( Ochsner et al., 2013 ). Besides the observations about scholars’ conceptions of research, quality criteria were extracted from the scholars’ notions of quality. In a second step, these quality criteria were completed and rated by all scholars in the three disciplines at the Swiss and LERU universities (League of European Research Universities), thus identifying consensual quality criteria for research using the Delphi method ( Hug et al., 2013 ). According the measurement approach, indicators were identified for the consensual quality criteria ( Ochsner et al., 2012 ) and also rated by the scholars. The results of the project indicate that there are a lot of quality criteria for research in the humanities to consider in research assessments. Many criteria are common to all three disciplines but there are also some discipline specific criteria. Furthermore, there is a mismatch between the humanities scholars’ quality criteria and the criteria applied in evaluation procedures ( Hug et al., 2013 ). Importantly, only about 50% of the relevant quality criteria can be measured with quantitative indicators. Therefore, humanities scholars will be critical of research assessments by means of indicators . Concerning a research assessment by means of quality criteria the studies show that a broad range of quality criteria must be applied and disciplinary differences have to be taken into account. With a certain amount of care, research indicators linked to the relevant criteria can be used to support the experts in research assessments (informed peer review). The project shows that humanities scholars are ready to take part in the development of quality criteria for research assessment if a strict bottom-approach is followed and transparency is assured ( Ochsner et al., 2014 ).

In the context of a broad examination of research assessment in law studies, Lienhard et al. (2016) present quality criteria for research in law studies drawing from the first findings of the project described above ( Hug et al., 2013 ) and complementing them with discipline specific criteria from the law studies. Being a discipline closely connected to a profession, the authors also included professionals (lawyers) into their analysis and find differences in the preferences for quality criteria between professors and lawyers, such as originality, reflexivity and theoretical soundness being emphasized much more by professors than lawyers, while clear language and correctness was more important to lawyers. Besides differentiating evaluations by different stakeholders, for example professors, lawyers or funders, they also differentiate between different assessment situations, for example, research evaluation, assessment of dissertations and habilitations or assessment of scholarly journals ( Lienhard et al., 2016 : 177).

In France, the Maison des Sciences de l’Homme en Bretagne (MSHB) supported two bottom-up projects related to research assessment in the humanities (for an overview see Williams and Galleron, 2016 ). The first project, IMPRESHS, was destined to investigate the dissemination practices and impact paths of research conducted by Breton scholars from various SSH disciplines (see https://www.mshb.fr/ projets_mshb/impreshs/2314/). Through focus group interviews and a thorough analysis of CVs, the project tried to identify publications with potential impact outside academia, as well as non-academic stakeholders of SSH researchers. The goal of the project was to understand what kind of relations SSH scholars build with these stakeholders, and to what extent one finds practices of co-creation of knowledge in France, such as described within the European project SIAMPI ( http://www.siampi.eu ). One of the major outcomes of the project is to have uncovered that many SSH scholars exercise a form of auto-censorship when it comes to declaring forms of research or outputs destined to a broader or non-scholar readership, these not being included in institutional forms of reporting or in CVs. This finding draw the attention of the project team upon the problems French scholars face when they come to declaring their work, since available fields in templates from AERES (the national agency for evaluation of higher education and research), or metadata structure in national repositories (such as HAL—Hyper Articles en Ligne) do not do justice to the large variety of outputs SSH research produces beyond the well-known books traditionally associated with the field. The project ultimately produced a more refined typology of outputs, which supported the creation of a pilot database destined to cope in a more appropriate way with the wealth and variety of SSH research.

The second project, QualiSHS, looked at how evaluative reports produced by AERES reflect disciplinary representations of quality. All evaluative reports produced in 2010–2011 about the activity of all the research units in history and law from two French regions (Bretagne and Rhône-Alpes) have been scrutinized using methods and tools from corpus linguistics, in search of formulations allowing to understand how peer experts conceptualize and perceive quality in the activities and outputs they evaluate. While interviews conducted in parallel confirmed that experts from the two investigated fields diverge regarding their perceptions of quality—a finding which is in line with what other studies pointed out about the diversity of SSH disciplines when it comes to the conceptualization of research quality (see for example, Hug et al., 2013 ; Gogolin and Stumm, 2014 ; Lienhard et al., 2016 )—it appears that reports do not echo these specificities adequately, since the main criteria they put forward are invariably the coherence of the research conducted in the evaluated unit and its productivity. It is not surprising, therefore, that the French SSH community found that the evaluation conducted by AERES was unsatisfactory on the whole and called for a radical modification of the exercise—a vow that was only very partially answered through the evolution of AERES towards HCERES 6 .

National research evaluation practices and the SSH

There are several projects on a national level that approach (national) research assessment in the SSH from a bottom-up perspective or that have designed the model to reflect SSH specifities. The inclusion of the SSH follows different degrees, from implementation of a performance-based funding model under the lead of an SSH scholar and thus accounting for SSH research practices from the beginning (some even say that the system gives the SSH an advantage, see Aagaard et al., 2015 ) in Norway ( Sivertsen, 2016 ) to a purely bottom-up approach based on research on SSH research practices and their impact on evaluation methods in Switzerland ( Loprieno et al., 2016 ).

The so-called “Norwegian model” ( Schneider, 2009 ) has caught considerable attention during the last years, and similar models were implemented in several countries (Belgium: Flanders, Denmark, Finland and Portugal). The Norwegian model is a performance-based funding model that should “represent all areas of research equally and properly” ( Sivertsen, 2016 : 80). The design of the model is a “simple pragmatic compromise” ( Sivertsen, 2016 : 80): one bibliometric indicator to cover all areas of research comprehensively rather than several representations of publication practices for individual disciplines. It consists of three components: a national data base that fully covers peer-reviewed scholarly output from all disciplines including books, a simple publication indicator dividing publications in level 1 and level 2 publications with a system of weights that makes discipline-specific publication traditions comparable at the level of institutions, and a performance-based funding model that reallocates a small fraction of the yearly funding according to the results of the indicator ( Sivertsen, 2016 : 79). Of course, the Norwegian model would also work without the third component (performance-based funding).

The indicator separates non-academic from academic publications by channels (books: publishers, journal articles: journals). The non-academic publications are not eligible for the performance indicator, while the academic publications are further divided into level 1 and level 2 publications. Level 2 publications cannot represent more than 20% of the world’s publications in a field. The government selects renowned scholars (deans, representatives from learned societies), from all major areas of research to be involved in the assignment process of publishers and journals to the levels, resulting in discipline-specific lists of channels.

The system gets more attention from the SSH scholars than from scholars of other areas. While initially the reaction was negative because it turns scholarly output into measures and the system is not designed to cover all scholarly activity but only academic publications, the evaluation of the system showed that there was no major discontent about the system among the scholars ( Aagaard et al., 2015 ). This might be well because of the fact that the indicator showed a high productivity of the SSH disciplines. In addition, while the main effect of the system is an increase of publication activity, the publication patterns did not change: book publishing, international publishing, and language use remained stable. Of course, the evaluation showed also some issues of the funding system: the fractionalizing of authorships favours the SSH, the assignment of experts in the definition of the publication levels is not transparent, and there is unintended use of the system on the individual level ( Aagaard et al., 2015 ).

In the Netherlands, the Royal Academy of the Arts and Sciences criticized the predominance of methods for (and from) national and life sciences in assessment practices in a report called “Judging Research on its Merits” and asked for specific methods for evaluating SSH disciplines in 2005 ( Royal Netherlands Academy of Arts and Sciences, 2005 ). In 2009, the Committee on the National Plan for the Future of the Humanities stated that the existing assessment tools are inadequate to judge the quality of humanities research and advised the Academy to develop a simple, clear and effective system of indicators for the humanities ( Committee on the National Plan for the Future of the Humanities, 2009 ). Thus, the Academy installed a Committee on Quality Indicators in the Humanities, whose report was published in 2011 ( Royal Netherlands Academy of Arts and Sciences, 2011 ). The committee summarizes the situation of research assessment in the humanities in the following way: some policy makers have too high expectations for a simple and purely metric system to compare research performance between research groups and even disciplines. On the other hand, there is too high an aversion against “measuring” research quality and management tools in general in the humanities disciplines. The committee thus suggests a mid-way solution and promotes applying an informed peer review process for SSH research assessments. Peer reviewers assess research along two dimensions, scholarly output and societal quality. Each of the dimensions is assessed using three criteria, that is, scholarly/societal publications or output, scholarly/societal use of output, evidence of scholarly/societal recognition. Each of these criteria can be measured by some quantitative indicators to support the peers in the decision making (for a schematic overview, see Royal Netherlands Academy of Arts and Sciences, 2011 : 47). This should add some inter-subjectivity to the peer review process while at the same time recognizing that also the quantitative indicators usually find their base in peer review in the first place ( Royal Netherlands Academy of Arts and Sciences, 2011 : 11).

The German Council of Science and Humanities ( Wissenschaftsrat ) reacted in 2004 to the growing importance of university rankings criticizing their methodology and validity with recommendations on research rankings ( Wissenschaftsrat, 2004 ). It established a comprehensive pilot study for developing and testing a national research rating in the disciplines chemistry and sociology. While such exercises rarely provoke strong reactions in the natural and life sciences, it is more controversial in SSH disciplines. Nevertheless, the research rating in sociology worked out well but met also criticism, especially the non-transparency of the plenary discussions in the panel annihilating the independency of the judgements of the two peers per research unit was pointed out as a danger to the objectivity and validity of the rating ( Riordan et al., 2011 ). In 2008, the Wissenschaftsrat decided that pilot studies in other disciplines are to be conducted to improve the procedure ( Mair, 2016 ). History was selected for the pilot study in the humanities. However, the rating for history spurred strong resistance and ended with a boycott by the Association of German Historians ( Plumpe, 2009 ). Mair (2016) suggests that the resistance of the historians was mainly due to miscommunication of the Wissenschaftsrat leading to a perception of a top-down-imposed assessment. To make the bottom-up intentions more explicit, a working group was created that worked out modifications to adapt the procedure to the characteristics of humanities research ( Wissenschaftsrat, 2010 : 203–205). In 2012, a pilot study in the humanities was eventually conducted. While still against the notion of quantifying research performance, the associations of English and American Studies decided to take part in the exercise ( Stierstorfer and Schneck, 2016 ). The Wissenschaftsrat qualified the exercise as a success that showed that such a rating is possible in the humanities; the humanities scholars involved in the exercise acknowledged the effort by the Wissenschaftsrat to adapt the procedure to the humanities but also identified some negative aspects and consequences of the exercise, such as a division into different sub-disciplines instead of a focus on commonalities ( Hornung et al., 2016 ).

In Switzerland, the Rectors’ Conference of the Swiss Universities (CRUS, since 1 January 2016 called swissuniversities) published in 2008 a position paper on research assessment entitled “The Swiss Way to University Quality”, which includes ten recommendations for quality monitoring ( CRUS, 2008 ). According to the CRUS, each Swiss university has its own specialization. Therefore, quality assurance has to be accustomed to the mission of each university. A national assessment procedure would therefore not make much sense. Instead, each university should build its own quality assurance system. A potential analysis for bibliometric indicators for research monitoring showed that these procedures are not fitted for use in the SSH. Therefore, a project entitled “Mesurer les performances de la recherche” was initiated that focused on the diversity of SSH research because research “includes a wide array of aspects, from the discovery of new knowledge and promoting young researchers to potential impacts on the scientific community and society” ( Loprieno et al., 2016 : 14). Since the relevance of these aspects differs between disciplines and university missions, the project paid particular attention to such differences and particularities of the disciplines. The project lasted from 2008 to 2012 and was followed by a second project during the time period of 2013 to 2016. In these two projects, several bottom-up initiatives were funded that researched such diverse topics as, amongst others (for a complete overview of the projects, see Loprieno et al., 2016 ), profiling in communication sciences ( Probst et al., 2011 ), cooperation of research teams with university partners as well as external stakeholders ( Perret et al., 2011 ), notions of quality of literature studies and art history scholars ( Ochsner et al., 2016 ), evaluation procedures and quality conceptions in law studies ( Lienhard et al., 2016 ), academic reputation and networks in economics ( Hoffmann et al., 2015 ).

At the same time, the Swiss Academy of Humanities and Social Sciences (SAGW) started a bottom-up initiative on reflections on research assessment in SSH disciplines. Following a conference on the broader topic entitled “For a New Culture in the Humanities” ( SAGW, 2012b ), the SAGW published a position paper on new developments in the humanities, including recommendations on assessment practices ( SAGW, 2012a : 32–36) that emphasizes the importance of bottom-up definitions on quality criteria and methods. The SAGW subsequently funded projects within their member associations to develop their recommendations or standards for research assessments in their disciplines. The resulting report features statements from Asian and Oriental studies, area studies, cultural and social anthropology, peace research, political sciences, art history and environmental humanities accompanied by a synthesis report by the SAGW ( Iseli, 2016 ).

Bottom-up initiatives at the European level

The different assessment procedures applied at the university or national level, the initial exclusion of SSH research in the ERC Grant-schemes as well as the initial concerns of severe cut-backs for the SSH in the Horizon 2020 program ( König, 2016 : 154–155) led to a higher interest of SSH scholars in the topic of research assessment. As the sections above show, there is a rise in SSH research on research assessment and evaluation, leading to sessions or even tracks dedicated to SSH research assessment at international scientometric conferences like the ISSI 2015 ( www.issi2015.org ) or the STI 2016 (sti2016.org) conferences, or to an international conference dedicated exclusively to SSH research evaluation, RESSH 2015 ( www.ressh.eu ). Even more important, SSH scholars team up with scientometricians concerned about the state of SSH research assessment (often SSH scholars themselves) in a European association called EvalHum initiative ( www.evalhum.eu ). EvalHum sets out to motivate and support bottom-up work on research evaluation in the SSH and encourages best practices in research evaluation in SSH that ensure adequate assessment procedures for the respective disciplines. EvalHum is also a forum on this topic and will strive for an accurate recognition of SSH research at the European level.

Currently, there is a COST Action entitled “European Network for Research Evaluation in the Social Sciences and Humanities (ENRESSH)” (CA-15137) that brings together SSH scholars from 30 European countries working together to improve assessment procedures in and for the SSH ( http://www.cost.eu/COST_Actions/ca/CA15137 ). The idea behind the action is “evaluating to valorize” because applying ill-adapted methods lead to under-valuation of SSH research. Participants in the Action share data about SSH research and confront methodologies, resulting in co-authored publications but also in policy briefs, collections of best practices and, ultimately, guidelines for SSH research evaluation. ENRESSH seeks also to involve the different stakeholders having a say in assessment principles and processes, to progress towards adequate frameworks and practices of SSH research. The Action consists of 4 Work Groups. The first Work Group focuses on the conceptual frameworks for SSH research assessment and studies the SSH knowledge production processes and strategies as a basis for developing adequate assessment procedures reflecting the SSH research practices. It investigates SSH scholars’ perceptions of research quality, peer review practices and national assessment practices. The second Work Group is about societal impact and relevance of SSH research. It observes the structural requirements needed for a smooth transfer of SSH research to the society, the national policies towards transfer to socio-economic or NGO partners, proposes procedures to collect data about engagement with the society and measures to better value the SSH. The third Work Group concerns databases and the use of data for understanding SSH research. It builds standards for the interoperability of, and methods for integrating data from, current research information systems and repositories dedicated to the SSH, to allow comparability of SSH publishing practices in various countries. It analyses the characteristics of SSH dissemination channels, develops common rules for building databases, designs a roadmap for a European bibliometric database and develops alternative metrics for the SSH. The fourth Work Group is concerned with the dissemination of the results of the Action. It builds a list of relevant European stakeholders in SSH research assessment and interacts actively with them and organizes conferences.

The future of research assessment in the humanities

While until recently research on assessment in the SSH focused on the deficiencies of the current assessment methods, such as bibliometrics and scientometrics, there is now much research going on that takes a bottom-up approach and focuses on research practices in the SSH and reflects on how to assess SSH research with its own methods instead of applying and adjusting the methods developed for and in the natural and life sciences (see also Hammarfelt, 2016 : 115). This is an important development because we can learn from the examples shown in the sections above that whenever the scholars felt that the assessment procedures were imposed top-down without proper adjustments to SSH research, it resulted in boycott or resistance (see for example, Academics Australia, 2008 ; Andersen et al., 2009 ; Mair, 2016 ).

The projects presented in this article show furthermore that if the assessment procedures adequately reflect the SSH research practices, scholars are ready to collaborate (for example, Giménez-Toledo et al., 2013 ; Ochsner et al., 2014 ) and to accept more easily research assessment, like in the Norwegian or German case ( Aagaard et al., 2015 ; Sivertsen, 2016 ; Stierstorfer and Schneck, 2016 ). Full-coverage databases including all relevant document types are of value for scholarly work ( Gogolin, 2016 ; Sandor and Vorndran, 2014a , b ) and increase the visibility of humanities research production ( Aagaard et al., 2015 ). While there are some degrees of convergence in some countries regarding their databases ( Giménez-Toledo et al., 2016 ), the conditions for full interoperability have yet to be discussed. It also has to be born in mind that universities fulfil different missions and countries face diverse challenges. Criteria and procedures for research evaluation should be adapted to the missions of the universities and to the specific aims of the evaluation ( Loprieno et al., 2016 ).

The future of research assessment in the humanities lies therefore in bottom-up procedures that are based on the research practices in the respective disciplines. However, the projects presented in this article show that more research on the research practices in the humanities is needed. Such research has only started. If bottom-up approaches are to be followed, more knowledge is needed on how research is conducted and disseminated as well as how it is used by different stakeholders including the SSH researchers themselves.

Combining the approaches and the insights on SSH research production presented in this article, we propose the following recommendations for research assessment in the humanities (these recommendations draw on Ochsner et al., 2015 ):

The preferred method of evaluation is informed peer review : peer review is accepted among scholars as an assessment procedure. However, it has several drawbacks such as, for example, poor inter-subjectivity and low reliability through dependency on the panel composition ( Bornmann, 2011 ; Riordan et al., 2011 ; Royal Netherlands Academy of Arts and Sciences, 2011 ). Scientific and political measures can however be taken to reduce these inconveniences, such as applying a fair evaluation process that grants the evaluated scholars the possibility to comment upon the process and its results.

A broad range of quality criteria has to be taken into account. The quality criteria must be developed bottom-up and reflect the notions of quality of the assessed scholars ( Hug et al., 2013 ; Ochsner et al., 2013 ) as they alone can judge what quality in the discipline actually is and they do see research quality predominantly as academic quality ( Kekäle, 2002 ). To assure that all paradigms and research traditions as well as new ways of thinking are included, quality criteria should be developed surveying all scholars to be evaluated.

For the quality criteria that reach consensus among the scholars, indicators can be identified. The scholars should rate the indicators with regard to how these indicators are measuring the criterion adequately.

From the quality criteria and indicators that reach consensus among the scholars, an evaluation sheet is to be created. The evaluation sheet thus includes both criteria that can be measured with indicators and criteria that cannot be measured ( Ochsner et al., 2012 ).

Other stakeholders’ criteria for research performance can be included in the evaluation sheet to take into account other goals of research than academic research quality ( Royal Netherlands Academy of Arts and Sciences, 2011 ). While not developed specifically for the humanities but in a way that allows a bottom-up approach to societal impact, the “Evaluating Research in Context”-project could serve as an example ( Spaapen et al., 2007 ). The criteria and indicators from other stakeholders should be indicated as such to ensure the transparency to the researchers and to make visible what is important from an academic point of view and what is important from other stakeholders’ view.

The peers must rate every criterion on its own, which is in line with the insights of Thorngate et al. (2009) who summarize the findings of their comprehensive research on decision making the following way: judging something overall is usually inconsistent and not adequate for judging merit while judging separately according to specified criteria reveals more reliable results ( Thorngate et al., 2009 : 26). The peers’ reading should be restricted to a reasonable amount of effort.

Rankings or ratings with an overall measure should not be published. Instead, the results of every single criterion should be provided. If overall ratings are produced, the weighting procedure has to be made transparent. However, it should be kept in mind that research units have different missions to fulfil, therefore an overall rating might favour some missions over others leading to a structural discrimination of some research units.

Many important issues of our times are global in nature and society has high hopes in a technical solution. The SSH, and specifically the humanities, are therefore not in the focus of the public discourse. Especially the critical questions SSH disciplines are asking are not high on the political agenda. However, complex global issues such as, for example, global warming, migration crisis, ageing or HIV cannot be sufficiently resolved without the knowledge of SSH disciplines. The critical questions challenging the blind technological faith in overcoming such problems are crucial. Not being on the top of the political agenda, however, does not mean to give in to the mainstream neo-positivist notion of a parametrically steered research policy. Nor does it mean that SSH scholars should frown at all requests for accountability. Instead, SSH disciplines should step forward and self-confidently and openly question truisms or blind technological faith and propose alternatives to simple but misleading practices. This paper presents many bottom-up actions of SSH scholars taking research assessment in their own hands. Certainly, these bottom-up procedures will lead to a more adequate assessment of SSH research but they might also help fostering a better valorization of SSH research among policy makers and colleagues from the natural sciences. And eventually, maybe some scientists will find these approaches also fruitful for their own disciplines? At the same time, an adequate evaluation and valorization of SSH research will also help society to better understand what the SSH contribution to solving major societal challenges can be. Therefore, taking the time to encourage bottom-up evaluation initiatives should result in better solving of modern societies’ issues.

Data availability

Data sharing is not applicable to this article as no datasets were analysed or generated.

Additional information

How to cite this article : Ochsner M et al. (2017) The future of research assessment in the humanities: bottom-up assessment procedures. Palgrave Communications . 3:17020 doi: 10.1057/palcomms.2017.20. Footnote 4 Footnote 5 Footnote 6

Because it comes from SSH scholars and is clearly bottom-up in nature, we include, however, the initiative by Oancea and Furlong (2007) that was motivated by, but did not have a visible impact on, the RAE in the United Kingdom.

Despite the inclusion of (some) books in the commercial databases in recent years as well as the rise of networking sites also promoting bibliographic data, the under-coverage of certain disciplines and languages remains while technical challenges arise and issues of transparency persist ( Gorraiz et al., 2013 ; Murray, 2014 ; Zuccala and Cornacchia, 2016 )

Others argue that there is a difference between performance-based funding and research evaluation. The first is used to distribute scarce funds and needs not to be related to quality while the latter is formative in nature and includes an understanding of quality. However, while this might be true from the evaluator’s perspective, it is misleading regarding the effect on the scholars’ behaviour. If scholars are assessed by indicators, they perceive these not only as incentives but also as indications of what is expected from them (as well as what is valued as ‘good’ research) and they will adjust their behaviour accordingly (see for example, Hammarfelt and de Rijcke, 2015 ; Williams and Galleron, 2016 ).

As mentioned in the introduction, we focus on European initiatives for coherence reasons and because of restrictions of space.

See the official website https://dbh.nsd.uib.no/publiseringskanaler/erihplus/

The Evaluation Agency for Research and Higher Education (Agence d'évaluation de la recherche et de l'enseignement supérieur, AERES) was replaced by the High Council for the Evaluation of Research and Higher Education (Haut Conseil de l'évaluation de la recherche et de l'enseignement supérieur, HCERES) on 17 November 2014.

Aagaard K, Bloch C and Schneider JW (2015) Impacts of performance-based research funding systems: The case of the Norwegian Publication Indicator. Research Evaluation ; 24 (2): 106–117.

Google Scholar  

Academics Australia. (2008) Letter to Senator the Honourable Kim Carr, Minister for Innovation, Industry, Science and Research , https://web.archive.org/web/20091221195149/http://www.academics-australia.org/AA/ERA/era.pdf , accessed 8 February 2017.

van den Akker W (2016) Yes we should; research assessment in the humanities. In: Ochsner M, Hug SE and Daniel H-D (eds). Research Assessment in the Humanities. Towards Criteria and Procedures . Springer International Publishing: Cham, Switzerland, pp 23–29.

Andersen H et al. (2009) Editorial journals under threat: A joint response from history of science. Technology and Medicine Editors. Social Studies of Science ; 39 (1): 6–9.

Archambault É, Vignola-Gagne E, Cote G, Lariviere V and Gingras Y (2006) Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics ; 68 (3): 329–342.

van Arensbergen P, van der Weijden I and van den Besselaar P (2014a) Different views on scholarly talent: What are the talents we are looking for in science? Research Evaluation ; 23 (4): 273–284.

van Arensbergen P, van der Weijden I and van den Besselaar P (2014b) The selection of talent as a group process. A literature review on the social dynamics of decision making in grant panels. Research Evaluation ; 23 (4): 298–311.

Arts and Humanities Research Council. (2006) Use of research metrics in the arts and humanities ; Report of the expert group set up jointly by the Arts and Humanities Research Council and the Higher Education Funding Council for England AHRC: Bristol, UK.

Arts and Humanities Research Council. (2009) Leading the World. The economic Impact of UK Arts and Humanities Research . AHRC: Bristol, UK.

Barker K (2007) The UK research assessment exercise: The evolution of a national research evaluation system. Research Evaluation ; 16 (1): 3–12.

Bornmann L (2011) Scientific peer review. Annual Review of Information Science and Technology ; 45 (1): 197–245.

Bunia R (2016) Quotation statistics and culture in literature and in other humanist disciplines. In: Ochsner M, Hug SE and Daniel H-D (eds). Research Assessment in the Humanities. Towards Criteria and Procedures . Springer International Publishing: Cham, Switzerland, pp 133–148.

Burrows R (2012) Living with the h-index? Metric assemblages in the contemporary academy. The Sociological Review ; 60 (2): 355–372.

Butler L (2003) Explaining Australia’s increased share of ISI publications—The effects of a funding formula based on publication counts. Research Policy ; 32 (1): 143–155.

Butler L (2007) Assessing university research: A plea for a balanced approach. Science and Public Policy ; 34 (8): 565–574.

Butler L (2008) Using a balanced approach to bibliometrics: Quantitative performance measures in the Australian research quality framework. Ethics in Science and Environmental Politics ; 8 (1): 83–92.

Butler L and McAllister I (2009) Metrics or peer review? Evaluating the 2001 UK research assessment exercise in political science. Political Studies Review ; 7 (1): 3–17.

Butler L and Visser MS (2006) Extending citation analysis to non-source items. Scientometrics ; 66 (2): 327–343.

CAS   Google Scholar  

Chi P-S (2012) Bibliometric characteristics of political science research in Germany. Proceedings of the American Society for Information Science and Technology ; 49 (1): 1–6.

Chi P-S (2014) Which role do non-source items play in the social sciences? A case study in political science in Germany. Scientometrics ; 101 (2): 1195–1213.

Committee on the National Plan for the Future of the Humanities. (2009) Sustainable Humanities: Report from the National Committee on the Future of the Humanities in the Netherlands . Amsterdam University Press: Amsterdam, The Netherlands.

Commission of the European Communities. (2000) Communication from the Commission to the Council, the European Parliament, the Economic and Social Committee and the Committee of the Regions: Towards a European research area . Commission of the European Communities: Brussels, UK.

Council for the Humanities, Arts and Social Sciences. (2009) Humanities and Creative Arts: Recognising Esteem Factors and Non- traditional Publication in Excellence in Research for Australia (ERA) Initiative ; CHASS Papers. Council for the Humanities Arts and Social Sciences: Canberra, Australia.

CRUS. (2008) The Swiss Way to University Quality . Rectors’ Conference of the Swiss Universities (CRUS): Bern, Switzerland.

Drabek A, Rozkosz EA, Hołowiecki M and Kulczycki E (2015) Polski Współczynnik Wpływu a kultury cytowań w humanistyce. Nauka i Szkolnictwo Wyższe ; 46 (2): 121–138.

European Research Area and Innovation Committee. (2015) European Research Area (ERA) Roadmap 2015–2020 . European Research Area and Innovation Committee: Brussels, UK.

Ferrara A and Bonaccorsi A (2016) How robust is journal rating in Humanities and Social Sciences? Evidence from a large-scale, multi-method exercise. Research Evaluation ; 25 (3): 279–291.

Finkenstaedt T (1990) Measuring research performance in the humanities. Scientometrics ; 19 (5): 409–417.

Fisher D, Rubenson K, Rockwell K, Grosjean G and Atkinson-Grosjean J (2000) Performance Indicators and the Humanities and Social Sciences . Centre for Policy Studies in Higher Education and Training: Vancouver, BC.

Genoni P and Haddow G (2009) ERA and the ranking of australian humanities journals. Australian Humanities Review ; 46 , 5–24.

van Gestel R, Micklitz H-W and Poiares MM (2012) Methodology in the new legal world. EUI Working Papers LAW 2012/13. doi: 10.2139/ssrn.2069872.

Giménez-Toledo E (2016) Assessment of journal & book publishers in the humanities and social sciences in Spain. In: Ochsner M, Hug SE and Daniel H-D (eds). Research Assessment in the Humanities. Towards Criteria and Procedures . Springer International Publishing: Cham, Switzerland, pp 91–102.

Giménez-Toledo E et al. (2016) Taking scholarly books into account: Current developments in five European countries. Scientometrics ; 107 (2): 1–15.

Giménez-Toledo E, Tejada-Artigas C and Mañana-Rodríguez J (2013) Evaluation of scientific books’ publishers in social sciences and humanities: Results of a survey. Research Evaluation ; 22 (1): 64–77.

Glänzel W (1996) A bibliometric approach to social sciences. National research performances in 6 selected social science areas, 1990–1992. Scientometrics ; 35 (3): 291–307.

Glänzel W, Thijs B and Debackere K (2016) Productivity, performance, efficiency, impact—What do we measure anyway? Journal of Informetrics ; 10 (2): 658–660.

Gogolin I (2016) European educational research quality indicators (EERQI): An experiment. In: Ochsner M, Hug SE and Daniel H-D (eds). Research Assessment in the Humanities. Towards Criteria and Procedures . Springer International Publishing: Cham, Switzerland, pp 103–111.

Gogolin I, Astrom F and Hansen A (eds) (2014) Assessing Quality in European Educational Research . Springer VS: Wiesbaden, Germany.

Gogolin I, Stumm V (2014) The EERQI peer review questionnaire. In: Gogolin I, Astrom F and Hansen A (eds). Assessing Quality in European Educational Research . Springer VS: Wiesbaden, Germany, pp 107–120.

Gorraiz J, Purnell PJ and Glänzel W (2013) Opportunities for and limitations of the Book Citation Index. Journal of The American Society For Information Science and Technology ; 64 (7): 1388–1398.

Guetzkow J, Lamont M and Mallard G (2004) What Is originality in the humanities and the social sciences? American Sociological Review ; 69 (2): 190–212.

Guillory J (2005) Valuing the humanities, evaluating scholarship. Profession (MLA) ; 11 , 28–38.

Gumpenberger C, Glänzel W and Gorraiz J (2016) The ecstasy and the agony of the altmetric score. Scientometrics ; 108 (2): 977–982.

Hamann J (2016) The visible hand of research performance assessment. Higher Education ; 72 (6): 761–779.

Hammarfelt B (2012) Following the footnotes: A bibliometric analysis of citation patterns in literary studies . Doctoral dissertation. Skrifter utgivna vid institutionen för ABM vid Uppsala Universitet (Vol. 5). Uppsala Universitet: Uppsala, http://www.diva-portal.org/smash/get/diva2:511996/FULLTEXT01.pdf .

Hammarfelt B (2013) Harvesting footnotes in a rural field: Citation patterns in Swedish literary studies. Journal of Documentation ; 68 (4): 536–558.

Hammarfelt B (2016) Beyond coverage: Toward a bibliometrics for the humanities. In: Ochsner M, Hug SE and Daniel H-D (eds). Research Assessment in the Humanities. Towards Criteria and Procedures . Springer International Publishing: Cham, Switzerland, pp 115–131.

Hammarfelt B and de Rijcke S (2015) Accountability in context: Effects of research evaluation systems on publication practices, disciplinary norms, and individual working routines in the faculty of Arts at Uppsala University. Research Evaluation ; 24 (1): 63–77.

Hellqvist B (2010) Referencing in the humanities and its implications for citation analysis. Journal of The American Society For Information Science And Technology ; 61 (2): 310–318.

Hemlin S (1993) Scientific quality in the eyes of the scientist. A questionnaire study. Scientometrics ; 27 (1): 3–18.

Hemlin S (1996) Social studies of the humanities. A case study of research conditions and performance in ancient history and classical archaeology and English. Research Evaluation ; 6 (1): 53–61.

Hemlin S and Gustafsson M (1996) Research production in the arts and humanities. A questionnaire study of factors influencing research performance. Scientometrics ; 37 (3): 417–432.

Herbert U and Kaube J (2008) Die Mühen der Ebene: Über Standards, Leistung und Hochschulreform. In: Lack E and Markschies C (eds). What the hell is quality? Qualitätsstandards in den Geisteswissenschaften . Campus-Verlag: Frankfurt, Germany, pp 37–51.

Hicks D (2004) The four literatures of social science. In: Moed H, Glänzel W and Schmoch U (eds). Handbook of Quantitative Science and Technology Research . Kluwer Academic Publishers: New York, pp 473–496.

Hicks D (2012) Performance-based university research funding systems. Research Policy ; 41 (2): 251–261.

Hoffmann CP, Lutz C and Meckel M (2015) A relational altmetric? Network centrality on ResearchGate as an indicator of scientific impact. Journal of the Association for Information Science and Technology ; 67 (4): 765–775.

Holmberg K and Thelwall M (2014) Disciplinary differences in Twitter scholarly communication. Scientometrics ; 101 (2): 1027–1042.

Hornung A, Khlavna V and Korte B (2016) Research Rating Anglistik/Amerikanistik of the German Council of Science and Humanities. In: Ochsner M, Hug SE and Daniel H-D (eds). Research Assessment in the Humanities. Towards Criteria and Procedures . Springer International Publishing: Cham, Switzerland, pp 219–233.

Hose M (2009) Qualitätsmessung: Glanz und Elend der Zahl. In: Prinz C and Hohls R (eds). Qualitätsmessung, Evaluation, Forschungsrating. Risiken und Chancen für die Geschichtswissenschaft? . Historisches Forum. Clio-online: Berlin, Germany, pp 91–98.

Hug SE and Ochsner M (2014) A framework to explore and develop criteria for assessing research quality in the humanities. International Journal of Education Law and Policy ; 10 (1): 55–68.

Hug SE, Ochsner M and Daniel H-D (2013) Criteria for assessing research quality in the humanities: A Delphi study among scholars of English literature, German literature and art history. Research Evaluation ; 22 (5): 369–383.

Iseli M (2016) Qualitäts- und Leistungsbeurteilung in den Geistes- und Sozialwissenschaften: Prinzipien, Ansätze und Verfahren . SAGW: Bern, Switzerland.

Johnston R (2008) On structuring subjective judgements: Originality, significance and rigour in RAE 2008. Higher Education Quarterly ; 62 (1/2): 120–147.

Kekäle J (2002) Conceptions of quality in four different disciplines. Tertiary Education and Management ; 8 (1): 65–80.

König T (2016) Peer review in the social sciences and humanities at the European Level: The experiences of the European research council. In: Ochsner M, Hug SE and Daniel H-D (eds). Research Assessment in the Humanities. Towards Criteria and Procedures . Springer International Publishing: Cham, Switzerland, pp 151–163.

Kousha K and Thelwall M (2009) Google book search: Citation analysis for social science and the humanities. Journal of The American Society For Information Science And Technology ; 60 (8): 1537–1549.

Krull W and Tepperwien A (2016) The four ‘I’s: Quality indicators for the humanities. In: Ochsner M, Hug SE and Daniel H-D (eds). Research Assessment in the Humanities. Towards Criteria and Procedures . Springer International Publishing: Cham, Switzerland, pp 165–179.

Kwok JT (2013) Impact of ERA Research Assessment on University Behaviour and their Staff . National Tertiary Education Union: South Melbourne, Australia.

Lack E (2008) Einleitung—Das Zauberwort “Standards”. In: Lack E and Markschies C (eds). What the hell is quality? Qualitätsstandards in den Geisteswissenschaften . Campus-Verlag: Frankfurt, Germany, pp 9–34.

Lamont M (2009) How Professors Think: Inside the Curious World of Academic Judgment . Harvard University Press: Harvard, UK.

Lariviere V, Gingras Y and Archambault É (2006) Canadian collaboration networks: A comparative analysis of the natural sciences, social sciences and the humanities. Scientometrics ; 68 (3): 519–533.

Lauer G (2016) The ESF scoping project “towards a bibliometric database for the social sciences and humanities”. In: Ochsner M, Hug SE and Daniel H-D (eds). Research Assessment in the Humanities. Towards Criteria and Procedures . Springer International Publishing: Cham, Switzerland, pp 73–77.

Lawrence PA (2002) Rank injustice. Nature ; 415 (6874): 835–836.

van Leeuwen TN (2013) Bibliometric research evaluations, web of science and the social sciences and Humanities: A problematic relationship? Bibliometrie—Praxis und Forschung ; 2 , 8.

Lienhard A, Tanquerel T, Flückiger A, Amschwand F, Byland K and Herrmann E (2016) Forschungsevaluation in der Rechtswissenschaft: Grundlagen und empirische Analyse in der Schweiz . Stämpfli Verlag: Bern, Switzerland.

Loprieno A, Werlen R, Hasgall A and Bregy J (2016) The “Mesurer les Performances de la Recherche” Project of the Rectors’ Conference of the Swiss Universities (CRUS) and Its Further Development. In: Ochsner M, Hug SE and Daniel H-D (eds). Research Assessment in the Humanities. Towards Criteria and Procedures . Springer International Publishing: Cham, Switzerland, pp 13–21.

Luckman S (2004) More than the sum of its parts: The humanities and communicating the ‘hidden work’ of research. In: Kenway J, Bullen E and Robb S (eds). Innovation and Tradition: The Arts, Humanities, and the Knowledge Economy . Peter Lang: New York, pp 82–90.

MacDonald SP (1994) Professional Academic Writing in the Humanities and Social Sciences . Southern Illinois University Press: Carbondale, Edwardsville, IL.

Mair C (2016) Rating research performance in the humanities: An interim report on an initiative of the German Wissenschaftsrat. In: Ochsner M, Hug SE and Daniel H-D (eds). Research Assessment in the Humanities. Towards Criteria and Procedures . Springer International Publishing: Cham, Switzerland, pp 201–209.

Martin BR et al. (2010) Towards a bibliometric database for the social sciences and humanities. A European scoping project (A report produced for DFG, ESRC, AHRC, NWO, ANR and ESF). Science and Technology Policy Research Unit: Sussex.

McCarthy KF, Ondaatje EH, Zakaras L and Brooks A (2004) Gifts of the Muse. Refraiming the Debate About the Benefits of the Arts . RAND Corporation: Santa Monica, CA.

Mohammadi E and Thelwall M (2014) Mendeley readership altmetrics for the social sciences and humanities: Research evaluation and knowledge flows. Journal of the Association for Information Science and Technology ; 65 (8): 1627–1638.

Mojon-Azzi SM, Jiang X, Wagner U and Mojon DS (2003) Journals: Redundant publications are bad news. Nature ; 421 (6920): 209.

Molinié A and Bodenhausen G (2010) Bibliometrics as weapons of mass citation. CHIMIA International Journal for Chemistry ; 64 (1): 78–89.

Moonesinghe R, Khoury MJ and Janssens A C J W (2007) Most published research findings are false—But a little replication goes a long way. PLoS Medicine ; 4 (2): e28.

Murray M (2014) Analysis of a scholarly social networking site: The case of the dormant user. SAIS 2014 Proceedings . Paper 24, http://aisel.aisnet.org/sais2014 , accessed 8 February 2017.

Nederhof AJ (2006) Bibliometric monitoring of research performance in the social sciences and the humanities: A review. Scientometrics ; 66 (1): 81–100.

Nederhof AJ, Zwaan R, de Bruin R and Dekker P (1989) Assessing the usefulness of bibliometric indicators for the humanities and the social sciences—a comparative study. Scientometrics ; 15 (5): 423–435.

Norris M and Oppenheim C (2003) Citation counts and the research assessment exercise V—Archaeology and the 2001 RAE. Journal of Documentation ; 59 (6): 709–730.

Nussbaum MC (2010) Not for Profit: Why Democracy Needs the Humanities . Princeton University Press: Princeton, NJ.

Oancea A and Furlong J (2007) Expressions of excellence and the assessment of applied and practice-based research. Research Papers in Education ; 22 (2): 119–137.

Ochsner M, Hug SE and Daniel H-D (2012) Indicators for research quality in the humanities: Opportunities and limitations. Bibliometrie—Praxis und Forschung ; 1 , 4.

Ochsner M, Hug SE and Daniel H-D (2013) Four types of research in the humanities: Setting the stage for research quality criteria in the humanities. Research Evaluation ; 22 (2): 79–92.

Ochsner M, Hug SE and Daniel H-D (2014) Setting the stage for the assessment of research quality in the humanities. Consolidating the results of four empirical studies. Zeitschrift für Erziehungswissenschaft ; 17 (6): 111–132.

Ochsner M, Hug SE, Daniel H-D (eds) (2016) Humanities scholars’ conceptions of research quality. In: Research Assessment in the Humanities. Towards Criteria and Procedures . Springer International Publishing: Cham, Switzerland, pp 43–69.

Ochsner M, Wolbring T and Hug SE (2015) Quality criteria for sociology? What sociologists can learn from the project “developing and testing research quality criteria in the humanities”. Sociologia E Politiche Sociali ; 18 (2): 90–110.

Oppenheim C and Summers M (2008) Citation counts and the research assessment exercise, part VI: Unit of assessment 67 (music). Information Research ; 13 (2).

Ossenblok T and Engels T (2015) Edited books in the social sciences and humanities: Characteristics and collaboration analysis. Scientometrics ; 104 (1): 219–237.

Ossenblok TLB, Engels TCE and Sivertsen G (2012) The representation of the social sciences and humanities in the web of science: A comparison of publication patterns and incentive structures in Flanders and Norway (2005–9). Research Evaluation ; 21 (4): 280–290.

Palumbo M and Pennisi C (2015) Criteri corretti e condivisi per una valutazione buona e utile della ricerca [Correct and shared criteria for a good and useful evaluation of research]. Sociologia E Politiche Sociali ; 18 (2): 73–89.

Perret JF, Sormani P, Bovet A and Kohler A (2011) Décrire et mesurer la “fécondité” des recherches en sciences humanies et sociales. Bulletin SAGW ; 2011 (2): 40–42.

Plag I (2016) Research assessment in a philological discipline: Criteria and rater reliability. In: Ochsner M, Hug SE and Daniel H-D (eds). Research Assessment in the Humanities. Towards Criteria and Procedures . Springer International Publishing: Cham, Switzerland, pp 235–247.

Plumpe W (2009) Qualitätsmessung: Stellungnahme zum Rating des Wissenschaftsrates aus Sicht des Historikerverbandes. In: Prinz C and Hohls R (eds). Qualitätsmessung, Evaluation, Forschungsrating. Risiken und Chancen für die Geschichtswissenschaft? . Historisches Forum. Clio-online: Berlin, Germany, pp 121–126.

Plumpe W (2010) Der Teufel der Unvergleichbarkeit. Über das quantitative Messen und Bewerten von Forschung. Forschung und Lehre ; 17 (8): 572–574.

Probst C, Lepori B, de Filippo D and Ingenhoff D (2011) Profiles and beyond: Constructing consensus on measuring research output in communication sciences. Research Evaluation ; 20 (1): 73–88.

Redden G (2008) From RAE to ERA: Research evaluation at work in the corporate university. Australian Humanities Review ; 45 , 7–26.

Riordan P, Ganser C and Wolbring T (2011) Measuring the quality of research. Kölner Zeitschrift für Soziologie und Sozialpsychologie ; 63 (1): 147–172.

Royal Netherlands Academy of Arts and Sciences. (2005) Judging research on its Merits . An advisory report by the Council for the Humanities and the Social Sciences Council. Royal Netherlands Academy of Arts and Sciences: Amsterdam.

Royal Netherlands Academy of Arts and Sciences. (2011) Quality Indicators for Research in the Humanities . Royal Netherlands Academy of Arts and Sciences: Amsterdam, The Netherlands.

SAGW. (2012a) Für eine Erneuerung der Geisteswissenschaften. Empfehlungen der SAGW zuhanden der Leitungsorgane der Hochschulen, der Lehrenden, der Förderorganisationen und des Staatssekretariats für Bildung und Forschung . SAGW: Bern, Switzerland.

SAGW. (2012b) Für eine neue Kultur der Geisteswissenschaften? Akten des Kongresses vom 30. November bis 2. Dezember 2011, Bern. SAGW: Bern, Switzerland.

Sandor A and Vorndran A (2014a) Enhancing relevance ranking of the EERQI search engine. In: Gogolin I, Astrom F and Hansen A (eds). Assessing Quality in European Educational Research . Springer VS: Wiesbaden, Germany, pp 56–59.

Sandor A and Vorndran A (2014b) Highlighting salient sentences for reading assistance. In: Gogolin I, Astrom F and Hansen A (eds). Assessing Quality in European Educational Research . Springer VS: Wiesbaden, Germany, pp 43–55.

Schneider JW (2009) An outline of the bibliometric indicator used for performance-based funding of research institutions in Norway. European Political Science ; 8 (3): 364–378.

Sivertsen G (2016) Publication-based funding: The Norwegian model. In: Ochsner M, Hug SE and Daniel H-D (eds). Research Assessment in the Humanities. Towards Criteria and Procedures . Springer International Publishing: Cham, Switzerland, pp 79–90.

Spaapen J, Dijstelbloem H and Wamelink F (2007) Evaluating Research in Context: A Method for Comprehensive Assessment , Second edition, Consultative Committee of Sector Councils for Research and Development: The Hague, The Netherlands.

Stierstorfer K and Schneck P (2016) “21 Grams”: Interdisciplinarity and the assessment of quality in the humanities. In: Ochsner M, Hug SE and Daniel H-D (eds). Research Assessment in the Humanities. Towards Criteria and Procedures . Springer International Publishing: Cham, Switzerland, pp 211–218.

Thorngate W, Dawes RM and Foddy M (2009) Judging merit . Pychology Press Taylor & Francis Group: New York, Hove, UK.

Unreliable research. Trouble at the lab. (2013) The Economist . 19 October, http://www.economist.com/news/briefing/21588057-scientists-think-science-self-correcting-alarming-degree-it-not-trouble .

Vec M (2009) Qualitätsmessung: Die vergessene Freiheit. Steuerung und Kontrolle der Geisteswissenschaften unter der Prämisse der Prävention. In: Prinz C and Hohls R (eds). Qualitätsmessung, Evaluation, Forschungsrating. Risiken und Chancen für die Geschichtswissenschaft? . Historisches Forum. Clio-online: Berlin, Germany, pp 79–90.

Verleysen FT and Weeren A (2016) Clustering by publication patterns of senior authors in the social sciences and humanities. Journal of Informetrics ; 10 (1): 254–272.

VolkswagenStiftung. (2014) What is Intellectual Quality in the Humanities? Some Guidelines . VolkswagenStiftung: Hannover, Germany.

Weingart P, Prinz W, Kastner M, Maasen S and Walter W (1991) Die sogenannten Geisteswissenschaften: Aussenansichten: Die Entwicklung der Geisteswissenschaften in der BRD, 1954–1987 . Suhrkamp: Frankfurt am Main, Germany.

White HD, Boell SK, Yu H, Davis M, Wilson CS and Cole FTH (2009) Libcitations: A measure for comparative assessment of book publications in the humanities and social sciences. Journal of The American Society For Information Science and Technology ; 60 (6): 1083–1096.

Williams G and Galleron I (2016) Bottom Up from the bottom: A new outlook on research evaluation for the SSH in France. In: Ochsner M, Hug SE and Daniel H-D (eds). Research Assessment in the Humanities. Towards Criteria and Procedures . Springer International Publishing: Cham, Switzerland, pp 181–198.

Wissenschaftsrat. (2004) Recommendations for rankings in the system of higher education and research. Part 1: Research . Wissenschaftsrat: Hamburg.

Wissenschaftsrat. (2010) Empfehlungen zur vergleichenden Forschungsbewertung in den Geisteswissenschaften . Wissenschaftsrat: Köln, Germany.

Zuccala AA (2016) Inciting the metric oriented humanist: Teaching bibliometrics in a faculty of humanities. Education for Information ; 32 (2): 149–164.

Zuccala AA, Verleysen FT, Cornacchia R and Engels TCE (2015) Altmetrics for the humanities. Aslib Journal of Information Management ; 67 (3): 320–336.

Zuccala AA and Cornacchia R (2016) Data matching, integration, and interoperability for a metric assessment of monographs. Scientometrics ; 108 (1): 465–484.

Zuccala AA and van Leeuwen T (2011) Book reviews in humanities research evaluations. Journal of The American Society For Information Science and Technology ; 62 (10): 1979–1991.

Download references

Acknowledgements

This article is based upon work from COST Action CA 15137 ‘European Network for Research Evaluation in the SSH (ENRESSH)’, supported by COST (European Cooperation in Science and Technology). Michael Ochsner and Sven E. Hug would like to thank swissuniversities for their grant for the project “Application of Bottom-up Criteria in the Assessment of Grant Proposals of Junior Researchers” within the “Programme P-3 Performances de la recherche en sciences humaines et socials”. Matching funds were provided by the University of Zurich.

Author information

Authors and affiliations.

ETH Zürich, Zürich, Switzerland

Michael Ochsner & Sven Hug

FORS, Lausanne, Switzerland

Michael Ochsner

University of Zürich, Zürich, Switzerland

University Grenobles Alpes, Grenoble, France

Ioana Galleron

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Michael Ochsner .

Ethics declarations

Competing interests.

The authors declare that they have no competing financial interests.

Rights and permissions

This work is licensed under a Creative Commons Attribution 4.0 International License. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in the credit line; if the material is not included under the Creative Commons license, users will need to obtain permission from the license holder to reproduce the material. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

Reprints and permissions

About this article

Cite this article.

Ochsner, M., Hug, S. & Galleron, I. The future of research assessment in the humanities: bottom-up assessment procedures. Palgrave Commun 3 , 17020 (2017). https://doi.org/10.1057/palcomms.2017.20

Download citation

Received : 22 August 2016

Accepted : 27 February 2017

Published : 21 March 2017

DOI : https://doi.org/10.1057/palcomms.2017.20

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Philosophers’ appraisals of bibliometric indicators and their use in evaluation: from recognition to knee-jerk rejection.

  • Ramón A. Feenstra
  • Emilio Delgado López-Cózar

Scientometrics (2022)

Toward a classification of Spanish scholarly journals in social sciences and humanities considering their impact and visibility

  • Daniela De Filippo
  • Rafael Aleixandre-Benavent
  • Elías Sanz-Casado

Scientometrics (2020)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

research in humanities pdf

Organizing Research for Arts and Humanities Papers and Theses

  • General Guide Information
  • Developing a Topic
  • What are Primary and Secondary Sources
  • What are Scholarly and Non-Scholarly Sources
  • Writing an Abstract
  • Writing Academic Book Reviews
  • Writing A Literature Review
  • Using Images and other Media

Critical Engagement

Note: these recommendations are geared toward researchers in the arts or humanities.

Developing a research topic is an iterative process, even for a short paper. This is a process that emerges in stages, and one which requires critical (but not criticizing) engagement with the evidence, literature, and prior research. The evidence can be an object, an artifact, a historic event, an idea, a theoretical framework, or existing interpretations.

Ultimately, you want to be able to pose a research question that you will then investigate in your paper.

If you are writing a paper for a course, the initial critical ideas and theoretical frameworks may come from your course readings. Pay attention to footnotes and bibliographies in your readings, because they can help you identify other potential sources of information.

As you are thinking about your topic, consider what, if anything, has already been written. If a lot of literature exists on your topic, you will need to narrow your topic down, and decide how to make it interesting for your reader. Regurgitating or synthesizing what has already been said is very unlikely to be exciting both for you and for those who will be reading your wok. If there is little or no literature on your topic, you will need to think how to frame it so as to take advantage of existing theories in the discipline. You may also be able to take advantage of existing scholarship on related topics.

Types of Research Papers

There are two common types of research papers in the arts and humanities: expository and argumentative . In an expository paper you develop an idea or critical "reading" of something, and then support your idea or "reading" with evidence. In an argumentative essay you propose an argument or a framework to engage in a dialog with and to refute an existing interpretation, and provide evidence to support your argument/interpretation, as well as evidence to refute an existing argument/interpretation. For further elaboration on expository and argumentative papers, as well and for examples of both types of essays, check the book titled The Art of Writing About Art , co-authored by Suzanne Hudson and Nancy Noonan-Morrissey, originally published in 2001. Note that particular disciplines in the arts or humanities may have other specialized types of frameworks for research.

Also, remember that a research paper is not "merely an elaborately footnoted presentation of what a dozen scholars have already said about a topic; it is a thoughtful evaluation of the available evidence , and so is, finally, an expression of what the author [i.e., you] thinks the evidence adds up to." [Sylvan Barnet, A Short Guide to Writing About Art (New York: Pearson/Longman, 2005), 238-239.]

If you select a broad topic

If a lot has been written on your topic, perhaps you should consider the following:

  • why are you interested in this topic?
  • is there something specific you want to address?
  • can you offer a different or a more nuanced interpretation?
  • is there a specific theoretical or methodological perspective that you would like to apply?
  • can you shed more light on specific evidence or detail(s)?
  • review scholarship cited in the footnotes/bibliographies of your readings and see if there are lacunae you can address.

If you stick with a broad topic, you run into the danger of over-generalizing or summarizing existing scholarship, both of which have limited value in contemporary arts and humanities research papers. Summarizing is generally useful for providing background information, as well as for literature reviews. However, it should not constitute the bulk of your paper.

If you select a narrow or a very new topic

If you are interested in something very specific or very new, you may find that little has been written about it. You might even find that the same information gets repeated everywhere, because nothing else is available. Consider this an opportunity for you to do unique research, and think of the following:

  • is there a broader or a related topic that you can investigate in order to circle back and hone in on your chosen topic?
  • can your topic be critically examined within an existing theoretical or methodological framework?
  • are you able to draw on another field of study to investigate your topic?
  • review scholarship cited in the footnotes/bibliographies of the readings. - in other words, engage in citation chaining.
  • if the pertinent readings you find are not scholarly (this is not necessarily a bad thing), evaluate how you can use them to develop a more scholarly and critical context for investigating your topic.

Citing sources

Remember to keep track of your sources, regardless of the stage of your research. The USC Libraries have an excellent guide to citation styles  and to citation management software . 

  • << Previous: General Guide Information
  • Next: What are Primary and Secondary Sources >>
  • Last Updated: Jan 19, 2023 3:12 PM
  • URL: https://libguides.usc.edu/ah_writing

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • Current issue
  • BMJ Journals More You are viewing from: Google Indexer

You are here

  • Online First
  • Vernacular knowledge about contraception: an interdisciplinary perspective on myths, misperceptions and lived experience
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

  • http://orcid.org/0000-0003-4622-6286 Victoria Louise Newton
  • School of Health Wellbeing and Social Care, Faculty of Wellbeing, Education and Language Studies , The Open University , Milton Keynes , UK
  • Correspondence to Dr Victoria Louise Newton, School of Health Wellbeing and Social Care, Faculty of Wellbeing, Education and Language Studies, The Open University, Milton Keynes, UK; victoria.newton{at}open.ac.uk

https://doi.org/10.1136/bmjsrh-2024-202355

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

  • contraception behavior
  • hormonal contraception
  • long-acting reversible contraception

Recent research points to social media as a driver in the rise of contraceptive misconceptions and misinformation. An attitudinal change towards hormonal contraception has also been documented, coupled with rising distrust of clinicians in favour of peer influencers, 1 a narrative that has been echoed in mainstream media. 2 However, the circulation of contraceptive narratives is not a new phenomenon, and not confined to social media.

I recently led a study, in collaboration with Public Health England and The Folklore Society, which explored communication about contraception between friends and family networks. 3 The ‘Reproductive Bodylore’ project (May 2020–December 2023) was a qualitative interdisciplinary study which straddled the fields of folklore and health and was funded by the Arts and Humanities Research Council (AHRC). It explored the role of vernacular (informal) knowledge in contraceptive decision-making. A short video abstract of the research and exhibition is available online. 4

Through the research we found that participants recognised the complexities and contradictions in the contraceptive information they come across both online and offline. They said that they needed to sift through and make sense of a lot of information from different sources, and that different sources of information served different functions. Medical information was gained from the National Health Service (NHS) website, or conversations with clinicians, whereas the more embodied types of knowledge – what a contraceptive might actually be like – was provided by friends and family.

Participants’ key concerns about contraception included the impact of long-term hormonal contraceptive use on future fertility, and the day-to-day impact of unwanted side effects. Misunderstandings about fertility and contraception were common, in particular we noticed a concern that emergency hormonal contraception can only be taken three times in a lifetime, otherwise it would have a detrimental effect on fertility.

Sometimes, the things people heard about contraception were interpreted within the context of lived experience. For example, a participant who struggled to conceive put this down to ‘chemicals’ being in her body following long-term use of the pill. This convergence of belief and experience suggests that framing informal knowledge as ‘myths and misperceptions’ may be unhelpful and does not account for individual interpretations of such knowledge.

Drawing on the interdisciplinary origins of this project, informal contraceptive knowledge and lived experience should be viewed holistically, as ‘vernacular knowledge’, a term borrowed from folklore studies 5 : vernacular relates to everyday informal culture and knowledge includes ‘awareness’ or ‘knowledge of’ something, rather than solely something factual or believed. Vernacular knowledge is the unofficial, informal and everyday culture of a group, including the communication of medical knowledge in lay terms.

Vernacular knowledge is a useful term as it encompasses how people understand and interpret the functioning of their bodies within a broader sociocultural frame. Studying such knowledge can reveal how people conventionally understand the ways in which medical interventions act on the body, incorporating and articulating fears of risk to the body, reflecting cultural norms and understandings, and the histories, values and concerns or anxieties of a group or community.

At present, vernacular knowledge as a component of patients’ contraceptive literacy is overlooked in research, and inadequately addressed in contraceptive consultations. The circulation of contraceptive beliefs demonstrates that patients are interested and involved in their healthcare decisions. Participant concerns about future fertility highlight the importance of greater education around fertility awareness in relation to pregnancy planning and avoidance. Being open to patient experiences and vernacular knowledge would also align with future directions in best practice, such as those stated in the UK Faculty of Sexual & Reproductive Healthcare Hatfield Vision. 6

Ethics statements

Patient consent for publication.

Not applicable.

Ethics approval

Acknowledgments.

The Reproductive Bodylore project was led by The Open University with Project Partners: The Folklore Society and Public Health England. Thank you to all volunteer researchers and their interviewees without whom the project would not have been a success. Thanks also to expert advisory group members, Prof. Richard Jenkins, Co-I Prof. Lesley Hoggart and project Research Associate Dr. Mari Greenfield.

  • Schneider-Kamp A ,
  • ↵ The FSRH Hatfield Vision: A Framework to Improve Women and Girls’ Reproductive Health Outcomes . The Faculty of Sexual & Reproductive Healthcare (FSRH) , 2022 . Available : https://www.fsrh.org/news-and-advocacy/the-fsrh-hatfield-vision/

Contributors Sole-authored letter.

Funding This work was supported by the Arts and Humanities Research Council (AH/S011587/1).

Competing interests None declared.

Provenance and peer review Not commissioned; internally peer reviewed.

Read the full text or download the PDF:

IMAGES

  1. (PDF) The Value of the Humanities

    research in humanities pdf

  2. Research for humanities

    research in humanities pdf

  3. (PDF) digital humanities

    research in humanities pdf

  4. Example Of Concept Paper About Humanities

    research in humanities pdf

  5. Humanities Research

    research in humanities pdf

  6. (PDF) Digital Humanities-An Introduction

    research in humanities pdf

VIDEO

  1. Meaningful Impact of Research in Humanities

  2. WGU C100 Humanities Tips

  3. Key Research Area Manuscript Research

  4. Humanities Institute Research Connections Database

  5. 9th and 10th Arts and Science Subjects of Federal Board

  6. 2050: Bioethical Issues

COMMENTS

  1. PDF Methods in the Humanities Extrinsic Isolationist

    Methods in the Humanities Research in the humanities can be accomplished through a surprisingly diverse range of methods, which apply equally well to the study of languages, literatures, the fine arts, applied arts, and religion. Most of these methods fall into two basic categories: Extrinsic or

  2. (PDF) Four types of research in the humanities: Setting the stage for

    This study presents humanities scholars' conceptions of research and subjective notions of quality in the three disciplines German literature studies, English literature studies, and art history ...

  3. PDF The Value of the Humanities

    a bounded set of answers to the question of the value of humanities research. They are as follows: Intrinsic value: humanities research has a value in and of itself. Even if it leads to other benefits (as listed below), it should also be pursued for its own sake. Social value: the humanities benefit society in a number of ways. They

  4. (PDF) An Introduction to Research Methods in the Humanities

    PDF | On Jun 15, 2016, Celestina O. Isiramen and others published An Introduction to Research Methods in the Humanities | Find, read and cite all the research you need on ResearchGate

  5. PDF The Impacts of Humanities and Social Science Research

    Research in the humanities, social sciences, and fine and creative arts has impact but defining, measuring, and comparing these impacts is an enormous challenge. This research contributes to a free and democratic society, public policy, quality of life, social

  6. PDF Why do the humanities matter? Stanford University

    An Interpretive Approach to Research A hallmark of humanistic study is that research is approached differently than in the natural and social sciences, where data and hard evidence are required to draw conclusions. Because the human experience cannot be adequately captured by facts and figures alone, humanities research employs

  7. [PDF] Four types of research in the humanities: Setting the stage for

    This study presents humanities scholars' conceptions of research and subjective notions of quality in the three disciplines German literature studies, English literature studies, and art history, captured using 21 Repertory Grid interviews. We identified three dimensions that structure the scholars' conceptions of research: quality, time, and success. Further, the results revealed four types ...

  8. Four types of research in the humanities: Setting the stage for

    Abstract. This study presents humanities scholars' conceptions of research and subjective notions of quality in the three disciplines German literature studies, English literature studies, and art history, captured using 21 Repertory Grid interviews.

  9. Ethics in the Humanities

    This chapter argues that extending knowledge of ethical issues into the Humanities domain is important in order to identify the ethical problems faced by individuals in the Humanities, so that tailored research and training on these types of situations can help these individuals to deal with such problems. Download reference work entry PDF.

  10. PDF Ethics in the Humanities

    the research ethics involved in the Humanities fields. Perhaps people do not regard individuals in the Humanities as conducting research.Individuals in the Humanities, however,doindeed conduct research, and the pitfalls associated with the ethical conduct of Humanities research must be outlined in order to help researchers make ethical decisions.

  11. PDF Writing a Research Paper in the Humanities

    Write down interesting problems, questions, tensions, and gaps in the discourse. All of these provide an opportunity for you to contribute new ideas to the conversation. Keep track of them somewhere. Every problem is a gift. Write out lists, questions, ideas that confuse you, and interesting problems.

  12. What is Humanities Research?

    Research in the humanities is frequently misunderstood. When we think of research, what immediately comes to mind for many of us is a laboratory setting, with white-coated scientists hunched over microscopes. Because research in the humanities is often a rather solitary activity, it can be difficult for newcomers to gain a sense of what research looks like within the scope of English Studies.

  13. (PDF) Rethinking Research Ethics in the Humanities: Principles and

    Rethinking Research Ethics: Principles and Recommendations 5 Rationale and methodology The purpose of the AHRC RRE-organised Seminar and Workshop #1 was to identify: (a) key questions and problems contributing to perturbing ethically-challenging qualitative research in the Humanities; and (b) the areas that the RRE network was to focus on.

  14. PDF 6 Research Papers 1n the Humanities

    6 Research Papers 1n the Humanities The term paper, or the library research paper, is a standard assignment of liberal arts courses. It is liberating in the sense that more than most assignments in the classroom, it allows you considerable flexibility in choosing your topic or your approach to your topic. ...

  15. Research in the Humanities

    The fields in the Humanities discipline generally include the visual and performing arts, philosophy, literature, religion, history, languages, art history, and classics. Although research methods differ among the Humanities, the Social Sciences, and the Sciences, any research project in any discipline starts with curiosity and a hypothesis.

  16. PDF Writing Research Proposals for Social Sciences and Humanities in a

    Figure 7-3 Research proposal progress signals 144 Figure 10-1 One thought, two ideas, four proposals 227 Boxes Box 3-1 From research area to research topic 28 Box 3-2 From research topic to general research question 29 Box 3-3 Examples of general research questions 30 Box 3-4 From general research question to specific research question

  17. Humanities Research

    Humanities research has many practical applications in various fields, including: Policy-making: Humanities research can inform policy-making by providing insights into social, cultural, and historical contexts. It can help policymakers understand the impact of policies on communities and identify potential unintended consequences.

  18. PDF Basic Research in the Humanities

    Basic Research in the Humanities (Definition): 1. The discovery, preservation, and communication of the historical and present record of human society; 2. The invention of methods for studying and interpreting that record; 3. The investigation and preservation of the languages and linguistic skills increasingly needed

  19. (PDF) Key concepts in humanities and social sciences

    PDF | This book is a collection of foundational concepts in the humanities and social sciences and will be of particular use to undergraduate students.... | Find, read and cite all the research ...

  20. The future of research assessment in the humanities: bottom-up

    Research assessment in the social sciences and humanities (SSH) is delicate. Assessment procedures meet strong criticisms from SSH scholars and bibliometric research shows that the methods that ...

  21. Organizing Research for Arts and Humanities Papers and Theses

    Note: these recommendations are geared toward researchers in the arts or humanities. Developing a research topic is an iterative process, even for a short paper. This is a process that emerges in stages, and one which requires critical (but not criticizing) engagement with the evidence, literature, and prior research. The evidence can be an ...

  22. (PDF) The Value of the Humanities

    Innovation: the humanities deal with questions of motivation, organi-. sation and action, which are essential components of creativity. and entrepreneurship, and so the humanities promote a ...

  23. Research humanities 4 .pdf

    View Research humanities (4).pdf from HU 345-102-MQ at Dawson College. Viewpoints The Role of Women in Conflict Resolution Galia Golan Prof. Galia Golan is a lecturer at the Lauder School

  24. Vernacular knowledge about contraception: an interdisciplinary

    Recent research points to social media as a driver in the rise of contraceptive misconceptions and misinformation. An attitudinal change towards hormonal contraception has also been documented, coupled with rising distrust of clinicians in favour of peer influencers,1 a narrative that has been echoed in mainstream media.2 However, the circulation of contraceptive narratives is not a new ...

  25. (PDF) Research issues in the humanities and social sciences in Africa

    Conducting research in humanities sciences in Iran is faces with different barriers, including the small number of researchers, financial resources shortage, weak research infrastructure, etc ...

  26. Journal of Measurement and Evaluation in Education and Humanities

    Request PDF | JOURNAL OF MEASUREMENT AND EVALUATION IN EDUCATION AND HUMANITIES | Abstract This paper examines the relationship between internal and external assessment of educational achievement ...