• Subject List
  • Take a Tour
  • For Authors
  • Subscriber Services
  • Publications
  • African American Studies
  • African Studies
  • American Literature
  • Anthropology
  • Architecture Planning and Preservation
  • Art History
  • Atlantic History
  • Biblical Studies
  • British and Irish Literature
  • Childhood Studies
  • Chinese Studies
  • Cinema and Media Studies
  • Communication
  • Criminology
  • Environmental Science
  • Evolutionary Biology
  • International Law
  • International Relations
  • Islamic Studies
  • Jewish Studies
  • Latin American Studies
  • Latino Studies
  • Linguistics
  • Literary and Critical Theory
  • Medieval Studies
  • Military History
  • Political Science
  • Public Health
  • Renaissance and Reformation
  • Social Work
  • Urban Studies
  • Victorian Literature
  • Browse All Subjects

How to Subscribe

  • Free Trials

In This Article Expand or collapse the "in this article" section Methodologies for Conducting Education Research

Introduction, general overviews.

  • Experimental Research
  • Quasi-Experimental Research
  • Hierarchical Linear Modeling
  • Survey Research
  • Assessment and Measurement
  • Qualitative Research Methodologies
  • Program Evaluation
  • Research Syntheses
  • Implementation

Related Articles Expand or collapse the "related articles" section about

About related articles close popup.

Lorem Ipsum Sit Dolor Amet

Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Aliquam ligula odio, euismod ut aliquam et, vestibulum nec risus. Nulla viverra, arcu et iaculis consequat, justo diam ornare tellus, semper ultrices tellus nunc eu tellus.

  • Action Research in Education
  • Data Collection in Educational Research
  • Educational Assessment
  • Educational Research Approaches: A Comparison
  • Educational Statistics for Longitudinal Research
  • Grounded Theory
  • Literature Reviews
  • Meta-Analysis and Research Synthesis in Education
  • Mixed Methods Research
  • Multivariate Research Methodology
  • Narrative Research in Education
  • Performance Objectives and Measurement
  • Performance-based Research Assessment in Higher Education
  • Qualitative Research Design
  • Quantitative Research Designs in Educational Research
  • Single-Subject Research Design
  • Social Network Analysis
  • Social Science and Education Research
  • Statistical Assumptions

Other Subject Areas

Forthcoming articles expand or collapse the "forthcoming articles" section.

  • Cyber Safety in Schools
  • Girls' Education in the Developing World
  • History of Education in Europe
  • Find more forthcoming articles...
  • Export Citations
  • Share This Facebook LinkedIn Twitter

Methodologies for Conducting Education Research by Marisa Cannata LAST REVIEWED: 15 December 2011 LAST MODIFIED: 15 December 2011 DOI: 10.1093/obo/9780199756810-0061

Education is a diverse field and methodologies used in education research are necessarily diverse. The reasons for the methodological diversity of education research are many, including the fact that the field of education is composed of a multitude of disciplines and tensions between basic and applied research. For example, accepted methods of systemic inquiry in history, sociology, economics, and psychology vary, yet all of these disciplines help answer important questions posed in education. This methodological diversity has led to debates about the quality of education research and the perception of shifting standards of quality research. The citations selected for inclusion in this article provide a broad overview of methodologies and discussions of quality research standards across the different types of questions posed in educational research. The citations represent summaries of ongoing debates, articles or books that have had a significant influence on education research, and guides to those who wish to implement particular methodologies. Most of the sections focus on specific methodologies and provide advice or examples for studies employing these methodologies.

The interdisciplinary nature of education research has implications for education research. There is no single best research design for all questions that guide education research. Even through many often heated debates about methodologies, the common strand is that research designs should follow the research questions. The following works offer an introduction to the debates, divides, and difficulties of education research. Schoenfeld 1999 , Mitchell and Haro 1999 , and Shulman 1988 provide perspectives on diversity within the field of education and the implications of this diversity on the debates about education research and difficulties conducting such research. National Research Council 2002 outlines the principles of scientific inquiry and how they apply to education. Published around the time No Child Left Behind required education policies to be based on scientific research, this book laid the foundation for much of the current emphasis of experimental and quasi-experimental research in education. To read another perspective on defining good education research, readers may turn to Hostetler 2005 . Readers who want a general overview of various methodologies in education research and directions on how to choose between them should read Creswell 2009 and Green, et al. 2006 . The American Educational Research Association (AERA), the main professional association focused on education research, has developed standards for how to report methods and findings in empirical studies. Those wishing to follow those standards should consult American Educational Research Association 2006 .

American Educational Research Association. 2006. Standards for reporting on empirical social science research in AERA publications. Educational Researcher 35.6: 33–40.

DOI: 10.3102/0013189X035006033

The American Educational Research Association is the professional association for researchers in education. Publications by AERA are a well-regarded source of research. This article outlines the requirements for reporting original research in AERA publications.

Creswell, J. W. 2009. Research design: Qualitative, quantitative, and mixed methods approaches . 3d ed. Los Angeles: SAGE.

Presents an overview of qualitative, quantitative and mixed-methods research designs, including how to choose the design based on the research question. This book is particularly helpful for those who want to design mixed-methods studies.

Green, J. L., G. Camilli, and P. B. Elmore. 2006. Handbook of complementary methods for research in education . Mahwah, NJ: Lawrence Erlbaum.

Provides a broad overview of several methods of educational research. The first part provides an overview of issues that cut across specific methodologies, and subsequent chapters delve into particular research approaches.

Hostetler, K. 2005. What is “good” education research? Educational Researcher 34.6: 16–21.

DOI: 10.3102/0013189X034006016

Goes beyond methodological concerns to argue that “good” educational research should also consider the conception of human well-being. By using a philosophical lens on debates about quality education research, this article is useful for moving beyond qualitative-quantitative divides.

Mitchell, T. R., and A. Haro. 1999. Poles apart: Reconciling the dichotomies in education research. In Issues in education research . Edited by E. C. Lagemann and L. S. Shulman, 42–62. San Francisco: Jossey-Bass.

Chapter outlines several dichotomies in education research, including the tension between applied research and basic research and between understanding the purposes of education and the processes of education.

National Research Council. 2002. Scientific research in education . Edited by R. J. Shavelson and L. Towne. Committee on Scientific Principles for Education Research. Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: National Academy Press.

This book was released around the time the No Child Left Behind law directed that policy decisions should be guided by scientific research. It is credited with starting the current debate about methods in educational research and the preference for experimental studies.

Schoenfeld, A. H. 1999. The core, the canon, and the development of research skills. Issues in the preparation of education researchers. In Issues in education research . Edited by E. C. Lagemann and L. S. Shulman, 166–202. San Francisco: Jossey-Bass.

Describes difficulties in preparing educational researchers due to the lack of a core and a canon in education. While the focus is on preparing researchers, it provides valuable insight into why debates over education research persist.

Shulman, L. S. 1988. Disciplines of inquiry in education: An overview. In Complementary methods for research in education . Edited by R. M. Jaeger, 3–17. Washington, DC: American Educational Research Association.

Outlines what distinguishes research from other modes of disciplined inquiry and the relationship between academic disciplines, guiding questions, and methods of inquiry.

back to top

Users without a subscription are not able to see the full content on this page. Please subscribe or login .

Oxford Bibliographies Online is available by subscription and perpetual access to institutions. For more information or to contact an Oxford Sales Representative click here .

  • About Education »
  • Meet the Editorial Board »
  • Academic Achievement
  • Academic Audit for Universities
  • Academic Freedom and Tenure in the United States
  • Adjuncts in Higher Education in the United States
  • Administrator Preparation
  • Adolescence
  • Advanced Placement and International Baccalaureate Courses
  • Advocacy and Activism in Early Childhood
  • African American Racial Identity and Learning
  • Alaska Native Education
  • Alternative Certification Programs for Educators
  • Alternative Schools
  • American Indian Education
  • Animals in Environmental Education
  • Art Education
  • Artificial Intelligence and Learning
  • Assessing School Leader Effectiveness
  • Assessment, Behavioral
  • Assessment, Educational
  • Assessment in Early Childhood Education
  • Assistive Technology
  • Augmented Reality in Education
  • Beginning-Teacher Induction
  • Bilingual Education and Bilingualism
  • Black Undergraduate Women: Critical Race and Gender Perspe...
  • Black Women in Academia
  • Blended Learning
  • Case Study in Education Research
  • Changing Professional and Academic Identities
  • Character Education
  • Children’s and Young Adult Literature
  • Children's Beliefs about Intelligence
  • Children's Rights in Early Childhood Education
  • Citizenship Education
  • Civic and Social Engagement of Higher Education
  • Classroom Learning Environments: Assessing and Investigati...
  • Classroom Management
  • Coherent Instructional Systems at the School and School Sy...
  • College Admissions in the United States
  • College Athletics in the United States
  • Community Relations
  • Comparative Education
  • Computer-Assisted Language Learning
  • Computer-Based Testing
  • Conceptualizing, Measuring, and Evaluating Improvement Net...
  • Continuous Improvement and "High Leverage" Educational Pro...
  • Counseling in Schools
  • Critical Approaches to Gender in Higher Education
  • Critical Perspectives on Educational Innovation and Improv...
  • Critical Race Theory
  • Crossborder and Transnational Higher Education
  • Cross-National Research on Continuous Improvement
  • Cross-Sector Research on Continuous Learning and Improveme...
  • Cultural Diversity in Early Childhood Education
  • Culturally Responsive Leadership
  • Culturally Responsive Pedagogies
  • Culturally Responsive Teacher Education in the United Stat...
  • Curriculum Design
  • Data-driven Decision Making in the United States
  • Deaf Education
  • Desegregation and Integration
  • Design Thinking and the Learning Sciences: Theoretical, Pr...
  • Development, Moral
  • Dialogic Pedagogy
  • Digital Age Teacher, The
  • Digital Citizenship
  • Digital Divides
  • Disabilities
  • Distance Learning
  • Distributed Leadership
  • Doctoral Education and Training
  • Early Childhood Education and Care (ECEC) in Denmark
  • Early Childhood Education and Development in Mexico
  • Early Childhood Education in Aotearoa New Zealand
  • Early Childhood Education in Australia
  • Early Childhood Education in China
  • Early Childhood Education in Europe
  • Early Childhood Education in Sub-Saharan Africa
  • Early Childhood Education in Sweden
  • Early Childhood Education Pedagogy
  • Early Childhood Education Policy
  • Early Childhood Education, The Arts in
  • Early Childhood Mathematics
  • Early Childhood Science
  • Early Childhood Teacher Education
  • Early Childhood Teachers in Aotearoa New Zealand
  • Early Years Professionalism and Professionalization Polici...
  • Economics of Education
  • Education For Children with Autism
  • Education for Sustainable Development
  • Education Leadership, Empirical Perspectives in
  • Education of Native Hawaiian Students
  • Education Reform and School Change
  • Educator Partnerships with Parents and Families with a Foc...
  • Emotional and Affective Issues in Environmental and Sustai...
  • Emotional and Behavioral Disorders
  • English as an International Language for Academic Publishi...
  • Environmental and Science Education: Overlaps and Issues
  • Environmental Education
  • Environmental Education in Brazil
  • Epistemic Beliefs
  • Equity and Improvement: Engaging Communities in Educationa...
  • Equity, Ethnicity, Diversity, and Excellence in Education
  • Ethical Research with Young Children
  • Ethics and Education
  • Ethics of Teaching
  • Ethnic Studies
  • Evidence-Based Communication Assessment and Intervention
  • Family and Community Partnerships in Education
  • Family Day Care
  • Federal Government Programs and Issues
  • Feminization of Labor in Academia
  • Finance, Education
  • Financial Aid
  • Formative Assessment
  • Future-Focused Education
  • Gender and Achievement
  • Gender and Alternative Education
  • Gender, Power and Politics in the Academy
  • Gender-Based Violence on University Campuses
  • Gifted Education
  • Global Mindedness and Global Citizenship Education
  • Global University Rankings
  • Governance, Education
  • Growth of Effective Mental Health Services in Schools in t...
  • Higher Education and Globalization
  • Higher Education and the Developing World
  • Higher Education Faculty Characteristics and Trends in the...
  • Higher Education Finance
  • Higher Education Governance
  • Higher Education Graduate Outcomes and Destinations
  • Higher Education in Africa
  • Higher Education in China
  • Higher Education in Latin America
  • Higher Education in the United States, Historical Evolutio...
  • Higher Education, International Issues in
  • Higher Education Management
  • Higher Education Policy
  • Higher Education Research
  • Higher Education Student Assessment
  • High-stakes Testing
  • History of Early Childhood Education in the United States
  • History of Education in the United States
  • History of Technology Integration in Education
  • Homeschooling
  • Inclusion in Early Childhood: Difference, Disability, and ...
  • Inclusive Education
  • Indigenous Education in a Global Context
  • Indigenous Learning Environments
  • Indigenous Students in Higher Education in the United Stat...
  • Infant and Toddler Pedagogy
  • Inservice Teacher Education
  • Integrating Art across the Curriculum
  • Intelligence
  • Intensive Interventions for Children and Adolescents with ...
  • International Perspectives on Academic Freedom
  • Intersectionality and Education
  • Knowledge Development in Early Childhood
  • Leadership Development, Coaching and Feedback for
  • Leadership in Early Childhood Education
  • Leadership Training with an Emphasis on the United States
  • Learning Analytics in Higher Education
  • Learning Difficulties
  • Learning, Lifelong
  • Learning, Multimedia
  • Learning Strategies
  • Legal Matters and Education Law
  • LGBT Youth in Schools
  • Linguistic Diversity
  • Linguistically Inclusive Pedagogy
  • Literacy Development and Language Acquisition
  • Mathematics Identity
  • Mathematics Instruction and Interventions for Students wit...
  • Mathematics Teacher Education
  • Measurement for Improvement in Education
  • Measurement in Education in the United States
  • Methodological Approaches for Impact Evaluation in Educati...
  • Methodologies for Conducting Education Research
  • Mindfulness, Learning, and Education
  • Motherscholars
  • Multiliteracies in Early Childhood Education
  • Multiple Documents Literacy: Theory, Research, and Applica...
  • Museums, Education, and Curriculum
  • Music Education
  • Native American Studies
  • Nonformal and Informal Environmental Education
  • Note-Taking
  • Numeracy Education
  • One-to-One Technology in the K-12 Classroom
  • Online Education
  • Open Education
  • Organizing for Continuous Improvement in Education
  • Organizing Schools for the Inclusion of Students with Disa...
  • Outdoor Play and Learning
  • Outdoor Play and Learning in Early Childhood Education
  • Pedagogical Leadership
  • Pedagogy of Teacher Education, A
  • Performance-based Research Funding
  • Phenomenology in Educational Research
  • Philosophy of Education
  • Physical Education
  • Podcasts in Education
  • Policy Context of United States Educational Innovation and...
  • Politics of Education
  • Portable Technology Use in Special Education Programs and ...
  • Post-humanism and Environmental Education
  • Pre-Service Teacher Education
  • Problem Solving
  • Productivity and Higher Education
  • Professional Development
  • Professional Learning Communities
  • Programs and Services for Students with Emotional or Behav...
  • Psychology Learning and Teaching
  • Psychometric Issues in the Assessment of English Language ...
  • Qualitative Data Analysis Techniques
  • Qualitative, Quantitative, and Mixed Methods Research Samp...
  • Queering the English Language Arts (ELA) Writing Classroom
  • Race and Affirmative Action in Higher Education
  • Reading Education
  • Refugee and New Immigrant Learners
  • Relational and Developmental Trauma and Schools
  • Relational Pedagogies in Early Childhood Education
  • Reliability in Educational Assessments
  • Religion in Elementary and Secondary Education in the Unit...
  • Researcher Development and Skills Training within the Cont...
  • Research-Practice Partnerships in Education within the Uni...
  • Response to Intervention
  • Restorative Practices
  • Risky Play in Early Childhood Education
  • Role of Gender Equity Work on University Campuses through ...
  • Scale and Sustainability of Education Innovation and Impro...
  • Scaling Up Research-based Educational Practices
  • School Accreditation
  • School Choice
  • School Culture
  • School District Budgeting and Financial Management in the ...
  • School Improvement through Inclusive Education
  • School Reform
  • Schools, Private and Independent
  • School-Wide Positive Behavior Support
  • Science Education
  • Secondary to Postsecondary Transition Issues
  • Self-Regulated Learning
  • Self-Study of Teacher Education Practices
  • Service-Learning
  • Severe Disabilities
  • Single Salary Schedule
  • Single-sex Education
  • Social Context of Education
  • Social Justice
  • Social Pedagogy
  • Social Studies Education
  • Sociology of Education
  • Standards-Based Education
  • Student Access, Equity, and Diversity in Higher Education
  • Student Assignment Policy
  • Student Engagement in Tertiary Education
  • Student Learning, Development, Engagement, and Motivation ...
  • Student Participation
  • Student Voice in Teacher Development
  • Sustainability Education in Early Childhood Education
  • Sustainability in Early Childhood Education
  • Sustainability in Higher Education
  • Teacher Beliefs and Epistemologies
  • Teacher Collaboration in School Improvement
  • Teacher Evaluation and Teacher Effectiveness
  • Teacher Preparation
  • Teacher Training and Development
  • Teacher Unions and Associations
  • Teacher-Student Relationships
  • Teaching Critical Thinking
  • Technologies, Teaching, and Learning in Higher Education
  • Technology Education in Early Childhood
  • Technology, Educational
  • Technology-based Assessment
  • The Bologna Process
  • The Regulation of Standards in Higher Education
  • Theories of Educational Leadership
  • Three Conceptions of Literacy: Media, Narrative, and Gamin...
  • Tracking and Detracking
  • Traditions of Quality Improvement in Education
  • Transformative Learning
  • Transitions in Early Childhood Education
  • Tribally Controlled Colleges and Universities in the Unite...
  • Understanding the Psycho-Social Dimensions of Schools and ...
  • University Faculty Roles and Responsibilities in the Unite...
  • Using Ethnography in Educational Research
  • Value of Higher Education for Students and Other Stakehold...
  • Virtual Learning Environments
  • Vocational and Technical Education
  • Wellness and Well-Being in Education
  • Women's and Gender Studies
  • Young Children and Spirituality
  • Young Children's Learning Dispositions
  • Young Children's Working Theories
  • Privacy Policy
  • Cookie Policy
  • Legal Notice
  • Accessibility

Powered by:

  • [185.66.15.189]
  • 185.66.15.189

Introduction to Education Research

  • First Online: 29 November 2023

Cite this chapter

research method of education

  • Sharon K. Park 3 ,
  • Khanh-Van Le-Bucklin 4 &
  • Julie Youm 4  

408 Accesses

Educators rely on the discovery of new knowledge of teaching practices and frameworks to improve and evolve education for trainees. An important consideration that should be made when embarking on a career conducting education research is finding a scholarship niche. An education researcher can then develop the conceptual framework that describes the state of knowledge, realize gaps in understanding of the phenomenon or problem, and develop an outline for the methodological underpinnings of the research project. In response to Ernest Boyer’s seminal report, Priorities of the Professoriate , research was conducted about the criteria and decision processes for grants and publications. Six standards known as the Glassick’s criteria provide a tangible measure by which educators can assess the quality and structure of their education research—clear goals, adequate preparation, appropriate methods, significant results, effective presentation, and reflective critique. Ultimately, the promise of education research is to realize advances and innovation for learners that are informed by evidence-based knowledge and practices.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

research method of education

“Just Tell Us the Formula!” Co-Constructing the Relevance of Education in Freshman Learning Communities

research method of education

Reconciling educational research traditions

research method of education

Building the Infrastructure to Improve the Use and Usefulness of Research in Education

Boyer EL. Scholarship reconsidered: priorities of the professoriate. Princeton: Carnegie Foundation for the Advancement of Teaching; 1990.

Google Scholar  

Munoz-Najar Galvez S, Heiberger R, McFarland D. Paradigm wars revisited: a cartography of graduate research in the field of education (1980–2010). Am Educ Res J. 2020;57(2):612–52.

Article   Google Scholar  

Ringsted C, Hodges B, Scherpbier A. ‘The research compass’: an introduction to research in medical education: AMEE Guide no. 56. Med Teach. 2011;33(9):695–709.

Article   PubMed   Google Scholar  

Bordage G. Conceptual frameworks to illuminate and magnify. Med Educ. 2009;43(4):312–9.

Varpio L, Paradis E, Uijtdehaage S, Young M. The distinctions between theory, theoretical framework, and conceptual framework. Acad Med. 2020;95(7):989–94.

Ravitch SM, Riggins M. Reason & Rigor: how conceptual frameworks guide research. Thousand Oaks: Sage Publications; 2017.

Park YS, Zaidi Z, O'Brien BC. RIME foreword: what constitutes science in educational research? Applying rigor in our research approaches. Acad Med. 2020;95(11S):S1–5.

National Institute of Allergy and Infectious Diseases. Writing a winning application—You’re your niche. 2020a. https://www.niaid.nih.gov/grants-contracts/find-your-niche . Accessed 23 Jan 2022.

National Institute of Allergy and Infectious Diseases. Writing a winning application—conduct a self-assessment. 2020b. https://www.niaid.nih.gov/grants-contracts/winning-app-self-assessment . Accessed 23 Jan 2022.

Glassick CE, Huber MT, Maeroff GI. Scholarship assessed: evaluation of the professoriate. San Francisco: Jossey Bass; 1997.

Simpson D, Meurer L, Braza D. Meeting the scholarly project requirement-application of scholarship criteria beyond research. J Grad Med Educ. 2012;4(1):111–2. https://doi.org/10.4300/JGME-D-11-00310.1 .

Article   PubMed   PubMed Central   Google Scholar  

Fincher RME, Simpson DE, Mennin SP, Rosenfeld GC, Rothman A, McGrew MC et al. The council of academic societies task force on scholarship. Scholarship in teaching: an imperative for the 21st century. Academic Medicine. 2000;75(9):887–94.

Hutchings P, Shulman LS. The scholarship of teaching new elaborations and developments. Change. 1999;11–5.

Download references

Author information

Authors and affiliations.

School of Pharmacy, Notre Dame of Maryland University, Baltimore, MD, USA

Sharon K. Park

University of California, Irvine School of Medicine, Irvine, CA, USA

Khanh-Van Le-Bucklin & Julie Youm

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Sharon K. Park .

Editor information

Editors and affiliations.

Johns Hopkins University School of Medicine, Baltimore, MD, USA

April S. Fitzgerald

Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, USA

Gundula Bosch

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Park, S.K., Le-Bucklin, KV., Youm, J. (2023). Introduction to Education Research. In: Fitzgerald, A.S., Bosch, G. (eds) Education Scholarship in Healthcare. Springer, Cham. https://doi.org/10.1007/978-3-031-38534-6_2

Download citation

DOI : https://doi.org/10.1007/978-3-031-38534-6_2

Published : 29 November 2023

Publisher Name : Springer, Cham

Print ISBN : 978-3-031-38533-9

Online ISBN : 978-3-031-38534-6

eBook Packages : Medicine Medicine (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

research method of education

8th Edition

Research Methods in Education

VitalSource Logo

  • Taylor & Francis eBooks (Institutional Purchase) Opens in new tab or window

Description

This thoroughly updated and extended eighth edition of the long-running bestseller Research Methods in Education covers the whole range of methods employed by educational research at all stages. Its five main parts cover: the context of educational research; research design; methodologies for educational research; methods of data collection; and data analysis and reporting. It continues to be the go-to text for students, academics and researchers who are undertaking, understanding and using educational research, and has been translated into several languages. It offers plentiful and rich practical advice, underpinned by clear theoretical foundations, research evidence and up-to-date references, and it raises key issues and questions for researchers planning, conducting, reporting and evaluating research. This edition contains new chapters on: Mixed methods research The role of theory in educational research Ethics in Internet research Research questions and hypotheses Internet surveys Virtual worlds, social network software and netography in educational research Using secondary data in educational research Statistical significance, effect size and statistical power Beyond mixed methods: using Qualitative Comparative Analysis (QCA) to integrate cross-case and within-case analyses. Research Methods in Education is essential reading for both the professional researcher and anyone involved in educational and social research. The book is supported by a wealth of online materials, including PowerPoint slides, useful weblinks, practice data sets, downloadable tables and figures from the book, and a virtual, interactive, self-paced training programme in research methods. These resources can be found at: www.routledge.com/cw/cohen.

Table of Contents

Louis Cohen is Emeritus Professor of Education at Loughborough University, UK. Lawrence Manion was Principal Lecturer in Music at Manchester Metropolitan University, UK. Keith Morrison is Professor and Advisor for Institutional Development at Macau University of Science and Technology, China.

Critics' Reviews

"Very much still the key text for ‘all’ education students and researchers. Cohen et al continue to update Research Methods in Education , with new theoretical, ethical, virtual and mixed methods information. It’s worth noting the impressive web page and links to materials for all chapters which is still the benchmark when looking at the competition for books in this area of social and education research." Dr. Richard Race, Senior Lecturer in Education, Roehampton University, UK   "A clear enhancement on the already well established text. The new edition addresses an important need to explain research design and question setting in more detail, helping guide the newcomer through the research process from inception through analysis to reporting." David Lundie, Associate Professor of Education, University of St Mark & St John, UK   " Research Methods in Education is a unique book for everybody who has to undertake educational research projects. The book gives an in depth understanding of quantitative and qualitative research designs and offers a practical guide for data collection and data analysis. It is an essential 'friend’ for teacher students from various disciplines who are not familiar with social science research." Dr Ellen P.W.A. Jansen, Associate Professor Teacher Education, University of Groningen, The Netherlands   " Research Methods in Education continues to offer an excellent route map, a well-structured and inspiring travel guide, for students engaging in research.  It works across levels, and while it provides clarity for the beginning researcher there is plenty here to aid the seasoned researcher with an open mind to new approaches and emerging practices.  A superb text that provides guidance for my own research as well as for students and partners in research projects." Peter Shukie, Lecturer in Education Studies and Academic Lead in Digital Innovation, University Centre at Blackburn College, UK   "Research Methods in Education is, besides being my personal favorite research methods book, a deep as well as a broad handbook useful both for undergraduate teacher education students as well as researchers and PhD students within educational sciences. In this new edition, new chapters are added emphasising both quantitative and qualitative methods in combination with thought-through discussions about how to mix them. The book can be used when planning a project and then throughout the whole research process and is therefore a complete methods book." Karolina Broman, senior lecturer in chemistry education, Umeå university, Sweden   "Comprehensive, well written and relevant: the 8 th edition of Research Methods in Education offers the background for methods courses at different levels. The new edition keeps the strong focus on education studies. Excellent extensions will make the book an even more popular basis for classes on both qualitative and quantitative methods." Felix Weiss, Assistant Professor for Sociology of Education, Aarhus University, Denmark   "Research Methods in Education, 8th Edition is an up-to-date, one-stop shop, taking education research students from conceptualization to presentation. With this book on your library shelf, you are good to go." Dr Fiona McGarry, Lecturer in Research Methods, University of Dundee, UK   "The 8th edition of Research Methods in Education contains a wealth of up-to-the-minute information and guidance on educational research which will be of immense value to researchers at all stages of their careers and across the education domain from early years settings to higher education. As research and education move into increasingly fluid and complex dimensions, Research Methods in Education will support students, researchers and practitioners in charting a course through these changing waters as they seek to create new knowledge about effective teaching and deepen our understanding of how learners learn." Julia Flutter, A Director of the Cambridge Primary Review Trust, Faculty of Education, University of Cambridge, UK   "As a doctoral supervisor I know that my students routinely return to Research Methods in Education as they develop their own research projects. This text has always been a mainstay on our reading lists but this new edition now features additional research topics and new perspectives on a wider range of research methods. As with previous editions this book is clearly organised and well written and appeals to a wide audience of experienced and novice researchers alike." Dr Val Poultney, Associate Professor, University of Derby, UK

About VitalSource eBooks

VitalSource is a leading provider of eBooks.

  • Access your materials anywhere, at anytime.
  • Customer preferences like text size, font type, page color and more.
  • Take annotations in line as you read.

Multiple eBook Copies

This eBook is already in your shopping cart. If you would like to replace it with a different purchasing option please remove the current eBook option from your cart.

Book Preview

research method of education

The country you have selected will result in the following:

  • Product pricing will be adjusted to match the corresponding currency.
  • The title Perception will be removed from your cart because it is not available in this region.
  • Find My Rep

You are here

Research Methods and Methodologies in Education

Research Methods and Methodologies in Education

  • Robert Coe - Durham University, UK
  • Michael Waring - University of Leeds, UK 
  • Larry V Hedges - Northwestern University, USA
  • Laura Day Ashley - University of Birmingham, UK
  • Description

The #1 resource for carrying out educational research as part of postgraduate study.

High-quality educational research requires careful consideration of every aspect of the process. This all-encompassing textbook written by leading international experts gives students and early career researchers a considered overview of principles that underpin research, and key qualitative, quantitative and mixed methods for research design, data collection and analysis.

This third edition includes  four  new chapters:

  • Disseminating your research
  • Data science and computational research methods
  • Observational methods
  • Analysis of variance (ANOVA)

Plus a new  Research essentials  feature that highlights key ‘must-haves’ or misconceptions relating to each methodological approach, research design or analytical tool discussed.

This is essential reading for postgraduate students on education courses and early career researchers looking to sharpen their research practice.

Logically structured, clear and informative. This book has significant chapter on Academic ethics which will be essential not only for BA students but for PhD and early research career as well

This book covers a wide range of issues in relation to educational research - there is something in there for all my students. Each chapter is short and to the point.

Research Methods and Methodologies in Education is a book I have used repeatedly since teaching Research on Foundation degree and BA top up

This book is beneficial for doctoral students because it provides clear and concise details on the steps to performing scholarly research designs. I am highly recommending this book for my courses.

Preview this book

Sample materials & chapters.

Sample Chapter

For instructors

Select a purchasing option.

  • Electronic Order Options VitalSource Amazon Kindle Google Play eBooks.com Kobo

Related Products

An Introduction to Educational Research

Get Citation

A Practical Guide to Teaching Research Methods in Education brings together more than 60 faculty experts. The contributors share detailed lesson plans about selected research concepts or skills in education and related disciplines, as well as discussions of the intellectual preparation needed to effectively teach the lesson.

Grounded in the wisdom of practice from exemplary and award-winning faculty from diverse institution types, career stages, and demographic backgrounds, this book draws on both the practical and cognitive elements of teaching educational (and related) research to students in higher education today. The book is divided into eight sections, covering the following key elements within education (and related) research: problems and research questions, literature reviews and theoretical frameworks, research design, quantitative methods, qualitative methods, mixed methods, findings and discussions, and special topics, such as student identity development, community and policy engaged research, and research dissemination. Within each section, individual chapters specifically focus on skills and perspectives needed to navigate the complexities of educational research. The concluding chapter reflects on how teachers of research also need to be learners of research, as faculty continuously strive for mastery, identity, and creativity in how they guide our next generation of knowledge producers through the research process.

Undergraduate and graduate professors of education (and related) research courses, dissertation chairs/committee members, faculty development staff members, and graduate students would all benefit from the lessons and expert commentary contained in this book.

TABLE OF CONTENTS

Chapter | 5  pages, introduction, part section i | 27  pages, topics, problems, and research questions, chapter 1 | 2  pages, introduction to section i, chapter 2 | 8  pages, from personal passion to hot topics, chapter 3 | 8  pages, articulating a research problem and its rationale, chapter 4 | 7  pages, part section ii | 27  pages, literature review and theoretical/conceptual framework, chapter 5 | 2  pages, introduction to section ii, chapter 6 | 7  pages, connecting pieces to the puzzle, chapter 7 | 6  pages, the candy sort, chapter 8 | 10  pages, theoretical and conceptual frameworks, part section iii | 38  pages, research design, chapter 9 | 3  pages, introduction to section iii, chapter 10 | 9  pages, visualize your research design, chapter 11 | 9  pages, let's road trip, chapter 12 | 8  pages, the self and research, chapter 13 | 7  pages, trustworthiness and ethics in research, part section iv | 39  pages, quantitative methods, chapter 14 | 3  pages, introduction to section iv, chapter 15 | 7  pages, making sense of multivariate analysis, chapter 16 | 7  pages, linear regression, chapter 17 | 11  pages, hands-on application of exploratory factor analysis in educational research, chapter 18 | 9  pages, trending topic, part section v | 47  pages, qualitative methods, chapter 19 | 3  pages, introduction to section v, chapter 20 | 8  pages, listening deeply, chapter 21 | 9  pages, write what you see, not what you know, chapter 22 | 8  pages, on the recovery of black life, chapter 23 | 8  pages, emerging approaches, chapter 24 | 9  pages, exploring how epistemologies guide the process of coding data and developing themes, part section vi | 31  pages, mixed methods, chapter 25 | 3  pages, introduction to section vi, chapter 26 | 8  pages, low hanging fruit, ripe for inquiry, chapter 27 | 8  pages, creating your masterpiece, chapter 28 | 10  pages, presenting and visualizing a mixed methods study, part section vii | 43  pages, findings and discussion, chapter 29 | 2  pages, introduction to section vii, chapter 30 | 9  pages, an introduction to regression using critical quantitative thinking, chapter 31 | 7  pages, show the story, chapter 32 | 8  pages, block by block, chapter 33 | 7  pages, making the theoretical practical, chapter 34 | 8  pages, the donut memo, part section viii | 48  pages, special topics, chapter 35 | 3  pages, introduction to section viii, chapter 36 | 7  pages, scholarly identity development of undergraduate researchers, chapter 37 | 8  pages, developing students' cultural competence through video interviews, chapter 38 | 8  pages, preparing students for community-engaged scholarship, chapter 39 | 7  pages, teaching policy implications, chapter 40 | 7  pages, introducing scholars to public writing, chapter | 6  pages, closing words.

  • Privacy Policy
  • Terms & Conditions
  • Cookie Policy
  • Taylor & Francis Online
  • Taylor & Francis Group
  • Students/Researchers
  • Librarians/Institutions

Connect with us

Registered in England & Wales No. 3099067 5 Howick Place | London | SW1P 1WG © 2024 Informa UK Limited

  • Privacy Policy

Research Method

Home » Research Methods – Types, Examples and Guide

Research Methods – Types, Examples and Guide

Table of Contents

Research Methods

Research Methods

Definition:

Research Methods refer to the techniques, procedures, and processes used by researchers to collect , analyze, and interpret data in order to answer research questions or test hypotheses. The methods used in research can vary depending on the research questions, the type of data that is being collected, and the research design.

Types of Research Methods

Types of Research Methods are as follows:

Qualitative research Method

Qualitative research methods are used to collect and analyze non-numerical data. This type of research is useful when the objective is to explore the meaning of phenomena, understand the experiences of individuals, or gain insights into complex social processes. Qualitative research methods include interviews, focus groups, ethnography, and content analysis.

Quantitative Research Method

Quantitative research methods are used to collect and analyze numerical data. This type of research is useful when the objective is to test a hypothesis, determine cause-and-effect relationships, and measure the prevalence of certain phenomena. Quantitative research methods include surveys, experiments, and secondary data analysis.

Mixed Method Research

Mixed Method Research refers to the combination of both qualitative and quantitative research methods in a single study. This approach aims to overcome the limitations of each individual method and to provide a more comprehensive understanding of the research topic. This approach allows researchers to gather both quantitative data, which is often used to test hypotheses and make generalizations about a population, and qualitative data, which provides a more in-depth understanding of the experiences and perspectives of individuals.

Key Differences Between Research Methods

The following Table shows the key differences between Quantitative, Qualitative and Mixed Research Methods

Research MethodQuantitativeQualitativeMixed Methods
To measure and quantify variablesTo understand the meaning and complexity of phenomenaTo integrate both quantitative and qualitative approaches
Typically focused on testing hypotheses and determining cause and effect relationshipsTypically exploratory and focused on understanding the subjective experiences and perspectives of participantsCan be either, depending on the research design
Usually involves standardized measures or surveys administered to large samplesOften involves in-depth interviews, observations, or analysis of texts or other forms of dataUsually involves a combination of quantitative and qualitative methods
Typically involves statistical analysis to identify patterns and relationships in the dataTypically involves thematic analysis or other qualitative methods to identify themes and patterns in the dataUsually involves both quantitative and qualitative analysis
Can provide precise, objective data that can be generalized to a larger populationCan provide rich, detailed data that can help understand complex phenomena in depthCan combine the strengths of both quantitative and qualitative approaches
May not capture the full complexity of phenomena, and may be limited by the quality of the measures usedMay be subjective and may not be generalizable to larger populationsCan be time-consuming and resource-intensive, and may require specialized skills
Typically focused on testing hypotheses and determining cause-and-effect relationshipsSurveys, experiments, correlational studiesInterviews, focus groups, ethnographySequential explanatory design, convergent parallel design, explanatory sequential design

Examples of Research Methods

Examples of Research Methods are as follows:

Qualitative Research Example:

A researcher wants to study the experience of cancer patients during their treatment. They conduct in-depth interviews with patients to gather data on their emotional state, coping mechanisms, and support systems.

Quantitative Research Example:

A company wants to determine the effectiveness of a new advertisement campaign. They survey a large group of people, asking them to rate their awareness of the product and their likelihood of purchasing it.

Mixed Research Example:

A university wants to evaluate the effectiveness of a new teaching method in improving student performance. They collect both quantitative data (such as test scores) and qualitative data (such as feedback from students and teachers) to get a complete picture of the impact of the new method.

Applications of Research Methods

Research methods are used in various fields to investigate, analyze, and answer research questions. Here are some examples of how research methods are applied in different fields:

  • Psychology : Research methods are widely used in psychology to study human behavior, emotions, and mental processes. For example, researchers may use experiments, surveys, and observational studies to understand how people behave in different situations, how they respond to different stimuli, and how their brains process information.
  • Sociology : Sociologists use research methods to study social phenomena, such as social inequality, social change, and social relationships. Researchers may use surveys, interviews, and observational studies to collect data on social attitudes, beliefs, and behaviors.
  • Medicine : Research methods are essential in medical research to study diseases, test new treatments, and evaluate their effectiveness. Researchers may use clinical trials, case studies, and laboratory experiments to collect data on the efficacy and safety of different medical treatments.
  • Education : Research methods are used in education to understand how students learn, how teachers teach, and how educational policies affect student outcomes. Researchers may use surveys, experiments, and observational studies to collect data on student performance, teacher effectiveness, and educational programs.
  • Business : Research methods are used in business to understand consumer behavior, market trends, and business strategies. Researchers may use surveys, focus groups, and observational studies to collect data on consumer preferences, market trends, and industry competition.
  • Environmental science : Research methods are used in environmental science to study the natural world and its ecosystems. Researchers may use field studies, laboratory experiments, and observational studies to collect data on environmental factors, such as air and water quality, and the impact of human activities on the environment.
  • Political science : Research methods are used in political science to study political systems, institutions, and behavior. Researchers may use surveys, experiments, and observational studies to collect data on political attitudes, voting behavior, and the impact of policies on society.

Purpose of Research Methods

Research methods serve several purposes, including:

  • Identify research problems: Research methods are used to identify research problems or questions that need to be addressed through empirical investigation.
  • Develop hypotheses: Research methods help researchers develop hypotheses, which are tentative explanations for the observed phenomenon or relationship.
  • Collect data: Research methods enable researchers to collect data in a systematic and objective way, which is necessary to test hypotheses and draw meaningful conclusions.
  • Analyze data: Research methods provide tools and techniques for analyzing data, such as statistical analysis, content analysis, and discourse analysis.
  • Test hypotheses: Research methods allow researchers to test hypotheses by examining the relationships between variables in a systematic and controlled manner.
  • Draw conclusions : Research methods facilitate the drawing of conclusions based on empirical evidence and help researchers make generalizations about a population based on their sample data.
  • Enhance understanding: Research methods contribute to the development of knowledge and enhance our understanding of various phenomena and relationships, which can inform policy, practice, and theory.

When to Use Research Methods

Research methods are used when you need to gather information or data to answer a question or to gain insights into a particular phenomenon.

Here are some situations when research methods may be appropriate:

  • To investigate a problem : Research methods can be used to investigate a problem or a research question in a particular field. This can help in identifying the root cause of the problem and developing solutions.
  • To gather data: Research methods can be used to collect data on a particular subject. This can be done through surveys, interviews, observations, experiments, and more.
  • To evaluate programs : Research methods can be used to evaluate the effectiveness of a program, intervention, or policy. This can help in determining whether the program is meeting its goals and objectives.
  • To explore new areas : Research methods can be used to explore new areas of inquiry or to test new hypotheses. This can help in advancing knowledge in a particular field.
  • To make informed decisions : Research methods can be used to gather information and data to support informed decision-making. This can be useful in various fields such as healthcare, business, and education.

Advantages of Research Methods

Research methods provide several advantages, including:

  • Objectivity : Research methods enable researchers to gather data in a systematic and objective manner, minimizing personal biases and subjectivity. This leads to more reliable and valid results.
  • Replicability : A key advantage of research methods is that they allow for replication of studies by other researchers. This helps to confirm the validity of the findings and ensures that the results are not specific to the particular research team.
  • Generalizability : Research methods enable researchers to gather data from a representative sample of the population, allowing for generalizability of the findings to a larger population. This increases the external validity of the research.
  • Precision : Research methods enable researchers to gather data using standardized procedures, ensuring that the data is accurate and precise. This allows researchers to make accurate predictions and draw meaningful conclusions.
  • Efficiency : Research methods enable researchers to gather data efficiently, saving time and resources. This is especially important when studying large populations or complex phenomena.
  • Innovation : Research methods enable researchers to develop new techniques and tools for data collection and analysis, leading to innovation and advancement in the field.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Context of the Study

Context of the Study – Writing Guide and Examples

Quasi-Experimental Design

Quasi-Experimental Research Design – Types...

Conceptual Framework

Conceptual Framework – Types, Methodology and...

Figures in Research Paper

Figures in Research Paper – Examples and Guide

Purpose of Research

Purpose of Research – Objectives and Applications

One-to-One Interview in Research

One-to-One Interview – Methods and Guide

American Psychological Association Logo

Methods for Quantitative Research in Psychology

  • Conducting Research

Psychological Research

August 2023

research method of education

This seven-hour course provides a comprehensive exploration of research methodologies, beginning with the foundational steps of the scientific method. Students will learn about hypotheses, experimental design, data collection, and the analysis of results. Emphasis is placed on defining variables accurately, distinguishing between independent, dependent, and controlled variables, and understanding their roles in research.

The course delves into major research designs, including experimental, correlational, and observational studies. Students will compare and contrast these designs, evaluating their strengths and weaknesses in various contexts. This comparison extends to the types of research questions scientists pose, highlighting how different designs are suited to different inquiries.

A critical component of the course is developing the ability to judge the quality of sources for literature reviews. Students will learn criteria for evaluating the credibility, relevance, and reliability of sources, ensuring that their understanding of the research literature is built on a solid foundation.

Reliability and validity are key concepts addressed in the course. Students will explore what it means for an observation to be reliable, focusing on consistency and repeatability. They will also compare and contrast different forms of validity, such as internal, external, construct, and criterion validity, and how these apply to various research designs.

The course concepts are thoroughly couched in examples drawn from the psychological research literature. By the end of the course, students will be equipped with the skills to design robust research studies, critically evaluate sources, and understand the nuances of reliability and validity in scientific research. This knowledge will be essential for conducting high-quality research and contributing to the scientific community.

Learning objectives

  • Describe the steps of the scientific method.
  • Specify how variables are defined.
  • Compare and contrast the major research designs.
  • Explain how to judge the quality of a source for a literature review.
  • Compare and contrast the kinds of research questions scientists ask.
  • Explain what it means for an observation to be reliable.
  • Compare and contrast forms of validity as they apply to the major research designs.

This program does not offer CE credit.

More in this series

Introduces applying statistical methods effectively in psychology or related fields for undergraduates, high school students, and professionals.

August 2023 On Demand Training

Introduces the importance of ethical practice in scientific research for undergraduates, high school students, and professionals.

  • Open access
  • Published: 28 August 2024

The design, implementation, and evaluation of a blended (in-person and virtual) Clinical Competency Examination for final-year nursing students

  • Rita Mojtahedzadeh 1 ,
  • Tahereh Toulabi 2 , 3 &
  • Aeen Mohammadi 1  

BMC Medical Education volume  24 , Article number:  936 ( 2024 ) Cite this article

7 Altmetric

Metrics details

Introduction

Studies have reported different results of evaluation methods of clinical competency tests. Therefore, this study aimed to design, implement, and evaluate a blended (in-person and virtual) Competency Examination for final-year Nursing Students.

This interventional study was conducted in two semesters of 2020–2021 using an educational action research method in the nursing and midwifery faculty. Thirteen faculty members and 84 final-year nursing students were included in the study using a census method. Eight programs and related activities were designed and conducted during the examination process. Students completed the Spielberger Anxiety Inventory before the examination, and both faculty members and students completed the Acceptance and Satisfaction questionnaire.

The results of the analysis of focused group discussions and reflections indicated that the virtual CCE was not capable of adequately assessing clinical skills. Therefore, it was decided that the CCE for final-year nursing students would be conducted using a blended method. The activities required for performing the examination were designed and implemented based on action plans. Anxiety and satisfaction were also evaluated as outcomes of the study. There was no statistically significant difference in overt, covert, and overall anxiety scores between the in-person and virtual sections of the examination ( p  > 0.05). The mean (SD) acceptance and satisfaction scores for students in virtual, in-person, and blended sections were 25.49 (4.73), 27.60 (4.70), and 25.57 (4.97), respectively, out of 30 points, in which there was a significant increase in the in-person section compared to the other sections. ( p  = 0.008). The mean acceptance and satisfaction scores for faculty members were 30.31 (4.47) in the virtual, 29.86 (3.94) in the in-person, and 30.00 (4.16) out of 33 in the blended, and there was no significant difference between the three sections ( p  = 0.864).

Evaluating nursing students’ clinical competency using a blended method was implemented and solved the problem of students’ graduation. Therefore, it is suggested that the blended method be used instead of traditional in-person or entirely virtual exams in epidemics or based on conditions, facilities, and human resources. Also, the use of patient simulation, virtual reality, and the development of necessary virtual and in-person training infrastructure for students is recommended for future research. Furthermore, considering that the acceptance of traditional in-person exams among students is higher, it is necessary to develop virtual teaching strategies.

Peer Review reports

The primary mission of the nursing profession is to educate competent, capable, and qualified nurses with the necessary knowledge and skills to provide quality nursing care to preserve and improve the community’s health [ 1 ]. Clinical education is one of the most essential and fundamental components of nursing education, in which students gain clinical experience by interacting with actual patients and addressing real problems. Therefore, assessing clinical skills is very challenging. The main goal of educational evaluation is to improve, ensure, and enhance the quality of the academic program. In this regard, evaluating learners’ performance is one of the critical and sensitive aspects of the teaching and learning process. It is considered one of the fundamental elements of the educational program [ 2 ]. The study area is educational evaluation.

Various methods are used to evaluate nursing students. The Objective Structured Clinical Examination (OSCE) is a valid and reliable method for assessing clinical competence [ 1 , 2 ]. In the last twenty years, the use of OSCE has increased significantly in evaluating medical and paramedical students to overcome the limitations of traditional practical evaluation systems [ 3 , 4 ]. The advantages of this method include providing rapid feedback, uniformity for all examinees, and providing conditions close to reality. However, the time-consuming nature and the need for a lot of personnel and equipment are some disadvantages of OSCE [ 5 , 6 ]. Additionally, some studies have shown that this method is anxiety-provoking for some students and, due to time constraints, being observed by the evaluator and other factors can cause dissatisfaction among students [ 7 , 8 ].

However, some studies have also reported that this method is not only not associated with high levels of stress among students [ 9 ] but also has higher satisfaction than traditional evaluation methods [ 4 ]. In addition, during the COVID-19 pandemic, problems such as overcrowding and student quarantine during the exam have arisen. Therefore, reducing time and costs, eliminating or reducing the tiring quarantine time, optimizing the exam, utilizing all facilities for simulating the clinical environment, using innovative methods for conducting the exam, reducing stress, increasing satisfaction, and ultimately preventing the transmission of COVID-19 are significant problems that need to be further investigated.

Studies show that using virtual space as an alternative solution is strongly felt [ 10 , 11 , 12 ]. In the fall of 2009, following the outbreak of H1N1, educational classes in the United States were held virtually [ 13 ]. Also, in 2005, during Hurricane Katrina, 27 universities in the Gulf of Texas used emergency virtual education and evaluation [ 14 ].

One of the challenges faced by healthcare providers in Iran, like most countries in the world, especially during the COVID-19 outbreak, was the shortage of nursing staff [ 15 , 16 ]. Also, in evaluating and conducting CCE for final-year students and subsequent job seekers in the Clinical Skills Center, problems such as student overcrowding and the need for quarantine during the implementation of OSCE existed. This problem has been reported not only for us but also in other countries [ 17 ]. The intelligent use of technology can solve many of these problems. Therefore, almost all educational institutions have quickly started changing their policies’ paradigms to introduce online teaching and evaluation methods [ 18 , 19 ].

During the COVID-19 pandemic, for the first time, this exam was held virtually in our school. However, feedback from professors and students and the experiences of researchers have shown that the virtual exam can only partially evaluate clinical and practical skills in some stations, such as basic skills, resuscitation, and pediatrics [ 20 ].

Additionally, using OSCE in skills assessment facilitates the evaluation of psychological-motor knowledge and attitudes and helps identify strengths and weaknesses [ 21 ]. Clinical competency is a combination of theoretical knowledge and clinical skills. Therefore, using an effective blended method focusing on the quality and safety of healthcare that measures students’ clinical skills and theoretical expertise more accurately in both in-person and virtual environments is essential. The participation of students, professors, managers, education and training staff, and the Clinical Skills Center was necessary to achieve this important and inevitable goal. Therefore, the Clinical Competency Examination (CCE) for nursing students in our nursing and midwifery school was held in the form of an educational action research process to design, implement, and evaluate a blended method. Implementing this process during the COVID-19 pandemic, when it was impossible to hold an utterly in-person exam, helped improve the quality of the exam and address its limitations and weaknesses while providing the necessary evaluation for students.

The innovation of this research lies in evaluating the clinical competency of final-year nursing students using a blended method that focuses on clinical and practical aspects. In the searches conducted, only a few studies have been done on virtual exams and simulations, and a similar study using a blended method was not found.

The research investigates the scientific and clinical abilities of nursing students through the clinical competency exam. This exam, traditionally administered in person, is a crucial milestone for final-year nursing students, marking their readiness for graduation. However, the unforeseen circumstances of the COVID-19 pandemic and the resulting restrictions rendered in-person exams impractical in 2020. This necessitated a swift and significant transition to an online format, a decision that has profound implications for the future of nursing education. While the adoption of online assessment was a necessary step to ensure student graduation and address the nursing workforce shortage during the pandemic, it was not without its challenges. The accurate assessment of clinical skills, such as dressing and CPR, proved to be a significant hurdle. This underscored the urgent need for a change in the exam format, prompting a deeper exploration of innovative solutions.

To address these problems, the research was conducted collaboratively with stakeholders, considering the context and necessity for change in exam administration. Employing an Action Research (AR) approach, a blend of online and in-person exam modalities was adopted. Necessary changes were implemented through a cyclic process involving problem identification, program design, implementation, reflection, and continuous evaluation.

The research began by posing the following questions:

What are the problems of conducting the CCE for final-year nursing students during COVID-19?

How can these problems be addressed?

What are the solutions and suggestions from the involved stakeholders?

How can the CCE be designed, implemented, and evaluated?

What is the impact of exam type on student anxiety and satisfaction?

These questions guided the research in exploring the complexities of administering the CCE amidst the COVID-19 pandemic and in devising practical solutions to ensure the validity and reliability of the assessment while meeting stakeholders’ needs.

Materials and methods

Research setting, expert panel members, job analysis, and role delineation.

This action research was conducted at the Nursing and Midwifery School of Lorestan University of Medical Sciences, with a history of approximately 40 years. The school accommodates 500 undergraduate and graduate nursing students across six specialized fields, with 84 students enrolled in their final year of undergraduate studies. Additionally, the school employs 26 full-time faculty members in nursing education departments.

An expert panel was assembled, consisting of faculty members specializing in various areas, including medical-surgical nursing, psychiatric nursing, community health nursing, pediatric nursing, and intensive care nursing. The panel also included educational department managers and the examination department supervisor. Through focused group discussions, the panel identified and examined issues regarding the exam format, and members proposed various solutions. Subsequently, after analyzing the proposed solutions and drawing upon the panel members’ experiences, specific roles for each member were delineated.

Sampling and participant selection

Given the nature of the research, purposive sampling was employed, ensuring that all individuals involved in the design, implementation, and evaluation of the exam participated in this study.

The participants in this study included final-year nursing students, faculty members, clinical skills center experts, the dean of the school, the educational deputy, group managers, and the exam department head. However, in the outcome evaluation phase, 13 faculty members participated in-person and virtually (26 times), and 84 final-year nursing students enrolled in the study using a census method in two semesters of 2020–2021 completed the questionnaires, including 37 females and 47 males. In addition, three male and ten female faculty members participated in this study; of this number, 2 were instructors, and 11 were assistant professors.

Data collection tools

In order to enhance the validity and credibility of the study and thoroughly examine the results, this study utilized a triangulation method consisting of demographic information, focus group discussions, the Spielberger Anxiety Scale questionnaire, and an Acceptance and Satisfaction Questionnaire.

Demographic information

A questionnaire was used to gather demographic information from both students and faculty members. For students, this included age, gender, and place of residence, while for faculty members, it included age, gender, field of study, and employment status.

Focus group discussion

Multiple focused group discussions were conducted with the participation of professors, administrators, experts, and students. These discussions were held through various platforms such as WhatsApp Skype, and in-person meetings while adhering to health protocols. The researcher guided the talks toward the research objectives and raised fundamental questions, such as describing the strengths and weaknesses of the previous exam, determining how to conduct the CCE considering the COVID-19 situation, deciding on virtual and in-person stations, specifying the evaluation checklists for stations, and explaining the weighting and scoring of each station.

Spielberger anxiety scale questionnaire

This study used the Spielberger Anxiety Questionnaire to measure students’ overt and covert anxiety levels. This questionnaire is an internationally standardized tool known as the STAI questionnaire that measures both overt (state) and covert (trait) anxiety [ 22 ]. The state anxiety scale (Form Y-1 of STAI) comprises twenty statements that assess the individual’s feelings at the moment of responding. The trait anxiety scale (Form Y-2 of STAI) also includes twenty statements that measure individuals’ general and typical feelings. The scores of each of the two scales ranged from 20 to 80 in the current study. The reliability coefficient of the test for the apparent and hidden anxiety scales, based on Cronbach’s alpha, was confirmed to be 0.9084 and 0.9025, respectively [ 23 , 24 ]. Furthermore, in the present study, Cronbach’s alpha value for the total anxiety questionnaire, overt anxiety, and covert anxiety scales were 0.935, 0.921, and 0.760, respectively.

Acceptance and satisfaction questionnaire

The Acceptability and Satisfaction Questionnaire for Clinical Competency Test was developed by Farajpour et al. (2012). The student questionnaire consists of ten questions, and the professor questionnaire consists of eleven questions, using a four-point Likert scale. Experts have confirmed the validity of these questionnaires, and their Cronbach’s alpha coefficients have been determined to be 0.85 and 0.87 for the professor and student questionnaires, respectively [ 6 ]. In the current study, ten medical education experts also confirmed the validity of the questionnaires. Regarding internal reliability, Cronbach’s alpha coefficients for the student satisfaction questionnaire for both virtual and in-person sections were 0.76 and 0.87, respectively. The professor satisfaction questionnaires were 0.84 and 0.87, respectively. An online platform was used to collect data for the virtual exam.

Data analysis and rigor of study

Qualitative data analysis was conducted using the method proposed by Graneheim and Lundman. Additionally, the criteria established by Lincoln and Guba (1985) were employed to confirm the rigor and validity of the data, including credibility, transferability, dependability, and confirmability [ 26 ].

In this research, data synthesis was performed by combining the collected data with various tools and methods. The findings of this study were reviewed and confirmed by participants, supervisors, mentors, and experts in qualitative research, reflecting their opinions on the alignment of findings with their experiences and perspectives on clinical competence examinations. Therefore, the member check method was used to validate credibility.

Moreover, efforts were made in this study to provide a comprehensive description of the research steps, create a suitable context for implementation, assess the views of others, and ensure the transferability of the results.

Furthermore, researchers’ interest in identifying and describing problems, reflecting, designing, implementing, and evaluating clinical competence examinations, along with the engagement of stakeholders in these examinations, was ensured by the researchers’ long-term engagement of over 25 years with the environment and stakeholders, seeking their opinions and considering their ideas and views. These factors contributed to ensuring confirmability.

In this research, by reflecting the results to the participants and making revisions by the researchers, problem clarification and solution presentation, design, implementation, and evaluation of operational programs with stakeholder participation and continuous presence were attempted to prevent biases, assumptions, and research hypotheses, and to confirm dependability.

Data analysis was performed using SPSS version 21, and descriptive statistical tests (absolute and relative frequency, mean, and standard deviation) and inferential tests (paired t-test, independent t-test, and analysis of variance) were used. The significance level was set at 0.05. Parametric tests were used based on the normality of the data according to the Kolmogorov-Smirnov statistical test.

Given that conducting the CCE for final-year nursing students required the active participation of managers, faculty members, staff, and students, and to answer the research question “How can the CCE for final-year nursing students be conducted?” and achieve the research objective of “designing, implementing, and evaluating the clinical competency exam,” the action research method was employed.

The present study was conducted based on the Dickens & Watkins model. There are four primary stages (Fig.  1 ) in the cyclical action research process: reflect, plan, act, observe, and then reflect to continue through the cycle [ 27 ].

figure 1

The cyclical process of action research [ 27 ]

Stage 1: Reflection

Identification of the problem.

According to the educational regulations, final semester nursing students must complete the clinical competency exam. However, due to the COVID-19 pandemic and the critical situation in most provinces, inter-city travel restrictions, and insufficient dormitory space, conducting the CCE in-person was not feasible.

This exam was conducted virtually at our institution. However, based on the reflections from experts, researchers have found that virtual exams can only partially assess clinical and practical skills in certain stations, such as basic skills, resuscitation, and pediatrics. Furthermore, utilizing Objective Structured Clinical Examination (OSCE) in skills assessment facilitates the evaluation of psychomotor skills, knowledge, and attitudes, aiding in identifying strengths and weaknesses.

P3, “Due to the COVID-19 pandemic and the critical situation in most provinces, inter-city travel restrictions, and insufficient dormitory space, conducting the CCE in-person is not feasible.”

Stage 2: Planning

Based on the reflections gathered from the participants, the exam was designed using a blended approach (combining in-person and virtual components) as per the schedule outlined in Fig.  2 . All planned activities for the blended CCE for final-year nursing students were executed over two semesters.

P5, “Taking the exam virtually might seem easier for us and the students, but in my opinion, it’s not realistic. For instance, performing wound dressing or airway management is very practical, and it’s not possible to assess students with a virtual scenario. We need to see them in person.”

P6"I believe it’s better to conduct those activities that are highly practical in person, but for those involving communication skills like report writing, professional ethics, etc., we can opt for virtual assessment.”

figure 2

Design and implementation of the blended CCE

Stage 3: Act

Cce implementation steps.

The CCE was conducted based on the flowchart in Fig.  3 and the following steps:

figure 3

Steps for conducting the CCE for final-year nursing students using a blended method

Step 1: Designing the framework for conducting the blended Clinical Competency Examination

The panelists were guided to design the blended exam in focused group sessions and virtual panels based on the ADDIE (Analysis, Design, Development, Implementation, Evaluation) model [ 28 ]. Initially, needs assessment and opinion polling were conducted, followed by the operational planning of the exam, including the design of the blueprint table (Table  1 ), determination of station types (in-person or virtual), designing question stems in the form of scenarios, creating checklists and station procedure guides by expert panel groups based on participant analysis, and the development of exam implementation guidelines with participant input [ 27 ]. The design, execution, and evaluation were as follows:

In-person and virtual meetings with professors were held to determine the exam schedule, deadlines for submitting checklists, decision-making regarding the virtual or in-person nature of stations based on the type of skill (practical, communication), and presenting problems and solutions. Based on the decisions, primary skill stations, as well as cardiac and pediatric resuscitation stations, were held in person. In contrast, virtual stations for health, nursing ethics, nursing reports, nursing diagnosis, physical examinations, and psychiatric nursing were held.

News about the exam was communicated to students through the college website and text messages. Then, an online orientation session was held on Skype with students regarding the need assessment of pre-exam educational workshops, virtual and in-person exam standards, how to use exam software, how to conduct virtual exams, explaining the necessary infrastructure for participating in the exam by students, completing anxiety and satisfaction questionnaires, rules and regulations, how to deal with rejected individuals, and exam testing and Q&A. Additionally, a pre-exam in-person orientation session was held.

To inform students about the entire educational process, the resources and educational content recommended by the professors, including PDF files, photos and videos, instructions, and links, were shared through a virtual group on the social media messenger, and scientific information was also, questions were asked and answered through this platform.

Correspondence and necessary coordination were made with the university clinical skills center to conduct in-person workshops and exams.

Following the Test-centered approach, the Angoff Modified method [ 29 , 30 ] was used to determine the scoring criteria for each station by panelists tasked with assigning scores.

Additionally, in establishing standards for this blended CCE for fourth-year nursing students, for whom graduation was a prerequisite, the panelists, as experienced clinical educators familiar with the performance and future roles of these students and the assessment method of the blended exam, were involved [ 29 , 30 ](Table 1 ).

Step 2: Preparing the necessary infrastructure for conducting the exam

Software infrastructure.

The pre- and post-virtual exam questions, scenarios, and questionnaires were uploaded using online software.

The exam was conducted on a trial basis in multiple sessions with the participation of several faculty members, and any issues were addressed. Students were authenticated to enter the exam environment via email and personal information verification. The questions for each station were designed and entered into the software by the respective station instructors and the examination coordinator, who facilitated the exam. The questions were formatted as clinical scenarios, images, descriptive questions, and multiple-choice questions, emphasizing the clinical and practical aspects. This software had various features for administering different types of exams and various question formats, including multiple-choice, descriptive, scenario-based, image-based, video-based, matching, Excel output, and graphical and descriptive statistical analyses. It also had automatic questionnaire completion, notification emails, score addition to questionnaires, prevention of multiple answer submissions, and the ability to upload files up to 4 gigabytes. Student authentication was based on national identification numbers and student IDs, serving as user IDs and passwords. Students could enter the exam environment using their email and multi-level personal information verification. If the information did not match, individuals could not access the exam environment.

Checklists and questionnaires

A student list was prepared, and checklists for the in-person exam and anxiety and satisfaction questionnaires were reproduced.

Empowerment workshops for professors and education staff

Educational needs of faculty members and academic staff include conducting clinical competency exams using the OSCE method; simulating and evaluating OSCE exams; designing standardized questions, checklists, and scenarios; innovative approaches in clinical evaluations; designing physical spaces and setting up stations; and assessing ethics and professional commitment in clinical competency exams.

Student empowerment programs

According to the students’ needs assessment results, in-person workshops on cardiopulmonary resuscitation and airway management and online workshops were held on health, pediatrics, cardiopulmonary resuscitation, ethics, nursing diagnosis, and report writing through Skype messenger. In addition, vaccination notes, psychiatric nursing, and educational files on clinical examinations and basic skills were recorded by instructors and made available to students via virtual groups.

Step 3: CCE implementation

The CCE was held in two parts, in-person and virtual.

In-person exam

The OSCE method was used for this section of the exam. The basic skills station exam included dressing and injections, and the CPR and pediatrics stations were conducted in person. The students were divided into two groups of 21 each semester, and the exam was held in two shifts. While adhering to quarantine protocols, the students performed the procedures for seven minutes at each station, and instructors evaluated them using a checklist. An additional minute was allotted for transitioning to the next station.

Virtual exam

The professional ethics, nursing diagnosis, nursing report, health, psychiatric nursing, and physical examination stations were conducted virtually after the in-person exam. This exam was made available to students via a primary and a secondary link in a virtual space at the scheduled time. Students were first verified, and after the specified time elapsed, the ability to respond to inactive questions and submitted answers was sent. During the exam, full support was provided by the examination center.

The examination coordinator conducted the entire virtual exam process. The exam results were announced 48 h after the exam. A passing grade was considered to be a score higher than 60% in all stations. Students who failed in various stations were given the opportunity for remediation based on faculty feedback, either through additional study or participation in educational workshops. Subsequent exams were held one week apart from the initial exam. It was stipulated that students who failed in more than half of the stations would be evaluated in the following semester. If they failed in more than three sessions at a station, a decision would be made by the faculty’s educational council. However, no students met these situations.

Step 4: Evaluation

The evaluation of the exam was conducted by examiners using a checklist, and the results were announced as pass or fail.

Stage 4: Observation / evaluation

In this study, both process and outcome evaluations were conducted:

Process evaluation

All programs and activities implemented during the test design and administration process were evaluated in the process evaluation. This evaluation was based on operational program control and reflections received from participants through group discussion sessions and virtual groups.

Sample reflections received from faculty members, managers, experts, and students through group discussions and social messaging platforms after the changes:

P7: “The implementation of the blended virtual exam, in the conditions of the COVID-19 crisis where the possibility of holding in-person exams was not fully available, in my opinion, was able to improve the quality of exam administration and address the limitations and weaknesses of the exam entirely virtually.”

P5: “In my opinion, this blended method was able to better evaluate students in terms of clinical readiness for entering clinical practice.”

Outcomes evaluation

The study outcomes were student anxiety, student acceptance and satisfaction, and faculty acceptance and satisfaction. Before the start of the in-person and virtual exams, the Spielberger Anxiety Questionnaire was provided to students. Additionally, immediately after the exam, students and instructors completed the acceptance and satisfaction questionnaire for the relevant section. After the exam, students and instructors completed the acceptance and satisfaction questionnaire again for the entire exam process, including feasibility, satisfaction with its implementation, and educational impact.

Design framework and implementation for the blended Clinical Competency Examination

The exam was planned using a blended method (part in-person, part virtual) according to the Fig.  2 schedule, and all planned programs for the blended CCE for final-year nursing students were implemented in two semesters.

Evaluation results

In this study, 84 final-year nursing students participated, including 37 females (44.05%) and 47 males (55.95%). Among them, 28 (33.3%) were dormitory residents, and 56 (66.7%) were non-dormitory residents.

In this study, both process and outcome evaluations were conducted.

All programs and activities implemented during the test design and administration process were evaluated in the process evaluation (Table  2 ). This evaluation was based on operational program control and reflections received from participants through group discussion sessions and virtual groups on social media.

Anxiety and satisfaction were examined and evaluated as study outcomes, and the results are presented below.

The paired t-test results in Table  3 showed no statistically significant difference in overt anxiety ( p  = 0.56), covert anxiety ( p  = 0.13), and total anxiety scores ( p  = 0.167) between the in-person and virtual sections before the blended Clinical Competency Examination.

However, the mean (SD) of overt anxiety in persons in males and females was 49.27 (11.16) and 43.63 (13.60), respectively, and this difference was statistically significant ( p  = 0.03). Also, the mean (SD) of overt virtual anxiety in males and females was 45.70 (11.88) and 51.00 (9.51), respectively, and this difference was statistically significant ( p  = 0.03). However, there was no significant difference between males and females regarding covert anxiety in the person ( p  = 0.94) and virtual ( p  = 0.60) sections. In addition, the highest percentage of overt anxiety was apparent in the virtual section among women (15.40%) and the in-person section among men (21.28%) and was prevalent at a moderate to high level.

According to Table  4 , One-way analysis of variance showed a significant difference between the virtual, in-person, and blended sections in terms of acceptance and satisfaction scores.

The results of the One-way analysis of variance showed that the mean (SD) acceptance and satisfaction scores of nursing students of the CCE in virtual, in-person, and blended sections were 25.49 (4.73), 27.60 (4.70), and 25.57 (4.97) out of 30, respectively. There was a significant difference between the three sections ( p  = 0.008).

In addition, 3 (7.23%) male and 10 (76.3%) female faculty members participated in this study; of this number, 2 (15.38%) were instructors, and 11 (84.62%) were assistant professors. Moreover, they were between 29 and 50 years old, with a mean (SD) of 41.37 (6.27). Furthermore, they had 4 to 20 years of work experience with a mean and standard deviation of 13.22(4.43).

The results of the analysis of variance showed that the mean (SD) acceptance and satisfaction scores of faculty members of the CCE in virtual, in-person, and blended sections were 30.31 (4.47), 29.86 (3.94), and 30.00 (4.16) out of 33, respectively. There was no significant difference between the three sections ( p  = 0.864).

This action research study showed that the blended CCE for nursing students is feasible and, depending on the conditions and objectives, evaluation stations can be designed and implemented virtually or in person.

The blended exam, combining in-person and virtual elements, managed to address some of the weaknesses of entirely virtual exams conducted in previous terms due to the COVID-19 pandemic. Given the pandemic conditions, the possibility of performing all in-person stations was not feasible due to the risk of students and evaluators contracting the virus, as well as the need for prolonged quarantine. Additionally, to meet the staffing needs of hospitals, nursing students needed to graduate. By implementing the blended exam idea and conducting in-person evaluations at clinical stations, the assessment of nursing students’ clinical competence was brought closer to reality compared to the entirely virtual method.

Furthermore, the need for human resources, station setup costs, and time spent was less than the entirely in-person method. Therefore, in pandemics or conditions where sufficient financial resources and human resources are not available, the blended approach can be utilized.

Additionally, the evaluation results showed that students’ total and overt anxiety in both virtual and in-person sections of the blended CCE did not differ significantly. However, the overt anxiety of female students in the virtual section and male students in the in-person section was considerably higher. Nevertheless, students’ covert anxiety related to personal characteristics did not differ in virtual and in-person exam sections. However, students’ acceptance and satisfaction in the in-person section were higher than in the virtual and blended sections, with a significant difference. The acceptance and satisfaction of faculty members from the CCE in in-person, virtual, and blended sections were the same and relatively high.

A blended CCE nursing competency exam was not found in the literature review. However, recent studies, especially during the COVID-19 pandemic, have designed and implemented this exam using virtual OSCE. Previously, the CCE was held in-person or through traditional OSCE methods.

During the COVID-19 pandemic, nursing schools worldwide faced difficulties administering clinical competency exams for students. The virtual simulation was used to evaluate clinical competency and develop nursing students’ clinical skills in the United States, including standard videos, home videos, and clinical scenarios. Additionally, an online virtual simulation program was designed to assess the clinical competency of senior nursing students in Hong Kong as a potential alternative to traditional clinical training [ 31 ].

A traditional in-person OSCE was also redesigned and developed through a virtual conferencing platform for nursing students at the University of Texas Medical Branch in Galveston. Survey findings showed that most professors and students considered virtual OSCE a highly effective tool for evaluating communication skills, obtaining a medical history, making differential diagnoses, and managing patients. However, professors noted that evaluating examination techniques in a virtual environment is challenging [ 32 ].

However, Biranvand reported that less than half of the nursing students believed the in-person OSCE was stressful [ 33 ]. At the same time, the results of another study showed that 96.2% of nursing students perceived the exam as anxiety-provoking [ 1 ]. Students believe that the stress of this exam is primarily related to exam time, complexity, and the execution of techniques, as well as confusion about exam methods [ 7 ]. In contrast to previous research results, in a study conducted in Egypt, 75% of students reported that the OSCE method has less stress than other examination methods [ 9 ]. However, there has yet to be a consensus across studies on the causes and extent of anxiety-provoking in the OSCE exam. In a study, the researchers found that in addition to the factors mentioned above, the evaluator’s presence could also be a cause of stress [ 34 ]. Another survey study showed that students perceived the OSCE method as more stressful than the traditional method, mainly due to the large number of stations, exam items, and time constraints [ 7 ]. Another study in Egypt, which designed two stages of the OSCE exam for 75 nursing students, found that 65.6% of students reported that the second stage exam was stressful due to the problem-solving station. In contrast, only 38.9% of participants considered the first-stage exam stressful [ 35 ]. Given that various studies have reported anxiety as one of the disadvantages of the OSCE exam, in this study, one of the outcomes evaluated was the anxiety of final-year nursing students. There was no significant difference in total anxiety and overt anxiety between students in the in-person and virtual sections of the blended Clinical Competency Examination. The overt anxiety was higher in male students in the in-person part and female students in the virtual section, which may be due to their personality traits, but further research is needed to confirm this. Moreover, since students’ total and overt anxiety in the in-person and virtual sections of the exam are the same in resource and workforce shortages or pandemics, the blended CCE is suggested as a suitable alternative to the traditional OSCE test. However, for generalization of the results, it is recommended that future studies consider three intervention groups, where all OSCE stations are conducted virtually in the first group, in-person in the second group, and a blend of in-person and virtual in the third group. Furthermore, the results of the study by Rafati et al. showed that the use of the OSCE clinical competency exam using the OSCE method is acceptable, valid, and reliable for assessing nursing skills, as 50% of the students were delighted, and 34.6% were relatively satisfied with the OSCE clinical competency exam. Additionally, 57.7% of the students believed the exam revealed learning weaknesses [ 1 ]. Another survey study showed that despite higher anxiety about the OSCE exam, students thought that this exam provides equal opportunities for everyone, is less complicated than the traditional method, and encourages the active participation of students [ 7 ]. In another study on maternal and infant care, 95% of the students believed the traditional exam only evaluates memory or practical skills. In contrast, the OSCE exam assesses knowledge, understanding, cognitive and analytical skills, communication, and emotional skills. They believed that explicit evaluation goals, appropriate implementation guidelines, appropriate scheduling, wearing uniforms, equipping the workroom, evaluating many skills, and providing fast feedback are among the advantages of this exam [ 36 ]. Moreover, in a survey study, most students were satisfied with the clinical environment offered by the OSCE CCE using the OSCE method, which is close to reality and involves a hypothetical patient in necessary situations that increase work safety. On the other hand, factors such as the scheduling of stations and time constraints have led to dissatisfaction among students [ 37 ].

Furthermore, another study showed that virtual simulations effectively improve students’ skills in tracheostomy suctioning, triage concepts, evaluation, life-saving interventions, clinical reasoning skills, clinical judgment skills, intravenous catheterization skills, role-based nursing care, individual readiness, critical thinking, reducing anxiety levels, and increasing confidence in the laboratory, clinical nursing education, interactive communication, and health evaluation skills. In addition to knowledge and skills, new findings indicate that virtual simulations can increase confidence, change attitudes and behaviors, and be an innovative, flexible, and hopeful approach for new nurses and nursing students [ 38 ].

Various studies have evaluated the satisfaction of students and faculty members with the OSCE Clinical Competency Examination. In this study, one of the evaluated outcomes was the acceptability and satisfaction of students and faculty members with implementing the CCE in blended, virtual, and in-person sections, which was relatively high and consistent with other studies. One crucial factor that influenced the satisfaction of this study was the provision of virtual justification sessions for students and coordination sessions with faculty members. Social messaging groups were formed through virtual and in-person communication, instructions were explained, expectations and tasks were clarified, and questions were answered. Students and faculty members could access the required information with minimal presence in medical education centers and time and cost constraints. Moreover, with the blended evaluation, the researcher’s communication with participants was more accessible. The written guidelines and uploaded educational content of the workshops enabled students to save the desired topics and review them later if needed. Students had easy access to scientific and up-to-date information, and the application of social messengers and Skype allowed for sending photos and videos, conducting workshops, and questions and answering questions. However, the clinical workshops and examinations were held in-person to ensure accuracy. The virtual part of the examination was conducted through online software, and questions focused on each station’s clinical and practical aspects. Students answered various questions, including multiple-choice, descriptive, scenario, picture, and puzzle questions, within a specified time. The blended examination evaluated clinical competency and did not delay these individuals’ entry into the job market. Moreover, during the severe human resource shortage faced by the healthcare system, the examination allowed several nurses to enter the country’s healthcare system. The blended examination can substitute in-person examination in pandemic and non-pandemic situations, saving facilities, equipment, and human resources. The results of this study can also serve as a model to guide other nursing departments that require appropriate planning and arrangements for Conducting Clinical Competency Examinations in blended formats. This examination can also be developed to evaluate students’ clinical performance.

One of the practical limitations of the study was the possibility that participants might need to complete the questionnaires accurately or be concerned about losing marks. Therefore, in a virtual session before the in-person exam, the objectives and importance of the study were explained. Participants were assured that it would not affect their evaluation and that they should not worry about losing marks. Additionally, active participation from all nursing students, faculty members, and staff was necessary for implementing this plan, achieved through prior coordination, virtual meetings, virtual group formation, and continuous reflection of results, creating the motivation for continued collaboration and participation.

Among other limitations of this study included the use of the Spielberger Anxiety Questionnaire to measure students’ anxiety. It is suggested that future studies use a dedicated anxiety questionnaire designed explicitly for pre-exam anxiety measurement. Another limitation of the current research was its implementation in nursing and midwifery faculty. Therefore, it is recommended that similar studies be conducted in nursing and midwifery faculties of other universities, as well as in related fields, and over multiple consecutive semesters. Additionally, for more precise effectiveness assessment, intervention studies in three separate virtual, in-person, and hybrid groups using electronic checklists are proposed. Furthermore, it is recommended that students be evaluated in terms of other dimensions and variables such as awareness, clinical skill acquisition, self-confidence, and self-efficacy.

Conducting in-person Clinical Competency Examination (CCE) during critical situations, such as the COVID-19 pandemic, is challenging. Instead of virtual exams, blended evaluation is a feasible approach to overcome the shortages of virtual ones and closely mimic in-person scenarios. Using a blended method in pandemics or resource shortages, it is possible to design, implement, and evaluate stations that evaluate basic and advanced clinical skills in in-person section, as well as stations that focus on communication, reporting, nursing diagnosis, professional ethics, mental health, and community health based on scenarios in a virtual section, and replace traditional OSCE exams. Furthermore, the use of patient simulators, virtual reality, virtual practice, and the development of virtual and in-person training infrastructure to improve the quality of clinical education and evaluation and obtain the necessary clinical competencies for students is recommended. Also, since few studies have been conducted using the blended method, it is suggested that future research be conducted in three intervention groups, over longer semesters, based on clinical evaluation models and influential on other outcomes such as awareness and clinical skill acquisition self-efficacy, confidence, obtained grades, and estimation of material and human resources costs. This approach reduced the need for physical space for in-person exams, ensuring participant quarantine and health safety with higher quality. Additionally, a more accurate assessment of nursing students’ practical abilities was achieved compared to a solely virtual exam.

Data availability

The datasets generated and analyzed during the current study are available on request from the corresponding author.

Rafati F, Pilevarzade M, Kiani A. Designing, implementing and evaluating once to assess nursing students’ clinical competence in Jiroft faculty of nursing and midwifery. Nurs Midwifery J. 2020;18(2):118–28.

Google Scholar  

Sadeghi T, Ravari A, Shahabinejad M, Hallakoei M, Shafiee M, Khodadadi H. Performing of OSCE method in nursing students of Rafsanjan University of Medical science before entering the clinical field in the year 2010: a process for quality improvement. Community Health J. 2012;6(1):1–8.

Ali GA, Mehdi AY, Ali HA. Objective structured clinical examination (OSCE) as an assessment tool for clinical skills in Sohag University: nursing students’ perspective. J Environ Stud. 2012;8(1):59–69.

Article   Google Scholar  

Bolourchifard F, Neishabouri M, Ashktorab T, Nasrollahzadeh S. Satisfaction of nursing students with two clinical evaluation methods: objective structured clinical examination (OSCE) and practical examination of clinical competence. Adv Nurs Midwifery. 2010;19(66):38–42.

Noohi E, Motesadi M, Haghdoost A. Clinical teachers’ viewpoints towards Objective Structured Clinical examination in Kerman University of Medical Science. Iran J Med Educ. 2008;8(1):113–20.

Reza Masouleh S, Zare A, Chehrzad M, Atrkarruoshan Z. Comparing two methods of evaluation, objective structured practical examination and traditional examination, on the satisfaction of students in Shahid Beheshti faculty of nursing and midwifery. J Holist Nurs Midwifery. 2008;18(1):22–30.

Bagheri M, Sadeghineajad Forotagheh M, Shaghayee Fallah M. The comparison of stressors in the assessment of basic clinical skills with traditional method and OSCE in nursing students. Life Sci J. 2012;9(4):1748–52.

Eldarir SH, El Sebaae HA, El Feky HA, Hussein HA, El Fadil NA, El Shaeer IH. An introduction of OSCE versus the traditional method in nursing education: Faculty capacity building and students’ perspectives. J Am Sci. 2010;6(12):1002–14.

Al-Zeftawy AM, Khaton SE. Student evaluation of an OSCE in Community Health nursing clinical course at Faculty of nursing, Tanta University. J Nurs Health Sci. 2016;5(4):68–76.

Hayter M, Jackson D. Pre-registration undergraduate nurses and the COVID-19 pandemic: students or workers? J Clin Nurs. 2020;29(17–18):3115–6.

Bayham J, Fenichel EP. Impact of school closures for COVID-19 on the US health-care workforce and net mortality: a modeling study. Lancet Public Health. 2020;5(5):e271–8.

Murphy MPA. COVID-19 and emergency eLearning: consequences of the securitization of higher education for post-pandemic pedagogy. Contemp Secur Policy. 2020;41(3):492–505.

Allen IE, Seaman J. Learning on demand: Online education in the United States, 2009.

Meyer KA, Wilson JL. The role of Online Learning in the emergency plans of Flagship Institutions. Online J Distance Learn Adm. 2011;14(1):110–8.

Kursumovic E, Lennane S, Cook TM. Deaths in healthcare workers due to COVID-19: the need for robust data and analysis. Anaesthesia. 2020;75(8):989–92.

Malekshahi Beiranvand F, Hatami Varzaneh A. Health care workers challenges during coronavirus outbreak: the qualitative study. J Res Behav Sci. 2020;18(2):180–90.

Boursicot K, Kemp S, Ong TH, Wijaya L, Goh SH, Freeman K, Curran I. Conducting a high-stakes OSCE in a COVID-19 environment. MedEdPublish. 2020;9:285–89.

Atwa H, Shehata MH, Al-Ansari A, Kumar A, Jaradat A, Ahmed J, Deifalla A, Online. Face-to-Face, or blended learning? Faculty and Medical Students’ perceptions during the COVID-19 pandemic: a mixed-method study. Front Med. 2022;9:791352.

Chan MMK, Yu DS, Lam VS, Wong JY. Online clinical training in the COVID-19 pandemic. Clin Teach. 2020;17(4):445–6.

Toulabi T, Yarahmadi S. Conducting a clinical competency test for nursing students in a virtual method during the Covid-19 pandemic: a case study. J Nurs Educ. 2021;9(5):33–42.

Meskell P, Burke E, Kropmans TJB, Byrne E, Setyonugroho W, Kennedy KM. Back to the future: an online OSCE Management Information System for nursing OSCEs. Nurse Educ Today. 2015;35(11):1091–6.

Lichtenberg PA. (2010). Handbook of Assessment in Clinical Gerontology, 2nd Ed. Academic Press, https://doi.org/10.1016/B978-0-12-374961-1.10030-2

Gholami Booreng F, Mahram B, Kareshki H. Construction and validation of a scale of research anxiety for students. IJPCP. 2017;23(1):78–93.

Esmaili M. A survey of the influence of Murita therapy on reducing the rate of anxiety in clients of counseling centers. Res Clin Psychol Couns. 2011;1(1):15–30.

Farajpour A, Amini M, Pishbin E, Arshadi H, Sanjarmusavi N, Yousefi J, Sarafrazyazdi M. Teachers’ and students’ satisfaction with DOPS Examination in Islamic Azad University of Mashhad, a study in Year 2012. Iran J Med Educ. 2014;14(2):165–73.

StraussAC, Corbin JM. Basics of qualitative research: grounded theory procedures and technique. 2nd ed. London: Sage, Newbury Park; 1998.

Dickens L, Watkins K. Action research: rethinking Lewin. Manage Learn. 1999;30(2):127–40.

Rezaeerad M, Nadri Kh, Mohammadi Etergoleh R. The effect of ADDIE (analysis, design, development, implementation, evaluation) designing method with emphasizing on mobile learning on students’ self-conception, development motivation and academic development in English course. Educational Adm Res Q. 2013;4(15):15–32.

Ben-David MF. AMEE Guide 18: standard setting in student assessment. Med Teach. 2000;22(2):120–30.

McKinley DW, Norcini JJ. How to set standards on performance-based examinations: AMEE Guide 85. Med Teach. 2014;36(2):97–110.

Fung JTC, Zhang W, Yeung MN, Pang MTH, Lam VSF, Chan BKY, Wong JYH. Evaluation of students perceived clinical competence and learning needs following an online virtual simulation education programmed with debriefing during the COVID-19 pandemic. Nurs Open. 2021;8(6):3045–54.

Luke S, Petitt E, Tombrella J, McGoff E. Virtual evaluation of clinical competence in nurse practitioner students. Med Sci Educ. 2021;31:1267–71.

Beiranvand SH, Hosseinabadi R, Ghasemi F, Anbari KH. An Assessment of nursing and Midwifery Student Veiwwpoin, Performance, and feedback with an objective structured clinical examination. J Nurs Educ. 2017;6(1):63–7.

Sheikh Abumasoudi R, Moghimian M, Hashemi M, Kashani F, Karimi T, Atashi V. Comparison of the Effect of Objective Structured Clinical evaluation (OSCE) with Direct and Indirect Supervision on nursing student’s test anxiety. J Nurs Educ. 2015;4(2):1–8.

Zahran EM, Taha EE. Students’ feedback on Objective Structured Clinical examinations (OSCEs) experience in emergency nursing. J High Inst Public Health. 2009;39(2):370–87.

Na A-G. Assessment of Students’ knowledge, clinical performance and satisfaction with objective structured clinical exam. Med J Cairo Univ. 2009;77(4):287–93.

Adib-Hajbaghery M, Yazdani M. Effects of OSCE on learning, satisfaction and test anxiety of nursing students: a review study. Iran J Med Educ. 2018;18:70–83.

Purwanti LE, Sukartini T, Kurniawati ND, Nursalam N, Susilowati T. Virtual Simulation in clinical nursing education to improve knowledge and clinical skills: Literature Review. Open Access Maced J Med Sci. 2022;10(F):396–404.

Download references

Acknowledgements

We want to thank the Research and Technology deputy of Smart University of Medical Sciences, Tehran, Iran, the faculty members, staff, and officials of the School of Nursing and Midwifery, Lorestan University of Medical Sciences, Khorramabad, Iran, and all individuals who participated in this study.

All steps of the study, including study design and data collection, analysis, interpretation, and manuscript drafting, were supported by the Deputy of Research of Smart University of Medical Sciences.

Author information

Authors and affiliations.

Department of E-Learning in Medical Education, Center of Excellence for E-learning in Medical Education, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran

Rita Mojtahedzadeh & Aeen Mohammadi

Department of Medical Education, Smart University of Medical Sciences, Tehran, Iran

Tahereh Toulabi

Cardiovascular Research Center, School of Nursing and Midwifery, Lorestan University of Medical Sciences, Khorramabad, Iran

You can also search for this author in PubMed   Google Scholar

Contributions

RM. Participating in study design, accrual of study participants, review of the manuscript, and critical revisions for important intellectual content. TT : The investigator; participated in study design, data collection, accrual of study participants, and writing and reviewing the manuscript. AM: Participating in study design, data analysis, accrual of study participants, and reviewing the manuscript. All authors read and approved the final version of the manuscript.

Corresponding author

Correspondence to Tahereh Toulabi .

Ethics declarations

Ethics approval and consent to participate.

This action research was conducted following the participatory method. All methods were performed according to the relevant guidelines and regulations in the Declaration of Helsinki (ethics approval and consent to participate). The study’s aims and procedures were explained to all participants, and necessary assurance was given to them for the anonymity and confidentiality of their information. The results were continuously provided as feedback to the participants. Informed consent (explaining the goals and methods of the study) was obtained from participants. The Smart University of Medical Sciences Ethics Committee approved the study protocol (IR.VUMS.REC.1400.011).

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Mojtahedzadeh, R., Toulabi, T. & Mohammadi, A. The design, implementation, and evaluation of a blended (in-person and virtual) Clinical Competency Examination for final-year nursing students. BMC Med Educ 24 , 936 (2024). https://doi.org/10.1186/s12909-024-05935-9

Download citation

Received : 21 July 2023

Accepted : 20 August 2024

Published : 28 August 2024

DOI : https://doi.org/10.1186/s12909-024-05935-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Clinical Competency Examination (CCE)
  • Objective Structural Clinical Examination (OSCE)
  • Blended method
  • Satisfaction

BMC Medical Education

ISSN: 1472-6920

research method of education

  • Share on twitter
  • Share on facebook

World University Rankings 2024: methodology

The methodology for the 20th edition of the world university rankings has been significantly updated to reflect the outputs of the diverse range of research-intensive universities across the world.

  • Share on linkedin
  • Share on mail

Elements from the World University Rankings 2024 supplement cover

Browse the full results of the World University Rankings 2024

The Times Higher Education World University Rankings are the only global performance tables that judge research-intensive universities across all their core missions: teaching, research, knowledge transfer and international outlook.

This year’s methodology, for the 20th edition of the World University Rankings, has been significantly updated , so that it continues to reflect the outputs of the diverse range of research-intensive universities across the world, now and in the future.

We have moved from 13 to 18 carefully calibrated performance indicators to provide the most comprehensive and balanced comparisons, trusted by students, academics, university leaders, industry and governments. One of the metrics (study abroad) currently has zero weight but will be counted in future (see below).

The performance indicators are still grouped into five areas, although the names of these have been tweaked: Teaching (the learning environment); Research environment (volume, income and reputation); Research quality (citation impact, research strength, research excellence and research influence); International outlook (staff, students and research); and Industry (income and patents).

The full methodology is published in the file at the bottom of this page.

research method of education

Teaching (the learning environment): 29.5%

  • Teaching reputation: 15%
  • Staff-to-student ratio: 4.5%
  • Doctorate-to-bachelor’s ratio: 2%
  • Doctorates-awarded-to-academic-staff ratio: 5.5%
  • Institutional income: 2.5%

The most recent Academic Reputation Survey (run annually, this year conducted by THE ) that underpins this category was carried out between October 2022 and January 2023. We have run the survey to ensure a balanced spread of responses across disciplines and countries. Where disciplines or countries were over- or under-represented, THE ’s data team weighted the responses to fully reflect the global distribution of scholars. The 2023 data are combined with the results of the 2022 survey, giving more than 500,000 votes to universities in 166 countries. Votes come from more than 68,000 cited academics.

As well as giving a sense of how committed an institution is to nurturing the next generation of academics, a high proportion of postgraduate research students also suggests the provision of teaching at the highest level that is thus attractive to graduates and effective at developing them. This indicator is normalised to take account of a university’s unique subject mix, reflecting that the volume of doctoral awards varies by discipline.

Institutional income is scaled against academic staff numbers and normalised for purchasing-power parity (PPP). It indicates an institution’s general status and gives a broad sense of the infrastructure and facilities available to students and staff.

Research environment: 29%

  • Research reputation: 18%
  • Research income: 5.5%
  • Research productivity: 5.5%

The most prominent indicator in this category looks at a university’s reputation for research excellence among its peers, based on the responses to our annual Academic Reputation Survey (see above).

Research income is scaled against academic staff numbers and adjusted for purchasing-power parity (PPP). This is a controversial indicator because it can be influenced by national policy and economic circumstances. But income is crucial to the development of world-class research, and because much of it is subject to competition and judged by peer review, our experts suggested that it was a valid measure. This indicator is fully normalised to take account of each university’s distinct subject profile, reflecting the fact that research grants in science subjects are often bigger than those awarded for the highest-quality social science, arts and humanities research.

To measure productivity, we count the number of publications published in the academic journals indexed by Elsevier’s Scopus database per scholar, scaled for institutional size and normalised for subject. This gives a sense of the university’s ability to get papers published in quality peer-reviewed journals. From the 2018 rankings, we devised a method to give credit for papers that are published in subjects where a university declares no staff.

Research quality: 30%

  • Citation impact: 15%
  • Research strength: 5%
  • Research excellence: 5%
  • Research influence: 5%

Our research quality pillar looks at universities’ role in spreading new knowledge and ideas.

We examine citation impact by capturing the average number of times a university’s published work is cited by scholars globally. This year, our bibliometric data supplier Elsevier examined more than 134 million citations to 16.5 million journal articles, article reviews, conference proceedings, books and book chapters published over five years. The data include more than 27,950 active peer-reviewed journals indexed by Elsevier’s Scopus database and all indexed publications between 2018 and 2022. Citations to these publications made in the six years from 2018 to 2023 are also collected.

The citations help to show us how much each university is contributing to the sum of human knowledge: they tell us whose research has stood out, has been picked up and built on by other scholars and, most importantly, has been shared around the global scholarly community to expand the boundaries of our understanding, irrespective of discipline.

The data are normalised to reflect variations in citation volume between different subject areas. This means that institutions with high levels of research activity in subjects with traditionally high citation counts do not gain an unfair advantage.

We have blended equal measures of a country-adjusted and non-country-adjusted raw measure of citations scores.

Three new research quality measures have been added in 2023. Research strength calculates the 75th percentile of field-weighted citation impact – a very robust guide to how strong typical research is.

Research excellence looks at the number of research publications in the top 10 per cent for field-weighted citation impact worldwide – a guide to the amount of world-leading research at an institution. It is normalised by year, subject and staff numbers.

Research influence helps us to understand when research is recognised in turn by the most influential research in the world – a broader look at excellence. The idea behind the metric is that the value of citations is not equal: a citation from an “important” paper is more significant than a citation from an “unimportant” one. We use an iterative method to measure the importance of a paper by not only counting the number of citations but also taking into account the importance of the citing papers. We also consider the subject of the research, as different disciplines have different citation patterns.

International outlook: 7.5%

  • Proportion of international students: 2.5%
  • Proportion of international staff: 2.5%
  • International collaboration: 2.5%

The ability of a university to attract undergraduates, postgraduates and faculty from all over the planet is key to its success on the world stage. In the third international indicator, we calculate the proportion of a university’s total relevant publications that have at least one international co-author and reward higher volumes. This indicator is normalised to account for a university’s subject mix and uses the same five-year window as the “Research quality” category.

Large countries have been disadvantaged compared to small countries in our international metrics, in that it is “easier” for staff and students in small countries to work or study abroad.​ This has led us to change our normalisation approach for the three measures in 2023, henceforth taking into consideration the population of a country when evaluating these metrics.

A study abroad metric – assessing the provision of international learning opportunities for domestic students – complements the International Outlook pillar, but is currently given a weight of 0%. The zero weight is a temporary provision due to the impact of Covid-19 on international travel. 

Industry: 4%

  • Industry income: 2%
  • Patents: 2%

A university’s ability to help industry with innovations, inventions and consultancy has become a core mission of the contemporary global academy. The industry income metric seeks to capture such knowledge-transfer activity by looking at how much research income an institution earns from industry (adjusted for PPP), scaled against the number of academic staff it employs.

The metric suggests the extent to which businesses are willing to pay for research and a university’s ability to attract funding in the commercial marketplace – useful indicators of institutional quality.

But the extent to which universities are supporting their national economies through technology transfer is an area that deserves greater recognition. The patents metric, introduced in 2023, is defined as the number of patents from any source that cite research conducted by the university.

The data are provided by Elsevier and relate to patents published between 2018 and 2022 (not research published between these dates). Patents are sourced from the World Intellectual Property Organisation, the European Patent Office, and the patent offices of the US, the UK and Japan.

This measure is subject-weighted to avoid penalising universities producing research in fields low in patents, and scaled for institutional size.

Universities can be excluded from the World University Rankings if they do not teach undergraduates, or if their research output amounted to fewer than 1,000 relevant publications between 2018 and 2022 (with a minimum of 150 a year). Universities can also be excluded if 80 per cent or more of their research output is exclusively in one of our 11 subject areas.

Universities at the bottom of the table that are listed as having “reporter” status provided data but did not meet our eligibility criteria to receive a rank. More information  here .

Data collection

Institutions provide and sign off their institutional data for use in the rankings. On the rare occasions when a particular data point at a subject level is not provided, we use an estimate calculated from the overall data point and any available subject-level data point. If a metric score cannot be calculated because of missing data points, it is imputed using a conservative estimate. By doing this, we avoid penalising an institution too harshly with a “zero” value for data that it overlooks or does not provide, but we do not reward it for withholding them.

Getting to the final result

Moving from a series of specific data points to indicators, and finally to a total score for an institution, requires us to match values that represent fundamentally different data. To do this, we use a standardisation approach for each indicator, and then combine the indicators in the proportions we detail above.

The standardisation approach we use is based on the distribution of data within a particular indicator, where we calculate a cumulative probability function, and evaluate where a particular institution’s indicator sits within that function.

For most metrics, we calculate the cumulative probability function using a version of Z-scoring. The distribution of data in the metrics on teaching reputation, research reputation, research excellence, research influence and patents requires us to use an exponential component.

Related files

The world university rankings 2024 methodology.

PDF icon

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter

Or subscribe for unlimited access to:

  • Unlimited access to news, views, insights & reviews
  • Digital editions
  • Digital access to THE’s university and college rankings analysis

Already registered or a current subscriber? Login

Related articles

stages of butterfly metamorphosis to illustrate the changes to the rankings methodology over the past 20 years

World University Rankings 2024: changes to our methodology

The 20th edition is our most robust, inclusive and global ranking, thanks to significant improvements to our methodology

Various rankings covers from the past 20 years

World University Rankings 2024: 20 years tracking global higher education

From just 200 universities to almost 2,000, the World University Rankings have become more global and more inclusive. Ellie Bothwell goes through the archives to examine the most noteworthy developments  

racing team working at pit stop

Impact Rankings 2025: time to register

Universities can now sign up to participate ahead of data collection opening in the autumn

aerial view of crowd

Record number of universities submit data for World University Rankings

African participation surpasses North America for the first time

Boat with flock of birds around it to suggest stewardship

Impact Rankings 2024: universities make trustworthy stewards

Locale determines the mission of many universities, and the Impact Rankings recognise how such diversity and community-mindedness helps to further progress towards the SDGs

Boccia World Cup - Preview Day - University of Ulster

Impact Rankings 2024: who is excelling at stewardship and outreach?

New thematic analysis reveals the countries and institutions that are walking the talk on sustainability 

Featured jobs

research method of education

  • India Today
  • Business Today
  • Harper's Bazaar
  • Brides Today
  • Cosmopolitan
  • India Today Hindi
  • Reader’s Digest
  • Aaj Tak Campus

Download App

Download app

BITS Pilani invites applications for Professor of Practice positions: Direct link

Bits pilani has started inviting experienced professionals to apply for the professor of practice positions, aiming to blend academic learning with industry expertise. this unique opportunity allows experts to contribute to cutting-edge research and innovative teaching methods..

Listen to Story

BITS Pilani invites applications for Professor of Practice positions: Direct link

The exam conducting body, Birla Institute of Technology & Science (BITS) Pilani is seeking experienced industry professionals for the role of Professor of Practice (PoP). This position is designed to integrate real-world experience with academic learning, enhancing students' practical skills and understanding. Interested and eligible applicants are advised to apply on or before the last date, which is September 10, 2024 on the official website,i.e., bits-pilani.ac.in/

According to BITS Pilani, Professors of Practice will play a crucial role in advancing the institution's research and educational initiatives. Their responsibilities will include developing new facilities and labs, designing innovative courses, collaborating with students and faculty, and driving industry-linked applied research that could lead to publications and patents.

ELIGIBILITY CRITERIA

Candidates must have substantial industry experience and should have held significant positions such as CEO, CTO, Vice President, Principal Scientist, or Senior Research Scientist.

STEPS TO APPLY

Your response will get submit automatically.

Direct link to apply for the above positions

IMAGES

  1. Types of Research by Method

    research method of education

  2. Research Methods in Education (6th Edition)

    research method of education

  3. Types of Research Methodology: Uses, Types & Benefits

    research method of education

  4. PPT

    research method of education

  5. Research Methodology In Education

    research method of education

  6. Introduction to Research Methodology in Education

    research method of education

VIDEO

  1. Understanding Research Methods in Education

  2. Steps in the Process of Research

  3. Types of Research in Educational Research(b.ed/m.ed/Net Education)

  4. Research Methodology || Educational and Nursing Research

  5. Scope of Educational Research

  6. Introduction to Educational Research

COMMENTS

  1. Research Methods in Education

    Research Methods in Education is essential reading for both the professional researcher and anyone involved in educational and social research. The book is supported by a wealth of online materials, including PowerPoint slides, useful weblinks, practice data sets, downloadable tables and figures from the book, and a virtual, interactive, self ...

  2. Methodologies for Conducting Education Research

    Presents an overview of qualitative, quantitative and mixed-methods research designs, including how to choose the design based on the research question. This book is particularly helpful for those who want to design mixed-methods studies. Green, J. L., G. Camilli, and P. B. Elmore. 2006. Handbook of complementary methods for research in education.

  3. Sage Research Methods

    Research Methods in Education is an innovative new text for teaching introductory research methods that addresses emerging instructional needs. It weaves actual research "stories" into the presentation of research topics, and it emphasizes validity, authenticity, and practical significance as overarching research goals. ...

  4. Research Methods in Education

    This rewritten, expanded and updated 7 th edition of the long-running bestseller Research Methods in Education encompasses the whole range of methods currently employed by educational research at all stages. It offers plentiful and rich practical advice, underpinned by clear theoretical foundations, research evidence and up-to-date references.

  5. Introduction to Education Research

    An education research methodology represents how the research is designed and conducted to meet the study objectives with valid and reliable results. A broad view of education research methodology distinguishes three primary types: Quantitative research. Qualitative research. Mixed-methods research. 3.2 Research Approaches

  6. Research Methods in Education

    Research Methods in Education is essential reading for both the professional researcher and anyone involved in educational and social research. The book is supported by a wealth of online materials, including PowerPoint slides, useful weblinks, practice data sets, downloadable tables and figures from the book, and a virtual, interactive, self ...

  7. Research Methods in Education

    This thoroughly updated and extended eighth edition of the long-running bestseller Research Methods in Education covers the whole range of methods employed by educational research at all stages. Its five main parts cover: the context of educational research; research design; methodologies for educational research; methods of data collection; and data analysis and reporting. It continues to be ...

  8. Research Methods and Methodologies in Education

    High-quality educational research requires careful consideration of every aspect of the process. This all-encompassing textbook written by leading international experts gives students and early career researchers a considered overview of principles that underpin research, and key qualitative, quantitative and mixed methods for research design ...

  9. Research Methods in Education

    Research Methods in Education is a unique book for everybody who has to undertake educational research projects. The book gives an in depth understanding of quantitative and qualitative research designs and offers a practical guide for data collection and data analysis. It is an essential "friend" for teachers and students from various ...

  10. Research Methods in Education

    This fully updated sixth edition of the international bestseller Research Methods in Education covers the whole range of methods currently employed by educational research at all stages. It is divided into five main parts: the context of educational research; planning educational research; styles of educational research; strategies for data collection and researching; and data analysis.

  11. Educational research

    Educational research refers to the systematic collection and analysis of evidence and data related to the field of education. Research may involve a variety of methods [1] [2] [3] and various aspects of education including student learning, interaction, teaching methods, teacher training, and classroom dynamics. [4]Educational researchers generally agree that research should be rigorous and ...

  12. Research Methods in Education

    Research Methods in Education is an innovative new text for teaching introductory research methods that addresses emerging instructional needs. It weaves actual research 'stories' into the presentation of research topics, and it emphasizes validity, authenticity, and practical significance as overarching research goals. This tripartite conceptual framework honors traditional quantitative ...

  13. Education Research and Methods

    Education Research and Methods. IES seeks to improve the quality of education for all students—prekindergarten through postsecondary and adult education—by supporting education research and the development of tools that education scientists need to conduct rigorous, applied research. Such research aims to advance our understanding of and ...

  14. PDF EDUC 500: Research Methodology in Education

    EDUC 500: Educational Research Methods Lecture Notes University of British Columbia Stephen Petrina (2014) 2 Etymology, History and Philosophy of Research 1. Etymology and Semantics a. Etymology i. Research: Etymologically, research derives from the Italian ricercare and French recherche, meaning to seek out or to search intensively with particular

  15. Systems Research in Education: Designs and methods

    The first section discusses the evolution of research methods in the field of education and the increased focus on systems research. This is followed by a brief discussion of the RLO programme and its proposed systems framework. This framework is 'tested' using a methods lens, taking examples from the various research projects funded within ...

  16. Research methods for pedagogy: seeing the hidden and hard to know

    The scale, depth and scope of studies of pedagogy have shifted in line with this growing understanding. Obviously, research methods adopted for the exploration of any concept need to align with definitions and conceptualizations of the substantive area in question. In the case of pedagogy the range of interrelated elements is considerable and ...

  17. Teaching Research Methods in the Social Sciences: Expert Perspectives

    1. Introduction. The teaching of research methods places very specific demands on teachers and learners. The capacity to undertake and engage with research 'requires a combination of theoretical understanding, procedural knowledge and mastery of a range of practical skills' (Kilburn et al., Citation 2014, p. 191).These pose significant challenges to both methods teachers and learners.

  18. PDF Educational Psychology: A Tool for Effective Teaching

    Common Research Methods Research in Educational Psychology EDUCATIONAL PSYCHOLOGY: A TOOL FOR EFFECTIVE TEACHING The Goals of Educational Psychology Effective Teaching ... † Discuss the nature of research. † Describe how educational psychology research and theory can enhance teaching practice. san23454_ch01_002-027.indd 2 12/1/09 10:51:29 ...

  19. (PDF) Teaching Research Methods: Learning by Doing

    Abstract. This paper outlines ways to structure a research-methods class so that students gain a practical knowledge of how research is done. Emphasis is placed on data collection, using ...

  20. A Practical Guide to Teaching Research Methods in Education

    A Practical Guide to Teaching Research Methods in Education brings together more than 60 faculty experts. The contributors share detailed lesson plans about selected research concepts or skills in education and related disciplines, as well as discussions of the intellectual preparation needed to effectively teach the lesson.

  21. Mixed methods integration strategies used in education: A systematic

    Mixed methods research (MMR) has been widely adopted in a plethora of disciplines. Integration is the pressing issue regarding the legitimation, the added value, and the quality of using MMR, though inadequate literature has discussed effective strategies used in the field of education, including school psychology, counseling, and teacher education.

  22. Research Methods

    Education: Research methods are used in education to understand how students learn, how teachers teach, and how educational policies affect student outcomes. Researchers may use surveys, experiments, and observational studies to collect data on student performance, teacher effectiveness, and educational programs. ...

  23. International Journal of Research & Method in Education

    The International Journal of Research & Method in Education is an interdisciplinary, peer-reviewed journal that draws contributions from a wide community of international researchers.Contributions are expected to develop and further international discourse in educational research with a particular focus on method and methodological issues.

  24. Methods for quantitative research in psychology

    Describe the steps of the scientific method. Specify how variables are defined. Compare and contrast the major research designs. Explain how to judge the quality of a source for a literature review. Compare and contrast the kinds of research questions scientists ask. Explain what it means for an observation to be reliable.

  25. The design, implementation, and evaluation of a blended (in-person and

    Studies have reported different results of evaluation methods of clinical competency tests. Therefore, this study aimed to design, implement, and evaluate a blended (in-person and virtual) Competency Examination for final-year Nursing Students. This interventional study was conducted in two semesters of 2020-2021 using an educational action research method in the nursing and midwifery faculty.

  26. (PDF) Analyzing the Lived Experiences of Teachers with Divergent

    Based on the findings of the research, there are opportunities in six themes, including the application of problem-oriented teaching methods, positive educational results, positive educational ...

  27. World University Rankings 2024: methodology

    The Times Higher Education World University Rankings are the only global performance tables that judge research-intensive universities across all their core missions: teaching, research, knowledge transfer and international outlook.. This year's methodology, for the 20th edition of the World University Rankings, has been significantly updated, so that it continues to reflect the outputs of ...

  28. BITS Pilani invites applications for Professor of Practice positions

    BITS Pilani has started inviting experienced professionals to apply for the Professor of Practice positions, aiming to blend academic learning with industry expertise. This unique opportunity allows experts to contribute to cutting-edge research and innovative teaching methods. Listen to Story The ...

  29. Kidney Donors' Risk of Death at All-Time Low

    The risk of death for people who donate a kidney for transplantation—already small a decade ago—has dropped by more than half since then, a new study shows.. Each year, roughly 6,000 Americans volunteer to donate a kidney, according to the Organ Procurement and Transplantation Network. Before undergoing the procedure, donors are informed of the potential risks.