• Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

what is existing data in research

Home Market Research

Secondary Research: Definition, Methods and Examples.

secondary research

In the world of research, there are two main types of data sources: primary and secondary. While primary research involves collecting new data directly from individuals or sources, secondary research involves analyzing existing data already collected by someone else. Today we’ll discuss secondary research.

One common source of this research is published research reports and other documents. These materials can often be found in public libraries, on websites, or even as data extracted from previously conducted surveys. In addition, many government and non-government agencies maintain extensive data repositories that can be accessed for research purposes.

LEARN ABOUT: Research Process Steps

While secondary research may not offer the same level of control as primary research, it can be a highly valuable tool for gaining insights and identifying trends. Researchers can save time and resources by leveraging existing data sources while still uncovering important information.

What is Secondary Research: Definition

Secondary research is a research method that involves using already existing data. Existing data is summarized and collated to increase the overall effectiveness of the research.

One of the key advantages of secondary research is that it allows us to gain insights and draw conclusions without having to collect new data ourselves. This can save time and resources and also allow us to build upon existing knowledge and expertise.

When conducting secondary research, it’s important to be thorough and thoughtful in our approach. This means carefully selecting the sources and ensuring that the data we’re analyzing is reliable and relevant to the research question . It also means being critical and analytical in the analysis and recognizing any potential biases or limitations in the data.

LEARN ABOUT: Level of Analysis

Secondary research is much more cost-effective than primary research , as it uses already existing data, unlike primary research, where data is collected firsthand by organizations or businesses or they can employ a third party to collect data on their behalf.

LEARN ABOUT: Data Analytics Projects

Secondary Research Methods with Examples

Secondary research is cost-effective, one of the reasons it is a popular choice among many businesses and organizations. Not every organization is able to pay a huge sum of money to conduct research and gather data. So, rightly secondary research is also termed “ desk research ”, as data can be retrieved from sitting behind a desk.

what is existing data in research

The following are popularly used secondary research methods and examples:

1. Data Available on The Internet

One of the most popular ways to collect secondary data is the internet. Data is readily available on the internet and can be downloaded at the click of a button.

This data is practically free of cost, or one may have to pay a negligible amount to download the already existing data. Websites have a lot of information that businesses or organizations can use to suit their research needs. However, organizations need to consider only authentic and trusted website to collect information.

2. Government and Non-Government Agencies

Data for secondary research can also be collected from some government and non-government agencies. For example, US Government Printing Office, US Census Bureau, and Small Business Development Centers have valuable and relevant data that businesses or organizations can use.

There is a certain cost applicable to download or use data available with these agencies. Data obtained from these agencies are authentic and trustworthy.

3. Public Libraries

Public libraries are another good source to search for data for this research. Public libraries have copies of important research that were conducted earlier. They are a storehouse of important information and documents from which information can be extracted.

The services provided in these public libraries vary from one library to another. More often, libraries have a huge collection of government publications with market statistics, large collection of business directories and newsletters.

4. Educational Institutions

Importance of collecting data from educational institutions for secondary research is often overlooked. However, more research is conducted in colleges and universities than any other business sector.

The data that is collected by universities is mainly for primary research. However, businesses or organizations can approach educational institutions and request for data from them.

5. Commercial Information Sources

Local newspapers, journals, magazines, radio and TV stations are a great source to obtain data for secondary research. These commercial information sources have first-hand information on economic developments, political agenda, market research, demographic segmentation and similar subjects.

Businesses or organizations can request to obtain data that is most relevant to their study. Businesses not only have the opportunity to identify their prospective clients but can also know about the avenues to promote their products or services through these sources as they have a wider reach.

Key Differences between Primary Research and Secondary Research

Understanding the distinction between primary research and secondary research is essential in determining which research method is best for your project. These are the two main types of research methods, each with advantages and disadvantages. In this section, we will explore the critical differences between the two and when it is appropriate to use them.

How to Conduct Secondary Research?

We have already learned about the differences between primary and secondary research. Now, let’s take a closer look at how to conduct it.

Secondary research is an important tool for gathering information already collected and analyzed by others. It can help us save time and money and allow us to gain insights into the subject we are researching. So, in this section, we will discuss some common methods and tips for conducting it effectively.

Here are the steps involved in conducting secondary research:

1. Identify the topic of research: Before beginning secondary research, identify the topic that needs research. Once that’s done, list down the research attributes and its purpose.

2. Identify research sources: Next, narrow down on the information sources that will provide most relevant data and information applicable to your research.

3. Collect existing data: Once the data collection sources are narrowed down, check for any previous data that is available which is closely related to the topic. Data related to research can be obtained from various sources like newspapers, public libraries, government and non-government agencies etc.

4. Combine and compare: Once data is collected, combine and compare the data for any duplication and assemble data into a usable format. Make sure to collect data from authentic sources. Incorrect data can hamper research severely.

4. Analyze data: Analyze collected data and identify if all questions are answered. If not, repeat the process if there is a need to dwell further into actionable insights.

Advantages of Secondary Research

Secondary research offers a number of advantages to researchers, including efficiency, the ability to build upon existing knowledge, and the ability to conduct research in situations where primary research may not be possible or ethical. By carefully selecting their sources and being thoughtful in their approach, researchers can leverage secondary research to drive impact and advance the field. Some key advantages are the following:

1. Most information in this research is readily available. There are many sources from which relevant data can be collected and used, unlike primary research, where data needs to collect from scratch.

2. This is a less expensive and less time-consuming process as data required is easily available and doesn’t cost much if extracted from authentic sources. A minimum expenditure is associated to obtain data.

3. The data that is collected through secondary research gives organizations or businesses an idea about the effectiveness of primary research. Hence, organizations or businesses can form a hypothesis and evaluate cost of conducting primary research.

4. Secondary research is quicker to conduct because of the availability of data. It can be completed within a few weeks depending on the objective of businesses or scale of data needed.

As we can see, this research is the process of analyzing data already collected by someone else, and it can offer a number of benefits to researchers.

Disadvantages of Secondary Research

On the other hand, we have some disadvantages that come with doing secondary research. Some of the most notorious are the following:

1. Although data is readily available, credibility evaluation must be performed to understand the authenticity of the information available.

2. Not all secondary data resources offer the latest reports and statistics. Even when the data is accurate, it may not be updated enough to accommodate recent timelines.

3. Secondary research derives its conclusion from collective primary research data. The success of your research will depend, to a greater extent, on the quality of research already conducted by primary research.

LEARN ABOUT: 12 Best Tools for Researchers

In conclusion, secondary research is an important tool for researchers exploring various topics. By leveraging existing data sources, researchers can save time and resources, build upon existing knowledge, and conduct research in situations where primary research may not be feasible.

There are a variety of methods and examples of secondary research, from analyzing public data sets to reviewing previously published research papers. As students and aspiring researchers, it’s important to understand the benefits and limitations of this research and to approach it thoughtfully and critically. By doing so, we can continue to advance our understanding of the world around us and contribute to meaningful research that positively impacts society.

QuestionPro can be a useful tool for conducting secondary research in a variety of ways. You can create online surveys that target a specific population, collecting data that can be analyzed to gain insights into consumer behavior, attitudes, and preferences; analyze existing data sets that you have obtained through other means or benchmark your organization against others in your industry or against industry standards. The software provides a range of benchmarking tools that can help you compare your performance on key metrics, such as customer satisfaction, with that of your peers.

Using QuestionPro thoughtfully and strategically allows you to gain valuable insights to inform decision-making and drive business success. Start today for free! No credit card is required.

LEARN MORE         FREE TRIAL

MORE LIKE THIS

NPS Survey Platform

NPS Survey Platform: Types, Tips, 11 Best Platforms & Tools

Apr 26, 2024

user journey vs user flow

User Journey vs User Flow: Differences and Similarities

gap analysis tools

Best 7 Gap Analysis Tools to Empower Your Business

Apr 25, 2024

employee survey tools

12 Best Employee Survey Tools for Organizational Excellence

Other categories.

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence

Using an existing data set to answer new research questions: a methodological review

Affiliation.

  • 1 University of San Francisco, California, USA. [email protected]
  • PMID: 19769213
  • DOI: 10.1891/1541-6577.23.3.203

The vast majority of the research methods literature assumes that the researcher designs the study subsequent to determining research questions. This assumption is not met for the many researchers involved in secondary data analysis. Researchers doing secondary data analysis need not only understand research concepts related to designing a new study, but additionally must be aware of challenges specific to conducting research using an existing data set. Techniques are discussed to determine if secondary data analysis is appropriate. Suggestions are offered on how to best identify, obtain, and evaluate a data set; refine research questions; manage data; calculate power; and report results. Examples from nursing research are provided. If an existing data set is suitable for answering a new research question, then a secondary analysis is preferable since it can be completed in less time, for less money, and with far lower risks to subjects. The researcher must carefully consider if the existing data set's available power and data quality are adequate to answer the proposed research questions.

Publication types

  • Research Support, Non-U.S. Gov't
  • Case-Control Studies
  • Data Collection / methods*
  • Data Collection / standards
  • Nursing Methodology Research / methods*
  • Nursing Research / methods*
  • Reproducibility of Results
  • Research Design*
  • Review Literature as Topic

Root out friction in every digital experience, super-charge conversion rates, and optimize digital self-service

Uncover insights from any interaction, deliver AI-powered agent coaching, and reduce cost to serve

Increase revenue and loyalty with real-time insights and recommendations delivered to teams on the ground

Know how your people feel and empower managers to improve employee engagement, productivity, and retention

Take action in the moments that matter most along the employee journey and drive bottom line growth

Whatever they’re are saying, wherever they’re saying it, know exactly what’s going on with your people

Get faster, richer insights with qual and quant tools that make powerful market research available to everyone

Run concept tests, pricing studies, prototyping + more with fast, powerful studies designed by UX research experts

Track your brand performance 24/7 and act quickly to respond to opportunities and challenges in your market

Explore the platform powering Experience Management

  • Free Account
  • For Digital
  • For Customer Care
  • For Human Resources
  • For Researchers
  • Financial Services
  • All Industries

Popular Use Cases

  • Customer Experience
  • Employee Experience
  • Employee Exit Interviews
  • Net Promoter Score
  • Voice of Customer
  • Customer Success Hub
  • Product Documentation
  • Training & Certification
  • XM Institute
  • Popular Resources
  • Customer Stories

Market Research

  • Artificial Intelligence
  • Partnerships
  • Marketplace

The annual gathering of the experience leaders at the world’s iconic brands building breakthrough business results, live in Salt Lake City.

  • English/AU & NZ
  • Español/Europa
  • Español/América Latina
  • Português Brasileiro
  • REQUEST DEMO
  • Experience Management
  • Secondary Research

Try Qualtrics for free

Secondary research: definition, methods, & examples.

19 min read This ultimate guide to secondary research helps you understand changes in market trends, customers buying patterns and your competition using existing data sources.

In situations where you’re not involved in the data gathering process ( primary research ), you have to rely on existing information and data to arrive at specific research conclusions or outcomes. This approach is known as secondary research.

In this article, we’re going to explain what secondary research is, how it works, and share some examples of it in practice.

Free eBook: The ultimate guide to conducting market research

What is secondary research?

Secondary research, also known as desk research, is a research method that involves compiling existing data sourced from a variety of channels . This includes internal sources (e.g.in-house research) or, more commonly, external sources (such as government statistics, organizational bodies, and the internet).

Secondary research comes in several formats, such as published datasets, reports, and survey responses , and can also be sourced from websites, libraries, and museums.

The information is usually free — or available at a limited access cost — and gathered using surveys , telephone interviews, observation, face-to-face interviews, and more.

When using secondary research, researchers collect, verify, analyze and incorporate it to help them confirm research goals for the research period.

As well as the above, it can be used to review previous research into an area of interest. Researchers can look for patterns across data spanning several years and identify trends — or use it to verify early hypothesis statements and establish whether it’s worth continuing research into a prospective area.

How to conduct secondary research

There are five key steps to conducting secondary research effectively and efficiently:

1.    Identify and define the research topic

First, understand what you will be researching and define the topic by thinking about the research questions you want to be answered.

Ask yourself: What is the point of conducting this research? Then, ask: What do we want to achieve?

This may indicate an exploratory reason (why something happened) or confirm a hypothesis. The answers may indicate ideas that need primary or secondary research (or a combination) to investigate them.

2.    Find research and existing data sources

If secondary research is needed, think about where you might find the information. This helps you narrow down your secondary sources to those that help you answer your questions. What keywords do you need to use?

Which organizations are closely working on this topic already? Are there any competitors that you need to be aware of?

Create a list of the data sources, information, and people that could help you with your work.

3.    Begin searching and collecting the existing data

Now that you have the list of data sources, start accessing the data and collect the information into an organized system. This may mean you start setting up research journal accounts or making telephone calls to book meetings with third-party research teams to verify the details around data results.

As you search and access information, remember to check the data’s date, the credibility of the source, the relevance of the material to your research topic, and the methodology used by the third-party researchers. Start small and as you gain results, investigate further in the areas that help your research’s aims.

4.    Combine the data and compare the results

When you have your data in one place, you need to understand, filter, order, and combine it intelligently. Data may come in different formats where some data could be unusable, while other information may need to be deleted.

After this, you can start to look at different data sets to see what they tell you. You may find that you need to compare the same datasets over different periods for changes over time or compare different datasets to notice overlaps or trends. Ask yourself: What does this data mean to my research? Does it help or hinder my research?

5.    Analyze your data and explore further

In this last stage of the process, look at the information you have and ask yourself if this answers your original questions for your research. Are there any gaps? Do you understand the information you’ve found? If you feel there is more to cover, repeat the steps and delve deeper into the topic so that you can get all the information you need.

If secondary research can’t provide these answers, consider supplementing your results with data gained from primary research. As you explore further, add to your knowledge and update your findings. This will help you present clear, credible information.

Primary vs secondary research

Unlike secondary research, primary research involves creating data first-hand by directly working with interviewees, target users, or a target market. Primary research focuses on the method for carrying out research, asking questions, and collecting data using approaches such as:

  • Interviews (panel, face-to-face or over the phone)
  • Questionnaires or surveys
  • Focus groups

Using these methods, researchers can get in-depth, targeted responses to questions, making results more accurate and specific to their research goals. However, it does take time to do and administer.

Unlike primary research, secondary research uses existing data, which also includes published results from primary research. Researchers summarize the existing research and use the results to support their research goals.

Both primary and secondary research have their places. Primary research can support the findings found through secondary research (and fill knowledge gaps), while secondary research can be a starting point for further primary research. Because of this, these research methods are often combined for optimal research results that are accurate at both the micro and macro level.

Sources of Secondary Research

There are two types of secondary research sources: internal and external. Internal data refers to in-house data that can be gathered from the researcher’s organization. External data refers to data published outside of and not owned by the researcher’s organization.

Internal data

Internal data is a good first port of call for insights and knowledge, as you may already have relevant information stored in your systems. Because you own this information — and it won’t be available to other researchers — it can give you a competitive edge . Examples of internal data include:

  • Database information on sales history and business goal conversions
  • Information from website applications and mobile site data
  • Customer-generated data on product and service efficiency and use
  • Previous research results or supplemental research areas
  • Previous campaign results

External data

External data is useful when you: 1) need information on a new topic, 2) want to fill in gaps in your knowledge, or 3) want data that breaks down a population or market for trend and pattern analysis. Examples of external data include:

  • Government, non-government agencies, and trade body statistics
  • Company reports and research
  • Competitor research
  • Public library collections
  • Textbooks and research journals
  • Media stories in newspapers
  • Online journals and research sites

Three examples of secondary research methods in action

How and why might you conduct secondary research? Let’s look at a few examples:

1.    Collecting factual information from the internet on a specific topic or market

There are plenty of sites that hold data for people to view and use in their research. For example, Google Scholar, ResearchGate, or Wiley Online Library all provide previous research on a particular topic. Researchers can create free accounts and use the search facilities to look into a topic by keyword, before following the instructions to download or export results for further analysis.

This can be useful for exploring a new market that your organization wants to consider entering. For instance, by viewing the U.S Census Bureau demographic data for that area, you can see what the demographics of your target audience are , and create compelling marketing campaigns accordingly.

2.    Finding out the views of your target audience on a particular topic

If you’re interested in seeing the historical views on a particular topic, for example, attitudes to women’s rights in the US, you can turn to secondary sources.

Textbooks, news articles, reviews, and journal entries can all provide qualitative reports and interviews covering how people discussed women’s rights. There may be multimedia elements like video or documented posters of propaganda showing biased language usage.

By gathering this information, synthesizing it, and evaluating the language, who created it and when it was shared, you can create a timeline of how a topic was discussed over time.

3.    When you want to know the latest thinking on a topic

Educational institutions, such as schools and colleges, create a lot of research-based reports on younger audiences or their academic specialisms. Dissertations from students also can be submitted to research journals, making these places useful places to see the latest insights from a new generation of academics.

Information can be requested — and sometimes academic institutions may want to collaborate and conduct research on your behalf. This can provide key primary data in areas that you want to research, as well as secondary data sources for your research.

Advantages of secondary research

There are several benefits of using secondary research, which we’ve outlined below:

  • Easily and readily available data – There is an abundance of readily accessible data sources that have been pre-collected for use, in person at local libraries and online using the internet. This data is usually sorted by filters or can be exported into spreadsheet format, meaning that little technical expertise is needed to access and use the data.
  • Faster research speeds – Since the data is already published and in the public arena, you don’t need to collect this information through primary research. This can make the research easier to do and faster, as you can get started with the data quickly.
  • Low financial and time costs – Most secondary data sources can be accessed for free or at a small cost to the researcher, so the overall research costs are kept low. In addition, by saving on preliminary research, the time costs for the researcher are kept down as well.
  • Secondary data can drive additional research actions – The insights gained can support future research activities (like conducting a follow-up survey or specifying future detailed research topics) or help add value to these activities.
  • Secondary data can be useful pre-research insights – Secondary source data can provide pre-research insights and information on effects that can help resolve whether research should be conducted. It can also help highlight knowledge gaps, so subsequent research can consider this.
  • Ability to scale up results – Secondary sources can include large datasets (like Census data results across several states) so research results can be scaled up quickly using large secondary data sources.

Disadvantages of secondary research

The disadvantages of secondary research are worth considering in advance of conducting research :

  • Secondary research data can be out of date – Secondary sources can be updated regularly, but if you’re exploring the data between two updates, the data can be out of date. Researchers will need to consider whether the data available provides the right research coverage dates, so that insights are accurate and timely, or if the data needs to be updated. Also, fast-moving markets may find secondary data expires very quickly.
  • Secondary research needs to be verified and interpreted – Where there’s a lot of data from one source, a researcher needs to review and analyze it. The data may need to be verified against other data sets or your hypotheses for accuracy and to ensure you’re using the right data for your research.
  • The researcher has had no control over the secondary research – As the researcher has not been involved in the secondary research, invalid data can affect the results. It’s therefore vital that the methodology and controls are closely reviewed so that the data is collected in a systematic and error-free way.
  • Secondary research data is not exclusive – As data sets are commonly available, there is no exclusivity and many researchers can use the same data. This can be problematic where researchers want to have exclusive rights over the research results and risk duplication of research in the future.

When do we conduct secondary research?

Now that you know the basics of secondary research, when do researchers normally conduct secondary research?

It’s often used at the beginning of research, when the researcher is trying to understand the current landscape . In addition, if the research area is new to the researcher, it can form crucial background context to help them understand what information exists already. This can plug knowledge gaps, supplement the researcher’s own learning or add to the research.

Secondary research can also be used in conjunction with primary research. Secondary research can become the formative research that helps pinpoint where further primary research is needed to find out specific information. It can also support or verify the findings from primary research.

You can use secondary research where high levels of control aren’t needed by the researcher, but a lot of knowledge on a topic is required from different angles.

Secondary research should not be used in place of primary research as both are very different and are used for various circumstances.

Questions to ask before conducting secondary research

Before you start your secondary research, ask yourself these questions:

  • Is there similar internal data that we have created for a similar area in the past?

If your organization has past research, it’s best to review this work before starting a new project. The older work may provide you with the answers, and give you a starting dataset and context of how your organization approached the research before. However, be mindful that the work is probably out of date and view it with that note in mind. Read through and look for where this helps your research goals or where more work is needed.

  • What am I trying to achieve with this research?

When you have clear goals, and understand what you need to achieve, you can look for the perfect type of secondary or primary research to support the aims. Different secondary research data will provide you with different information – for example, looking at news stories to tell you a breakdown of your market’s buying patterns won’t be as useful as internal or external data e-commerce and sales data sources.

  • How credible will my research be?

If you are looking for credibility, you want to consider how accurate the research results will need to be, and if you can sacrifice credibility for speed by using secondary sources to get you started. Bear in mind which sources you choose — low-credibility data sites, like political party websites that are highly biased to favor their own party, would skew your results.

  • What is the date of the secondary research?

When you’re looking to conduct research, you want the results to be as useful as possible , so using data that is 10 years old won’t be as accurate as using data that was created a year ago. Since a lot can change in a few years, note the date of your research and look for earlier data sets that can tell you a more recent picture of results. One caveat to this is using data collected over a long-term period for comparisons with earlier periods, which can tell you about the rate and direction of change.

  • Can the data sources be verified? Does the information you have check out?

If you can’t verify the data by looking at the research methodology, speaking to the original team or cross-checking the facts with other research, it could be hard to be sure that the data is accurate. Think about whether you can use another source, or if it’s worth doing some supplementary primary research to replicate and verify results to help with this issue.

We created a front-to-back guide on conducting market research, The ultimate guide to conducting market research , so you can understand the research journey with confidence.

In it, you’ll learn more about:

  • What effective market research looks like
  • The use cases for market research
  • The most important steps to conducting market research
  • And how to take action on your research findings

Download the free guide for a clearer view on secondary research and other key research types for your business.

Related resources

Market intelligence 10 min read, marketing insights 11 min read, ethnographic research 11 min read, qualitative vs quantitative research 13 min read, qualitative research questions 11 min read, qualitative research design 12 min read, primary vs secondary research 14 min read, request demo.

Ready to learn more about Qualtrics?

  • IRB-SBS Home
  • Contact IRB-SBS
  • IRB SBS Staff Directory
  • IRB SBS Board Members
  • About the IRB-SBS
  • CITI Training
  • Education Events
  • Virginia IRB Consortium
  • IRB-SBS Learning Shots
  • HRPP Education & Training
  • Student Support
  • Access iProtocol
  • Getting Started
  • iProtocol Question Guide
  • iProtocol Management
  • Protocol Review Process
  • Certificate of Confidentiality
  • Deception and/or Withholding Information from a Participant
  • Ethnographic Research
  • IRB-SBS 101
  • IRB-SBS Glossary
  • Participant Pools 
  • Paying Participants
  • Research in an Educational Setting
  • Research in an International Setting and/or Location
  • Risk-Sensitive Populations
  • Student Researchers and Faculty Sponsors
  • Study Funding and the IRB
  • Understanding Risk in Research
  • Vulnerable Participants
  • IRB-SBS PAM & Ed
  • Federal Regulations
  • Ethical Principals
  • Partner Offices
  • Determining Human Subjects Research
  • Determining HSR or SBS

IRB-SBS Researcher

Secondary Use of Existing Data

Secondary use of existing (archival) data studies includes all of the following:

  • Data that are collected for non-research purposes (i.e. student records) or collected for a research study other than the proposed study (i.e. another study’s data set)
  • The proposed study plans to use the existing data as opposed to gathering new data (or possibly in conjunction with newly gathered data)
  • Data contains information that can be linked to individuals (though not necessarily to the individual’s identity)
  • Data are the primary source (versus a secondary source where the data was analyzed for another publication)
  • While data usually exists prior to the protocol’s approval, there are instances in which the data can continue to accumulate; however, the researcher cannot be engaged in gathering the data. For example, students can continue to add content to their student records that can be accessed by the researcher (with appropriate permission) but the researcher is not engaged in contacting the students directly for this information.

In order for the Board to assess the risks to the participants through the use of existing data sources and make recommendations for ethical use of the data, they will need to know the following:

  • How did you obtain access to the data?  The Board will need to know if the data are publicly available or if there are restrictions for accessing the data.  If the second is true, the Board will need to know how you obtained permission to access the data.
  • What do the data consist of?  The Board will need to know if you are using data sets, video tapes, audio tapes, journal entries, transcripts, etc.  If you are using data sets, they will need to know what data fields you will use.  
  • How many records will you access? Will the data be combined with other data sources? How easy is it to deduce the identities of the participants? The Board needs to understand the complete picture of the data and the potential to deduce identity which could compromise confidentiality.
  • Can the participants be linked to their data?  The Board will need to know in what form you will receive the data.  Can the data be de-identified? Are the data linked and stripped of identifiers?  Who prepared the data for you?  Will you merge multiple data sets? 

For suggestions on how to create a "Secondary Data" iProtocol, see Creating and Submitting a New iProtocol . 

Exempt studies are not under the same obligation to obtain consent from participants (though the Board often asks researchers to provide information about the study to participants using a Study Notification). The federal regulations allow the Board to exempt research involving the secondary use of existing data if either of the following are true :

  • the identifiable private information is publicly available 
  • the information is recorded by the investigator in such a manner that the identity of the participant cannot be readily ascertained directly or through identifiers linked to the subject, the investigator does not contact the subjects, and the investigator will not re-identify the subjects.

In addition, data sets that specifically targets prisoners cannot be exempted. If a prisoner is incidentally included in a data set, the data set can be exempted.

The Board evaluates the existing data source (i.e. public or private) and if the data can identify the participants to determine exemption. If the protocol qualifies for exemption, the Board does not require researchers to obtain consent from participants. If the protocol does not qualify for exemption, the board may consider waiving consent or they may require that researcher obtain consent from participants.

The data sets listed below do not require IRB review except in the case where the data sets are merged with other data or if the data archive requires IRB review:

  • Inter-University Consortium for Political and Social Research (ICPSR)
  • National Center for Health Statistics
  • National Center for Education Statistics
  • National Election Studies
  • U.S. Bureau of the Census

Additional data sets and archives may quality for inclusion on this list. Investigators who wish to have a specific data set or data archive considered for inclusion on this list should submit the following information to  [email protected] :

  • In the email subject, please add the following: IRB-SBS archival data set review
  • The name of data set or data archive; and
  • The URL for the data set/archive or other specific information on how to obtain the data set; and

An abstract that describes the content and potential uses of the data set/archive.

Data collected by various government agencies and academic institutions make their data available to the public for research purposes.  Any data set that is made available to the public and does not require special permission to access the data is considered a publicly available data set.  Publicly available data sets are exempt. 

Private data sets may include (but are not limited to): data collected previously by another researcher for another study, data collected by another agency for evaluative or research purposes, or your own data that you collected for a previous study. Private data sets generally require permission to access the data, and the Board will need to know that you will obtain (or have already obtained) proper permission from the appropriate entity.

Private records are data that were not collected with the intent to conduct research, but instead exists for the purpose of collecting information on individuals for the individual’s own sake.  For example, student records, medical records, credit histories, etc, are private records that are maintained by agencies other than the individual but contain personal information about the individual. Some of these records are collected by government agencies and by law are accessible to the public—thus they fall under the publicly-available data sets category. Private records can be governed by privacy laws and regulations, thus requiring special permission to access the records as well as additional safeguards for using the data. Some researchers may have access to private records as part of their professional role; for example you may be able to access student records as a professor but you will still need to obtain permission to access records as a researcher (particularly because these records are also protected by FERPA regulations). These records can still qualify for exemption if the data are received stripped of  identifiers .

Student Records and Classroom Data: Please see Education: Student Records .

Medical records and hipaa:.

The IRB-SBS does not review studies where a medical record is used; these studies are reviewed by the  IRB-HSR . If you have any questions regarding which IRB should review your study, check out the  HSR/SBS decision algorithm .  If this doesn’t answer your question, please  contact  our office (or the HSR) before completing our protocol form as each IRB has separate submission procedures.

Combining data sets can provide interesting insights into behavior and provide rich information for statistical models. However, combining data can also increase the ability to identify individuals in de-identified data sets. From the OHRP website :

“A subset of “big data research” uses ongoing and constantly replenished and revised data systems, with analysis updated in real time as new information becomes available.  In some instances these may be ongoing “longitudinal” studies; may involve Bayesian designs for data collection and analysis; and can involve “adaptive” study designs that change as new information becomes available and is added to the data being analyzed.  Increasingly in the social and behavioral research context, longitudinal data systems link multiple ongoing data streams (e.g., student records, employment, social welfare services, health records, police encounters, arrest records), and these study designs can, over time, create risks of re-identification and misuse that are not present in studies using static data sets.” 

The IRB regulations require that researcher obtain IRB approval/ exemption prior to collecting any data.  The Board cannot retroactively approve the collection of data that falls under our definition of research.  However, the regulations recognize that there are instances where data is collected without the intention to conduct human subjects research and this data could prove to be valuable information in a later study.  For example, information collected in a pilot study to test the feasibility of conducting a full study may be viable data to include in the full study.  A pilot study doesn’t necessarily qualify as “research” according the IRB regulations.  The same could be true for a class project where data was collected for a brief paper submitted to a professor, but later provided necessary information for a full dissertation or thesis project.  This should not be considered a loophole for avoiding IRB review, however.  In order to approve the use of this data, the IRB will review the collection of the data and hold it to the same standards required for any collection of data.  If the IRB finds that the data was not collected according to our ethical guidelines and regulations, the Board will not allow that the data be used. For example, if you collect sensitive information that can be linked to an individual but the participant did not consent to the collection of this data, the Board may not approve the use of this data because of the manner in which it was collected. In order to avoid this scenario, we recommend that you contact our office for further guidance regarding data collection.  Depending on the project, we may advise that you submit a protocol for a  pilot study or class project , which will help you avoid any question about the viability of your data.  If you don’t need to submit a protocol at this time, we can provide suggestions and recommendations for collecting your data so that it can be approved at a later date if you decide to use it. 

  • Describe research involving the secondary research of existing data by creating a Data Source in the Data Source section.
  • Upload any additional resources that describe the existing data in the Data Source Upload .
  • Upload any files that document permission to access data in the Permissions section.
  • If you have more than one Data Source and the sources are linked, the Associate Data Sources with Data Sources is the section where you can demonstrate and describe this relationship.
  • The Associate Data Sources with Participant Groups is the section where you can demonstrate the relationship between Participant Groups and Data Sources (if you have more than one of both).
  • Creating and Submitting a New iProtocol
  • Data Source  
  • Data Source Upload
  • Exempt Protocols
  • Permissions
  • Associate Data Sources with Data Sources
  • Associate Data Sources with Participant Groups
  • HSR/SBS decision algorithm
  • Identifiers
  • OHRP website

What is secondary data? A data analyst sitting on a sofa, working on a laptop

What Is Secondary Data? A Complete Guide

what is existing data in research

What is secondary data, and why is it important? Find out in this post.

Within data analytics, there are many ways of categorizing data. A common distinction, for instance, is that between qualitative and quantitative data . In addition, you might also distinguish your data based on factors like sensitivity. For example, is it publicly available or is it highly confidential?  

Probably the most fundamental distinction between different types of data is their source. Namely, are they primary, secondary, or third-party data? Each of these vital data sources supports the data analytics process in its own way. In this post, we’ll focus specifically on secondary data. We’ll look at its main characteristics, provide some examples, and highlight the main pros and cons of using secondary data in your analysis.  

We’ll cover the following topics:  

What is secondary data?

  • What’s the difference between primary, secondary, and third-party data?
  • What are some examples of secondary data?
  • How to analyse secondary data
  • Advantages of secondary data
  • Disadvantages of secondary data
  • Wrap-up and further reading

Ready to learn all about secondary data? Then let’s go.

1. What is secondary data?

Secondary data (also known as second-party data) refers to any dataset collected by any person other than the one using it.  

Secondary data sources are extremely useful. They allow researchers and data analysts to build large, high-quality databases that help solve business problems. By expanding their datasets with secondary data, analysts can enhance the quality and accuracy of their insights. Most secondary data comes from external organizations. However, secondary data also refers to that collected within an organization and then repurposed.

Secondary data has various benefits and drawbacks, which we’ll explore in detail in section four. First, though, it’s essential to contextualize secondary data by understanding its relationship to two other sources of data: primary and third-party data. We’ll look at these next.

2. What’s the difference between primary, secondary, and third-party data?

To best understand secondary data, we need to know how it relates to the other main data sources: primary and third-party data.

What is primary data?

‘Primary data’ (also known as first-party data) are those directly collected or obtained by the organization or individual that intends to use them. Primary data are always collected for a specific purpose. This could be to inform a defined goal or objective or to address a particular business problem. 

For example, a real estate organization might want to analyze current housing market trends. This might involve conducting interviews, collecting facts and figures through surveys and focus groups, or capturing data via electronic forms. Focusing only on the data required to complete the task at hand ensures that primary data remain highly relevant. They’re also well-structured and of high quality.

As explained, ‘secondary data’ describes those collected for a purpose other than the task at hand. Secondary data can come from within an organization but more commonly originate from an external source. If it helps to make the distinction, secondary data is essentially just another organization’s primary data. 

Secondary data sources are so numerous that they’ve started playing an increasingly vital role in research and analytics. They are easier to source than primary data and can be repurposed to solve many different problems. While secondary data may be less relevant for a given task than primary data, they are generally still well-structured and highly reliable.

What is third-party data?

‘Third-party data’ (sometimes referred to as tertiary data) refers to data collected and aggregated from numerous discrete sources by third-party organizations. Because third-party data combine data from numerous sources and aren’t collected with a specific goal in mind, the quality can be lower. 

Third-party data also tend to be largely unstructured. This means that they’re often beset by errors, duplicates, and so on, and require more processing to get them into a usable format. Nevertheless, used appropriately, third-party data are still a useful data analytics resource. You can learn more about structured vs unstructured data here . 

OK, now that we’ve placed secondary data in context, let’s explore some common sources and types of secondary data.

3. What are some examples of secondary data?

External secondary data.

Before we get to examples of secondary data, we first need to understand the types of organizations that generally provide them. Frequent sources of secondary data include:  

  • Government departments
  • Public sector organizations
  • Industry associations
  • Trade and industry bodies
  • Educational institutions
  • Private companies
  • Market research providers

While all these organizations provide secondary data, government sources are perhaps the most freely accessible. They are legally obliged to keep records when registering people, providing services, and so on. This type of secondary data is known as administrative data. It’s especially useful for creating detailed segment profiles, where analysts hone in on a particular region, trend, market, or other demographic.

Types of secondary data vary. Popular examples of secondary data include:

  • Tax records and social security data
  • Census data (the U.S. Census Bureau is oft-referenced, as well as our favorite, the U.S. Bureau of Labor Statistics )
  • Electoral statistics
  • Health records
  • Books, journals, or other print media
  • Social media monitoring, internet searches, and other online data
  • Sales figures or other reports from third-party companies
  • Libraries and electronic filing systems
  • App data, e.g. location data, GPS data, timestamp data, etc.

Internal secondary data 

As mentioned, secondary data is not limited to that from a different organization. It can also come from within an organization itself.  

Sources of internal secondary data might include:

  • Sales reports
  • Annual accounts
  • Quarterly sales figures
  • Customer relationship management systems
  • Emails and metadata
  • Website cookies

In the right context, we can define practically any type of data as secondary data. The key takeaway is that the term ‘secondary data’ doesn’t refer to any inherent quality of the data themselves, but to how they are used. Any data source (external or internal) used for a task other than that for which it was originally collected can be described as secondary data.

4. How to analyse secondary data

The process of analysing secondary data can be performed either quantitatively or qualitatively, depending on the kind of data the researcher is dealing with. The quantitative method of secondary data analysis is used on numerical data and is analyzed mathematically. The qualitative method uses words to provide in-depth information about data.

There are different stages of secondary data analysis, which involve events before, during, and after data collection. These stages include:

  • Statement of purpose: Before collecting secondary data, you need to know your statement of purpose. This means you should have a clear awareness of the goal of the research work and how this data will help achieve it. This will guide you to collect the right data, then choosing the best data source and method of analysis.
  • Research design: This is a plan on how the research activities will be carried out. It describes the kind of data to be collected, the sources of data collection, the method of data collection, tools used, and method of analysis. Once the purpose of the research has been identified, the researcher should design a research process that will guide the data analysis process.
  • Developing the research questions: Once you’ve identified the research purpose, an analyst should also prepare research questions to help identify secondary data. For example, if a researcher is looking to learn more about why working adults are increasingly more interested in the “gig economy” as opposed to full-time work, they may ask, “What are the main factors that influence adults decisions to engage in freelance work?” or, “Does education level have an effect on how people engage in freelance work?
  • Identifying secondary data: Using the research questions as a guide, researchers will then begin to identify relevant data from the sources provided. If the kind of data to be collected is qualitative, a researcher can filter out qualitative data—for example.
  • Evaluating secondary data: Once relevant data has been identified and collates, it will be evaluated to ensure it fulfils the criteria of the research topic. Then, it is analyzed either using the quantitative or qualitative method, depending on the type of data it is.

You can learn more about secondary data analysis in this post .  

5. Advantages of secondary data

Secondary data is suitable for any number of analytics activities. The only limitation is a dataset’s format, structure, and whether or not it relates to the topic or problem at hand. 

When analyzing secondary data, the process has some minor differences, mainly in the preparation phase. Otherwise, it follows much the same path as any traditional data analytics project. 

More broadly, though, what are the advantages and disadvantages of using secondary data? Let’s take a look.

Advantages of using secondary data

It’s an economic use of time and resources: Because secondary data have already been collected, cleaned, and stored, this saves analysts much of the hard work that comes from collecting these data firsthand. For instance, for qualitative data, the complex tasks of deciding on appropriate research questions or how best to record the answers have already been completed. Secondary data saves data analysts and data scientists from having to start from scratch.  

It provides a unique, detailed picture of a population: Certain types of secondary data, especially government administrative data, can provide access to levels of detail that it would otherwise be extremely difficult (or impossible) for organizations to collect on their own. Data from public sources, for instance, can provide organizations and individuals with a far greater level of population detail than they could ever hope to gather in-house. You can also obtain data over larger intervals if you need it., e.g. stock market data which provides decades’-worth of information.  

Secondary data can build useful relationships: Acquiring secondary data usually involves making connections with organizations and analysts in fields that share some common ground with your own. This opens the door to a cross-pollination of disciplinary knowledge. You never know what nuggets of information or additional data resources you might find by building these relationships.

Secondary data tend to be high-quality: Unlike some data sources, e.g. third-party data, secondary data tends to be in excellent shape. In general, secondary datasets have already been validated and therefore require minimal checking. Often, such as in the case of government data, datasets are also gathered and quality-assured by organizations with much more time and resources available. This further benefits the data quality , while benefiting smaller organizations that don’t have endless resources available.

It’s excellent for both data enrichment and informing primary data collection: Another benefit of secondary data is that they can be used to enhance and expand existing datasets. Secondary data can also inform primary data collection strategies. They can provide analysts or researchers with initial insights into the type of data they might want to collect themselves further down the line.

6. Disadvantages of secondary data

They aren’t always free: Sometimes, it’s unavoidable—you may have to pay for access to secondary data. However, while this can be a financial burden, in reality, the cost of purchasing a secondary dataset usually far outweighs the cost of having to plan for and collect the data firsthand.  

The data isn’t always suited to the problem at hand: While secondary data may tick many boxes concerning its relevance to a business problem, this is not always true. For instance, secondary data collection might have been in a geographical location or time period ill-suited to your analysis. Because analysts were not present when the data were initially collected, this may also limit the insights they can extract.

The data may not be in the preferred format: Even when a dataset provides the necessary information, that doesn’t mean it’s appropriately stored. A basic example: numbers might be stored as categorical data rather than numerical data. Another issue is that there may be gaps in the data. Categories that are too vague may limit the information you can glean. For instance, a dataset of people’s hair color that is limited to ‘brown, blonde and other’ will tell you very little about people with auburn, black, white, or gray hair.  

You can’t be sure how the data were collected: A structured, well-ordered secondary dataset may appear to be in good shape. However, it’s not always possible to know what issues might have occurred during data collection that will impact their quality. For instance, poor response rates will provide a limited view. While issues relating to data collection are sometimes made available alongside the datasets (e.g. for government data) this isn’t always the case. You should therefore treat secondary data with a reasonable degree of caution.

Being aware of these disadvantages is the first step towards mitigating them. While you should be aware of the risks associated with using secondary datasets, in general, the benefits far outweigh the drawbacks.

7. Wrap-up and further reading

In this post we’ve explored secondary data in detail. As we’ve seen, it’s not so different from other forms of data. What defines data as secondary data is how it is used rather than an inherent characteristic of the data themselves. 

To learn more about data analytics, check out this free, five-day introductory data analytics short course . You can also check out these articles to learn more about the data analytics process:

  • What is data cleaning and why is it important?
  • What is data visualization? A complete introductory guide
  • 10 Great places to find free datasets for your next project
  • Privacy Policy

Research Method

Home » Research Data – Types Methods and Examples

Research Data – Types Methods and Examples

Table of Contents

Research Data

Research Data

Research data refers to any information or evidence gathered through systematic investigation or experimentation to support or refute a hypothesis or answer a research question.

It includes both primary and secondary data, and can be in various formats such as numerical, textual, audiovisual, or visual. Research data plays a critical role in scientific inquiry and is often subject to rigorous analysis, interpretation, and dissemination to advance knowledge and inform decision-making.

Types of Research Data

There are generally four types of research data:

Quantitative Data

This type of data involves the collection and analysis of numerical data. It is often gathered through surveys, experiments, or other types of structured data collection methods. Quantitative data can be analyzed using statistical techniques to identify patterns or relationships in the data.

Qualitative Data

This type of data is non-numerical and often involves the collection and analysis of words, images, or sounds. It is often gathered through methods such as interviews, focus groups, or observation. Qualitative data can be analyzed using techniques such as content analysis, thematic analysis, or discourse analysis.

Primary Data

This type of data is collected by the researcher directly from the source. It can include data gathered through surveys, experiments, interviews, or observation. Primary data is often used to answer specific research questions or to test hypotheses.

Secondary Data

This type of data is collected by someone other than the researcher. It can include data from sources such as government reports, academic journals, or industry publications. Secondary data is often used to supplement or support primary data or to provide context for a research project.

Research Data Formates

There are several formats in which research data can be collected and stored. Some common formats include:

  • Text : This format includes any type of written data, such as interview transcripts, survey responses, or open-ended questionnaire answers.
  • Numeric : This format includes any data that can be expressed as numerical values, such as measurements or counts.
  • Audio : This format includes any recorded data in an audio form, such as interviews or focus group discussions.
  • Video : This format includes any recorded data in a video form, such as observations of behavior or experimental procedures.
  • Images : This format includes any visual data, such as photographs, drawings, or scans of documents.
  • Mixed media: This format includes any combination of the above formats, such as a survey response that includes both text and numeric data, or an observation study that includes both video and audio recordings.
  • Sensor Data: This format includes data collected from various sensors or devices, such as GPS, accelerometers, or heart rate monitors.
  • Social Media Data: This format includes data collected from social media platforms, such as tweets, posts, or comments.
  • Geographic Information System (GIS) Data: This format includes data with a spatial component, such as maps or satellite imagery.
  • Machine-Readable Data : This format includes data that can be read and processed by machines, such as data in XML or JSON format.
  • Metadata: This format includes data that describes other data, such as information about the source, format, or content of a dataset.

Data Collection Methods

Some common research data collection methods include:

  • Surveys : Surveys involve asking participants to answer a series of questions about a particular topic. Surveys can be conducted online, over the phone, or in person.
  • Interviews : Interviews involve asking participants a series of open-ended questions in order to gather detailed information about their experiences or perspectives. Interviews can be conducted in person, over the phone, or via video conferencing.
  • Focus groups: Focus groups involve bringing together a small group of participants to discuss a particular topic or issue in depth. The group is typically led by a moderator who asks questions and encourages discussion among the participants.
  • Observations : Observations involve watching and recording behaviors or events as they naturally occur. Observations can be conducted in person or through the use of video or audio recordings.
  • Experiments : Experiments involve manipulating one or more variables in order to measure the effect on an outcome of interest. Experiments can be conducted in a laboratory or in the field.
  • Case studies: Case studies involve conducting an in-depth analysis of a particular individual, group, or organization. Case studies typically involve gathering data from multiple sources, including interviews, observations, and document analysis.
  • Secondary data analysis: Secondary data analysis involves analyzing existing data that was collected for another purpose. Examples of secondary data sources include government records, academic research studies, and market research reports.

Analysis Methods

Some common research data analysis methods include:

  • Descriptive statistics: Descriptive statistics involve summarizing and describing the main features of a dataset, such as the mean, median, and standard deviation. Descriptive statistics are often used to provide an initial overview of the data.
  • Inferential statistics: Inferential statistics involve using statistical techniques to draw conclusions about a population based on a sample of data. Inferential statistics are often used to test hypotheses and determine the statistical significance of relationships between variables.
  • Content analysis : Content analysis involves analyzing the content of text, audio, or video data to identify patterns, themes, or other meaningful features. Content analysis is often used in qualitative research to analyze open-ended survey responses, interviews, or other types of text data.
  • Discourse analysis: Discourse analysis involves analyzing the language used in text, audio, or video data to understand how meaning is constructed and communicated. Discourse analysis is often used in qualitative research to analyze interviews, focus group discussions, or other types of text data.
  • Grounded theory : Grounded theory involves developing a theory or model based on an analysis of qualitative data. Grounded theory is often used in exploratory research to generate new insights and hypotheses.
  • Network analysis: Network analysis involves analyzing the relationships between entities, such as individuals or organizations, in a network. Network analysis is often used in social network analysis to understand the structure and dynamics of social networks.
  • Structural equation modeling: Structural equation modeling involves using statistical techniques to test complex models that include multiple variables and relationships. Structural equation modeling is often used in social science research to test theories about the relationships between variables.

Purpose of Research Data

Research data serves several important purposes, including:

  • Supporting scientific discoveries : Research data provides the basis for scientific discoveries and innovations. Researchers use data to test hypotheses, develop new theories, and advance scientific knowledge in their field.
  • Validating research findings: Research data provides the evidence necessary to validate research findings. By analyzing and interpreting data, researchers can determine the statistical significance of relationships between variables and draw conclusions about the research question.
  • Informing policy decisions: Research data can be used to inform policy decisions by providing evidence about the effectiveness of different policies or interventions. Policymakers can use data to make informed decisions about how to allocate resources and address social or economic challenges.
  • Promoting transparency and accountability: Research data promotes transparency and accountability by allowing other researchers to verify and replicate research findings. Data sharing also promotes transparency by allowing others to examine the methods used to collect and analyze data.
  • Supporting education and training: Research data can be used to support education and training by providing examples of research methods, data analysis techniques, and research findings. Students and researchers can use data to learn new research skills and to develop their own research projects.

Applications of Research Data

Research data has numerous applications across various fields, including social sciences, natural sciences, engineering, and health sciences. The applications of research data can be broadly classified into the following categories:

  • Academic research: Research data is widely used in academic research to test hypotheses, develop new theories, and advance scientific knowledge. Researchers use data to explore complex relationships between variables, identify patterns, and make predictions.
  • Business and industry: Research data is used in business and industry to make informed decisions about product development, marketing, and customer engagement. Data analysis techniques such as market research, customer analytics, and financial analysis are widely used to gain insights and inform strategic decision-making.
  • Healthcare: Research data is used in healthcare to improve patient outcomes, develop new treatments, and identify health risks. Researchers use data to analyze health trends, track disease outbreaks, and develop evidence-based treatment protocols.
  • Education : Research data is used in education to improve teaching and learning outcomes. Data analysis techniques such as assessments, surveys, and evaluations are used to measure student progress, evaluate program effectiveness, and inform policy decisions.
  • Government and public policy: Research data is used in government and public policy to inform decision-making and policy development. Data analysis techniques such as demographic analysis, cost-benefit analysis, and impact evaluation are widely used to evaluate policy effectiveness, identify social or economic challenges, and develop evidence-based policy solutions.
  • Environmental management: Research data is used in environmental management to monitor environmental conditions, track changes, and identify emerging threats. Data analysis techniques such as spatial analysis, remote sensing, and modeling are used to map environmental features, monitor ecosystem health, and inform policy decisions.

Advantages of Research Data

Research data has numerous advantages, including:

  • Empirical evidence: Research data provides empirical evidence that can be used to support or refute theories, test hypotheses, and inform decision-making. This evidence-based approach helps to ensure that decisions are based on objective, measurable data rather than subjective opinions or assumptions.
  • Accuracy and reliability : Research data is typically collected using rigorous scientific methods and protocols, which helps to ensure its accuracy and reliability. Data can be validated and verified using statistical methods, which further enhances its credibility.
  • Replicability: Research data can be replicated and validated by other researchers, which helps to promote transparency and accountability in research. By making data available for others to analyze and interpret, researchers can ensure that their findings are robust and reliable.
  • Insights and discoveries : Research data can provide insights into complex relationships between variables, identify patterns and trends, and reveal new discoveries. These insights can lead to the development of new theories, treatments, and interventions that can improve outcomes in various fields.
  • Informed decision-making: Research data can inform decision-making in a range of fields, including healthcare, business, education, and public policy. Data analysis techniques can be used to identify trends, evaluate the effectiveness of interventions, and inform policy decisions.
  • Efficiency and cost-effectiveness: Research data can help to improve efficiency and cost-effectiveness by identifying areas where resources can be directed most effectively. By using data to identify the most promising approaches or interventions, researchers can optimize the use of resources and improve outcomes.

Limitations of Research Data

Research data has several limitations that researchers should be aware of, including:

  • Bias and subjectivity: Research data can be influenced by biases and subjectivity, which can affect the accuracy and reliability of the data. Researchers must take steps to minimize bias and subjectivity in data collection and analysis.
  • Incomplete data : Research data can be incomplete or missing, which can affect the validity of the findings. Researchers must ensure that data is complete and representative to ensure that their findings are reliable.
  • Limited scope: Research data may be limited in scope, which can limit the generalizability of the findings. Researchers must carefully consider the scope of their research and ensure that their findings are applicable to the broader population.
  • Data quality: Research data can be affected by issues such as measurement error, data entry errors, and missing data, which can affect the quality of the data. Researchers must ensure that data is collected and analyzed using rigorous methods to minimize these issues.
  • Ethical concerns: Research data can raise ethical concerns, particularly when it involves human subjects. Researchers must ensure that their research complies with ethical standards and protects the rights and privacy of human subjects.
  • Data security: Research data must be protected to prevent unauthorized access or use. Researchers must ensure that data is stored and transmitted securely to protect the confidentiality and integrity of the data.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Primary Data

Primary Data – Types, Methods and Examples

Qualitative Data

Qualitative Data – Types, Methods and Examples

Quantitative Data

Quantitative Data – Types, Methods and Examples

Secondary Data

Secondary Data – Types, Methods and Examples

Research Information

Information in Research – Types and Examples

Banner

Data Module #3 - Finding & Collecting Data for Your Research

  • Collecting Data: An Overview
  • Learning Through Example: Scenarios
  • Identifying Specific Data Needs
  • Research Scenario: Existing Data
  • Research Scenario: Your Data
  • Let's Review

Quick Navigation

Using existing data in your research.

what is existing data in research

When choosing existing data it is vital you understand how the data was collected. For example, if you use polling data, you may want to know collection method, sample size, and demographics of people surveyed. Sometimes, term definitions change over time, making comparisons difficult. For example, the United States federal government's definition of unemployment has changed more than once during the time it has collected that data. To find information about your data, look for metadata and documentation accompanying it. You can also look at other studies that have used the data to find possible critiques and limitations.

Exploring Potential Data Sources

Where can you find existing data available to use in your research? Is the best answer to "Just use Google?" Sometimes! Google is a great resource to use. There are a lot of web sites that make data available. Often, these sites include their own tools for finding data that allow you to be more focused in your searching. Think about who has a stake in providing the data you need or is an advocate for the topic. Check to see if they collect or publish any data that might be helpful to your research.

what is existing data in research

Look for research publications (books, articles, websites, etc.) on your topic to discover what data sources other researchers have used. Their data may be just what you need. If the data are not easily obtainable, either in the publication or elsewhere, try contacting the researcher directly.

what is existing data in research

Governments all over the world collect lots of data. The United States government, along with many state and local agencies, provide open (free) data. Identify the government agencies that have a stake in tracking or regulating the topic of your research and check to see what data they make available. 

what is existing data in research

There are a wide range of international organizations, non-profit research centers, foundations, trade associations, and advocacy groups that collect data and make it available. Check to see if there is an organization that focuses on your research topic.  

what is existing data in research

Data repositories are curated spaces for storing research data. Contributors may include individual researchers, organizations, and government agencies. Benefits to using a repository are the data are findable, reusable, citable, and preserved. There are several general and subject specific data respositories. Look for the data repositories available in your broad subject area.

Finding Data on the Web

The more you know about the data you are looking for the easier searching will be. For instance, who produced the data, who published the data, was there a title (such as American Community Survey), when was the data created, etc. Basically, more information is good! Here are some tips for searching the web for data:

  • Use keywords.   Try words from the title of the data, the agency that produced it, etc. For example, "American community survey census bureau."  
  • Use the word "data" in your search. For example, you want data on automobile thefts in Minnesota. You might search for: "automobile thefts Minnesota data."  This tends to focus the results to actual data versus description of an issue or subject.  
  • Broaden your search. If you don’t get good results, try broadening your search terms. Instead of "automobile thefts..." try "crime theft Minnesota data."   
  • Try synonyms or related terms. For example, “automobile theft data” doesn’t work as well as "motor vehicle theft data." Pay attention to different ways your topic is described in order to find additional search terms.

Some types of data often not found freely online:

  • Older data (pre-2000).
  • Proprietary data (company-based, anything that people will pay for).
  • << Previous: Identifying Specific Data Needs
  • Next: Research Scenario: Existing Data >>
  • Last Updated: Feb 8, 2024 12:33 PM
  • URL: https://libguides.macalester.edu/data3

University of Northern Iowa Home

  • Request Info
  • About UNI About open dropdown
  • History & Traditions
  • Office of the President
  • Administrative Departments
  • Academics Academics open dropdown
  • Majors, Minors & Degrees
  • Pre-Professional Preparation
  • Online & Distance Education
  • Colleges & Departments
  • Honors Program
  • Student Research
  • Study Abroad
  • Office of the Registrar
  • Academic Advising
  • Admissions & Aid Admissions & Aid open dropdown
  • Transfer Students
  • International
  • Tuition & Fees
  • Financial Aid
  • Student Life Life on Campus open dropdown
  • Health & Safety
  • Housing & Dining
  • Student Health & Wellbeing
  • Diversity & Inclusion
  • Career Services
  • Cedar Falls & Waterloo
  • Support UNI Support UNI open dropdown
  • UNI Alumni Association
  • UNI Foundation
  • Give to UNI
  • Rod Library

Research Using Existing or Secondary Data

Research that involves the use of existing data, documents, records, or specimens from living individuals usually must be reviewed by the IRB in advance of the investigator receiving or analyzing the data. If the data contain individual identifiers, the research may be eligible for an expedited review. If the data are recorded so that participants cannot be identified, either directly or through identifiers linked to the subject, the research may be reviewed by the IRB through exempt procedures. For procedural guidance on these types of projects, see  Which Projects Require Review  for a discussion of direct versus indirect identifiers as well as which secondary data projects require review (most do).

  • Internal UNI Grant Funding
  • Grant Databases
  • Funding Alert Services
  • Internal Approval of Submissions
  • Proposal Routing and Approval Form (PRAF)
  • Institutional Information for Proposals
  • Budget Development
  • Check List for Managing Sponsored Projects
  • Award Modifications & Extensions
  • Effort Allocation & Certification
  • Faculty Courtesy Appointments
  • Small Business Subcontracting Procedures
  • UNI Scholarly Research Publications
  • UNI Centers & Institutes
  • UNI Centers & Institutes Reporting
  • Internal Grants
  • UNI Research Showcase
  • Human Participation - IRB
  • Cayuse IRB - Human Ethics
  • Academic Ethics & Responsible Conduct of Research
  • Research Misconduct
  • Introduction to IP at UNI
  • Technology Transfer

Consent issues require special attention in secondary data projects. Ordinarily, when a person uses data collected by someone else for another purpose, the consent of the participants must be sought again. For example, if researcher A has interviewed a number of persons for project A, the interview cannot be released later to researcher B for project B. The participant who consented for his or her data to be used in project A might disapprove heartily of project B and might not have provided consent for the latter activity. The wording of the original consent form is critical. If a participant consented to allow his or her blood sample to be available to persons studying blood diseases, his or her sample could be shared with many researchers without additional consent. The original researchers who received consent can re-work the data without new consent provided it is for a related purpose and the original consent form informed participants of this possibility. 

Having said that, if the data from the original research are truly anonymous or the data are pooled in a form ensuring anonymity, then consent for secondary use may be waived by the IRB. It is also possible to request that the IRB waive consent when it will be impossible or highly impractical to go back and obtain consent. Issues pertaining to waivers of consent and documentation of consent are further discussed in the page on  Consent .

Publicly-available datasets are defined by the IRB as data that have already been compiled and are now available in a library or on the internet, and no permissions are required to access them. Privately-held datasets are those for which some type of permission is necessary to access the data and studies involving these typically require IRB review. (The IRB has made an exception for a few well-known, commonly used repositories of secondary data, such as the InterUniversity Consortium for Political and Social Research-ICPSR.) Research involving data that is both completely de-identified and publicly available without permission or application is not reviewed by the IRB.  

Studies using only existing or secondary data must be submitted for IRB review using the  Cayuse IRB-Human Ethics application system . More information on this system can be found by visiting Cayuse IRB .

Human Research Protection Program and Institutional Review Board

  • For Researchers
  • For External Partners
  • Research Impact
  • Research Annual Report
  • OSU Advantage
  • Office of Research Integrity (ORI)
  • Research Integrity Forums
  • Research Security
  • Report Concerns

How do I describe the use of pre-existing data in the application materials?

Initial application.

Risk/Benefit Assessment for adults and/or children: Evaluate the risk to the participants within the data set.

Participant age range: Age range of participants within the data set. If the exact age range is unknown, please indicate whether participants will be adults and/or children.

T arget population(s): Specify whether any of the populations are excluded or targeted. If they will not be excluded or targeted, select permitted.

Description of research: Identify the purpose for which the data was collected.

External Research or Recruitment Site(s): If the data was collected at an external site, include information in this section. If the data originated from another institution or organization, documentation of permission for use of the data from the appropriate authority may be required (e.g., data use agreement).

Subject Population: A description of the characteristics of the participants whose data will be included. A total enrollment number is not required for studies of existing datasets.

Consent Process:

  • Waiving consent: If the study meets the criteria for a waiver of consent [ §46.116(d) ]: An explanation as to why it would be impossible to obtain consent from everyone in the data set or how obtaining consent would increase the risks to those individuals.
  • Obtaining consent: Describe how consent will be obtained for the new use of the data. This section should not explain how consent was obtained for the original data collection, but should indicate whether a reasonable participant would find that the current project was described to them in the original consent language.

Assent Process: The above instructions for addressing the issue of consent apply.

Methods and Procedures: Describe the methods used for the collection and analysis of the pre-existing data, not the original data. This section should not describe the methods for the original data collection, but should instead include a brief explanation of the purpose of the original collection (e.g. the data was originally collected as part of a class project; medical record; Medicare prescription drug surveillance; etc.).

Risks and Benefits: Describe the risks and benefits of the new analysis, not the original data collection.

Consent Document, when applicable

Purpose: Identify the purpose of the secondary data analysis.

Activities and Time: Clarify that participating will involve no additional time, as the data was previously collected. This section should briefly identify the purpose of the original data collection. Sample sentence: We are asking your permission to use the information collected as part of your class project in a new research study. Your participation would be limited to providing permission for the use of the existing data; it will not involve additional time on your part.

Risks: Identify the risks of the secondary data analysis, not the original data collection procedures. The risk of a breach of confidentiality may apply if individual identifiers exist.

Benefits: Identify the benefits of the secondary data analysis, not the original data collection procedures.

Recruitment Materials, when applicable

There are no unique requirements for the recruitment materials. However, please consider using a statement similar to the following: We are asking you to give permission for us to use your data collected for ____ in our research project. Participation will involve no additional time.

Human Research Protection Program, Institutional Review Board B308 Kerr Administration Corvallis, OR 97331-2140 [email protected] Phone: (541) 737-8008

  • Education and Advising
  • New Applications
  • Post Approval
  • Ethical Principles, Regulations, and Policies
  • Guidance for Researchers
  • Informed Consent Guidance and Templates

Contact Info

Division of Research and Innovation A312 Kerr Administration Corvallis, OR  97331-2140 Phone 541-737-3467

  • Open access
  • Published: 19 April 2024

A scoping review of continuous quality improvement in healthcare system: conceptualization, models and tools, barriers and facilitators, and impact

  • Aklilu Endalamaw 1 , 2 ,
  • Resham B Khatri 1 , 3 ,
  • Tesfaye Setegn Mengistu 1 , 2 ,
  • Daniel Erku 1 , 4 , 5 ,
  • Eskinder Wolka 6 ,
  • Anteneh Zewdie 6 &
  • Yibeltal Assefa 1  

BMC Health Services Research volume  24 , Article number:  487 ( 2024 ) Cite this article

732 Accesses

Metrics details

The growing adoption of continuous quality improvement (CQI) initiatives in healthcare has generated a surge in research interest to gain a deeper understanding of CQI. However, comprehensive evidence regarding the diverse facets of CQI in healthcare has been limited. Our review sought to comprehensively grasp the conceptualization and principles of CQI, explore existing models and tools, analyze barriers and facilitators, and investigate its overall impacts.

This qualitative scoping review was conducted using Arksey and O’Malley’s methodological framework. We searched articles in PubMed, Web of Science, Scopus, and EMBASE databases. In addition, we accessed articles from Google Scholar. We used mixed-method analysis, including qualitative content analysis and quantitative descriptive for quantitative findings to summarize findings and PRISMA extension for scoping reviews (PRISMA-ScR) framework to report the overall works.

A total of 87 articles, which covered 14 CQI models, were included in the review. While 19 tools were used for CQI models and initiatives, Plan-Do-Study/Check-Act cycle was the commonly employed model to understand the CQI implementation process. The main reported purposes of using CQI, as its positive impact, are to improve the structure of the health system (e.g., leadership, health workforce, health technology use, supplies, and costs), enhance healthcare delivery processes and outputs (e.g., care coordination and linkages, satisfaction, accessibility, continuity of care, safety, and efficiency), and improve treatment outcome (reduce morbidity and mortality). The implementation of CQI is not without challenges. There are cultural (i.e., resistance/reluctance to quality-focused culture and fear of blame or punishment), technical, structural (related to organizational structure, processes, and systems), and strategic (inadequate planning and inappropriate goals) related barriers that were commonly reported during the implementation of CQI.

Conclusions

Implementing CQI initiatives necessitates thoroughly comprehending key principles such as teamwork and timeline. To effectively address challenges, it’s crucial to identify obstacles and implement optimal interventions proactively. Healthcare professionals and leaders need to be mentally equipped and cognizant of the significant role CQI initiatives play in achieving purposes for quality of care.

Peer Review reports

Continuous quality improvement (CQI) initiative is a crucial initiative aimed at enhancing quality in the health system that has gradually been adopted in the healthcare industry. In the early 20th century, Shewhart laid the foundation for quality improvement by describing three essential steps for process improvement: specification, production, and inspection [ 1 , 2 ]. Then, Deming expanded Shewhart’s three-step model into ‘plan, do, study/check, and act’ (PDSA or PDCA) cycle, which was applied to management practices in Japan in the 1950s [ 3 ] and was gradually translated into the health system. In 1991, Kuperman applied a CQI approach to healthcare, comprising selecting a process to be improved, assembling a team of expert clinicians that understands the process and the outcomes, determining key steps in the process and expected outcomes, collecting data that measure the key process steps and outcomes, and providing data feedback to the practitioners [ 4 ]. These philosophies have served as the baseline for the foundation of principles for continuous improvement [ 5 ].

Continuous quality improvement fosters a culture of continuous learning, innovation, and improvement. It encourages proactive identification and resolution of problems, promotes employee engagement and empowerment, encourages trust and respect, and aims for better quality of care [ 6 , 7 ]. These characteristics drive the interaction of CQI with other quality improvement projects, such as quality assurance and total quality management [ 8 ]. Quality assurance primarily focuses on identifying deviations or errors through inspections, audits, and formal reviews, often settling for what is considered ‘good enough’, rather than pursuing the highest possible standards [ 9 , 10 ], while total quality management is implemented as the management philosophy and system to improve all aspects of an organization continuously [ 11 ].

Continuous quality improvement has been implemented to provide quality care. However, providing effective healthcare is a complicated and complex task in achieving the desired health outcomes and the overall well-being of individuals and populations. It necessitates tackling issues, including access, patient safety, medical advances, care coordination, patient-centered care, and quality monitoring [ 12 , 13 ], rooted long ago. It is assumed that the history of quality improvement in healthcare started in 1854 when Florence Nightingale introduced quality improvement documentation [ 14 ]. Over the passing decades, Donabedian introduced structure, processes, and outcomes as quality of care components in 1966 [ 15 ]. More comprehensively, the Institute of Medicine in the United States of America (USA) has identified effectiveness, efficiency, equity, patient-centredness, safety, and timeliness as the components of quality of care [ 16 ]. Moreover, quality of care has recently been considered an integral part of universal health coverage (UHC) [ 17 ], which requires initiatives to mobilise essential inputs [ 18 ].

While the overall objective of CQI in health system is to enhance the quality of care, it is important to note that the purposes and principles of CQI can vary across different contexts [ 19 , 20 ]. This variation has sparked growing research interest. For instance, a review of CQI approaches for capacity building addressed its role in health workforce development [ 21 ]. Another systematic review, based on random-controlled design studies, assessed the effectiveness of CQI using training as an intervention and the PDSA model [ 22 ]. As a research gap, the former review was not directly related to the comprehensive elements of quality of care, while the latter focused solely on the impact of training using the PDSA model, among other potential models. Additionally, a review conducted in 2015 aimed to identify barriers and facilitators of CQI in Canadian contexts [ 23 ]. However, all these reviews presented different perspectives and investigated distinct outcomes. This suggests that there is still much to explore in terms of comprehensively understanding the various aspects of CQI initiatives in healthcare.

As a result, we conducted a scoping review to address several aspects of CQI. Scoping reviews serve as a valuable tool for systematically mapping the existing literature on a specific topic. They are instrumental when dealing with heterogeneous or complex bodies of research. Scoping reviews provide a comprehensive overview by summarizing and disseminating findings across multiple studies, even when evidence varies significantly [ 24 ]. In our specific scoping review, we included various types of literature, including systematic reviews, to enhance our understanding of CQI.

This scoping review examined how CQI is conceptualized and measured and investigated models and tools for its application while identifying implementation challenges and facilitators. It also analyzed the purposes and impact of CQI on the health systems, providing valuable insights for enhancing healthcare quality.

Protocol registration and results reporting

Protocol registration for this scoping review was not conducted. Arksey and O’Malley’s methodological framework was utilized to conduct this scoping review [ 25 ]. The scoping review procedures start by defining the research questions, identifying relevant literature, selecting articles, extracting data, and summarizing the results. The review findings are reported using the PRISMA extension for a scoping review (PRISMA-ScR) [ 26 ]. McGowan and colleagues also advised researchers to report findings from scoping reviews using PRISMA-ScR [ 27 ].

Defining the research problems

This review aims to comprehensively explore the conceptualization, models, tools, barriers, facilitators, and impacts of CQI within the healthcare system worldwide. Specifically, we address the following research questions: (1) How has CQI been defined across various contexts? (2) What are the diverse approaches to implementing CQI in healthcare settings? (3) Which tools are commonly employed for CQI implementation ? (4) What barriers hinder and facilitators support successful CQI initiatives? and (5) What effects CQI initiatives have on the overall care quality?

Information source and search strategy

We conducted the search in PubMed, Web of Science, Scopus, and EMBASE databases, and the Google Scholar search engine. The search terms were selected based on three main distinct concepts. One group was CQI-related terms. The second group included terms related to the purpose for which CQI has been implemented, and the third group included processes and impact. These terms were selected based on the Donabedian framework of structure, process, and outcome [ 28 ]. Additionally, the detailed keywords were recruited from the primary health framework, which has described lists of dimensions under process, output, outcome, and health system goals of any intervention for health [ 29 ]. The detailed search strategy is presented in the Supplementary file 1 (Search strategy). The search for articles was initiated on August 12, 2023, and the last search was conducted on September 01, 2023.

Eligibility criteria and article selection

Based on the scoping review’s population, concept, and context frameworks [ 30 ], the population included any patients or clients. Additionally, the concepts explored in the review encompassed definitions, implementation, models, tools, barriers, facilitators, and impacts of CQI. Furthermore, the review considered contexts at any level of health systems. We included articles if they reported results of qualitative or quantitative empirical study, case studies, analytic or descriptive synthesis, any review, and other written documents, were published in peer-reviewed journals, and were designed to address at least one of the identified research questions or one of the identified implementation outcomes or their synonymous taxonomy as described in the search strategy. Based on additional contexts, we included articles published in English without geographic and time limitations. We excluded articles with abstracts only, conference abstracts, letters to editors, commentators, and corrections.

We exported all citations to EndNote x20 to remove duplicates and screen relevant articles. The article selection process includes automatic duplicate removal by using EndNote x20, unmatched title and abstract removal, citation and abstract-only materials removal, and full-text assessment. The article selection process was mainly conducted by the first author (AE) and reported to the team during the weekly meetings. The first author encountered papers that caused confusion regarding whether to include or exclude them and discussed them with the last author (YA). Then, decisions were ultimately made. Whenever disagreements happened, they were resolved by discussion and reconsideration of the review questions in relation to the written documents of the article. Further statistical analysis, such as calculating Kappa, was not performed to determine article inclusion or exclusion.

Data extraction and data items

We extracted first author, publication year, country, settings, health problem, the purpose of the study, study design, types of intervention if applicable, CQI approaches/steps if applicable, CQI tools and procedures if applicable, and main findings using a customized Microsoft Excel form.

Summarizing and reporting the results

The main findings were summarized and described based on the main themes, including concepts under conceptualizing, principles, teams, timelines, models, tools, barriers, facilitators, and impacts of CQI. Results-based convergent synthesis, achieved through mixed-method analysis, involved content analysis to identify the thematic presentation of findings. Additionally, a narrative description was used for quantitative findings, aligning them with the appropriate theme. The authors meticulously reviewed the primary findings from each included material and contextualized these findings concerning the main themes1. This approach provides a comprehensive understanding of complex interventions and health systems, acknowledging quantitative and qualitative evidence.

Search results

A total of 11,251 documents were identified from various databases: SCOPUS ( n  = 4,339), PubMed ( n  = 2,893), Web of Science ( n  = 225), EMBASE ( n  = 3,651), and Google Scholar ( n  = 143). After removing duplicates ( n  = 5,061), 6,190 articles were evaluated by title and abstract. Subsequently, 208 articles were assessed for full-text eligibility. Following the eligibility criteria, 121 articles were excluded, leaving 87 included in the current review (Fig.  1 ).

figure 1

Article selection process

Operationalizing continuous quality improvement

Continuous Quality Improvement (CQI) is operationalized as a cyclic process that requires commitment to implementation, teamwork, time allocation, and celebrating successes and failures.

CQI is a cyclic ongoing process that is followed reflexive, analytical and iterative steps, including identifying gaps, generating data, developing and implementing action plans, evaluating performance, providing feedback to implementers and leaders, and proposing necessary adjustments [ 31 , 32 , 33 , 34 , 35 , 36 , 37 , 38 ].

CQI requires committing to the philosophy, involving continuous improvement [ 19 , 38 ], establishing a mission statement [ 37 ], and understanding quality definition [ 19 ].

CQI involves a wide range of patient-oriented measures and performance indicators, specifically satisfying internal and external customers, developing quality assurance, adopting common quality measures, and selecting process measures [ 8 , 19 , 35 , 36 , 37 , 39 , 40 ].

CQI requires celebrating success and failure without personalization, leading each team member to develop error-free attitudes [ 19 ]. Success and failure are related to underlying organizational processes and systems as causes of failure rather than blaming individuals [ 8 ] because CQI is process-focused based on collaborative, data-driven, responsive, rigorous and problem-solving statistical analysis [ 8 , 19 , 38 ]. Furthermore, a gap or failure opens another opportunity for establishing a data-driven learning organization [ 41 ].

CQI cannot be implemented without a CQI team [ 8 , 19 , 37 , 39 , 42 , 43 , 44 , 45 , 46 ]. A CQI team comprises individuals from various disciplines, often comprising a team leader, a subject matter expert (physician or other healthcare provider), a data analyst, a facilitator, frontline staff, and stakeholders [ 39 , 43 , 47 , 48 , 49 ]. It is also important to note that inviting stakeholders or partners as part of the CQI support intervention is crucial [ 19 , 38 , 48 ].

The timeline is another distinct feature of CQI because the results of CQI vary based on the implementation duration of each cycle [ 35 ]. There is no specific time limit for CQI implementation, although there is a general consensus that a cycle of CQI should be relatively short [ 35 ]. For instance, a CQI implementation took 2 months [ 42 ], 4 months [ 50 ], 9 months [ 51 , 52 ], 12 months [ 53 , 54 , 55 ], and one year and 5 months [ 49 ] duration to achieve the desired positive outcome, while bi-weekly [ 47 ] and monthly data reviews and analyses [ 44 , 48 , 56 ], and activities over 3 months [ 57 ] have also resulted in a positive outcome.

Continuous quality improvement models and tools

There have been several models are utilized. The Plan-Do-Study/Check-Act cycle is a stepwise process involving project initiation, situation analysis, root cause identification, solution generation and selection, implementation, result evaluation, standardization, and future planning [ 7 , 36 , 37 , 45 , 47 , 48 , 49 , 50 , 51 , 53 , 56 , 57 , 58 , 59 , 60 , 61 , 62 , 63 , 64 , 65 , 66 , 67 , 68 , 69 , 70 ]. The FOCUS-PDCA cycle enhances the PDCA process by adding steps to find and improve a process (F), organize a knowledgeable team (O), clarify the process (C), understand variations (U), and select improvements (S) [ 55 , 71 , 72 , 73 ]. The FADE cycle involves identifying a problem (Focus), understanding it through data analysis (Analyze), devising solutions (Develop), and implementing the plan (Execute) [ 74 ]. The Logic Framework involves brainstorming to identify improvement areas, conducting root cause analysis to develop a problem tree, logically reasoning to create an objective tree, formulating the framework, and executing improvement projects [ 75 ]. Breakthrough series approach requires CQI teams to meet in quarterly collaborative learning sessions, share learning experiences, and continue discussion by telephone and cross-site visits to strengthen learning and idea exchange [ 47 ]. Another CQI model is the Lean approach, which has been conducted with Kaizen principles [ 52 ], 5 S principles, and the Six Sigma model. The 5 S (Sort, Set/Straighten, Shine, Standardize, Sustain) systematically organises and improves the workplace, focusing on sorting, setting order, shining, standardizing, and sustaining the improvement [ 54 , 76 ]. Kaizen principles guide CQI by advocating for continuous improvement, valuing all ideas, solving problems, focusing on practical, low-cost improvements, using data to drive change, acknowledging process defects, reducing variability and waste, recognizing every interaction as a customer-supplier relationship, empowering workers, responding to all ideas, and maintaining a disciplined workplace [ 77 ]. Lean Six Sigma, a CQI model, applies the DMAIC methodology, which involves defining (D) and measuring the problem (M), analyzing root causes (A), improving by finding solutions (I), and controlling by assessing process stability (C) [ 78 , 79 ]. The 5 C-cyclic model (consultation, collection, consideration, collaboration, and celebration), the first CQI framework for volunteer dental services in Aboriginal communities, ensures quality care based on community needs [ 80 ]. One study used meetings involving activities such as reviewing objectives, assigning roles, discussing the agenda, completing tasks, retaining key outputs, planning future steps, and evaluating the meeting’s effectiveness [ 81 ].

Various tools are involved in the implementation or evaluation of CQI initiatives: checklists [ 53 , 82 ], flowcharts [ 81 , 82 , 83 ], cause-and-effect diagrams (fishbone or Ishikawa diagrams) [ 60 , 62 , 79 , 81 , 82 ], fuzzy Pareto diagram [ 82 ], process maps [ 60 ], time series charts [ 48 ], why-why analysis [ 79 ], affinity diagrams and multivoting [ 81 ], and run chart [ 47 , 48 , 51 , 60 , 84 ], and others mentioned in the table (Table  1 ).

Barriers and facilitators of continuous quality improvement implementation

Implementing CQI initiatives is determined by various barriers and facilitators, which can be thematized into four dimensions. These dimensions are cultural, technical, structural, and strategic dimensions.

Continuous quality improvement initiatives face various cultural, strategic, technical, and structural barriers. Cultural dimension barriers involve resistance to change (e.g., not accepting online technology), lack of quality-focused culture, staff reporting apprehensiveness, and fear of blame or punishment [ 36 , 41 , 85 , 86 ]. The technical dimension barriers of CQI can include various factors that hinder the effective implementation and execution of CQI processes [ 36 , 86 , 87 , 88 , 89 ]. Structural dimension barriers of CQI arise from the organization structure, process, and systems that can impede the effective implementation and sustainability of CQI [ 36 , 85 , 86 , 87 , 88 ]. Strategic dimension barriers are, for example, the inability to select proper CQI goals and failure to integrate CQI into organizational planning and goals [ 36 , 85 , 86 , 87 , 88 , 90 ].

Facilitators are also grouped to cultural, structural, technical, and strategic dimensions to provide solutions to CQI barriers. Cultural challenges were addressed by developing a group culture to CQI and other rewards [ 39 , 41 , 80 , 85 , 86 , 87 , 90 , 91 , 92 ]. Technical facilitators are pivotal to improving technical barriers [ 39 , 42 , 53 , 69 , 86 , 90 , 91 ]. Structural-related facilitators are related to improving communication, infrastructure, and systems [ 86 , 92 , 93 ]. Strategic dimension facilitators include strengthening leadership and improving decision-making skills [ 43 , 53 , 67 , 86 , 87 , 92 , 94 , 95 ] (Table  2 ).

Impact of continuous quality improvement

Continuous quality improvement initiatives can significantly impact the quality of healthcare in a wide range of health areas, focusing on improving structure, the health service delivery process and improving client wellbeing and reducing mortality.

Structure components

These are health leadership, financing, workforce, technology, and equipment and supplies. CQI has improved planning, monitoring and evaluation [ 48 , 53 ], and leadership and planning [ 48 ], indicating improvement in leadership perspectives. Implementing CQI in primary health care (PHC) settings has shown potential for maintaining or reducing operation costs [ 67 ]. Findings from another study indicate that the costs associated with implementing CQI interventions per facility ranged from approximately $2,000 to $10,500 per year, with an average cost of approximately $10 to $60 per admitted client [ 57 ]. However, based on model predictions, the average cost savings after implementing CQI were estimated to be $5430 [ 31 ]. CQI can also be applied to health workforce development [ 32 ]. CQI in the institutional system improved medical education [ 66 , 96 , 97 ], human resources management [ 53 ], motivated staffs [ 76 ], and increased staff health awareness [ 69 ], while concerns raised about CQI impartiality, independence, and public accountability [ 96 ]. Regarding health technology, CQI also improved registration and documentation [ 48 , 53 , 98 ]. Furthermore, the CQI initiatives increased cleanliness [ 54 ] and improved logistics, supplies, and equipment [ 48 , 53 , 68 ].

Process and output components

The process component focuses on the activities and actions involved in delivering healthcare services.

Service delivery

CQI interventions improved service delivery [ 53 , 56 , 99 ], particularly a significant 18% increase in the overall quality of service performance [ 48 ], improved patient counselling, adherence to appropriate procedures, and infection prevention [ 48 , 68 ], and optimised workflow [ 52 ].

Coordination and collaboration

CQI initiatives improved coordination and collaboration through collecting and analysing data, onsite technical support, training, supportive supervision [ 53 ] and facilitating linkages between work processes and a quality control group [ 65 ].

Patient satisfaction

The CQI initiatives increased patient satisfaction and improved quality of life by optimizing care quality management, improving the quality of clinical nursing, reducing nursing defects and enhancing the wellbeing of clients [ 54 , 76 , 100 ], although CQI was not associated with changes in adolescent and young adults’ satisfaction [ 51 ].

CQI initiatives reduced medication error reports from 16 to 6 [ 101 ], and it significantly reduced the administration of inappropriate prophylactic antibiotics [ 44 ], decreased errors in inpatient care [ 52 ], decreased the overall episiotomy rate from 44.5 to 33.3% [ 83 ], reduced the overall incidence of unplanned endotracheal extubation [ 102 ], improving appropriate use of computed tomography angiography [ 103 ], and appropriate diagnosis and treatment selection [ 47 ].

Continuity of care

CQI initiatives effectively improve continuity of care by improving client and physician interaction. For instance, provider continuity levels showed a 64% increase [ 55 ]. Modifying electronic medical record templates, scheduling, staff and parental education, standardization of work processes, and birth to 1-year age-specific incentives in post-natal follow-up care increased continuity of care to 74% in 2018 compared to baseline 13% in 2012 [ 84 ].

The CQI initiative yielded enhanced efficiency in the cardiac catheterization laboratory, as evidenced by improved punctuality in procedure starts and increased efficiency in manual sheath-pulls inside [ 78 ].

Accessibility

CQI initiatives were effective in improving accessibility in terms of increasing service coverage and utilization rate. For instance, screening for cigarettes, nutrition counselling, folate prescription, maternal care, immunization coverage [ 53 , 81 , 104 , 105 ], reducing the percentage of non-attending patients to surgery to 0.9% from the baseline 3.9% [ 43 ], increasing Chlamydia screening rates from 29 to 60% [ 45 ], increasing HIV care continuum coverage [ 51 , 59 , 60 ], increasing in the uptake of postpartum long-acting reversible contraceptive use from 6.9% at the baseline to 25.4% [ 42 ], increasing post-caesarean section prophylaxis from 36 to 89% [ 62 ], a 31% increase of kangaroo care practice [ 50 ], and increased follow-up [ 65 ]. Similarly, the QI intervention increased the quality of antenatal care by 29.3%, correct partograph use by 51.7%, and correct active third-stage labour management, a 19.6% improvement from the baseline, but not significantly associated with improvement in contraceptive service uptake [ 61 ].

Timely access

CQI interventions improved the time care provision [ 52 ], and reduced waiting time [ 62 , 74 , 76 , 106 ]. For instance, the discharge process waiting time in the emergency department decreased from 76 min to 22 min [ 79 ]. It also reduced mean postprocedural length of stay from 2.8 days to 2.0 days [ 31 ].

Acceptability

Acceptability of CQI by healthcare providers was satisfactory. For instance, 88% of the faculty, 64% of the residents, and 82% of the staff believed CQI to be useful in the healthcare clinic [ 107 ].

Outcome components

Morbidity and mortality.

CQI efforts have demonstrated better management outcomes among diabetic patients [ 40 ], patients with oral mucositis [ 71 ], and anaemic patients [ 72 ]. It has also reduced infection rate in post-caesarean Sect. [ 62 ], reduced post-peritoneal dialysis peritonitis [ 49 , 108 ], and prevented pressure ulcers [ 70 ]. It is explained by peritonitis incidence from once every 40.1 patient months at baseline to once every 70.8 patient months after CQI [ 49 ] and a 63% reduction in pressure ulcer prevalence within 2 years from 2008 to 2010 [ 70 ]. Furthermore, CQI initiatives significantly reduced in-hospital deaths [ 31 ] and increased patient survival rates [ 108 ]. Figure  2 displays the overall process of the CQI implementations.

figure 2

The overall mechanisms of continuous quality improvement implementation

In this review, we examined the fundamental concepts and principles underlying CQI, the factors that either hinder or assist in its successful application and implementation, and the purpose of CQI in enhancing quality of care across various health issues.

Our findings have brought attention to the application and implementation of CQI, emphasizing its underlying concepts and principles, as evident in the existing literature [ 31 , 32 , 33 , 34 , 35 , 36 , 39 , 40 , 43 , 45 , 46 ]. Continuous quality improvement has shared with the principles of continuous improvement, such as a customer-driven focus, effective leadership, active participation of individuals, a process-oriented approach, systematic implementation, emphasis on design improvement and prevention, evidence-based decision-making, and fostering partnership [ 5 ]. Moreover, Deming’s 14 principles laid the foundation for CQI principles [ 109 ]. These principles have been adapted and put into practice in various ways: ten [ 19 ] and five [ 38 ] principles in hospitals, five principles for capacity building [ 38 ], and two principles for medication error prevention [ 41 ]. As a principle, the application of CQI can be process-focused [ 8 , 19 ] or impact-focused [ 38 ]. Impact-focused CQI focuses on achieving specific outcomes or impacts, whereas process-focused CQI prioritizes and improves the underlying processes and systems. These principles complement each other and can be utilized based on the objectives of quality improvement initiatives in healthcare settings. Overall, CQI is an ongoing educational process that requires top management’s involvement, demands coordination across departments, encourages the incorporation of views beyond clinical area, and provides non-judgemental evidence based on objective data [ 110 ].

The current review recognized that it was not easy to implement CQI. It requires reasonable utilization of various models and tools. The application of each tool can be varied based on the studied health problem and the purpose of CQI initiative [ 111 ], varied in context, content, structure, and usability [ 112 ]. Additionally, overcoming the cultural, technical, structural, and strategic-related barriers. These barriers have emerged from clinical staff, managers, and health systems perspectives. Of the cultural obstacles, staff non-involvement, resistance to change, and reluctance to report error were staff-related. In contrast, others, such as the absence of celebration for success and hierarchical and rational culture, may require staff and manager involvement. Staff members may exhibit reluctance in reporting errors due to various cultural factors, including lack of trust, hierarchical structures, fear of retribution, and a blame-oriented culture. These challenges pose obstacles to implementing standardized CQI practices, as observed, for instance, in community pharmacy settings [ 85 ]. The hierarchical culture, characterized by clearly defined levels of power, authority, and decision-making, posed challenges to implementing CQI initiatives in public health [ 41 , 86 ]. Although rational culture, a type of organizational culture, emphasizes logical thinking and rational decision-making, it can also create challenges for CQI implementation [ 41 , 86 ] because hierarchical and rational cultures, which emphasize bureaucratic norms and narrow definitions of achievement, were found to act as barriers to the implementation of CQI [ 86 ]. These could be solved by developing a shared mindset and collective commitment, establishing a shared purpose, developing group norms, and cultivating psychological preparedness among staff, managers, and clients to implement and sustain CQI initiatives. Furthermore, reversing cultural-related barriers necessitates cultural-related solutions: development of a culture and group culture to CQI [ 41 , 86 ], positive comprehensive perception [ 91 ], commitment [ 85 ], involving patients, families, leaders, and staff [ 39 , 92 ], collaborating for a common goal [ 80 , 86 ], effective teamwork [ 86 , 87 ], and rewarding and celebrating successes [ 80 , 90 ].

The technical dimension barriers of CQI can include inadequate capitalization of a project and insufficient support for CQI facilitators and data entry managers [ 36 ], immature electronic medical records or poor information systems [ 36 , 86 ], and the lack of training and skills [ 86 , 87 , 88 ]. These challenges may cause the CQI team to rely on outdated information and technologies. The presence of barriers on the technical dimension may challenge the solid foundation of CQI expertise among staff, the ability to recognize opportunities for improvement, a comprehensive understanding of how services are produced and delivered, and routine use of expertise in daily work. Addressing these technical barriers requires knowledge creation activities (training, seminar, and education) [ 39 , 42 , 53 , 69 , 86 , 90 , 91 ], availability of quality data [ 86 ], reliable information [ 92 ], and a manual-online hybrid reporting system [ 85 ].

Structural dimension barriers of CQI include inadequate communication channels and lack of standardized process, specifically weak physician-to-physician synergies [ 36 ], lack of mechanisms for disseminating knowledge and limited use of communication mechanisms [ 86 ]. Lack of communication mechanism endangers sharing ideas and feedback among CQI teams, leading to misunderstandings, limited participation and misinterpretations, and a lack of learning [ 113 ]. Knowledge translation facilitates the co-production of research, subsequent diffusion of knowledge, and the developing stakeholder’s capacity and skills [ 114 ]. Thus, the absence of a knowledge translation mechanism may cause missed opportunities for learning, inefficient problem-solving, and limited creativity. To overcome these challenges, organizations should establish effective communication and information systems [ 86 , 93 ] and learning systems [ 92 ]. Though CQI and knowledge translation have interacted with each other, it is essential to recognize that they are distinct. CQI focuses on process improvement within health care systems, aiming to optimize existing processes, reduce errors, and enhance efficiency.

In contrast, knowledge translation bridges the gap between research evidence and clinical practice, translating research findings into actionable knowledge for practitioners. While both CQI and knowledge translation aim to enhance health care quality and patient outcomes, they employ different strategies: CQI utilizes tools like Plan-Do-Study-Act cycles and statistical process control, while knowledge translation involves knowledge synthesis and dissemination. Additionally, knowledge translation can also serve as a strategy to enhance CQI. Both concepts share the same principle: continuous improvement is essential for both. Therefore, effective strategies on the structural dimension may build efficient and effective steering councils, information systems, and structures to diffuse learning throughout the organization.

Strategic factors, such as goals, planning, funds, and resources, determine the overall purpose of CQI initiatives. Specific barriers were improper goals and poor planning [ 36 , 86 , 88 ], fragmentation of quality assurance policies [ 87 ], inadequate reinforcement to staff [ 36 , 90 ], time constraints [ 85 , 86 ], resource inadequacy [ 86 ], and work overload [ 86 ]. These barriers can be addressed through strengthening leadership [ 86 , 87 ], CQI-based mentoring [ 94 ], periodic monitoring, supportive supervision and coaching [ 43 , 53 , 87 , 92 , 95 ], participation, empowerment, and accountability [ 67 ], involving all stakeholders in decision-making [ 86 , 87 ], a provider-payer partnership [ 64 ], and compensating staff for after-hours meetings on CQI [ 85 ]. The strategic dimension, characterized by a strategic plan and integrated CQI efforts, is devoted to processes that are central to achieving strategic priorities. Roles and responsibilities are defined in terms of integrated strategic and quality-related goals [ 115 ].

The utmost goal of CQI has been to improve the quality of care, which is usually revealed by structure, process, and outcome. After resolving challenges and effectively using tools and running models, the goal of CQI reflects the ultimate reason and purpose of its implementation. First, effectively implemented CQI initiatives can improve leadership, health financing, health workforce development, health information technology, and availability of supplies as the building blocks of a health system [ 31 , 48 , 53 , 68 , 98 ]. Second, effectively implemented CQI initiatives improved care delivery process (counselling, adherence with standards, coordination, collaboration, and linkages) [ 48 , 53 , 65 , 68 ]. Third, the CQI can improve outputs of healthcare delivery, such as satisfaction, accessibility (timely access, utilization), continuity of care, safety, efficiency, and acceptability [ 52 , 54 , 55 , 76 , 78 ]. Finally, the effectiveness of the CQI initiatives has been tested in enhancing responses related to key aspects of the HIV response, maternal and child health, non-communicable disease control, and others (e.g., surgery and peritonitis). However, it is worth noting that CQI initiative has not always been effective. For instance, CQI using a two- to nine-times audit cycle model through systems assessment tools did not bring significant change to increase syphilis testing performance [ 116 ]. This study was conducted within the context of Aboriginal and Torres Strait Islander people’s primary health care settings. Notably, ‘the clinics may not have consistently prioritized syphilis testing performance in their improvement strategies, as facilitated by the CQI program’ [ 116 ]. Additionally, by applying CQI-based mentoring, uptake of facility-based interventions was not significantly improved, though it was effective in increasing community health worker visits during pregnancy and the postnatal period, knowledge about maternal and child health and exclusive breastfeeding practice, and HIV disclosure status [ 117 ]. The study conducted in South Africa revealed no significant association between the coverage of facility-based interventions and Continuous Quality Improvement (CQI) implementation. This lack of association was attributed to the already high antenatal and postnatal attendance rates in both control and intervention groups at baseline, leaving little room for improvement. Additionally, the coverage of HIV interventions remained consistently high throughout the study period [ 117 ].

Regarding health care and policy implications, CQI has played a vital role in advancing PHC and fostering the realization of UHC goals worldwide. The indicators found in Donabedian’s framework that are positively influenced by CQI efforts are comparable to those included in the PHC performance initiative’s conceptual framework [ 29 , 118 , 119 ]. It is clearly explained that PHC serves as the roadmap to realizing the vision of UHC [ 120 , 121 ]. Given these circumstances, implementing CQI can contribute to the achievement of PHC principles and the objectives of UHC. For instance, by implementing CQI methods, countries have enhanced the accessibility, affordability, and quality of PHC services, leading to better health outcomes for their populations. CQI has facilitated identifying and resolving healthcare gaps and inefficiencies, enabling countries to optimize resource allocation and deliver more effective and patient-centered care. However, it is crucial to recognize that the successful implementation of Continuous Quality Improvement (CQI) necessitates optimizing the duration of each cycle, understanding challenges and barriers that extend beyond the health system and settings, and acknowledging that its effectiveness may be compromised if these challenges are not adequately addressed.

Despite abundant literature, there are still gaps regarding the relationship between CQI and other dimensions within the healthcare system. No studies have examined the impact of CQI initiatives on catastrophic health expenditure, effective service coverage, patient-centredness, comprehensiveness, equity, health security, and responsiveness.

Limitations

In conducting this review, it has some limitations to consider. Firstly, only articles published in English were included, which may introduce the exclusion of relevant non-English articles. Additionally, as this review follows a scoping methodology, the focus is on synthesising available evidence rather than critically evaluating or scoring the quality of the included articles.

Continuous quality improvement is investigated as a continuous and ongoing intervention, where the implementation time can vary across different cycles. The CQI team and implementation timelines were critical elements of CQI in different models. Among the commonly used approaches, the PDSA or PDCA is frequently employed. In most CQI models, a wide range of tools, nineteen tools, are commonly utilized to support the improvement process. Cultural, technical, structural, and strategic barriers and facilitators are significant in implementing CQI initiatives. Implementing the CQI initiative aims to improve health system blocks, enhance health service delivery process and output, and ultimately prevent morbidity and reduce mortality. For future researchers, considering that CQI is context-dependent approach, conducting scale-up implementation research about catastrophic health expenditure, effective service coverage, patient-centredness, comprehensiveness, equity, health security, and responsiveness across various settings and health issues would be valuable.

Availability of data and materials

The data used and/or analyzed during the current study are available in this manuscript and/or the supplementary file.

Shewhart WA, Deming WE. Memoriam: Walter A. Shewhart, 1891–1967. Am Stat. 1967;21(2):39–40.

Article   Google Scholar  

Shewhart WA. Statistical method from the viewpoint of quality control. New York: Dover; 1986. ISBN 978-0486652320. OCLC 13822053. Reprint. Originally published: Washington, DC: Graduate School of the Department of Agriculture, 1939.

Moen R, editor Foundation and History of the PDSA Cycle. Asian network for quality conference Tokyo. https://www.deming.org/sites/default/files/pdf/2015/PDSA_History_Ron_MoenPdf . 2009.

Kuperman G, James B, Jacobsen J, Gardner RM. Continuous quality improvement applied to medical care: experiences at LDS hospital. Med Decis Making. 1991;11(4suppl):S60–65.

Article   CAS   PubMed   Google Scholar  

Singh J, Singh H. Continuous improvement philosophy–literature review and directions. Benchmarking: An International Journal. 2015;22(1):75–119.

Goldstone J. Presidential address: Sony, Porsche, and vascular surgery in the 21st century. J Vasc Surg. 1997;25(2):201–10.

Radawski D. Continuous quality improvement: origins, concepts, problems, and applications. J Physician Assistant Educ. 1999;10(1):12–6.

Shortell SM, O’Brien JL, Carman JM, Foster RW, Hughes E, Boerstler H, et al. Assessing the impact of continuous quality improvement/total quality management: concept versus implementation. Health Serv Res. 1995;30(2):377.

CAS   PubMed   PubMed Central   Google Scholar  

Lohr K. Quality of health care: an introduction to critical definitions, concepts, principles, and practicalities. Striving for quality in health care. 1991.

Berwick DM. The clinical process and the quality process. Qual Manage Healthc. 1992;1(1):1–8.

Article   CAS   Google Scholar  

Gift B. On the road to TQM. Food Manage. 1992;27(4):88–9.

CAS   PubMed   Google Scholar  

Greiner A, Knebel E. The core competencies needed for health care professionals. health professions education: A bridge to quality. 2003:45–73.

McCalman J, Bailie R, Bainbridge R, McPhail-Bell K, Percival N, Askew D et al. Continuous quality improvement and comprehensive primary health care: a systems framework to improve service quality and health outcomes. Front Public Health. 2018:6 (76):1–6.

Sheingold BH, Hahn JA. The history of healthcare quality: the first 100 years 1860–1960. Int J Afr Nurs Sci. 2014;1:18–22.

Google Scholar  

Donabedian A. Evaluating the quality of medical care. Milbank Q. 1966;44(3):166–206.

Institute of Medicine (US) Committee on Quality of Health Care in America. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington (DC): National Academies Press (US). 2001. 2, Improving the 21st-century Health Care System. Available from: https://www.ncbi.nlm.nih.gov/books/NBK222265/ .

Rubinstein A, Barani M, Lopez AS. Quality first for effective universal health coverage in low-income and middle-income countries. Lancet Global Health. 2018;6(11):e1142–1143.

Article   PubMed   Google Scholar  

Agency for Healthcare Reserach and Quality. Quality Improvement and monitoring at your fingertips USA,: Agency for Healthcare Reserach and Quality. 2022. Available from: https://qualityindicators.ahrq.gov/ .

Anderson CA, Cassidy B, Rivenburgh P. Implementing continuous quality improvement (CQI) in hospitals: lessons learned from the International Quality Study. Qual Assur Health Care. 1991;3(3):141–6.

Gardner K, Mazza D. Quality in general practice - definitions and frameworks. Aust Fam Physician. 2012;41(3):151–4.

PubMed   Google Scholar  

Loper AC, Jensen TM, Farley AB, Morgan JD, Metz AJ. A systematic review of approaches for continuous quality improvement capacity-building. J Public Health Manage Pract. 2022;28(2):E354.

Hill JE, Stephani A-M, Sapple P, Clegg AJ. The effectiveness of continuous quality improvement for developing professional practice and improving health care outcomes: a systematic review. Implement Sci. 2020;15(1):1–14.

Candas B, Jobin G, Dubé C, Tousignant M, Abdeljelil AB, Grenier S, et al. Barriers and facilitators to implementing continuous quality improvement programs in colonoscopy services: a mixed methods systematic review. Endoscopy Int Open. 2016;4(02):E118–133.

Peters MD, Marnie C, Colquhoun H, Garritty CM, Hempel S, Horsley T, et al. Scoping reviews: reinforcing and advancing the methodology and application. Syst Reviews. 2021;10(1):1–6.

Arksey H, O’Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol. 2005;8(1):19–32.

Tricco AC, Lillie E, Zarin W, O’Brien KK, Colquhoun H, Levac D, et al. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. 2018;169(7):467–73.

McGowan J, Straus S, Moher D, Langlois EV, O’Brien KK, Horsley T, et al. Reporting scoping reviews—PRISMA ScR extension. J Clin Epidemiol. 2020;123:177–9.

Donabedian A. Explorations in quality assessment and monitoring: the definition of quality and approaches to its assessment. Health Administration Press, Ann Arbor. 1980;1.

World Health Organization. Operational framework for primary health care: transforming vision into action. Geneva: World Health Organization and the United Nations Children’s Fund (UNICEF); 2020 [updated 14 December 2020; cited 2023 Nov Oct 17]. Available from: https://www.who.int/publications/i/item/9789240017832 .

The Joanna Briggs Institute. The Joanna Briggs Institute Reviewers’ Manual :2014 edition. Australia: The Joanna Briggs Institute. 2014:88–91.

Rihal CS, Kamath CC, Holmes DR Jr, Reller MK, Anderson SS, McMurtry EK, et al. Economic and clinical outcomes of a physician-led continuous quality improvement intervention in the delivery of percutaneous coronary intervention. Am J Manag Care. 2006;12(8):445–52.

Ade-Oshifogun JB, Dufelmeier T. Prevention and Management of Do not return notices: a quality improvement process for Supplemental staffing nursing agencies. Nurs Forum. 2012;47(2):106–12.

Rubenstein L, Khodyakov D, Hempel S, Danz M, Salem-Schatz S, Foy R, et al. How can we recognize continuous quality improvement? Int J Qual Health Care. 2014;26(1):6–15.

O’Neill SM, Hempel S, Lim YW, Danz MS, Foy R, Suttorp MJ, et al. Identifying continuous quality improvement publications: what makes an improvement intervention ‘CQI’? BMJ Qual Saf. 2011;20(12):1011–9.

Article   PubMed   PubMed Central   Google Scholar  

Sibthorpe B, Gardner K, McAullay D. Furthering the quality agenda in Aboriginal community controlled health services: understanding the relationship between accreditation, continuous quality improvement and national key performance indicator reporting. Aust J Prim Health. 2016;22(4):270–5.

Bennett CL, Crane JM. Quality improvement efforts in oncology: are we ready to begin? Cancer Invest. 2001;19(1):86–95.

VanValkenburgh DA. Implementing continuous quality improvement at the facility level. Adv Ren Replace Ther. 2001;8(2):104–13.

Loper AC, Jensen TM, Farley AB, Morgan JD, Metz AJ. A systematic review of approaches for continuous quality improvement capacity-building. J Public Health Manage Practice. 2022;28(2):E354–361.

Ryan M. Achieving and sustaining quality in healthcare. Front Health Serv Manag. 2004;20(3):3–11.

Nicolucci A, Allotta G, Allegra G, Cordaro G, D’Agati F, Di Benedetto A, et al. Five-year impact of a continuous quality improvement effort implemented by a network of diabetes outpatient clinics. Diabetes Care. 2008;31(1):57–62.

Wakefield BJ, Blegen MA, Uden-Holman T, Vaughn T, Chrischilles E, Wakefield DS. Organizational culture, continuous quality improvement, and medication administration error reporting. Am J Med Qual. 2001;16(4):128–34.

Sori DA, Debelew GT, Degefa LS, Asefa Z. Continuous quality improvement strategy for increasing immediate postpartum long-acting reversible contraceptive use at Jimma University Medical Center, Jimma, Ethiopia. BMJ Open Qual. 2023;12(1):e002051.

Roche B, Robin C, Deleaval PJ, Marti MC. Continuous quality improvement in ambulatory surgery: the non-attending patient. Ambul Surg. 1998;6(2):97–100.

O’Connor JB, Sondhi SS, Mullen KD, McCullough AJ. A continuous quality improvement initiative reduces inappropriate prescribing of prophylactic antibiotics for endoscopic procedures. Am J Gastroenterol. 1999;94(8):2115–21.

Ursu A, Greenberg G, McKee M. Continuous quality improvement methodology: a case study on multidisciplinary collaboration to improve chlamydia screening. Fam Med Community Health. 2019;7(2):e000085.

Quick B, Nordstrom S, Johnson K. Using continuous quality improvement to implement evidence-based medicine. Lippincotts Case Manag. 2006;11(6):305–15 ( quiz 16 – 7 ).

Oyeledun B, Phillips A, Oronsaye F, Alo OD, Shaffer N, Osibo B, et al. The effect of a continuous quality improvement intervention on retention-in-care at 6 months postpartum in a PMTCT Program in Northern Nigeria: results of a cluster randomized controlled study. J Acquir Immune Defic Syndr. 2017;75(Suppl 2):S156–164.

Nyengerai T, Phohole M, Iqaba N, Kinge CW, Gori E, Moyo K, et al. Quality of service and continuous quality improvement in voluntary medical male circumcision programme across four provinces in South Africa: longitudinal and cross-sectional programme data. PLoS ONE. 2021;16(8):e0254850.

Article   CAS   PubMed   PubMed Central   Google Scholar  

Wang J, Zhang H, Liu J, Zhang K, Yi B, Liu Y, et al. Implementation of a continuous quality improvement program reduces the occurrence of peritonitis in PD. Ren Fail. 2014;36(7):1029–32.

Stikes R, Barbier D. Applying the plan-do-study-act model to increase the use of kangaroo care. J Nurs Manag. 2013;21(1):70–8.

Wagner AD, Mugo C, Bluemer-Miroite S, Mutiti PM, Wamalwa DC, Bukusi D, et al. Continuous quality improvement intervention for adolescent and young adult HIV testing services in Kenya improves HIV knowledge. AIDS. 2017;31(Suppl 3):S243–252.

Le RD, Melanson SE, Santos KS, Paredes JD, Baum JM, Goonan EM, et al. Using lean principles to optimise inpatient phlebotomy services. J Clin Pathol. 2014;67(8):724–30.

Manyazewal T, Mekonnen A, Demelew T, Mengestu S, Abdu Y, Mammo D, et al. Improving immunization capacity in Ethiopia through continuous quality improvement interventions: a prospective quasi-experimental study. Infect Dis Poverty. 2018;7:7.

Kamiya Y, Ishijma H, Hagiwara A, Takahashi S, Ngonyani HAM, Samky E. Evaluating the impact of continuous quality improvement methods at hospitals in Tanzania: a cluster-randomized trial. Int J Qual Health Care. 2017;29(1):32–9.

Kibbe DC, Bentz E, McLaughlin CP. Continuous quality improvement for continuity of care. J Fam Pract. 1993;36(3):304–8.

Adrawa N, Ongiro S, Lotee K, Seret J, Adeke M, Izudi J. Use of a context-specific package to increase sputum smear monitoring among people with pulmonary tuberculosis in Uganda: a quality improvement study. BMJ Open Qual. 2023;12(3):1–6.

Hunt P, Hunter SB, Levan D. Continuous quality improvement in substance abuse treatment facilities: how much does it cost? J Subst Abuse Treat. 2017;77:133–40.

Azadeh A, Ameli M, Alisoltani N, Motevali Haghighi S. A unique fuzzy multi-control approach for continuous quality improvement in a radio therapy department. Qual Quantity. 2016;50(6):2469–93.

Memiah P, Tlale J, Shimabale M, Nzyoka S, Komba P, Sebeza J, et al. Continuous quality improvement (CQI) institutionalization to reach 95:95:95 HIV targets: a multicountry experience from the Global South. BMC Health Serv Res. 2021;21(1):711.

Yapa HM, De Neve JW, Chetty T, Herbst C, Post FA, Jiamsakul A, et al. The impact of continuous quality improvement on coverage of antenatal HIV care tests in rural South Africa: results of a stepped-wedge cluster-randomised controlled implementation trial. PLoS Med. 2020;17(10):e1003150.

Dadi TL, Abebo TA, Yeshitla A, Abera Y, Tadesse D, Tsegaye S, et al. Impact of quality improvement interventions on facility readiness, quality and uptake of maternal and child health services in developing regions of Ethiopia: a secondary analysis of programme data. BMJ Open Qual. 2023;12(4):e002140.

Weinberg M, Fuentes JM, Ruiz AI, Lozano FW, Angel E, Gaitan H, et al. Reducing infections among women undergoing cesarean section in Colombia by means of continuous quality improvement methods. Arch Intern Med. 2001;161(19):2357–65.

Andreoni V, Bilak Y, Bukumira M, Halfer D, Lynch-Stapleton P, Perez C. Project management: putting continuous quality improvement theory into practice. J Nurs Care Qual. 1995;9(3):29–37.

Balfour ME, Zinn TE, Cason K, Fox J, Morales M, Berdeja C, et al. Provider-payer partnerships as an engine for continuous quality improvement. Psychiatric Serv. 2018;69(6):623–5.

Agurto I, Sandoval J, De La Rosa M, Guardado ME. Improving cervical cancer prevention in a developing country. Int J Qual Health Care. 2006;18(2):81–6.

Anderson CI, Basson MD, Ali M, Davis AT, Osmer RL, McLeod MK, et al. Comprehensive multicenter graduate surgical education initiative incorporating entrustable professional activities, continuous quality improvement cycles, and a web-based platform to enhance teaching and learning. J Am Coll Surg. 2018;227(1):64–76.

Benjamin S, Seaman M. Applying continuous quality improvement and human performance technology to primary health care in Bahrain. Health Care Superv. 1998;17(1):62–71.

Byabagambi J, Marks P, Megere H, Karamagi E, Byakika S, Opio A, et al. Improving the quality of voluntary medical male circumcision through use of the continuous quality improvement approach: a pilot in 30 PEPFAR-Supported sites in Uganda. PLoS ONE. 2015;10(7):e0133369.

Hogg S, Roe Y, Mills R. Implementing evidence-based continuous quality improvement strategies in an urban Aboriginal Community Controlled Health Service in South East Queensland: a best practice implementation pilot. JBI Database Syst Rev Implement Rep. 2017;15(1):178–87.

Hopper MB, Morgan S. Continuous quality improvement initiative for pressure ulcer prevention. J Wound Ostomy Cont Nurs. 2014;41(2):178–80.

Ji J, Jiang DD, Xu Z, Yang YQ, Qian KY, Zhang MX. Continuous quality improvement of nutrition management during radiotherapy in patients with nasopharyngeal carcinoma. Nurs Open. 2021;8(6):3261–70.

Chen M, Deng JH, Zhou FD, Wang M, Wang HY. Improving the management of anemia in hemodialysis patients by implementing the continuous quality improvement program. Blood Purif. 2006;24(3):282–6.

Reeves S, Matney K, Crane V. Continuous quality improvement as an ideal in hospital practice. Health Care Superv. 1995;13(4):1–12.

Barton AJ, Danek G, Johns P, Coons M. Improving patient outcomes through CQI: vascular access planning. J Nurs Care Qual. 1998;13(2):77–85.

Buttigieg SC, Gauci D, Dey P. Continuous quality improvement in a Maltese hospital using logical framework analysis. J Health Organ Manag. 2016;30(7):1026–46.

Take N, Byakika S, Tasei H, Yoshikawa T. The effect of 5S-continuous quality improvement-total quality management approach on staff motivation, patients’ waiting time and patient satisfaction with services at hospitals in Uganda. J Public Health Afr. 2015;6(1):486.

PubMed   PubMed Central   Google Scholar  

Jacobson GH, McCoin NS, Lescallette R, Russ S, Slovis CM. Kaizen: a method of process improvement in the emergency department. Acad Emerg Med. 2009;16(12):1341–9.

Agarwal S, Gallo J, Parashar A, Agarwal K, Ellis S, Khot U, et al. Impact of lean six sigma process improvement methodology on cardiac catheterization laboratory efficiency. Catheter Cardiovasc Interv. 2015;85:S119.

Rahul G, Samanta AK, Varaprasad G A Lean Six Sigma approach to reduce overcrowding of patients and improving the discharge process in a super-specialty hospital. In 2020 International Conference on System, Computation, Automation and Networking (ICSCAN) 2020 July 3 (pp. 1-6). IEEE

Patel J, Nattabi B, Long R, Durey A, Naoum S, Kruger E, et al. The 5 C model: A proposed continuous quality improvement framework for volunteer dental services in remote Australian Aboriginal communities. Community Dent Oral Epidemiol. 2023;51(6):1150–8.

Van Acker B, McIntosh G, Gudes M. Continuous quality improvement techniques enhance HMO members’ immunization rates. J Healthc Qual. 1998;20(2):36–41.

Horine PD, Pohjala ED, Luecke RW. Healthcare financial managers and CQI. Healthc Financ Manage. 1993;47(9):34.

Reynolds JL. Reducing the frequency of episiotomies through a continuous quality improvement program. CMAJ. 1995;153(3):275–82.

Bunik M, Galloway K, Maughlin M, Hyman D. First five quality improvement program increases adherence and continuity with well-child care. Pediatr Qual Saf. 2021;6(6):e484.

Boyle TA, MacKinnon NJ, Mahaffey T, Duggan K, Dow N. Challenges of standardized continuous quality improvement programs in community pharmacies: the case of SafetyNET-Rx. Res Social Adm Pharm. 2012;8(6):499–508.

Price A, Schwartz R, Cohen J, Manson H, Scott F. Assessing continuous quality improvement in public health: adapting lessons from healthcare. Healthc Policy. 2017;12(3):34–49.

Gage AD, Gotsadze T, Seid E, Mutasa R, Friedman J. The influence of continuous quality improvement on healthcare quality: a mixed-methods study from Zimbabwe. Soc Sci Med. 2022;298:114831.

Chan YC, Ho SJ. Continuous quality improvement: a survey of American and Canadian healthcare executives. Hosp Health Serv Adm. 1997;42(4):525–44.

Balas EA, Puryear J, Mitchell JA, Barter B. How to structure clinical practice guidelines for continuous quality improvement? J Med Syst. 1994;18(5):289–97.

ElChamaa R, Seely AJE, Jeong D, Kitto S. Barriers and facilitators to the implementation and adoption of a continuous quality improvement program in surgery: a case study. J Contin Educ Health Prof. 2022;42(4):227–35.

Candas B, Jobin G, Dubé C, Tousignant M, Abdeljelil A, Grenier S, et al. Barriers and facilitators to implementing continuous quality improvement programs in colonoscopy services: a mixed methods systematic review. Endoscopy Int Open. 2016;4(2):E118–133.

Brandrud AS, Schreiner A, Hjortdahl P, Helljesen GS, Nyen B, Nelson EC. Three success factors for continual improvement in healthcare: an analysis of the reports of improvement team members. BMJ Qual Saf. 2011;20(3):251–9.

Lee S, Choi KS, Kang HY, Cho W, Chae YM. Assessing the factors influencing continuous quality improvement implementation: experience in Korean hospitals. Int J Qual Health Care. 2002;14(5):383–91.

Horwood C, Butler L, Barker P, Phakathi S, Haskins L, Grant M, et al. A continuous quality improvement intervention to improve the effectiveness of community health workers providing care to mothers and children: a cluster randomised controlled trial in South Africa. Hum Resour Health. 2017;15(1):39.

Hyrkäs K, Lehti K. Continuous quality improvement through team supervision supported by continuous self-monitoring of work and systematic patient feedback. J Nurs Manag. 2003;11(3):177–88.

Akdemir N, Peterson LN, Campbell CM, Scheele F. Evaluation of continuous quality improvement in accreditation for medical education. BMC Med Educ. 2020;20(Suppl 1):308.

Barzansky B, Hunt D, Moineau G, Ahn D, Lai CW, Humphrey H, et al. Continuous quality improvement in an accreditation system for undergraduate medical education: benefits and challenges. Med Teach. 2015;37(11):1032–8.

Gaylis F, Nasseri R, Salmasi A, Anderson C, Mohedin S, Prime R, et al. Implementing continuous quality improvement in an integrated community urology practice: lessons learned. Urology. 2021;153:139–46.

Gaga S, Mqoqi N, Chimatira R, Moko S, Igumbor JO. Continuous quality improvement in HIV and TB services at selected healthcare facilities in South Africa. South Afr J HIV Med. 2021;22(1):1202.

Wang F, Yao D. Application effect of continuous quality improvement measures on patient satisfaction and quality of life in gynecological nursing. Am J Transl Res. 2021;13(6):6391–8.

Lee SB, Lee LL, Yeung RS, Chan J. A continuous quality improvement project to reduce medication error in the emergency department. World J Emerg Med. 2013;4(3):179–82.

Chiang AA, Lee KC, Lee JC, Wei CH. Effectiveness of a continuous quality improvement program aiming to reduce unplanned extubation: a prospective study. Intensive Care Med. 1996;22(11):1269–71.

Chinnaiyan K, Al-Mallah M, Goraya T, Patel S, Kazerooni E, Poopat C, et al. Impact of a continuous quality improvement initiative on appropriate use of coronary CT angiography: results from a multicenter, statewide registry, the advanced cardiovascular imaging consortium (ACIC). J Cardiovasc Comput Tomogr. 2011;5(4):S29–30.

Gibson-Helm M, Rumbold A, Teede H, Ranasinha S, Bailie R, Boyle J. A continuous quality improvement initiative: improving the provision of pregnancy care for Aboriginal and Torres Strait Islander women. BJOG: Int J Obstet Gynecol. 2015;122:400–1.

Bennett IM, Coco A, Anderson J, Horst M, Gambler AS, Barr WB, et al. Improving maternal care with a continuous quality improvement strategy: a report from the interventions to minimize preterm and low birth weight infants through continuous improvement techniques (IMPLICIT) network. J Am Board Fam Med. 2009;22(4):380–6.

Krall SP, Iv CLR, Donahue L. Effect of continuous quality improvement methods on reducing triage to thrombolytic interval for Acute myocardial infarction. Acad Emerg Med. 1995;2(7):603–9.

Swanson TK, Eilers GM. Physician and staff acceptance of continuous quality improvement. Fam Med. 1994;26(9):583–6.

Yu Y, Zhou Y, Wang H, Zhou T, Li Q, Li T, et al. Impact of continuous quality improvement initiatives on clinical outcomes in peritoneal dialysis. Perit Dial Int. 2014;34(Suppl 2):S43–48.

Schiff GD, Goldfield NI. Deming meets Braverman: toward a progressive analysis of the continuous quality improvement paradigm. Int J Health Serv. 1994;24(4):655–73.

American Hospital Association Division of Quality Resources Chicago, IL: The role of hospital leadership in the continuous improvement of patient care quality. American Hospital Association. J Healthc Qual. 1992;14(5):8–14,22.

Scriven M. The Logic and Methodology of checklists [dissertation]. Western Michigan University; 2000.

Hales B, Terblanche M, Fowler R, Sibbald W. Development of medical checklists for improved quality of patient care. Int J Qual Health Care. 2008;20(1):22–30.

Vermeir P, Vandijck D, Degroote S, Peleman R, Verhaeghe R, Mortier E, et al. Communication in healthcare: a narrative review of the literature and practical recommendations. Int J Clin Pract. 2015;69(11):1257–67.

Eljiz K, Greenfield D, Hogden A, Taylor R, Siddiqui N, Agaliotis M, et al. Improving knowledge translation for increased engagement and impact in healthcare. BMJ open Qual. 2020;9(3):e000983.

O’Brien JL, Shortell SM, Hughes EF, Foster RW, Carman JM, Boerstler H, et al. An integrative model for organization-wide quality improvement: lessons from the field. Qual Manage Healthc. 1995;3(4):19–30.

Adily A, Girgis S, D’Este C, Matthews V, Ward JE. Syphilis testing performance in Aboriginal primary health care: exploring impact of continuous quality improvement over time. Aust J Prim Health. 2020;26(2):178–83.

Horwood C, Butler L, Barker P, Phakathi S, Haskins L, Grant M, et al. A continuous quality improvement intervention to improve the effectiveness of community health workers providing care to mothers and children: a cluster randomised controlled trial in South Africa. Hum Resour Health. 2017;15:1–11.

Veillard J, Cowling K, Bitton A, Ratcliffe H, Kimball M, Barkley S, et al. Better measurement for performance improvement in low- and middle-income countries: the primary Health Care Performance Initiative (PHCPI) experience of conceptual framework development and indicator selection. Milbank Q. 2017;95(4):836–83.

Barbazza E, Kringos D, Kruse I, Klazinga NS, Tello JE. Creating performance intelligence for primary health care strengthening in Europe. BMC Health Serv Res. 2019;19(1):1006.

Assefa Y, Hill PS, Gilks CF, Admassu M, Tesfaye D, Van Damme W. Primary health care contributions to universal health coverage. Ethiopia Bull World Health Organ. 2020;98(12):894.

Van Weel C, Kidd MR. Why strengthening primary health care is essential to achieving universal health coverage. CMAJ. 2018;190(15):E463–466.

Download references

Acknowledgements

Not applicable.

The authors received no fund.

Author information

Authors and affiliations.

School of Public Health, The University of Queensland, Brisbane, Australia

Aklilu Endalamaw, Resham B Khatri, Tesfaye Setegn Mengistu, Daniel Erku & Yibeltal Assefa

College of Medicine and Health Sciences, Bahir Dar University, Bahir Dar, Ethiopia

Aklilu Endalamaw & Tesfaye Setegn Mengistu

Health Social Science and Development Research Institute, Kathmandu, Nepal

Resham B Khatri

Centre for Applied Health Economics, School of Medicine, Grifth University, Brisbane, Australia

Daniel Erku

Menzies Health Institute Queensland, Grifth University, Brisbane, Australia

International Institute for Primary Health Care in Ethiopia, Addis Ababa, Ethiopia

Eskinder Wolka & Anteneh Zewdie

You can also search for this author in PubMed   Google Scholar

Contributions

AE conceptualized the study, developed the first draft of the manuscript, and managing feedbacks from co-authors. YA conceptualized the study, provided feedback, and supervised the whole processes. RBK provided feedback throughout. TSM provided feedback throughout. DE provided feedback throughout. EW provided feedback throughout. AZ provided feedback throughout. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Aklilu Endalamaw .

Ethics declarations

Ethics approval and consent to participate.

Not applicable because this research is based on publicly available articles.

Consent for publication

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Supplementary material 1., supplementary material 2., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Endalamaw, A., Khatri, R.B., Mengistu, T.S. et al. A scoping review of continuous quality improvement in healthcare system: conceptualization, models and tools, barriers and facilitators, and impact. BMC Health Serv Res 24 , 487 (2024). https://doi.org/10.1186/s12913-024-10828-0

Download citation

Received : 27 December 2023

Accepted : 05 March 2024

Published : 19 April 2024

DOI : https://doi.org/10.1186/s12913-024-10828-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Continuous quality improvement
  • Quality of Care

BMC Health Services Research

ISSN: 1472-6963

what is existing data in research

NIH OBSSR

  • Notices of Funding Opportunities
  • NIH RePORTER
  • Co-Funding Activities
  • BSSR Clinical Trials
  • Reports and Publications
  • OBSSR-Supported Training
  • Online Training Resources
  • OBSSR Connector Monthly Newsletter
  • The Director’s Voice Blog
  • Research Spotlights
  • BSSR News and Announcements
  • All Upcoming Events and Meetings
  • NIH Behavioral and Social Sciences Research Festival
  • NIH Matilda White Riley Behavioral and Social Sciences Honors
  • OBSSR Director’s Webinar Series
  • Mission and History
  • BSSR Definition
  • BSSR Accomplishments
  • Strategic Plan
  • NIH Behavioral and Social Sciences Research Coordinating Committee (BSSR-CC)
  • Coordination of NIH-wide Initiatives
  • Staff Directory

  • News and Events

A New Model for Studying Social Isolation and Health in People with Serious Mental Illnesses

Researchers have developed a promising new framework for studying the link between social disconnection and poor physical health in people living with serious mental illnesses (SMI). Drawing on published research from animal models and data from the general population, this framework builds on existing social isolation and loneliness models by integrating insights from evolutionary and cognitive theories. This research was supported by the Office of Behavioral and Social Sciences Research and the National Institute of Mental Health.

What were the researchers studying and why?

One of the most challenging aspects of living with SMI is difficulties with social perception, motivation, and social behaviors. These difficulties can lead to social withdrawal and loneliness, outcomes that can contribute to poor heart health and early death. However, researchers have an incomplete understanding of how differences in the brain functions in people living with SMIs impact the connection between their social perception and self-reported, lived experience of social withdrawal, isolation, or loneliness.

How did the researchers conduct the study?

Researchers from Boston University and Harvard Medical School conducted a selective narrative review of studies addressing social withdrawal, isolation, loneliness, and health in SMI.

Their review highlighted evidence indicating differences in brain activity between people experiencing loneliness and those who are not, particularly in regions associated with social cognitive processes. Additionally, neuroimaging studies have shown increased activation in brain areas responsible for risk assessment among lonely individuals.

Furthermore, the researchers discussed findings suggesting that individuals experiencing loneliness, who perceive others negatively and exhibit signs of psychopathology, may misinterpret social cues, leading to social disconnection. Over time, this social disconnection can prompt a defensive response to social situations, further reducing motivation for social interaction.

What did the study results show?

Based on a synthesis of recent findings that indicate a causal relationship between loneliness and nervous system responses in the human body that cause inflammation and reduce immunity, the authors developed a testable model of the psychological and neural mechanisms of social disconnection in SMI. They hypothesize that people living with SMI are more likely to experience high levels of chronic psychological stress and therefore, more likely to experience persistently high levels of physiological inflammation. Stress and inflammation biomarkers can serve as indicators of an unmet need for social connection. Health providers and caregivers could use these indicators to provide social support and connection to those experiencing this need.

What is the potential impact of these findings?

The authors suggest that once their hypothesis has been rigorously tested and verified, new methods to improve health outcomes for people living with SMI may be developed, including potential “just-in-time” digital interventions through mobile devices. The authors also suggest that people living with SMI and experiencing loneliness can receive interventions that address any potential negative beliefs they hold about rejection, thus interrupting the cycle of social isolation.

Citation: Fulford D, Holt DJ. Social Withdrawal, Loneliness, and Health in Schizophrenia: Psychological and Neural Mechanisms . Schizophr Bull. 2023 Sep 7;49(5):1138-1149. doi: 10.1093/schbul/sbad099. PMID: 37419082; PMCID: PMC10483452.

31 Center Drive, Building 31, Room B1C19 Bethesda, MD 20892

Email: [email protected]

Phone: 301-402-1146

NIH Virtual Tour

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

  • America’s Abortion Quandary

Methodology

Table of contents.

  • Abortion at various stages of pregnancy 
  • Abortion and circumstances of pregnancy 
  • Parental notification for minors seeking abortion
  • Penalties for abortions performed illegally 
  • Public views of what would change the number of abortions in the U.S.
  • A majority of Americans say women should have more say in setting abortion policy in the U.S.
  • How do certain arguments about abortion resonate with Americans?
  • In their own words: How Americans feel about abortion 
  • Personal connections to abortion 
  • Religion’s impact on views about abortion
  • Acknowledgments
  • The American Trends Panel survey methodology

The American Trends Panel (ATP), created by Pew Research Center, is a nationally representative panel of randomly selected U.S. adults. Panelists participate via self-administered web surveys. Panelists who do not have internet access at home are provided with a tablet and wireless internet connection. Interviews are conducted in both English and Spanish. The panel is being managed by Ipsos.

Data in this report is drawn from the panel wave conducted March 7-13, 2022. A total of 10,441 panelists responded out of 11,687 who were sampled, for a response rate of 89%. The cumulative response rate accounting for nonresponse to the recruitment surveys and attrition is 3%. The break-off rate among panelists who logged on to the survey and completed at least one item is 1%. The margin of sampling error for the full sample of 10,441 respondents is plus or minus 1.5 percentage points. 

Panel recruitment

American Trends Panel recruitment surveys

The ATP was created in 2014, with the first cohort of panelists invited to join the panel at the end of a large, national, landline and cellphone random-digit-dial survey that was conducted in both English and Spanish. Two additional recruitments were conducted using the same method in 2015 and 2017, respectively. Across these three surveys, a total of 19,718 adults were invited to join the ATP, of whom 9,942 (50%) agreed to participate.

In August 2018, the ATP switched from telephone to address-based recruitment. Invitations were sent to a stratified, random sample of households selected from the U.S. Postal Service’s Delivery Sequence File. Sampled households receive mailings asking a randomly selected adult to complete a survey online. A question at the end of the survey asks if the respondent is willing to join the ATP. Starting in 2020, another stage was added to the recruitment. Households that do not respond to the online survey are sent a paper version of the questionnaire, $5 and a postage-paid return envelope. A subset of the adults returning the paper version of the survey are invited to join the ATP. This subset of adults receive a follow-up mailing with a $10 pre-incentive and invitation to join the ATP.

Across the four address-based recruitments, a total of 19,822 adults were invited to join the ATP, of whom 17,472 agreed to join the panel and completed an initial profile survey. In each household, the adult with the next birthday was asked to go online to complete a survey, at the end of which they were invited to join the panel. Of the 27,414 individuals who have ever joined the ATP, 11,687 remained active panelists and continued to receive survey invitations at the time this survey was conducted.

The U.S. Postal Service’s Delivery Sequence File has been estimated to cover as much as 98% of the population, although some studies suggest that the coverage could be in the low 90% range. 2 The American Trends Panel never uses breakout routers or chains that direct respondents to additional surveys.

Sample design

The overall target population for this survey was non-institutionalized persons ages 18 and older, living in the U.S., including Alaska and Hawaii. 

Questionnaire development and testing

The questionnaire was developed by Pew Research Center in consultation with Ipsos. The web program was rigorously tested on both PC and mobile devices by the Ipsos project management team and Pew Research Center researchers. The Ipsos project management team also populated test data that was analyzed in SPSS to ensure the logic and randomizations were working as intended before launching the survey. 

All respondents were offered a post-paid incentive for their participation. Respondents could choose to receive the post-paid incentive in the form of a check or a gift code to Amazon.com or could choose to decline the incentive. Incentive amounts ranged from $5 to $20 depending on whether the respondent belongs to a part of the population that is harder or easier to reach. Differential incentive amounts were designed to increase panel survey participation among groups that traditionally have low survey response propensities.

Data collection protocol

The data collection field period for this survey was March 7-13, 2022. Postcard notifications were mailed to all ATP panelists with a known residential address on March 7, 2022.  

Invitations were sent out in two separate launches: Soft Launch and Full Launch. Sixty panelists were included in the soft launch, which began with an initial invitation sent on March 7, 2022. The ATP panelists chosen for the initial soft launch were known responders who had completed previous ATP surveys within one day of receiving their invitation. All remaining English- and Spanish-speaking panelists were included in the full launch and were sent an invitation on March 8, 2022.

All panelists with an email address received an email invitation and up to two email reminders if they did not respond to the survey. All ATP panelists that consented to SMS messages received an SMS invitation and up to two SMS reminders. 

what is existing data in research

Data quality checks

To ensure high-quality data, the Center’s researchers performed data quality checks to identify any respondents showing clear patterns of satisficing. This includes checking for very high rates of leaving questions blank, as well as always selecting the first or last answer presented. As a result of this checking, three ATP respondents were removed from the survey dataset prior to weighting and analysis. 

The ATP data is weighted in a multistep process that accounts for multiple stages of sampling and nonresponse that occur at different points in the survey process. First, each panelist begins with a base weight that reflects their probability of selection for their initial recruitment survey.  The base weights for panelists recruited in different years are scaled to be proportionate to the effective sample size for all active panelists in their cohort and then calibrated to align with the population benchmarks in the accompanying table to correct for nonresponse to recruitment surveys and panel attrition. If only a subsample of panelists was invited to participate in the wave, this weight is adjusted to account for any differential probabilities of selection.

Among the panelists who completed the survey, this weight is then calibrated again to align with the population benchmarks identified in the accompanying table and trimmed at the 1st and 99th percentiles to reduce the loss in precision stemming from variance in the weights. Sampling errors and tests of statistical significance take into account the effect of weighting.

Some of the population benchmarks used for weighting come from surveys conducted prior to the coronavirus outbreak that began in February 2020. However, the weighting variables for panelists recruited in 2021 were measured at the time they were recruited to the panel. Likewise, the profile variables for existing panelists were updated from panel surveys conducted in July or August 2021.

This does not pose a problem for most of the variables used in the weighting, which are quite stable at both the population and individual levels. However, volunteerism may have changed over the intervening period in ways that made their 2021 measurements incompatible with the available (pre-pandemic) benchmarks. To address this, volunteerism is weighted using the profile variables that were measured in 2020. For all other weighting dimensions, the more recent panelist measurements from 2021 are used. 

Weighting dimensions

For panelists recruited in 2021, plausible values were imputed using the 2020 volunteerism values from existing panelists with similar characteristics. This ensures that any patterns of change that were observed in the existing panelists were also reflected in the new recruits when the weighting was performed.

The following table shows the unweighted sample sizes and the error attributable to sampling that would be expected at the 95% level of confidence for different groups in the survey. 

Chart showing unweighted sample sizes and error attributable to sampling that would be expected at the 95% level of confidence for different groups in the survey.

Sample sizes and sampling errors for other subgroups are available upon request. In addition to sampling error, one should bear in mind that question wording and practical difficulties in conducting surveys can introduce error or bias into the findings of opinion polls.

Dispositions and response rates

what is existing data in research

© Pew Research Center, 2022

  • AAPOR Task Force on Address-based Sampling. 2016. “ AAPOR Report: Address-based Sampling .” ↩

Sign up for our weekly newsletter

Fresh data delivery Saturday mornings

Sign up for The Briefing

Weekly updates on the world of news & information

  • Christianity
  • Evangelicalism
  • Political Issues
  • Politics & Policy
  • Protestantism
  • Religion & Abortion
  • Religion & Politics
  • Religion & Social Values

A growing share of Americans have little or no confidence in Netanyahu

In tight presidential race, voters are broadly critical of both biden and trump, what the data says about crime in the u.s., most americans say a free press is highly important to society, majority of u.s. catholics express favorable view of pope francis, most popular, report materials.

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Age & Generations
  • Coronavirus (COVID-19)
  • Economy & Work
  • Family & Relationships
  • Gender & LGBTQ
  • Immigration & Migration
  • International Affairs
  • Internet & Technology
  • Methodological Research
  • News Habits & Media
  • Non-U.S. Governments
  • Other Topics
  • Race & Ethnicity
  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

Copyright 2024 Pew Research Center

Terms & Conditions

Privacy Policy

Cookie Settings

Reprints, Permissions & Use Policy

Students Gather Data On Campus Squirrels As Part Of Urban Wildlife Research Effort

a photo of a squirrel perched on a low oak branch looking at the camera

A longtime favorite of students and frequent subject of fascination across social media , Texas A&M University’s sizeable campus squirrel population is getting even more time in the spotlight this year as student researchers seek to learn more about the animals’ behaviors.

Beginning this month, students in the  Texas A&M College of Agriculture and Life Sciences   Department of Rangeland, Wildlife and Fisheries Management (RWFM) will use various field techniques and statistical analyses to provide quantitative insight into the world of these bushy-tailed campus rodents.

Campus Wildlife Provides Unique Learning Opportunities

Of the eight squirrel species that call Texas home, the highly adaptable Eastern fox squirrel has seemingly found its niche in the open, park-like environments of universities across much of the state. With a variety of habitat options, an abundant food supply, and relatively few predators, it’s no surprise that these campus squirrels flourish.

Dr. Ty Werdel, RWFM assistant professor, said this provides the perfect opportunity to integrate accessible, field-based monitoring with academic coursework.

“The presence of urban wildlife on campus enables our students to conduct research and practice key technical skills in their own backyard,” Werdel said.

A man in a black shirt (Ty Werdel, Ph.D., assistant professor in the Texas A&M University Department of Rangeland, Wildlife and Fisheries Management) helps a student in a blue shirt and hat try to locate squirrels

Project Launches This Spring

Led by Werdel and RWFM graduate students, undergraduates enrolled in Techniques in Wildlife Management will set traps located in trees to capture 12 squirrels across campus. Once the animals are captured, students will collect data such as sex and weight and equip the squirrels with micro-GPS collars.

Werdel said these GPS collars, like very small pet collars, have no detrimental effect on the daily activities of the wildlife wearing them and will provide researchers with fine-scale spatial insight into their movement and behavior.

To mitigate stress on the animal and ensure human safety, only trained personnel will be allowed to handle the squirrels; however, wildlife students will assist in the process. Further, the trapping will take place only in the early morning or late evening hours to avoid the hottest portions of the day.

“It’s important for us to ensure this process results in the least amount of stress possible for the squirrels,” Werdel said. “Prior to even starting this work, we obtained a research permit from the  Texas Parks and Wildlife Department  and approval from the  Texas A&M Division of Research  Animal Use Protocol.”

An Eastern Fox squirrel maneuvers through tree branches after it was fitted with a small radio frequency collar

Information Collected Sheds Light On The Campus Squirrels

A man (Ty Werdel, Ph.D., assistant professor in the Texas A&M University Department of Rangeland, Wildlife and Fisheries Management) helps students with radio frequency gear

In addition to monitoring general movement patterns, the GPS collars, other survey methods and statistical modeling will enable students enrolled in two additional undergraduate courses — Wildlife Population Dynamics and Urban Wildlife and Fisheries — to determine the campus squirrel population, as well as home ranges and habitat preferences across the landscape.

For example, thanks to an existing geographic information system, GIS, database of campus trees, along with data on building density and roads, students can correlate squirrel activity and density with particular landscape features. This helps researchers better understand which campus elements squirrels most prefer or avoid.

Further, students will monitor and assess squirrel mortality, locate and monitor nesting dens, and estimate squirrel populations on campus based on surveys.

“This project will enable students to learn and implement an array of basic wildlife techniques including radio telemetry, census methodology, trapping, GIS and statistical analyses,” Werdel said. “Beyond gaining an understanding of the population dynamics of urban squirrels, this project is really about equipping our students with the skills needed to successfully enter the career field of wildlife management.”

Building On Existing Campus Data

This isn’t the first time Texas A&M students have investigated the behavior of campus squirrels. Twenty-five years ago, Dr. Roel Lopez, head of the Department of Rangeland, Wildlife and Fisheries Management and director of the  Texas A&M Natural Resources Institute , launched a similar project.

“At the time, urban squirrels had never been studied in Texas and had rarely been studied in the U.S.,” said Lopez, then an assistant professor.

This project spanned six years and led to numerous findings, including that male squirrels on the Texas A&M campus are more likely to die from  highway-related deaths  than females.

Department faculty were able to revive this project thanks in part to financial support from an alumnus of Texas A&M’s wildlife program.

“College Station and the Texas A&M campus have urbanized and changed drastically since the initial study,” Lopez said. “This will provide an amazing opportunity to see if these changes have affected how these animals use the landscape.”

Werdel said understanding the impact of urbanization on all wildlife species is extremely relevant as cities continue their outward expansion and overlap with wildlife habitat.

“The project’s primary objective is to prepare our students with the technical knowledge and skills needed to conserve and manage any number of wildlife species in an evolving environment,” Werdel said. “While this research is specific to our campus, students will be able to apply what they learn through this project to future wildlife management endeavors.”

This article by Sarah Fuller originally appeared on AgriLife Today .

Related Stories

Susan Liu, Gen. Eric Smith and Lee Thornton pose for a photo

Two Students Honored With Aggie Ring Handoff By US Marine Corps Commandant At The Pentagon

Susan Liu ’26 and Lee Thornton ’25 accepted their Aggie gold in the nation’s capital from Gen. Eric Smith, a Texas A&M graduate.

A hand touching blades of grass.

Texas A&M AgriLife Turfgrass Program Leads Through Innovation

From backyards to football fields and golf courses, science is reshaping the turfgrass experience.

A man, Nadav Mer picks coffee fruit in Costa Rica

From Coffee Bean To Coffee Cup

On a recent study abroad trip, Texas A&M students got a firsthand look at coffee production in Costa Rica.

Recent Stories

A white truck charging at one of the new Level 3 stations.

New Electric Vehicle Fast-Charging Stations Installed On Texas A&M Campus

The DC fast chargers are available for public use in Lot 47.

an aerial photo of campus showing Aggie Park across from Kyle Field, with the rest of campus in the background

Texas MS 150 Bike Tour Returns To Aggieland

The annual charity ride will conclude on the Texas A&M campus on April 28. Learn more about road closures, parking and where to watch.

Cosmic Leap Foundation founders Rachelle Pedersen and Natasha Wilkerson with their prize checks from the Aggie PITCH competition.

Sixth Annual Aggie PITCH Awards McFerrin Cup, Cash Prizes

Twenty Aggie-led startups competed in the only university-wide business pitch competition.

Decorative photo of the Academic Building

Subscribe to the Texas A&M Today newsletter for the latest news and stories every week.

EU Health Data Space: more efficient treatments and life-saving research  

Share this page:  .

  • Facebook  
  • Twitter  
  • LinkedIn  
  • WhatsApp  
  • Citizens will have access across the EU to an electronic health record containing prescriptions, imagery and lab tests  
  • Anonymised health data to be shared for research e.g. into rare diseases  
  • Strong privacy safeguards governing how and for what purpose sensitive data are shared  

MEPs approved the creation of a European Health Data Space, improving citizens’ access to their personal health data and boosting secure sharing in the public interest.

On Wednesday, MEPs voted with 445 in favour and 142 against (39 abstentions) to approve the inter-institutional agreement on establishing a European Health Data Space. It will empower patients to access their health data in an electronic format, including from a different member state to the one in which they live, and allow health professionals to consult their patients’ files with their consent (so-called primary use), also from other EU countries. These electronic health records (EHR) would include patient summaries, electronic prescriptions, medical imagery and laboratory results.

The law will make it possible to transfer health data safely to health professionals in other EU countries (based on MyHealth@EU infrastructure), for example when citizens move to another state. It will be possible to download the health record free of charge.

Data-sharing for the common good with safeguards

Additionally, the Health Data Space would unleash the research potential of health data in an anonymised or pseudonymised format. Data including health records, clinical trials, pathogens, health claims and reimbursements, genetic data, public health registry information, wellness data and information on healthcare resources, expenditure and financing, could be processed for public interest purposes, including research, statistics and policy-making (so-called secondary use). Data could, for example, be used to find treatments for rare diseases, where small datasets and fragmentation currently prevent advances in treatments.

Secondary use will not be allowed for commercial purposes including advertising, assessing insurance requests or lending conditions or making job market decisions. Access decisions will be made by national data access bodies.

Robust privacy safeguards

The law ensures people will have a say in how their data are used and accessed. Patients will be able to refuse their health data being accessed by practitioners (except where this is necessary for protecting the vital interests of the data subject or another person) or processed for research purposes, apart from certain public-interest, policy-making or statistical purposes. Patients will also have to be informed each time their data are accessed, and will have the right to request corrections to incorrect data.

Tomislav Sokol (EPP, Croatia), Environment Committee co-rapporteur, said: "The Health Data Space can help us to leverage the data we have in a safe and secure manner, giving vital research into new treatments a major boost. It will prevent gaps in treatment by making sure health professional can access their patients’ records across borders. At the same time, opt-outs will ensure that patients have a say, and that the system is trustworthy. It will be a major step forward for digital healthcare in the EU."

Annalisa Tardino (ID, Italy), Civil Liberties Committee co-rapporteur, said: “The Health Data Space will boost everyone's access to healthcare. In future, doctors can be authorised to access their patients’ health records and laboratory results in other regions, or even other EU member states, saving money, resources and providing better cures. We also secured opt-outs to ensure that patients have a say in how their data are used. Although we would have preferred even stronger measures, we were able to find a position that can be accepted by a majority."

The provisional agreement still needs to be formally approved by the Council. Once published in the EU’s Official Journal, it will enter into force twenty days later. It will be applied two years after, with certain exceptions, including primary and secondary use of data categories, which will apply four to six years later, depending on the category..

By adopting the law, Parliament is responding to the demands of citizens put forward in the conclusions of the Conference of the Future of Europe. These include proposal 8(1), which explicitly recommended the creation of a health data space to facilitate exchanges, and proposals 35(7) and 35(8) on data and artificial intelligence.

Contacts:  

Janne ojamo  .

Dana POPP  

Further information  

  • Agreed text  
  • Procedure file  
  • EP Research Service briefing  
  • Committee on the Environment, Public Health and Food Safety  
  • Committee on Civil Liberties, Justice and Home Affairs  

Product information  

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Neurol Res Pract

Logo of neurrp

How to use and assess qualitative research methods

Loraine busetto.

1 Department of Neurology, Heidelberg University Hospital, Im Neuenheimer Feld 400, 69120 Heidelberg, Germany

Wolfgang Wick

2 Clinical Cooperation Unit Neuro-Oncology, German Cancer Research Center, Heidelberg, Germany

Christoph Gumbinger

Associated data.

Not applicable.

This paper aims to provide an overview of the use and assessment of qualitative research methods in the health sciences. Qualitative research can be defined as the study of the nature of phenomena and is especially appropriate for answering questions of why something is (not) observed, assessing complex multi-component interventions, and focussing on intervention improvement. The most common methods of data collection are document study, (non-) participant observations, semi-structured interviews and focus groups. For data analysis, field-notes and audio-recordings are transcribed into protocols and transcripts, and coded using qualitative data management software. Criteria such as checklists, reflexivity, sampling strategies, piloting, co-coding, member-checking and stakeholder involvement can be used to enhance and assess the quality of the research conducted. Using qualitative in addition to quantitative designs will equip us with better tools to address a greater range of research problems, and to fill in blind spots in current neurological research and practice.

The aim of this paper is to provide an overview of qualitative research methods, including hands-on information on how they can be used, reported and assessed. This article is intended for beginning qualitative researchers in the health sciences as well as experienced quantitative researchers who wish to broaden their understanding of qualitative research.

What is qualitative research?

Qualitative research is defined as “the study of the nature of phenomena”, including “their quality, different manifestations, the context in which they appear or the perspectives from which they can be perceived” , but excluding “their range, frequency and place in an objectively determined chain of cause and effect” [ 1 ]. This formal definition can be complemented with a more pragmatic rule of thumb: qualitative research generally includes data in form of words rather than numbers [ 2 ].

Why conduct qualitative research?

Because some research questions cannot be answered using (only) quantitative methods. For example, one Australian study addressed the issue of why patients from Aboriginal communities often present late or not at all to specialist services offered by tertiary care hospitals. Using qualitative interviews with patients and staff, it found one of the most significant access barriers to be transportation problems, including some towns and communities simply not having a bus service to the hospital [ 3 ]. A quantitative study could have measured the number of patients over time or even looked at possible explanatory factors – but only those previously known or suspected to be of relevance. To discover reasons for observed patterns, especially the invisible or surprising ones, qualitative designs are needed.

While qualitative research is common in other fields, it is still relatively underrepresented in health services research. The latter field is more traditionally rooted in the evidence-based-medicine paradigm, as seen in " research that involves testing the effectiveness of various strategies to achieve changes in clinical practice, preferably applying randomised controlled trial study designs (...) " [ 4 ]. This focus on quantitative research and specifically randomised controlled trials (RCT) is visible in the idea of a hierarchy of research evidence which assumes that some research designs are objectively better than others, and that choosing a "lesser" design is only acceptable when the better ones are not practically or ethically feasible [ 5 , 6 ]. Others, however, argue that an objective hierarchy does not exist, and that, instead, the research design and methods should be chosen to fit the specific research question at hand – "questions before methods" [ 2 , 7 – 9 ]. This means that even when an RCT is possible, some research problems require a different design that is better suited to addressing them. Arguing in JAMA, Berwick uses the example of rapid response teams in hospitals, which he describes as " a complex, multicomponent intervention – essentially a process of social change" susceptible to a range of different context factors including leadership or organisation history. According to him, "[in] such complex terrain, the RCT is an impoverished way to learn. Critics who use it as a truth standard in this context are incorrect" [ 8 ] . Instead of limiting oneself to RCTs, Berwick recommends embracing a wider range of methods , including qualitative ones, which for "these specific applications, (...) are not compromises in learning how to improve; they are superior" [ 8 ].

Research problems that can be approached particularly well using qualitative methods include assessing complex multi-component interventions or systems (of change), addressing questions beyond “what works”, towards “what works for whom when, how and why”, and focussing on intervention improvement rather than accreditation [ 7 , 9 – 12 ]. Using qualitative methods can also help shed light on the “softer” side of medical treatment. For example, while quantitative trials can measure the costs and benefits of neuro-oncological treatment in terms of survival rates or adverse effects, qualitative research can help provide a better understanding of patient or caregiver stress, visibility of illness or out-of-pocket expenses.

How to conduct qualitative research?

Given that qualitative research is characterised by flexibility, openness and responsivity to context, the steps of data collection and analysis are not as separate and consecutive as they tend to be in quantitative research [ 13 , 14 ]. As Fossey puts it : “sampling, data collection, analysis and interpretation are related to each other in a cyclical (iterative) manner, rather than following one after another in a stepwise approach” [ 15 ]. The researcher can make educated decisions with regard to the choice of method, how they are implemented, and to which and how many units they are applied [ 13 ]. As shown in Fig.  1 , this can involve several back-and-forth steps between data collection and analysis where new insights and experiences can lead to adaption and expansion of the original plan. Some insights may also necessitate a revision of the research question and/or the research design as a whole. The process ends when saturation is achieved, i.e. when no relevant new information can be found (see also below: sampling and saturation). For reasons of transparency, it is essential for all decisions as well as the underlying reasoning to be well-documented.

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig1_HTML.jpg

Iterative research process

While it is not always explicitly addressed, qualitative methods reflect a different underlying research paradigm than quantitative research (e.g. constructivism or interpretivism as opposed to positivism). The choice of methods can be based on the respective underlying substantive theory or theoretical framework used by the researcher [ 2 ].

Data collection

The methods of qualitative data collection most commonly used in health research are document study, observations, semi-structured interviews and focus groups [ 1 , 14 , 16 , 17 ].

Document study

Document study (also called document analysis) refers to the review by the researcher of written materials [ 14 ]. These can include personal and non-personal documents such as archives, annual reports, guidelines, policy documents, diaries or letters.

Observations

Observations are particularly useful to gain insights into a certain setting and actual behaviour – as opposed to reported behaviour or opinions [ 13 ]. Qualitative observations can be either participant or non-participant in nature. In participant observations, the observer is part of the observed setting, for example a nurse working in an intensive care unit [ 18 ]. In non-participant observations, the observer is “on the outside looking in”, i.e. present in but not part of the situation, trying not to influence the setting by their presence. Observations can be planned (e.g. for 3 h during the day or night shift) or ad hoc (e.g. as soon as a stroke patient arrives at the emergency room). During the observation, the observer takes notes on everything or certain pre-determined parts of what is happening around them, for example focusing on physician-patient interactions or communication between different professional groups. Written notes can be taken during or after the observations, depending on feasibility (which is usually lower during participant observations) and acceptability (e.g. when the observer is perceived to be judging the observed). Afterwards, these field notes are transcribed into observation protocols. If more than one observer was involved, field notes are taken independently, but notes can be consolidated into one protocol after discussions. Advantages of conducting observations include minimising the distance between the researcher and the researched, the potential discovery of topics that the researcher did not realise were relevant and gaining deeper insights into the real-world dimensions of the research problem at hand [ 18 ].

Semi-structured interviews

Hijmans & Kuyper describe qualitative interviews as “an exchange with an informal character, a conversation with a goal” [ 19 ]. Interviews are used to gain insights into a person’s subjective experiences, opinions and motivations – as opposed to facts or behaviours [ 13 ]. Interviews can be distinguished by the degree to which they are structured (i.e. a questionnaire), open (e.g. free conversation or autobiographical interviews) or semi-structured [ 2 , 13 ]. Semi-structured interviews are characterized by open-ended questions and the use of an interview guide (or topic guide/list) in which the broad areas of interest, sometimes including sub-questions, are defined [ 19 ]. The pre-defined topics in the interview guide can be derived from the literature, previous research or a preliminary method of data collection, e.g. document study or observations. The topic list is usually adapted and improved at the start of the data collection process as the interviewer learns more about the field [ 20 ]. Across interviews the focus on the different (blocks of) questions may differ and some questions may be skipped altogether (e.g. if the interviewee is not able or willing to answer the questions or for concerns about the total length of the interview) [ 20 ]. Qualitative interviews are usually not conducted in written format as it impedes on the interactive component of the method [ 20 ]. In comparison to written surveys, qualitative interviews have the advantage of being interactive and allowing for unexpected topics to emerge and to be taken up by the researcher. This can also help overcome a provider or researcher-centred bias often found in written surveys, which by nature, can only measure what is already known or expected to be of relevance to the researcher. Interviews can be audio- or video-taped; but sometimes it is only feasible or acceptable for the interviewer to take written notes [ 14 , 16 , 20 ].

Focus groups

Focus groups are group interviews to explore participants’ expertise and experiences, including explorations of how and why people behave in certain ways [ 1 ]. Focus groups usually consist of 6–8 people and are led by an experienced moderator following a topic guide or “script” [ 21 ]. They can involve an observer who takes note of the non-verbal aspects of the situation, possibly using an observation guide [ 21 ]. Depending on researchers’ and participants’ preferences, the discussions can be audio- or video-taped and transcribed afterwards [ 21 ]. Focus groups are useful for bringing together homogeneous (to a lesser extent heterogeneous) groups of participants with relevant expertise and experience on a given topic on which they can share detailed information [ 21 ]. Focus groups are a relatively easy, fast and inexpensive method to gain access to information on interactions in a given group, i.e. “the sharing and comparing” among participants [ 21 ]. Disadvantages include less control over the process and a lesser extent to which each individual may participate. Moreover, focus group moderators need experience, as do those tasked with the analysis of the resulting data. Focus groups can be less appropriate for discussing sensitive topics that participants might be reluctant to disclose in a group setting [ 13 ]. Moreover, attention must be paid to the emergence of “groupthink” as well as possible power dynamics within the group, e.g. when patients are awed or intimidated by health professionals.

Choosing the “right” method

As explained above, the school of thought underlying qualitative research assumes no objective hierarchy of evidence and methods. This means that each choice of single or combined methods has to be based on the research question that needs to be answered and a critical assessment with regard to whether or to what extent the chosen method can accomplish this – i.e. the “fit” between question and method [ 14 ]. It is necessary for these decisions to be documented when they are being made, and to be critically discussed when reporting methods and results.

Let us assume that our research aim is to examine the (clinical) processes around acute endovascular treatment (EVT), from the patient’s arrival at the emergency room to recanalization, with the aim to identify possible causes for delay and/or other causes for sub-optimal treatment outcome. As a first step, we could conduct a document study of the relevant standard operating procedures (SOPs) for this phase of care – are they up-to-date and in line with current guidelines? Do they contain any mistakes, irregularities or uncertainties that could cause delays or other problems? Regardless of the answers to these questions, the results have to be interpreted based on what they are: a written outline of what care processes in this hospital should look like. If we want to know what they actually look like in practice, we can conduct observations of the processes described in the SOPs. These results can (and should) be analysed in themselves, but also in comparison to the results of the document analysis, especially as regards relevant discrepancies. Do the SOPs outline specific tests for which no equipment can be observed or tasks to be performed by specialized nurses who are not present during the observation? It might also be possible that the written SOP is outdated, but the actual care provided is in line with current best practice. In order to find out why these discrepancies exist, it can be useful to conduct interviews. Are the physicians simply not aware of the SOPs (because their existence is limited to the hospital’s intranet) or do they actively disagree with them or does the infrastructure make it impossible to provide the care as described? Another rationale for adding interviews is that some situations (or all of their possible variations for different patient groups or the day, night or weekend shift) cannot practically or ethically be observed. In this case, it is possible to ask those involved to report on their actions – being aware that this is not the same as the actual observation. A senior physician’s or hospital manager’s description of certain situations might differ from a nurse’s or junior physician’s one, maybe because they intentionally misrepresent facts or maybe because different aspects of the process are visible or important to them. In some cases, it can also be relevant to consider to whom the interviewee is disclosing this information – someone they trust, someone they are otherwise not connected to, or someone they suspect or are aware of being in a potentially “dangerous” power relationship to them. Lastly, a focus group could be conducted with representatives of the relevant professional groups to explore how and why exactly they provide care around EVT. The discussion might reveal discrepancies (between SOPs and actual care or between different physicians) and motivations to the researchers as well as to the focus group members that they might not have been aware of themselves. For the focus group to deliver relevant information, attention has to be paid to its composition and conduct, for example, to make sure that all participants feel safe to disclose sensitive or potentially problematic information or that the discussion is not dominated by (senior) physicians only. The resulting combination of data collection methods is shown in Fig.  2 .

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig2_HTML.jpg

Possible combination of data collection methods

Attributions for icons: “Book” by Serhii Smirnov, “Interview” by Adrien Coquet, FR, “Magnifying Glass” by anggun, ID, “Business communication” by Vectors Market; all from the Noun Project

The combination of multiple data source as described for this example can be referred to as “triangulation”, in which multiple measurements are carried out from different angles to achieve a more comprehensive understanding of the phenomenon under study [ 22 , 23 ].

Data analysis

To analyse the data collected through observations, interviews and focus groups these need to be transcribed into protocols and transcripts (see Fig.  3 ). Interviews and focus groups can be transcribed verbatim , with or without annotations for behaviour (e.g. laughing, crying, pausing) and with or without phonetic transcription of dialects and filler words, depending on what is expected or known to be relevant for the analysis. In the next step, the protocols and transcripts are coded , that is, marked (or tagged, labelled) with one or more short descriptors of the content of a sentence or paragraph [ 2 , 15 , 23 ]. Jansen describes coding as “connecting the raw data with “theoretical” terms” [ 20 ]. In a more practical sense, coding makes raw data sortable. This makes it possible to extract and examine all segments describing, say, a tele-neurology consultation from multiple data sources (e.g. SOPs, emergency room observations, staff and patient interview). In a process of synthesis and abstraction, the codes are then grouped, summarised and/or categorised [ 15 , 20 ]. The end product of the coding or analysis process is a descriptive theory of the behavioural pattern under investigation [ 20 ]. The coding process is performed using qualitative data management software, the most common ones being InVivo, MaxQDA and Atlas.ti. It should be noted that these are data management tools which support the analysis performed by the researcher(s) [ 14 ].

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig3_HTML.jpg

From data collection to data analysis

Attributions for icons: see Fig. ​ Fig.2, 2 , also “Speech to text” by Trevor Dsouza, “Field Notes” by Mike O’Brien, US, “Voice Record” by ProSymbols, US, “Inspection” by Made, AU, and “Cloud” by Graphic Tigers; all from the Noun Project

How to report qualitative research?

Protocols of qualitative research can be published separately and in advance of the study results. However, the aim is not the same as in RCT protocols, i.e. to pre-define and set in stone the research questions and primary or secondary endpoints. Rather, it is a way to describe the research methods in detail, which might not be possible in the results paper given journals’ word limits. Qualitative research papers are usually longer than their quantitative counterparts to allow for deep understanding and so-called “thick description”. In the methods section, the focus is on transparency of the methods used, including why, how and by whom they were implemented in the specific study setting, so as to enable a discussion of whether and how this may have influenced data collection, analysis and interpretation. The results section usually starts with a paragraph outlining the main findings, followed by more detailed descriptions of, for example, the commonalities, discrepancies or exceptions per category [ 20 ]. Here it is important to support main findings by relevant quotations, which may add information, context, emphasis or real-life examples [ 20 , 23 ]. It is subject to debate in the field whether it is relevant to state the exact number or percentage of respondents supporting a certain statement (e.g. “Five interviewees expressed negative feelings towards XYZ”) [ 21 ].

How to combine qualitative with quantitative research?

Qualitative methods can be combined with other methods in multi- or mixed methods designs, which “[employ] two or more different methods [ …] within the same study or research program rather than confining the research to one single method” [ 24 ]. Reasons for combining methods can be diverse, including triangulation for corroboration of findings, complementarity for illustration and clarification of results, expansion to extend the breadth and range of the study, explanation of (unexpected) results generated with one method with the help of another, or offsetting the weakness of one method with the strength of another [ 1 , 17 , 24 – 26 ]. The resulting designs can be classified according to when, why and how the different quantitative and/or qualitative data strands are combined. The three most common types of mixed method designs are the convergent parallel design , the explanatory sequential design and the exploratory sequential design. The designs with examples are shown in Fig.  4 .

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig4_HTML.jpg

Three common mixed methods designs

In the convergent parallel design, a qualitative study is conducted in parallel to and independently of a quantitative study, and the results of both studies are compared and combined at the stage of interpretation of results. Using the above example of EVT provision, this could entail setting up a quantitative EVT registry to measure process times and patient outcomes in parallel to conducting the qualitative research outlined above, and then comparing results. Amongst other things, this would make it possible to assess whether interview respondents’ subjective impressions of patients receiving good care match modified Rankin Scores at follow-up, or whether observed delays in care provision are exceptions or the rule when compared to door-to-needle times as documented in the registry. In the explanatory sequential design, a quantitative study is carried out first, followed by a qualitative study to help explain the results from the quantitative study. This would be an appropriate design if the registry alone had revealed relevant delays in door-to-needle times and the qualitative study would be used to understand where and why these occurred, and how they could be improved. In the exploratory design, the qualitative study is carried out first and its results help informing and building the quantitative study in the next step [ 26 ]. If the qualitative study around EVT provision had shown a high level of dissatisfaction among the staff members involved, a quantitative questionnaire investigating staff satisfaction could be set up in the next step, informed by the qualitative study on which topics dissatisfaction had been expressed. Amongst other things, the questionnaire design would make it possible to widen the reach of the research to more respondents from different (types of) hospitals, regions, countries or settings, and to conduct sub-group analyses for different professional groups.

How to assess qualitative research?

A variety of assessment criteria and lists have been developed for qualitative research, ranging in their focus and comprehensiveness [ 14 , 17 , 27 ]. However, none of these has been elevated to the “gold standard” in the field. In the following, we therefore focus on a set of commonly used assessment criteria that, from a practical standpoint, a researcher can look for when assessing a qualitative research report or paper.

Assessors should check the authors’ use of and adherence to the relevant reporting checklists (e.g. Standards for Reporting Qualitative Research (SRQR)) to make sure all items that are relevant for this type of research are addressed [ 23 , 28 ]. Discussions of quantitative measures in addition to or instead of these qualitative measures can be a sign of lower quality of the research (paper). Providing and adhering to a checklist for qualitative research contributes to an important quality criterion for qualitative research, namely transparency [ 15 , 17 , 23 ].

Reflexivity

While methodological transparency and complete reporting is relevant for all types of research, some additional criteria must be taken into account for qualitative research. This includes what is called reflexivity, i.e. sensitivity to the relationship between the researcher and the researched, including how contact was established and maintained, or the background and experience of the researcher(s) involved in data collection and analysis. Depending on the research question and population to be researched this can be limited to professional experience, but it may also include gender, age or ethnicity [ 17 , 27 ]. These details are relevant because in qualitative research, as opposed to quantitative research, the researcher as a person cannot be isolated from the research process [ 23 ]. It may influence the conversation when an interviewed patient speaks to an interviewer who is a physician, or when an interviewee is asked to discuss a gynaecological procedure with a male interviewer, and therefore the reader must be made aware of these details [ 19 ].

Sampling and saturation

The aim of qualitative sampling is for all variants of the objects of observation that are deemed relevant for the study to be present in the sample “ to see the issue and its meanings from as many angles as possible” [ 1 , 16 , 19 , 20 , 27 ] , and to ensure “information-richness [ 15 ]. An iterative sampling approach is advised, in which data collection (e.g. five interviews) is followed by data analysis, followed by more data collection to find variants that are lacking in the current sample. This process continues until no new (relevant) information can be found and further sampling becomes redundant – which is called saturation [ 1 , 15 ] . In other words: qualitative data collection finds its end point not a priori , but when the research team determines that saturation has been reached [ 29 , 30 ].

This is also the reason why most qualitative studies use deliberate instead of random sampling strategies. This is generally referred to as “ purposive sampling” , in which researchers pre-define which types of participants or cases they need to include so as to cover all variations that are expected to be of relevance, based on the literature, previous experience or theory (i.e. theoretical sampling) [ 14 , 20 ]. Other types of purposive sampling include (but are not limited to) maximum variation sampling, critical case sampling or extreme or deviant case sampling [ 2 ]. In the above EVT example, a purposive sample could include all relevant professional groups and/or all relevant stakeholders (patients, relatives) and/or all relevant times of observation (day, night and weekend shift).

Assessors of qualitative research should check whether the considerations underlying the sampling strategy were sound and whether or how researchers tried to adapt and improve their strategies in stepwise or cyclical approaches between data collection and analysis to achieve saturation [ 14 ].

Good qualitative research is iterative in nature, i.e. it goes back and forth between data collection and analysis, revising and improving the approach where necessary. One example of this are pilot interviews, where different aspects of the interview (especially the interview guide, but also, for example, the site of the interview or whether the interview can be audio-recorded) are tested with a small number of respondents, evaluated and revised [ 19 ]. In doing so, the interviewer learns which wording or types of questions work best, or which is the best length of an interview with patients who have trouble concentrating for an extended time. Of course, the same reasoning applies to observations or focus groups which can also be piloted.

Ideally, coding should be performed by at least two researchers, especially at the beginning of the coding process when a common approach must be defined, including the establishment of a useful coding list (or tree), and when a common meaning of individual codes must be established [ 23 ]. An initial sub-set or all transcripts can be coded independently by the coders and then compared and consolidated after regular discussions in the research team. This is to make sure that codes are applied consistently to the research data.

Member checking

Member checking, also called respondent validation , refers to the practice of checking back with study respondents to see if the research is in line with their views [ 14 , 27 ]. This can happen after data collection or analysis or when first results are available [ 23 ]. For example, interviewees can be provided with (summaries of) their transcripts and asked whether they believe this to be a complete representation of their views or whether they would like to clarify or elaborate on their responses [ 17 ]. Respondents’ feedback on these issues then becomes part of the data collection and analysis [ 27 ].

Stakeholder involvement

In those niches where qualitative approaches have been able to evolve and grow, a new trend has seen the inclusion of patients and their representatives not only as study participants (i.e. “members”, see above) but as consultants to and active participants in the broader research process [ 31 – 33 ]. The underlying assumption is that patients and other stakeholders hold unique perspectives and experiences that add value beyond their own single story, making the research more relevant and beneficial to researchers, study participants and (future) patients alike [ 34 , 35 ]. Using the example of patients on or nearing dialysis, a recent scoping review found that 80% of clinical research did not address the top 10 research priorities identified by patients and caregivers [ 32 , 36 ]. In this sense, the involvement of the relevant stakeholders, especially patients and relatives, is increasingly being seen as a quality indicator in and of itself.

How not to assess qualitative research

The above overview does not include certain items that are routine in assessments of quantitative research. What follows is a non-exhaustive, non-representative, experience-based list of the quantitative criteria often applied to the assessment of qualitative research, as well as an explanation of the limited usefulness of these endeavours.

Protocol adherence

Given the openness and flexibility of qualitative research, it should not be assessed by how well it adheres to pre-determined and fixed strategies – in other words: its rigidity. Instead, the assessor should look for signs of adaptation and refinement based on lessons learned from earlier steps in the research process.

Sample size

For the reasons explained above, qualitative research does not require specific sample sizes, nor does it require that the sample size be determined a priori [ 1 , 14 , 27 , 37 – 39 ]. Sample size can only be a useful quality indicator when related to the research purpose, the chosen methodology and the composition of the sample, i.e. who was included and why.

Randomisation

While some authors argue that randomisation can be used in qualitative research, this is not commonly the case, as neither its feasibility nor its necessity or usefulness has been convincingly established for qualitative research [ 13 , 27 ]. Relevant disadvantages include the negative impact of a too large sample size as well as the possibility (or probability) of selecting “ quiet, uncooperative or inarticulate individuals ” [ 17 ]. Qualitative studies do not use control groups, either.

Interrater reliability, variability and other “objectivity checks”

The concept of “interrater reliability” is sometimes used in qualitative research to assess to which extent the coding approach overlaps between the two co-coders. However, it is not clear what this measure tells us about the quality of the analysis [ 23 ]. This means that these scores can be included in qualitative research reports, preferably with some additional information on what the score means for the analysis, but it is not a requirement. Relatedly, it is not relevant for the quality or “objectivity” of qualitative research to separate those who recruited the study participants and collected and analysed the data. Experiences even show that it might be better to have the same person or team perform all of these tasks [ 20 ]. First, when researchers introduce themselves during recruitment this can enhance trust when the interview takes place days or weeks later with the same researcher. Second, when the audio-recording is transcribed for analysis, the researcher conducting the interviews will usually remember the interviewee and the specific interview situation during data analysis. This might be helpful in providing additional context information for interpretation of data, e.g. on whether something might have been meant as a joke [ 18 ].

Not being quantitative research

Being qualitative research instead of quantitative research should not be used as an assessment criterion if it is used irrespectively of the research problem at hand. Similarly, qualitative research should not be required to be combined with quantitative research per se – unless mixed methods research is judged as inherently better than single-method research. In this case, the same criterion should be applied for quantitative studies without a qualitative component.

The main take-away points of this paper are summarised in Table ​ Table1. 1 . We aimed to show that, if conducted well, qualitative research can answer specific research questions that cannot to be adequately answered using (only) quantitative designs. Seeing qualitative and quantitative methods as equal will help us become more aware and critical of the “fit” between the research problem and our chosen methods: I can conduct an RCT to determine the reasons for transportation delays of acute stroke patients – but should I? It also provides us with a greater range of tools to tackle a greater range of research problems more appropriately and successfully, filling in the blind spots on one half of the methodological spectrum to better address the whole complexity of neurological research and practice.

Take-away-points

Acknowledgements

Abbreviations, authors’ contributions.

LB drafted the manuscript; WW and CG revised the manuscript; all authors approved the final versions.

no external funding.

Availability of data and materials

Ethics approval and consent to participate, consent for publication, competing interests.

The authors declare no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Support for Existing Expertise: Community-focused training initiatives to improve the safety and health of Tribal buffalo herd workers

picture of bison

American bison, also known as buffalo, are the largest land mammal in North America and are perfectly adapted to the harsh landscape of the high plains, capable of surviving extreme winters, vast changes in temperature, drought conditions, high humidity, and many diseases that impact other hoofed mammals. In recent decades, indigenous communities across North America and organizations such as the Intertribal Buffalo Council (ITBC) have led efforts to bring the buffalo home to Tribal lands. This work is done with many goals in mind including ecological restoration, cultural and spiritual revitalization, economic growth, and food sovereignty.

However, buffalo are not domesticated and have not been bred for docility. They still exhibit their innate defensive strategies, including aggression and heightened vigilance in comparison with domesticated livestock like cattle. In flight or pursuit, buffalo can reach speeds up to 35 mph and are surprisingly agile given their large size. For these reasons, people working around buffalo are at risk for injury and exposure to zoonotic diseases, which are infections that can spread between animals and people. These risks and the growth of buffalo herding led to initial research which documented both hazards and health and safety best practices in buffalo herding . Now, a new research project seeks to build on past work and partnerships to create relevant and culturally appropriate safety and health training for buffalo herd workers.

The Central States Center for Agriculture Safety and Health (CS-CASH) at the College of Public Health in the University of Nebraska Medical Center is one of 11 regional Centers for Agricultural Safety and Health funded by the National Institute for Occupational Safety and Health. CS-CASH’s new project, “ Establishing a Community-Based Training Network to Enhance Bison Herd Workers Safety on Tribal Lands ” aims to support the people who do the hands-on work of managing Tribal buffalo herds by employing what was learned in previous work to:

 1. Continue to monitor and understand workplace injuries, working conditions, and worker safety hazards for buffalo herd workers on Tribal lands;

2. Work with indigenous buffalo herd managers to ensure educational materials and training strategies are culturally relevant and appropriate; and

3. Help develop and support an indigenous-led training and mentorship program focused on worker and herd health.

The photo shows a buffalo worker looking at a herd of buffalo from the passenger side of a pickup truck.

The existing community of Tribal buffalo herd managers and workers contains the world’s foremost experts in buffalo herd management, harvesting, and processing. Tribes who are establishing their own herds may need trusted guidance and support as they work to establish their own programs. What has been lacking, however, is support for expert mentorship and training for these up-and-coming programs. This project intends to help provide this support and foster collaboration between experienced and less experienced tribal groups.

Improving Health and Safety Through Collaboration and Community

Health and safety hazards for buffalo herd workers include working with aging and repurposed equipment (often designed for cattle); working in remote locations during winter months; slip, trip and fall hazards; and high stress handling techniques. For the past five years, CS-CASH and the ITBC have worked together to hold an annual roundtable event which brings together experts and learners to discuss creative solutions, facilitate resource sharing, and document concerns regarding new and existing hazards to worker safety within the community.

Tribal communities have a strong interest in the safety and logistics surrounding cultural harvests and processing. Community events serve as an opportunity for the exchange of cultural knowledge, spiritual practice, as well as supporting community food sovereignty initiatives. This is also an opportunity to refine safety practices surrounding food preparation, to sample organs for disease and parasite monitoring, and to establish practices aimed at supporting the health and safety of the herd, the herd workers, and the broader community.

As the movement to bring bison back to Tribal lands continues to grow, we continue to work with Tribal herd workers and managers, ITBC, and other collaborators to enhance training materials, training opportunities, and support for community-led mentorship. Existing materials are available on the Central States Center for Agricultural Safety and Health (CS-CASH) website . Including annual reports summarizing discussions resulting from our annual roundtable events.

Mystera Samuelson, PhD, Assistant Professor, University of Nebraska Medical Center, College of Public Health, Department of Environmental, Agricultural, and Occupational Health

Arlo Iron Cloud Sr., Porcupine, SD Community Member

Lisa Iron Cloud, Porcupine, SD Community Member

KC Elliott, MA, MPH, Epidemiologist in the NIOSH Office of Agriculture Safety and Health.

Jessica Post, University of Nebraska Medical Center, Central States Center for Agricultural Safety and Health

Risto Rautiainen, PhD, MS, Professor, University of Nebraska Medical Center, College of Public Health, Director of Central States Center for Agricultural Safety and Health

Ellen Duysen, MPH, COHC, Assistant Research Professor, University of Nebraska, College of Public Health, Central States Center for Agricultural Safety and Health

John Gibbins, DVM, MPH, Senior Veterinary Advisor, NIOSH Office of Agriculture Safety and Health

This research is done under research cooperative agreement award U54OH010162 supported by the Centers for Disease Control and Prevention National Institute for Occupational Safety and Health (CDC/NIOSH) under CDC funding opportunity RFA-OH-22-002. The contents are those of the author(s) and do not necessarily represent the official views of, nor an endorsement, by CDC/HHS, or the U.S. Government.

One comment on “Support for Existing Expertise: Community-focused training initiatives to improve the safety and health of Tribal buffalo herd workers”

Comments listed below are posted by individuals not associated with CDC, unless otherwise stated. These comments do not represent the official views of CDC, and CDC does not guarantee that any information posted by individuals on this site is correct, and disclaims any liability for any loss or damage resulting from reliance on any such information. Read more about our comment policy » .

I am truly amazed that this work with buffalo and the Indian nation exists. I had no idea studies and help was there to help the Indian reservations re-populate the Bison herd and take care of them. I am sure the help with managing the care and keeping the workers safe is a big step forward. I applaud DR. John Gibbins for his work on this subject.

Post a Comment

Cancel reply.

Your email address will not be published. Required fields are marked *

  • 50th Anniversary Blog Series
  • Additive Manufacturing
  • Aging Workers
  • Agriculture
  • Animal/Livestock hazards
  • Artificial Intelligence
  • Back Injury
  • Bloodborne pathogens
  • Cardiovascular Disease
  • cold stress
  • commercial fishing
  • Communication
  • Construction
  • Cross Cultural Communication
  • Dermal Exposure
  • Education and Research Centers
  • Electrical Safety
  • Emergency Response/Public Sector
  • Engineering Control
  • Environment/Green Jobs
  • Epidemiology
  • Fire Fighting
  • Food Service
  • Future of Work and OSH
  • Healthy Work Design
  • Hearing Loss
  • Heat Stress
  • Holiday Themes
  • Hydraulic Fracturing
  • Infectious Disease Resources
  • International
  • Landscaping
  • Law Enforcement
  • Manufacturing
  • Manufacturing Mondays Series
  • Mental Health
  • Motor Vehicle Safety
  • Musculoskeletal Disorders
  • Nanotechnology
  • National Occupational Research Agenda
  • Needlestick Prevention
  • NIOSH-funded Research
  • Nonstandard Work Arrangements
  • Observances
  • Occupational Health Equity
  • Oil and Gas
  • Outdoor Work
  • Partnership
  • Personal Protective Equipment
  • Physical activity
  • Policy and Programs
  • Prevention Through Design
  • Prioritizing Research
  • Reproductive Health
  • Research to practice r2p
  • Researcher Spotlights
  • Respirators
  • Respiratory Health
  • Risk Assessment
  • Safety and Health Data
  • Service Sector
  • Small Business
  • Social Determinants of Health
  • Spanish translations
  • Sports and Entertainment
  • Strategic Foresight
  • Struck-by injuries
  • Student Training
  • Substance Use Disorder
  • Surveillance
  • Synthetic Biology
  • Systematic review
  • Take Home Exposures
  • Teachers/School Workers
  • Temporary/Contingent Workers
  • Total Worker Health
  • Translations (other than Spanish)
  • Transportation
  • Uncategorized
  • Veterinarians
  • Wearable Technologies
  • Wholesale and Retail Trade
  • Work Schedules
  • Workers' Compensation
  • Workplace Medical Mystery
  • Workplace Supported Recovery
  • World Trade Center Health Program
  • Young Workers

To receive email updates about this page, enter your email address:

Exit Notification / Disclaimer Policy

  • The Centers for Disease Control and Prevention (CDC) cannot attest to the accuracy of a non-federal website.
  • Linking to a non-federal website does not constitute an endorsement by CDC or any of its employees of the sponsors or the information and products presented on the website.
  • You will be subject to the destination website's privacy policy when you follow the link.
  • CDC is not responsible for Section 508 compliance (accessibility) on other federal or private website.

IMAGES

  1. Research Data

    what is existing data in research

  2. Data Analysis in research methodology

    what is existing data in research

  3. 5 Steps of the Data Analysis Process

    what is existing data in research

  4. 6-1: Types of Research Data (Source: Malhotra et al, 2002)

    what is existing data in research

  5. Introduction to Research Data Management (RDM)

    what is existing data in research

  6. 15 Secondary Research Examples (2024)

    what is existing data in research

VIDEO

  1. Introduction To Data Analytics (Practical Session)

  2. Existing Data Based Research| Research Method, #researchmethodology #bcom #shortnotes #bba

  3. Finding Data: Teaching Science and Technology Data Information Literacy Skills to Researchers

  4. 287 data research ready to use #data #business #database

  5. Lecture 57: Handling Big Data Research

  6. Research Data Lifecycle

COMMENTS

  1. What is Secondary Research?

    Secondary research is a research method that uses data that was collected by someone else. In other words, whenever you conduct research using data that already exists, you are conducting secondary research. On the other hand, any type of research that you undertake yourself is called primary research. Example: Secondary research.

  2. Secondary Data

    Secondary data analysis involves the use of pre-existing data for research purposes. Here are some common methods of secondary data analysis: Descriptive Analysis: This method involves describing the characteristics of a dataset, such as the mean, standard deviation, and range of the data. Descriptive analysis can be used to summarize data and ...

  3. Secondary Research: Definition, Methods & Examples

    Secondary research is a research method that involves using already existing data. Existing data is summarized and collated to increase the overall effectiveness of the research. One of the key advantages of secondary research is that it allows us to gain insights and draw conclusions without having to collect new data ourselves.

  4. Using an existing data set to answer new research questions: a

    If an existing data set is suitable for answering a new research question, then a secondary analysis is preferable since it can be completed in less time, for less money, and with far lower risks to subjects. The researcher must carefully consider if the existing data set's available power and data quality are adequate to answer the proposed ...

  5. Secondary analysis of existing data: opportunities and implementation

    Summary. The secondary analysis of existing data has become an increasingly popular method of enhancing the overall efficiency of the health research enterprise. But this effort depends on governments, funding agencies, and researchers making the data collected in primary research studies and in health-related registry systems available to ...

  6. Secondary Analysis Research

    The advantages of doing SDA research that are cited most often are the economic savings—in time, money, and labor—and the convenience of using existing data rather than collecting primary data, which is usually the most time-consuming and expensive aspect of research (Johnston, 2014; Rew et al., 2000; Tripathy, 2013).

  7. Secondary Research: Definition, Methods & Examples

    Secondary research, also known as desk research, is a research method that involves compiling existing data sourced from a variety of channels. This includes internal sources (e.g.in-house research) or, more commonly, external sources (such as government statistics, organizational bodies, and the internet).

  8. Answering Research Questions Using an Existing Data Set

    Pose research questions that could be answered given a prudent sample measures, and, if applicable, follow-up 4. Refine research questions 4. Write a research proposal including how subjects will be recruited, what data will be collected. and what safeguards will be in place to protect subjects' safety 5.

  9. Finding Social Science Data for Research

    We are incredibly lucky to live during a time when the amount of available digital data is skyrocketing! Using existing data for research projects can help save time and money, and supports innovation within the scholarly community. For more information on the benefits of reusing data, please visit this resource: Open Knowledge Foundation

  10. Secondary Use of Existing Data

    Secondary use of existing (archival) data studies includes all of the following: Data that are collected for non-research purposes (i.e. student records) or collected for a research study other than the proposed study (i.e. another study's data set) The proposed study plans to use the existing data as opposed to gathering new data (or ...

  11. PDF Research Involving the Secondary Use of Existing Data

    Example: Many student research projects involve secondary analysis of data that belongs to, or was initially collected by, their faculty advisor or another investigator. If the student is provided with a de- identified, non-coded data set, the use of the data does not constitute research with human subjects because there is no interaction with any individual and no identifiable private ...

  12. What is Secondary Data? [Examples, Sources & Advantages]

    5. Advantages of secondary data. Secondary data is suitable for any number of analytics activities. The only limitation is a dataset's format, structure, and whether or not it relates to the topic or problem at hand. When analyzing secondary data, the process has some minor differences, mainly in the preparation phase.

  13. Research Data

    Research data refers to any information or evidence gathered through systematic investigation or experimentation to support or refute a hypothesis or answer a research question. It includes both primary and secondary data, and can be in various formats such as numerical, textual, audiovisual, or visual. Research data plays a critical role in ...

  14. Data Module #1: What is Research Data?

    Research data are collected and used in scholarship across all academic disciplines and, while it can consist of numbers in a spreadsheet, it also takes many different formats, including videos, images, artifacts, and diaries. Whether a psychologist collecting survey data to better understand human behavior, an artist using data to generate ...

  15. Data Module #3

    Scholars frequently use existing data for new research. The new research may or may not align with the original purpose for the data collection. For example, the U.S. Census Bureau collects demographic data for its own use. These data are often used for a wide variety of research projects, and frequently combined with other sources, such as ...

  16. Data Collection

    Data collection is a systematic process of gathering observations or measurements. Whether you are performing research for business, governmental or academic purposes, data collection allows you to gain first-hand knowledge and original insights into your research problem. While methods and aims may differ between fields, the overall process of ...

  17. A Practical Guide to Writing Quantitative and Qualitative Research

    A research question is what a study aims to answer after data analysis and interpretation. The answer is written in length in the discussion section of the paper. ... These questions can function in several ways, such as to 1) identify and describe existing conditions (contextual research questions); 2) describe a phenomenon (descriptive ...

  18. What is Data Analysis? An Expert Guide With Examples

    Data analysis is a comprehensive method of inspecting, cleansing, transforming, and modeling data to discover useful information, draw conclusions, and support decision-making. It is a multifaceted process involving various techniques and methodologies to interpret data from various sources in different formats, both structured and unstructured.

  19. How to Analyze Existing Data for Research Projects

    2. Choose your data sources. Be the first to add your personal experience. 3. Clean and prepare your data. Be the first to add your personal experience. 4. Select your analysis methods. Be the ...

  20. Research Using Existing or Secondary Data

    Research Using Existing or Secondary Data. Research that involves the use of existing data, documents, records, or specimens from living individuals usually must be reviewed by the IRB in advance of the investigator receiving or analyzing the data. If the data contain individual identifiers, the research may be eligible for an expedited review.

  21. What Is Qualitative Research?

    Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which involves collecting and ...

  22. How do I describe the use of pre-existing data in the ...

    This guidance addresses the use of data collected for purposes other than the proposed research project. Initial Application Risk/Benefit Assessment for adults and/or children: Evaluate the risk to the participants within the data set. Participant age range: Age range of participants within the data set. If the exact age range is unknown, please indicate whether participants will be adults and ...

  23. A scoping review of continuous quality improvement in healthcare system

    The growing adoption of continuous quality improvement (CQI) initiatives in healthcare has generated a surge in research interest to gain a deeper understanding of CQI. However, comprehensive evidence regarding the diverse facets of CQI in healthcare has been limited. Our review sought to comprehensively grasp the conceptualization and principles of CQI, explore existing models and tools ...

  24. A New Model for Studying Social Isolation and Health in People with

    Researchers have developed a promising new framework for studying the link between social disconnection and poor physical health in people living with serious mental illnesses (SMI). Drawing on published research from animal models and data from the general population, this framework builds on existing social isolation and loneliness models by integrating insights from evolutionary and ...

  25. Methodology

    Data in this report is drawn from the panel wave conducted March 7-13, 2022. A total of 10,441 panelists responded out of 11,687 who were sampled, for a response rate of 89%. The cumulative response rate accounting for nonresponse to the recruitment surveys and attrition is 3%.

  26. MSK Research Highlights, April 26, 2024

    MSK Research Highlights, April 26, 2024. Friday, April 26, 2024. New research from Memorial Sloan Kettering Cancer Center (MSK) provides two examples of computational tools developed at MSK that leverage data about cells' locations to expand our understanding of cancer. One approach describes cellular "neighborhoods" not just in terms of ...

  27. Students Gather Data On Campus Squirrels As Part Of Urban Wildlife

    Building On Existing Campus Data. This isn't the first time Texas A&M students have investigated the behavior of campus squirrels. Twenty-five years ago, Dr. Roel Lopez, head of the Department of Rangeland, Wildlife and Fisheries Management and director of the Texas A&M Natural Resources Institute, launched a similar project.

  28. EU Health Data Space: more efficient treatments and life-saving research

    Additionally, the Health Data Space would unleash the research potential of health data in an anonymised or pseudonymised format. Data including health records, clinical trials, pathogens, health claims and reimbursements, genetic data, public health registry information, wellness data and information on healthcare resources, expenditure and ...

  29. How to use and assess qualitative research methods

    Abstract. This paper aims to provide an overview of the use and assessment of qualitative research methods in the health sciences. Qualitative research can be defined as the study of the nature of phenomena and is especially appropriate for answering questions of why something is (not) observed, assessing complex multi-component interventions ...

  30. Support for Existing Expertise: Community-focused training initiatives

    These risks and the growth of buffalo herding led to initial research which documented both hazards and health and safety best practices in buffalo herding. Now, a new research project seeks to build on past work and partnerships to create relevant and culturally appropriate safety and health training for buffalo herd workers.