Skip to content

Cdho Portfolio Bibliography Maker

  • © 2006 American Dental Education Association

The Impact of Quality Assurance Programming: A Comparison of Two Canadian Dental Hygienist Programs

  1. Joanna Asadoorian, A.A.S. (DH), B.Sc.D. (DH), M.Sc. and
  2. David Locker, B.D.S., Ph.D.
  1. School of Dental Hygiene, Faculty of Dentistry, University of Manitoba
  1. Direct correspondence and requests for reprints to Prof. Joanna Asadoorian, School of Dental Hygiene, Faculty of Dentistry, University of Manitoba, D35-780 Bannatyne Avenue, Winnipeg, Manitoba R3E 0W2 Canada; 204-789-3574 phone; 204-789-3948 fax; Joanna_Asadoorian{at}UManitoba.ca.
  • Received July 25, 2005.
  • Accepted May 25, 2006.

Abstract

Quality assurance (QA) and continuing competence (CC) programs aim to ensure acceptable levels of health care provider competence, but it is unknown which program methods most successfully achieve this goal. The objectives of the study reported in this article were to compare two distinct QA/CC programs of Canadian dental hygienists and assess the impact of these two programs on practice behavior change, a proxy measure for quality. British Columbia (BC) and Ontario (ON) were compared because the former mandates continuing education (CE) time requirements. A two-group comparison survey design using a self-administered questionnaire was implemented in randomly selected samples from two jurisdictions. No statistical differences were found in total activity, change opportunities, or change implementation, but ON study subjects participated in significantly more activities that yielded change opportunities and more activities that generated appropriate change implementation, meaning positive and correct approaches to providing care, than BC dental hygienists. Both groups reported implementing change to a similarly high degree. The findings suggest that ON dental hygienists participated in more learning activities that had relevancy to their practice and learning needs than did BC subjects. The findings indicate that the QA program in ON may allow for greater efficiency in professional learning.

Keywords:

Continuing competence (CC), quality assurance (QA), and quality improvement programs are strategies designed to ensure that the public receives not only appropriate and technically sound care, but also that health care delivery improves over time. The structure of these programs has become a contentious issue because requirements for registrants vary considerably between professions and jurisdictions and are resource-intensive. Program developers are increasingly obligated to validate CC requirements.

Canadian dental hygiene provincial QA/CC programs range in format from traditional mandatory continuing education (CE)-based programming, such as that occurring in British Columbia (BC) and other provinces, to more unique portfolio-based schemes, like that implemented in Ontario (ON). A full review of Canadian dental hygiene QA program requirements has been previously published.1

This article reports the findings of a survey-based study that compared the impact of the BC and ON dental hygiene QA programs on positive practice behavior change, a necessary prerequisite for practice improvements. These two Canadian self-regulated jurisdictions were selected because of their dissimilar QA/CC program requirements and because both have a sufficiently large population of dental hygienists to allow for statistical analysis. A previous article reported on the fulfillment of program requirements by study participants.2

Previous SectionNext Section

Background

While keeping up-to-date with current research is foundational to providing quality care, it is the application of recent findings to practice that has the potential to improve practice.35 There is a considerable lag time between new research findings and their general application into the majority of practice settings,4,69 which has helped generate debate surrounding the best way to encourage current research utility in practice. Because the delivery of competent care requires the application of new knowledge and skills into practice, measurement of change in practice behaviors has been used as a proxy to measure quality and improvement.3,10,11

The idea that mandatory CE as an isolated mechanism, especially when passively disseminated, ensures CC of providers or quality in practice has not been empirically demonstrated.4,10,1215 New initiatives in QA programming have been partly fuelled by research demonstrating that learning based on individual needs and practice environment does lead to practice change.10,13,15,16 The College of Dental Hygienists of Ontario (CDHO), a provincial regulatory body, implemented such an innovation in QA programming in 1999.1

The ON program requires registrants to self-direct their learning, guided by ongoing self-assessment comparing individual practice behaviors to practice standards.17 Through the self-assessment, learning goals and a plan are established, the plan is implemented, and, finally, the plan is evaluated for the attainment of goals.17 Regardless of the learning resources selected or their quantity, the critical element is that activities are based on the predetermined learning goals. In contrast, the BC program requires seventy-five credit hours of formally recognized CE within a three-year cycle.1 While self-assessment is recommended, it is not an enforced component of the CE program in BC.

Previous SectionNext Section

Materials and Methods

This study utilized an observational, cross-sectional two-group comparison survey design. Data was collected with self-administered, mailed questionnaires primarily composed of closed-response items. The research proposal and survey instrument were validated based on face, content, consensual, criterion, and construct validity.18,19 A four-step approach was used to develop the survey instrument. First, knowledge acquired through previous research was applied.1 Second, an extensive literature review of QA, CE, Quality Improvement, CC, behavior change theory, adult learning theory, and the principles of health survey design was conducted. Next, several consultations with experts in QA, dental hygiene, survey methods and design, behavioral sciences, and psychometrics were carried out. Finally, pre-testing of the survey instrument was conducted with a convenience sample of practicing dental hygienists, and appropriate revisions were made.

The sample size was calculated based on a formula for two-group comparisons.18 While no comparable research was available within this specific population to base predicted values, Aday states that assumptions may be made based on knowledge of the group and previous research.18 We estimated a 50 percent difference between the two groups because self-assessment is a requirement within the ON QA program whereas, in BC, self-assessment is a recommendation. Based on this estimation, we calculated that each group required 860 cases to detect a difference between the group proportions for the main outcome variable: implementation of appropriate behavior change in professional practice. Appropriateness was defined for study subjects within the survey as “what you believe to be a positive and correct approach to your work and to be consistent with your knowledge, skill, and professional standards.”

While positive health care outcomes have been described as the most robust method for testing the impact of continuing medical education3 and are considered to be the “gold standard” for the study of QA in health care, outcomes can be complicated to measure and may be influenced by factors outside of the delivery of care, such as lack of patient compliance and the nature of the disease process.10,11 Greco and Eisenberg say that a less stringent test is whether health care providers (physicians) change their practice.3 Measuring change in practitioner behavior is based on the premise that improvements in health care delivery, specifically structure and process elements, will increase the likelihood of positive health care outcomes.11 The measurement of practice behavior change has been used as a proxy measure in other studies.3 In this study, the primary outcome measure was behavior change reported as occurring as a direct result of selected learning activities, such as continuing education, within a two-year period. As professional health care providers with legal and ethical responsibilities, we assert that dental hygienists are expected to be able to distinguish between appropriate and inappropriate care within their own scope of practice, knowledge, and level of expertise. Subsequent to reading a journal article or attending a course on an innovation in periodontal maintenance therapy, for example, the dental hygienist would implement this innovation into practice.

We used a 0.05 significance level of precision. In less complicated study designs such as the systematic random sampling used in this investigation, Aday states it is unnecessary to estimate and adjust for an anticipated design effect.18 Adjustments for an expected response rate of 75 percent were made.

Survey procedures were based, where feasible, on criteria from Dillman’s “Mail and Other Self-Administered Questionnaires” and Aday’s Designing and Conducting Health Surveys.18,20 Standardized survey procedures were exactingly followed to ensure reliability of the results. The study was ethically and scientifically approved by the University of Toronto and the Faculty of Dentistry respectively. Strict privacy and confidentiality were ensured throughout the study, and no risks to participants were expected. Sample subjects were disproportionately drawn from the two jurisdictions, BC and ON, of registered, active dental hygienists. Exclusion criteria included registrants with less than two years registration in BC or ON, those registered in more than one province, and those not actively registered in either BC or ON at the time of the survey.

The survey was divided into two components: Part I requested demographic data and QA activities and has been previously reported on,2 and Part II examined appropriate professional behavior change. This article presents the results of Part II. The survey instrument is available by contacting the first author.

In order to measure change in practice, study subjects were asked to report on their participation in seventeen specified dental hygiene and related learning activities (Table 1⇓) and, in addition, any other learning activities not included in the seventeen specified activities within the previous two-year period (see example in Figure 1⇓). Study subjects were asked to report on all learning activities undertaken during the previous two years including informal (e.g., journal reading, literature searches) and formal (e.g., continuing education lectures, study clubs) methods. Examples were provided within the survey of formal and informal activities. While respondents were asked in Part I of the questionnaire to report on the number of hours/days of informal and formal learning activities participated in,2 they were not asked to provide details on specific activities in relation to changes in behavior in Part II of the study. Opportunities were defined as “appropriate behavior change suggested and applicable for your work setting,” keeping in mind that the term “appropriate” had been defined earlier in the survey instrument. Three scores were calculated as presented in Table 2⇓. Three subsequent ratio scores were formulated for each respondent to present more accurate reflections of appropriate change implementation (Table 3⇓). This was necessary because the primary objective of the study was to measure behavior change as an outcome of learning activity; therefore, it is necessary to measure change as a proportion of opportunities rather than just total change as there may have been variation between the total activity scores.

Figure 1.

Example of specified dental hygiene learning activity

We interpreted the last score, implementation from activity, as a measure of learning efficiency.

Finally, a self-directed learning score was calculated as a reflection of the degree to which the respondents implemented the components of self-directed learning within practice, which theoretically guided their learning and change. Self-directed learning, as described originally by Knowles in 1975,21 involves several steps to guide learning and make it more relevant for the individual. The steps included in self-directed learning for the purposes of this study are:

  • self-assessment to determine professional weaknesses,

  • establishment of personal learning goals,

  • development of a plan for action,

  • implementation of a plan for action, and

  • re-evaluation to determine whether one’s goals were attained.

Self-directed learning scores were analyzed to determine if a significant relationship was evident between the self-directed learning score and the outcome measure, which reported appropriate behavior change implementation. Respondents were asked if they always, sometimes/occasionally, or never performed these self-directed learning steps and were allocated 2, 1, or 0 points, respectively, for a possible score out of 10.

Table 1.

Specified dental hygiene and related learning activities

Table 2.

Description of total scores

Table 3.

Description of ratio scores

Previous SectionNext Section

Results

A total of 1,750 study subjects were randomly drawn from the two provincial registries (nBC=875, nON=875). A response rate of 49.5 percent of eligible subjects was achieved through two mailings. Of the eligible respondents, 46.6 percent (nON=404) and 53.4 percent (nBC=463) were from ON and BC, respectively.

Representativeness of the sample to the population was determined through comparisons of demographic and educational background data from this survey and that of a large Canadian study conducted in 2001 with a near 80 percent response rate.22 No statistically significant differences were demonstrated between the data, with the exception of a significantly greater proportion of ON respondents in the present study (13.4 percent) who obtained their dental hygiene education within a university setting as compared to those from the 2001 study (7.1 percent) (p=0.005).

Calculation of the activity score resulted in a range of zero to nineteen activities reported for both groups. Means for each of these three scores were calculated (Table 4⇓), and t-tests showed no significant difference between the two groups (activity score p=0.08; opportunity score p=0.27; implementation score p=0.42).

While not significant, these means show that while BC study subjects reported participating in slightly more total activities in the two-year period, ON dental hygienists demonstrated more total opportunities for practice behavior change.

A t-test of mean ratio scores demonstrated ON dental hygienists participated in significantly more professional learning activities that yielded relevant opportunities for change (opportunity from activity ratio score) than BC respondents (p<0.05) (Table 5⇓). The results indicated no significant difference between the BC and ON respondents’ implementation from opportunity scores. Therefore, when appropriate suggestions for change were presented, both groups implemented them to a similarly high degree. While the means revealed that both groups implemented much less change in relation to total activity (implementation from opportunity score) than in relation to total opportunities, respondents from ON showed a significantly greater proportion (p<0.05), suggesting more efficient learning strategies.

Previously reported findings demonstrated that ON participants reported significantly higher SDL scores than those in BC, which we believe is at least partially a result of the ON QA program requirements.2 The mean self-directed learning score in ON was 6.98 (s=2.40) and in BC was 6.14 (s=2.89) out of 10 (p<0.05).2

We tested for correlations between self-directed learning scores and the three ratio change scores. We believe that, of these scores, the best indicator of appropriate change is the implementation from opportunity score because no implementation of change would be expected in those activities participated in that did not yield appropriate suggestions for change. Using the Spearman’s rho correlation test for non-parametric data, a small (see Cohen, as cited in Hopkins for interpretation of correlation coefficients23) positive, significant correlation coefficient (rs=0.14, p<0.01) was demonstrated between self-directed learning score and implementation from opportunity score.

Correlations were also calculated for the SDL score and both the implementation from activity score and the opportunity within activity score. Again, small, positive, significant correlation coefficients were revealed (implementation from activity score rs=0.16, p<0.01; opportunity within activity score rs=0.11, p<0.01).

More moderate correlations were demonstrated for self-directed learning scores and the total scores: activity score rs=0.30 (p<0.01), opportunity score rs=0.32 (p<0.01), and implementation score rs=0.34 (p<0.01). These correlation coefficients indicate that as the self-directed learning score increases, so do the total amount of activity, total opportunities for practice change, and total implementation of practice change.

Correlation coefficients were also calculated to test for associations between demographic data and the implementation from opportunity score. Only age and experience showed very small positive correlations rs=0.12 for both (p<0.01).

Table 4.

Comparison of mean total scores

Table 5.

Comparison of mean ratio scores

Previous SectionNext Section

Discussion

No statistical differences were found between the two groups in any of the total scores (activity, opportunities, or implementation), but the ON respondents participated in significantly more activities that yielded change opportunities and more activities that generated appropriate change implementation than did dental hygienists in BC. Both groups implemented change to a similarly high degree when presented with an appropriate opportunity. These findings suggest greater efficiency in professional learning strategies for the ON subjects.

Several reasons may explain this phenomenon. First, a significantly higher level of adherence to the self-directed learning process may result in a higher acuity for ON respondents to select appropriate learning activities. However, the small correlation coefficient calculations of self-directed learning and ratio change scores did not support this rationale. Second, CE time requirements imposed in BC coupled with more limited access to formal learning resources2 may impose barriers to selecting appropriate learning interventions that have a greater likelihood for presenting relevant suggestions for change suitable for implementation. Third, the ON QA program removes the preceding barrier and permits autonomous selection of learning activities. We believe a combination of these factors is responsible.

Only small but significant positive correlations were found between self-directed learning scores and the three ratio scores. More moderate, significant positive correlations were demonstrated between self-directed learning scores and the three total scores. This suggests that the self-directed learning process, in its entirety, appears to be positively related to all aspects of learning: participation in activity, opportunities for change encountered, and implementation of change. Had the self-directed learning scores been more highly correlated with change opportunities and change implementation, but not with activity participation, it may have been rationalized that the self-directed learning process is associated with appropriate choice and subsequent implementation. However, this distinction was not evident.

It is our assertion that dental hygienists in BC will need to participate in more learning activities in order to encounter similar quantities of appropriate learning opportunities for change than do dental hygienists in ON. This point raises very important questions regarding the validity and legitimacy of the imposition of CE time requirements on registrants within professional colleges. We recommend further scrutiny of QA and CC program requirements, especially where CE time requirements are imposed, as they may be inappropriate and inhibit efficiency in learning.

We have identified four main weaknesses to the study. First, only the final stage of the change process, implementation, was measured and was not sensitive to the stages of change. Slotnick has stated that research needs to incorporate the movement from one stage to the next when evaluating the success of an intervention on change.24 Second, the study was based on the utility of somewhat discretionary self-reported data versus observed data, and it is therefore subject to reporting inaccuracies. Social desirability bias occurs when an individual does not adhere to a social norm but reports doing so when questioned.25 Despite a suspected overreporting of activity and change, these influences are believed to be reasonably equal between these groups. This means that comparisons between groups can be made, correlations calculated, and conclusions drawn. However, caution must be emphasized when examining the total scores because these results are based on subjective and individual interpretations. Third, the generalizations made from the literature review presented in this article, especially references 10–16, may be somewhat unjustified because most of the research has been conducted on physicians. Bero remarks that the generalizability of the findings from the physician studies to other settings is uncertain, because of educational differences, structure of health care systems, and different barriers to change.14 Finally, the study design did not incorporate a control group. Including a province where no formal QA program is in place may have yielded useful comparative data. While a small follow-up study in another province is conceivable, it may be argued that more traditional programming based on mandated CE time requirements provide the control as very few jurisdictions, particularly those that are self-regulated, have no QA/CC programming in place.

Previous SectionNext Section

Conclusions

No statistical differences were found between BC and ON study subjects participating in QA/CC programs based on mandatory CE and SDL in any of the total scores (activity, opportunities, or implementation). However, the ON respondents reported participating in significantly more activities yielding change opportunities and generating appropriate change implementation than dental hygienists in BC. When presented with an appropriate opportunity, respondents from both groups implemented change to a similarly high degree. Small but significant positive correlations were found between self-directed learning scores and the three ratio scores, but more moderate, significant positive correlations were demonstrated between self-directed learning scores and the three total scores.

This study has generated important hypotheses worthy of investigation, including how the SDL process can be enhanced to provide more accurate and meaningful reflections of individual learning needs, examination of how and why the change process may collapse specifically in dental hygiene among other health professions, and, finally, a determination of how QA policy and programming can best facilitate appropriate learning strategies and the positive practice changes required for CC.

Previous SectionNext Section

Acknowledgments

The authors thank the Community Dental Health Services Research Unit, Faculty of Dentistry at the University of Toronto for the generous financial support provided for this project. In addition, we appreciate Dentistry Canada Fund in the awarding of the DCF/Warner Lambert Fellowship for Dental Hygienists Community/Special Interest Project also used to help finance this study. Finally, we thank the College of Dental Hygienists of British Columbia and the College of Dental Hygienists of Ontario for the provision of their respective provincial registries for study sampling procedures.

Previous SectionNext Section

Footnotes

  • Prof. Asadoorian is Assistant Professor and First-Year Clinical Coordinator, School of Dental Hygiene, Faculty of Dentistry, University of Manitoba; Dr. Locker is Professor and Director, Community Dental Health Services Research Unit, Faculty of Dentistry, University of Toronto. Direct correspondence and requests for reprints to Prof. Joanna Asadoorian, School of Dental Hygiene, Faculty of Dentistry, University of Manitoba, D35-780 Bannatyne Avenue, Winnipeg, Manitoba R3E 0W2 Canada; 204-789-3574 phone; 204-789-3948 fax; Joanna_Asadoorian{at}UManitoba.ca.

REFERENCES

  1. Asadoorian J. Quality assurance programs for self-regulated dental hygienists in Canada: a comparative analysis. Probe-Scientific2001;35(6):225–32.

  2. Asadoorian J, Locker D. Quality assurance programming in Canada: an investigation into the fulfillment of dental hygiene requirements in British Columbia and Ontario. Can J Dent Hyg2005;39(4):168–87.

  3. Greco PJ, Eisenberg JM. Changing physicians’ practices. N Engl J Med1993;329(17):1271–4.

  4. Bauchner H, Simpson L, Chessare J. Changing physician behaviour. Arch Dis Child2001;84(6):459–62.

  5. Berwick DM.A primer on leading the improvement of systems. BMJ1996;312(7031):619–22.

  6. Anderson FA, Wheeler HB, Goldberg RG, Hosmer DW, Forcier A, Patwardhan NA. Changing clinical practice. Arch Intern Med1994;154(6):669–77.

  7. Kanouse DE, Jacoby I. When does information change practitioner behaviour? Int J Technol Assess Health Care1988;4:27–33.

  8. Greer AL. The state of the art versus the state of the science. Int J Technol Assess Health Care1988;4:5–26.

  9. Jolley S. Raising research awareness: a strategy for nurses. Nurs Stand2002;16(33):33–9.

  10. Davis DA, Thomson MA, Oxman AD, Haynes RB. Evidence for the effectiveness of CME. JAMA1992;268(9): 1111–7.

  11. Chassin MR, Galvin RW. The urgent need to improve health care auality: Institute of Medicine national roundtable on health care quality. JAMA1998;280(11): 1000–5.

  12. Davis DA, Thomson MA, Oxman AD, Haynes RB. Changing physician performance: a systematic review of the effect of continuing medical education strategies. JAMA1995;274(9):700–5.

  13. Oxman AD, Thomson MA, Davis DA, Haynes RB. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. CMAJ1995;153(10):1423–31.

  14. Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA. Getting research findings into practice: closing the gap between research and practice. BMJ1998;317(7156):465–8.

  15. Cantillon P, Jones R. Does continuing medical education in general practice make a difference? BMJ1999;318(7193):1276–9.

  16. Grant J. Learning needs assessment: assessing the need. BMJ2002;324(7330):156–9.

  17. The College of Dental Hygienists of Ontario. Quality assurance program. At: www.cdho.org/. Accessed: May 10, 2005.

  18. Aday LA. Designing and conducting health surveys, 2nd ed. San Francisco: Jossey-Bass, 1996.

  19. Fletcher RH, Fletcher SW, Wagner EH. Clinical epidemiology: the essentials, 3rd ed. Baltimore: Williams and Wilkins, 1996.

  20. Dillman D. Mail and other self-administered questionnaires. In: Rossi PH, Wright JD, Anderson AB, eds. Handbook of survey research. New York: Academic Press, 1983.

  21. Knowles M. Self-directed learning: a guide for learners and teachers. New York: Association Press, 1975.

  22. Johnson PM. Dental hygiene practice in Canada 2001. Report No. 3 Findings. Ottawa: Canadian Dental Hygienists Association, 2002.

  23. Slotnick HB. How doctors learn: physicians’ self-directed learning episodes. Acad Med1999;74(10):1106–17.

  24. Adams AS, Soumerai SB, Lomas J, Ross-Degnan D. Evidence of self-report bias in assessing adherence to guidelines. Int J Qual Health Care1999;11(3):187–92.

Колокола звонили где-то совсем рядом, очень громко. Беккер чувствовал жжение в боку, но кровотечение прекратилось. Он старался двигаться быстрее, знал, что где-то позади идет человек с пистолетом. Беккер смешался с толпой прихожан и шел с низко опущенной головой. Собор был уже совсем рядом, он это чувствовал.