QAP Assessment Tool-Response to Feedback

The Quality Assurance Program (QAP) is a fundamental way the College fulfills its mandate to regulate the profession of dental hygiene to protect the public and ensure safe practice. The QAP Assessment Tool is intended to strike a balance between the requirement to ensure  public protection and being reasonable for registrants to complete. The revised QAP Assessment Tool was developed by the College with questions and answers written by BC dental hygienists from a wide range of practice settings, including clinical practice. It is a three-hour, open-book, interactive, online assessment with 70 questions. The revised Assessment Tool is less expensive ($85 vs. $125) and more interactive than the previous tool.

As 2021 was the first year that we used this Assessment Tool, it is inevitable that there will be some areas that will be identified for adjustment. Evaluation of the Assessment Tool is one of the guiding principles of the Quality Assurance Program and is undertaken annually. In past years, this has included a performance analysis and registrant survey. This year, a psychometric analysis has also been added now that the Assessment Tool has been developed in-house. 

For answers to common questions we have received about the QAP Assessment Tool, please read the Q&A below.
 
For more information specifically about the QAP Assessment Tool’s security features, please read more about the Proctorio Chrome extension in this issue of Access.

Q. How do you evaluate the QAP Assessment Tool?
A. Evaluation of the Tool is one of the guiding principles of the Quality Assurance Program and is done in a number of ways. Data on overall performance is gathered and monitored annually (e.g., the proportion of the annual cohort who are successful/unsuccessful, average time spent taking the Tool, etc.) as is data on performance within content categories. There is also a survey that registrants in the annual cohort are asked to complete after they’ve finished taking the Tool and it is hosted and analyzed by an independent research firm. In addition to these evaluation measures which have been ongoing, evaluation of the questions on the Assessment Tool is also undertaken by a psychometrician since the Assessment Tool was brought in-house. The psychometrics review and analysis looks at things like whether the questions perform as expected, the difficulty level for the questions individually and the Tool overall, and whether the different question formats are performing in a fair and defensible manner.

Q. How have you responded to feedback from registrants who completed the tool in 2021?
A. We have heard feedback from registrants who were part of the 2021 cohort. We received responses from 49 per cent (219) of the 2021 cohort, a notable increase from response rates in previous years.

Our Spring 2021 review of the revised Tool validated that the assessment is valid, reliable, and fair. We heard registrant feedback about the question format, but we are also mindful that we received feedback on the previous tool from registrants who did not like the multiple-choice format as it is not reflective of real life. We won’t be changing the question format for the tool.

We also did a fulsome review of the questions as part of the normal course of reviewing the assessment tool. We identified some questions that we will update to ensure currency or, in a few cases, retire and replace for the 2022 assessment.

Q. Why am I being assigned more sub-categories for my Required Learning Plan (RLP) with the revised QAP tool?
A. Through our analysis, it appears registrants received an average of 1-2 more sub-categories than in their last QAP cycle. With the revised tool, there was a slight change to the tool blueprint as well as an increase in the variety of questions in the item bank. This allowed for a better balancing of questions across the subcategories than the previous tool. This is why some registrants may have received a small increase in the number of subcategories assign in their Online Learning Plan (OLP). 

Q. The questions in the revised QAP Assessment Tool doesn’t seem to be as reflective as current dental hygiene practice and not enough of a focus on clinical skills. The majority of us working in private clinical settings and there were questions about public health, community settings. Why is that?
A. The purpose of the Quality Assurance Program is to ensure knowledge across all areas of foundational dental hygiene practice. The goal is not to create a tool that only covers a respective registrant’s main area of practice. As such, there may be some aspects of the assessment that do not relate to a hygienists’ current practice environment. Since the College does not regulate practice environments, all registrants are expected to have some foundational knowledge of all areas of practice. The number of questions about different practice settings is proportionate to the settings that registrants practice in. For example, there were only 2 or 3 questions about practicing in a community public health setting out of 70 questions.

Q. I have heard that some questions in the Assessment Tool requires multiple answers, for example dragging and dropping multiple fields. Why is this? And why is there not partial scoring for questions like this?
A. The benefit of the Tool is not the regurgitation of knowledge but the application of knowledge to scenarios that are similar to what you see in treating a client in real life. As registrants often need to make multiple decisions or assessments for a single situation in their practice, the Tool does not offer partial marks to reflect actual practice.

Q. Why am I being assessed on my knowledge of local anaesthesia in the QAP Assessment Tool and the new LA Module?
A. As local anaesthesia is considered part of the blueprint for the QAP Assessment Tool, there are also questions about the administration of local anaesthesia in the Tool. All registrants need to have foundational knowledge of pain management, regardless of if or how often they administer LA. This won’t change with the launch of the LA Module. 

Q. Why couldn’t I review incorrect answers as well as have a prompt for the correct answer?
A. Unfortunately, we cannot show incorrect answers in the revised tool as the new platform lacks the functionality to only show questions answered incorrectly. We encourage registrants to reflect on how their assigned sub-categories apply to their area of practice. The goal is not to “correct” the question that was answered incorrectly. The goal is for the assigned sub-category to assist in directing learning in that area as it applies to a registrant’s practice. As such, there is little value in reviewing incorrect answers on the tool. In addition, by publishing any answers, it puts the security and validity of the tool at risk.

  • You can read answers to more questions about the Tool on the CDHBC website.
  • Information about the Tool is also available in the QAP Info Guide.