QAP Assessment Tool: Your Questions Answered


Q. Why is there a successful/unsuccessful threshold for the QAP Assessment Tool?
Q. Do I lose my licence if I don’t meet the successful threshold?
Q. Does the new QAP Assessment Tool record me while I take it? What is this Proctorio plug-in?
Q. The new QAP Assessment Tool seems more difficult than the old assessment tool. Is that true?
Q. Will this new QAP Assessment Tool be phased out with amalgamation?
Q. Why does the QAP Assessment Tool have a fee while a module like the JEM does not?
Q. I’ve heard it takes registrants longer to complete the new QAP Assessment Tool. Is this true?
Q. Why am I not able to take the QAP Assessment Tool on my smartphone?
Q. How do you evaluate the QAP Assessment Tool?
Q. How have you responded to feedback from registrants who completed the Tool in 2021?
Q. Why am I being assigned more sub-categories for my Required Learning Plan (RLP) with the revised Tool?
Q. The questions in the revised Tool don't seem to be as reflective as current dental hygiene practice and do not have enough of a focus on clinical skills. The majority of us work in private clinical settings and there were questions about public health and community settings. Why is that?
Q. I have heard that some questions in the Assessment Tool require multiple answers, for example dragging and dropping multiple fields. Why is this? And why is there not partial scoring for questions like this?
Q. Why am I being assessed on my knowledge of local anaesthesia in the QAP Assessment Tool and the new LA Module?
Q. Why couldn't I review incorrect answers as well as have a prompt for the correct answer?

Q. Why is there a successful/unsuccessful threshold for the QAP Assessment Tool?

A. The Quality Assurance Program Assessment Tool serves two purposes. First, the College requires assurance that registrants have remained current in their foundational dental hygiene knowledge to practice safely, so a successful/unsuccessful threshold is a critical aspect of the program. Second, this success threshold identifies a small minority of registrants who require further assessment and support to improve their practice. This is done through the second assessment tier: an on-site Professional Performance Assessment which entails a documentation audit based on the College’s Practice Standards. The registrant works collaboratively with a QAP Assessor and the College to determine a directed learning plan to remediate in the necessary areas.

Back to top

Q. Do I lose my licence if I don’t meet the successful threshold?

A. No. Registrants who have two unsuccessful attempts on the QAP Assessment Tool enter the second tier of the Quality Assurance Program for a Professional Performance Assessment (PPA). The Quality Assurance Committee, which oversees registrants who are involved in the second tier of the QAP, does not have the ability under the Health Professions Act or the College Bylaws to remove a registrant’s licence.

Back to top

Q. Does the new QAP Assessment Tool record me while I take it? What is this Proctorio plug-in?

A. The Proctorio plugin provides security features for the QAP Assessment Tool, including temporarily disabling the ability to copy, paste, or print while taking the tool. These features were built-in to the previous tool, but the platform for the new tool does not have that functionality, which is why Proctorio was used beginning in 2021.

It is in registrants’ best interests to maintain the integrity of the QAP Assessment Tool. Substantial investment was required to create the tool and if it were compromised in any way, registrant funds would be needed to address this.

While Proctorio has the capability to deliver other enhanced security features such as video proctoring, those are not enabled for the College’s tool, nor are they expected to be in future.  Furthermore, the law does not allow the College to engage these features without your explicit consent.

One benefit of Proctorio is that if you have an IT issue while completing the assessment, the College is able to verify that, and allows us to issue a re-take for free.

Back to top

Q. The new QAP Assessment Tool seems more difficult than the old assessment tool. Is that true?

A. The new version of the QAP Assessment Tool was developed by BC dental hygienists from a wide range of practice setting, including clinical practice. The proportion of registrants who were not successful after a first attempt in 2021 is nearly identical to that of the previous tool: 4 per cent of the 2021 cohort compared to an average of 3 per cent for the cohorts over the previous eight years. That said, the QAP Assessment Tool would not be an appropriate assessment if the questions were too easy. In addition, feedback from the QAP Assessment Tool is intended to guide professional development which also contributes to enhancing knowledge, currency, and competency.

Back to top

Q. Will this new QAP Assessment Tool be phased out with amalgamation?

A. The Health Professions Act (HPA) requires each BC health regulatory college to develop a quality assurance program that includes an assessment component as a measure of public assurance that health professionals are remaining current and competent in their practice.

The amalgamation will merge four oral health colleges in B.C., but the mandate of the new regulator will be the same, to regulate all the oral health professions in the public interest and ensure that it is fulfilling its obligations under the Health Professions Act.

Back to top

Q. Why does the QAP Assessment Tool have a fee while a module like the JEM does not?

A. The resources required to create and maintain modules like the Jurisprudence Education Module (JEM) can be covered through registrant renewal fees as the content requires minimal updates each year. More resources are needed to maintain the QAP Assessment Tool as the process to develop content for an assessment that is current and defensible is more robust and complex. The $85 fee reflects the resources required for ongoing development of the tool which is necessary both to maintain its security and to have a current and accurate question item bank. To ensure transparency and fairness, the CDHBC Board of Directors decided to charge a fee at the time a registrant completes the QAP Assessment Tool, rather than increasing registrants’ annual fees to cover the cost.

It is worth noting that the new Assessment Tool is cheaper than the previous assessment tool ($85 instead of $125 plus tax).

Back to top

Q. I’ve heard it takes registrants longer to complete the new QAP Assessment Tool. Is this true?

A. As the new tool has an interactive question format, we recognize that it may take some registrants more time to complete than the previous multiple-choice tool. In recognition of this, we have extended the allotted time to 3 hours from 2.5 hours and the number of questions has been reduced from 75 to 70. Of the 2021 cohort, 69 per cent completed it between 2.5 and 3 hours, 22 per cent between 2 and 2.5 hours, and 7 per cent in less than two hours. While the maximum amount of time has increased, registrants can finish the tool more quickly if they choose.

Back to top

Q. Why am I not able to take the QAP Assessment Tool on my smartphone?

A. We’re not looking into developing a mobile friendly version at this time as the platform does not support touchscreen technology for smartphones and tablets, including iPads. A computer in a quiet environment is recommended.

Back to top

Q. How do you evaluate the QAP Assessment Tool?
A. Evaluation of the Tool is one of the guiding principles of the Quality Assurance Program and is done in a number of ways. Data on overall performance is gathered and monitored annually (e.g., the proportion of the annual cohort who are successful/unsuccesful, average time spent taking the Tool, etc.) as is data on performance within content categories. There is also a survey that registrants in the annual cohort are asked to complete after they've finished taking the Tool and it is hosted and analyzed by an independent research firm. In addition to these evaluation measures which have been ongoing, evaluation of the questions on the Assessment Tool is undertaken by a psychometrician since the Assessment Tool was brought in-house. The psychometrics review and analysis looks at things like whether the questions perform as expected, the difficulty level for the questions individually and the Tool overall, and whether the different question formats are performing in a fair and defensible manner.

Back to top

Q. How have you responded to feedback from registrants who completed the Tool in 2021?
A. We have heard feedback from registrants who were part of the 2021 cohort. We received responses from 49 per cent (219) of the 2021 cohort, a notable increase from response rates in previous years.
Our Spring 2021 review of the revised Tool validated that the assessment is valid, reliable and fair.
We heard registrant feedback about the question format, but we are also mindful that we received feedback on the previous tool from registrants who did not like the multiple-choice format as it is not reflective of real life. We won't be changing the question format for the Tool.
We also did a fulsome review of the questions as part of the normal course of reviewing the Assessment Tool. We identified some questions that we will update to ensure currency or, in a few cases, retire and replace for the 2022 assessment.

Back to top

Q. Why am I being assigned more sub-categories for my Required Learning Plan (RLP) with the revised Tool?
A. Through our analysis, it appears registrants received an average of 1-2 more sub-categories than in their last QAP cycle. With the revised Tool, there was a slight change to the tool blueprint as well as an increase in the variety of questions in the item bank. This allowed for a better balancing of questions across the subcategories than the previous tool. This is why some registrants may have received a small increase in the number of subcategories assigned in their Online Learning Plan (OLP).

Back to top

Q. The questions in the revised Tool don't seem to be as reflective as current dental hygiene practice and do not have enough of a focus on clinical skills. The majority of us work in private clinical settings and there were questions about public health and community settings. Why is that?
A. The purpose of the Quality Assurance Program is to ensure knowledge across all areas of foundational dental hygiene practice. The goal is not to create a tool that only covers a respective registrant's main area of practice. As such, there may be some aspects of the assessment that do not relate to a hygienist's current practice environment. Since the College does not regulate practice environments, all registrants are expected to have some foundational knowledge of all areas of practice. The number of questions about different practice settings is proportionate to the settings that registrants practice in. For example, there were only 2 or 3 questions about practicing in a community public health setting out of 70 questions.

Back to top

Q. I have heard that some questions in the Assessment Tool require multiple answers, for example dragging and dropping multiple fields. Why is this? And why is there not partial scoring for questions like this?
A. The benefit of the Tool is not the regurgitation of knowledge but the application of knowledge in scenarios that are similar to what you see when treating clients in real life. As registrants often need to make multiple decisions or assessments for a single situation in their practice, the Tool does not offer partial marks to reflect actual practice.

Back to top

Q. Why am I being assessed on my knowledge of local anaesthesia in the QAP Assessment Tool and the new LA Module?
A. As local anaesthesia is considered part of the blueprint for the QAP Assessment Tool, there are also questions about the administration of local anaesthesia in the Tool. All registrants need to have foundational knowledge of pain management, regardless of if, or how often, they administer LA. This won't change with the launch of the LA Module.

Back to top

Q. Why couldn't I review incorrect answers as well as have a prompt for the correct answer?
A. Unfortunately, we cannot show incorrect answers in the revised Tool as the new platform lacks the functionality to only show questions answered incorrectly. We encourage registrants to reflect on how their assigned sub-categories apply to their area of practice. The goal is not to "correct" the question that was answered incorrectly. The goal is for the assigned sub-category to assist in directing learning in that area as it applies to a registrant's practice. As such, there is little value in reviewing incorrect answers on the Tool. In addition, by publishing any answers, it puts the security and validity of the tool at risk.

Back to top