TY - JOUR
T1 - Assessment of radiology physicians by a regulatory authority
AU - Lockyer, Jocelyn M.
AU - Violato, Claudio
AU - Fidler, Herta M.
PY - 2008/6
Y1 - 2008/6
N2 - Purpose: To determine whether it is possible to develop a feasible, valid, and reliable multisource feedback program for radiologists. Materials and Methods: Surveys with 38, 29, and 20 items were developed to assess individual radiologists by eight radiologic colleagues (peers), eight referring physicians, and eight co-workers (eg, technicians), respectively, by using five-point scales along with an "unable to assess" category. Radiologists completed a self-assessment on the basis of the peer questionnaire. Items addressed key competencies related to clinical competence, collegiality, professionalism, workplace behavior, and self-management. The study was approved by the University of Calgary Conjoint Health Ethics Research Board. Results: Data from 190 radiologists were available. The mean numbers of respondents per physician were 7.5 of eight (1259 of 1520, 83%), 7.15 of eight (1337 of 1520, 88%), and 7.5 of eight (1420 of 1520, 93%) for peers, referring physicians, and co-workers, respectively. The internal consistency reliability indicated all instruments had a Cronbach α of more than 0.95. The generalizability coefficient analysis indicated that the peer, referring physicians, and coworker instruments achieved a generalizability coefficient of 0.88, 0.79, and 0.87, respectively. The factor analysis indicated that four factors on the colleague questionnaire accounted for 70% of the total variance: clinical competence, collegiality, professional development, and workplace behavior. For the referring physician survey, three factors accounted for 64.1% of the variance: professional development, professional consultation, and professional responsibility. Two factors on the co-worker questionnaire accounted for 63.2% of the total variance: professional responsibility and patient interaction. Conclusion: The psychometric examination of the data suggests that the instruments developed to assess radiologists are a feasible way to assess radiology practice and provide evidence for validity and reliability.
AB - Purpose: To determine whether it is possible to develop a feasible, valid, and reliable multisource feedback program for radiologists. Materials and Methods: Surveys with 38, 29, and 20 items were developed to assess individual radiologists by eight radiologic colleagues (peers), eight referring physicians, and eight co-workers (eg, technicians), respectively, by using five-point scales along with an "unable to assess" category. Radiologists completed a self-assessment on the basis of the peer questionnaire. Items addressed key competencies related to clinical competence, collegiality, professionalism, workplace behavior, and self-management. The study was approved by the University of Calgary Conjoint Health Ethics Research Board. Results: Data from 190 radiologists were available. The mean numbers of respondents per physician were 7.5 of eight (1259 of 1520, 83%), 7.15 of eight (1337 of 1520, 88%), and 7.5 of eight (1420 of 1520, 93%) for peers, referring physicians, and co-workers, respectively. The internal consistency reliability indicated all instruments had a Cronbach α of more than 0.95. The generalizability coefficient analysis indicated that the peer, referring physicians, and coworker instruments achieved a generalizability coefficient of 0.88, 0.79, and 0.87, respectively. The factor analysis indicated that four factors on the colleague questionnaire accounted for 70% of the total variance: clinical competence, collegiality, professional development, and workplace behavior. For the referring physician survey, three factors accounted for 64.1% of the variance: professional development, professional consultation, and professional responsibility. Two factors on the co-worker questionnaire accounted for 63.2% of the total variance: professional responsibility and patient interaction. Conclusion: The psychometric examination of the data suggests that the instruments developed to assess radiologists are a feasible way to assess radiology practice and provide evidence for validity and reliability.
UR - http://www.scopus.com/inward/record.url?scp=45149132782&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=45149132782&partnerID=8YFLogxK
U2 - 10.1148/radiol.2473071431
DO - 10.1148/radiol.2473071431
M3 - Article
C2 - 18375839
AN - SCOPUS:45149132782
SN - 0033-8419
VL - 247
SP - 771
EP - 778
JO - Radiology
JF - Radiology
IS - 3
ER -