A Reevaluation of Assessment Center Construct-Related Validity

Milton V. Cahoon, Mark C. Bowler, Jennifer L. Bowler

Abstract


Recent Monte Carlo research (Lance, Woehr, & Meade, 2007) has questioned the primary analytical tool used to
assess the construct-related validity of assessment center post-exercise dimension ratings (PEDRs) – a
confirmatory factor analysis of a multitrait-multimethod (MTMM) matrix. By utilizing a hybrid of Monte Carlo
data generation and univariate generalizability theory, we examined three primary sources of variance (i.e.,
persons, dimensions, and exercises) and their interactions in 23 previously published assessment center MTMM
matrices. Overall, the person, dimension, and person by dimension effects accounted for a combined 34.06% of
variance in assessment center PEDRs (16.83%, 4.02%, and 13.21%, respectively). However, the largest single
effect came from the person by exercise interaction (21.83%). Implications and suggestions for future
assessment center research and design are discussed.


Full Text: PDF DOI: 10.5539/ijbm.v7n9p3

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.

International Journal of Business and Management   ISSN 1833-3850 (Print)   ISSN 1833-8119 (Online)

Copyright © Canadian Center of Science and Education

To make sure that you can receive messages from us, please add the 'ccsenet.org' domain to your e-mail 'safe list'. If you do not receive e-mail in your 'inbox', check your 'bulk mail' or 'junk mail' folders.