Download this article

Innovation, Multicultural Research, and Future Developments in Testing

Paul Barrett, Head of Research & Development for Psytech International.

Interestingly, the two keynote speakers at last yearsa^ Society of Industrial and Organizational Psychology of South Africa (SIOPSA) 6th Annual Conference, 2003 addressed topics which concern Psytech directly innovations in testing, and multicultural issues involved when using questionnaires and tests constructed in one country within another like South Africa. Both addresses may be downloaded from the conference website at: http://www.siopsa.org.za/Conference/03conferencepapers.htm

Fons van de Vijver, from Tilburg University, the Netherlands, addressed the conference about the key issues involved in cross cultural and multicultural assessment. Specifically, he introduced the concept of bias. This occurs when differences in the scores of the indicators of a particular construct for an individual do not correspond with differences in magnitudes on that individuala^s underlying trait or ability. Three types of bias are identified: construct, method, and item bias. Construct bias occurs when the construct being measured is not identical in meaning across cultures, and especially where the behaviours identified as constitutive of the construct are not identical in each culture. Method bias is more to do with how the methodology of assessment may change across cultures, or in some way is not identical across cultures. Finally, item bias occurs at the item level of a questionnaire. That is, each item response pattern is examined across cultures, investigating adifferential item functioninga?N by comparing item response rates across cultures and scale score levels within cultures. From this basic taxonomy, Professor van de Vijver then explored the wider practical issues in cross-cultural research, especially within multicultural settings such as South Africa. Specifically, he outlined the implications for personnel selection and recruitment if using tests which were not anormeda?N for a particular country or culture, or if a test exhibited one or more bias effects as he had outlined earlier.

As a practical exemplar of the kinds of analysis involved in examining questionnaire data for bias, he described the work he had undertaken with Deon Mering, Sebastian Rothmann, and Murray Barrick on testing the construct, item, and method bias of a cognitive ability test (reading, comprehension, and spelling, developed by the South African Police Service) and the Psytech 15FQ+ personality questionnaire within a sample of South African Police Force recruits. The sample consisted of a total of 13,681 participants divided into 12 cultural groups. The results from this work indicated that the cognitive tests were relatively unbiased and possessed good internal consistency reliability. However, the 15FQ+, whilst showing some minor construct bias amongst certain cultural groups, did demonstrate some significant item bias. However, of more concern is that within this particular applicant sample, the alpha internal consistency indices for some scales were very low far lower than in any other norm group used for the 15FQ+ in various countries. Nanette Tredoux of Psytech SA later confirmed that low alphas were seen in almost every American or European normed test imported into South Africa, and used with applicants who had not received tertiary education. This is a challenge that confronts all test publishers of imported occupational tests that are being used in South Africa, a challenge that Psytech SA are meeting with their research on the relation between literacy levels and the utility of some Psytech tests in specific application domains.

The second keynote speaker, Psytecha^s Director of R&D Paul Barrett (now also adjunct professor of psychometrics and performance measurement at Auckland University), took a different theme for his address. It was one of innovation in the area of psychological assessment. He opened his address with a quotation from a paper by Robert Sternberg and Wendy Williams (1998)1"No technology of which we are aware- computers, telecommunications, televisions, and so on- has shown the kind of ideational stagnation that has characterized the testing industry. Why? Because in other industries, those who do not innovate do not survive. In the testing industry, the opposite appears to be the case. Like Rocky I, Rocky II, Rocky III, and so on, the testing industry provides minor cosmetic successive variants of the same product where only the numbers after the names substantially change. These variants survive because psychologists buy the tests and then loyally defend them (see preceding nine commentaries, this issue). The existing tests and use of tests have value, but they are not the best they can be…". This is a stunning criticism of the psychological test publishing industry. Professor Barrett then followed this up with the equally infamous statement from Joel Michell (1997)2aNo critic has explained why psychology, alone amongst the sciences, is entitled to its own definition of measurement…. Readers of this journal have been given no adequate reason, yet, to avoid the conclusion that methodological thought disorder is systemic in modern psychology.a?N From these two foundational statements, further elaboration and exploration of the status of current psychometrics and psychological tests was provided. It was concluded that the innovation to be expected in psychometric testing, as it might be expected in any industry over a 60 year or so period, was indeed lacking. It was for these reasons that he was now embarked on the generation of new forms of psychological assessment, as are some leading I/O psychology teams in US (for example, (Susan Embretson and colleagues at Kansas, Fritz Drasgow and colleagues at Illinois, Neal Schmitt and colleagues at Michigan, and Julie Olson-Buchanan at California).

One new technology created by Professor Barrett, already patented in Asia, Europe, and the US, is the graphical profiler. This is a means of acquiring assessments of personality, values, interests, and preferences from individuals without the need for a questionnaire. The technology demonstrated live at the conference is already in use in New Zealand and the US, and research using a Big Five personality test showed that very considerable advances in assessment might be made with this technique. Certainly, Psytech are aware of this research and are considering it in relation to their ongoing product development strategy.

A second product was then showcased, designed around the methodology of computer-based dynamic testing. This technology is being applied to the assessment of abilities, aptitudes, and learning potential. The word adynamica?N is used to describe assessments in which the actual behaviour of an individual becomes the focus of assessment. Instead of asking questions or problems that are responded to in a self-report, multiple-choice questionnaire format, these kinds of task create working simulations or scenarios in which the respondent has to learn certain rules in order to solve problems. Essentially, what is being assessed is an individuala^s capacity to acquire new information, and apply it. The measures used in these tests are associated with the alearninga?N and acquisition of knowledge, allied to its application. Psytecha^s first application in this area is for assessing untrained individuals for computer programming potential. Again, a live demonstration of the initial prototype for the test was presented at the conference.

After this apsychological assessmenta?N phase of the presentation, Professor Barrett then described the second domain of innovation to be expected in the area of psychometric testing within organizational settings. This is in the more sensitive and better-targeted analysis of psychometric data in order to produce better predictions of job success and performance. At the heart of his work here is the attempt to dramatically increase the financial return on investment in using psychometrics in the workplace. Two methodologies were showcased here person-target profiling, and the application of data mining and non-linear classification techniques to profile construction. Psytech are already beginning to use some of these methods in their GeneSys Profiler Module, whilst Nanette Tredoux of Psytech SA has applied non-linear CART (Classification and Regression Tree Analysis) to the optimised construction of a completely different kind of person-profile

So, overall, the contrasting themes of the two keynotes certainly provided much food for thought for conference participants. What is important to note is that Psytech are now involved in truly leading edge research and development, as well as continuing to enhance and maintain their current product base. Interesting times are ahead for us all.


1Sternberg, R. J., & Williams, W. M. (1998). You proved our point better than we did: A reply to our critics. American Psychologist, 53, 576-577

2Michell, J. (1997) Quantitative science and the definition of measurement in Psychology. British Journal of Psychology, 88, 3, 355-383

 

psychometric testing - psychometric assessment - psychometric software