Validation vs. “Vladation”: The Flawed Logic Behind Many Assessments

Interviewer: We are speaking today with world-famous HR test expert, Dr. Vladimir Blotov. Welcome, Dr. Blotov. Vladimir Blotov: Hello. Let’s not stand on formality. Please, call me Vlad. Interviewer: Okay, Vlad. I couldn’t help but notice you continually misspell the word “validation” as “vladation.” What’s up? Vlad: Ho! Ho! Ho! Vladation is not misspelled, my dyslexic friend! “Vladation” is much superior to the traditional forms of assessment validation. Vladation is the same as validation… only it’s different. Validation is too complicated. Vladation is much preferred by organizations! Interviewer: Interesting. I must admit I am new to the field, Vlad. What is the difference between validation and vladation? Vlad: Well, try to follow along and I will explain. Vladation has several components that set it apart from so-called validation. First, vladation is not taught in accredited universities. Second, I support vladation data with personal testimony. Third, I only use high and low producer averages, not egghead statistics. Finally, my clients like to keep things simple. Interviewer: You raise some interesting questions. Why do you suppose vladation is not taught in universities? Vlad: Simple! People prefer to base decisions on their own experiences. They don’t want to get involved in high-brow, fancy-dancy number-crunching and research papers. Besides, they might have to change their minds after reading all that research stuff. Mind-changing is hard work. My system makes intuitive sense. Were you ever enrolled in a special-ed class? Interviewer: Isn’t the purpose of a test, Vlad, to evaluate an applicant’s ability to perform a job? You know, high scores predict high performance; low scores predict low performance…that kind of thing? Vlad: Of course! That’s the beauty of vladation. If the applicant’s scores match an average of high performers, they’re hired! Interviewer: But, group averages tend to hide individual differences. On the average, a person with one foot in a fire and the other in a bucket of ice water is comfortable. Averaging eliminates critical data about individuals in the group. Vlad: What’s your point? Interviewer: If individuals in the high group don’t match their own group average, how can you assume the group average is a legitimate benchmark? Vlad: Yes, yes, a minor technicality. I’ll soon release an impressive computer program that compares how many people in the high average group meet (and do not meet) the group average. It will be written in Widows. Interviewer: Don’t you mean “Windows”? Vlad: No, no. Windows is quite inferior. I developed the Widows operating system to run my proprietary software programs. It’s just like Windows, only different. Interviewer: Okay… Well, I guess that clears things up a bit. As I mentioned, I am not an expert. But why don’t you compare individual’s test scores with individual production? That way you could get a one-to-one comparison. Vlad: Too complicated! I have worked with hundreds of test companies who use vladation to sell tests. Vladation is everywhere! People like it. It must be good! Interviewer: But you are using group data to predict individual performance! That’s stereotyping. How do you justify stereotyping? Vlad: Feedback! Interviewer: Feedback? Vlad: Is there an echo in here? Yes, customer feedback. People tell me vladation works all the time. Interviewer: Do these people conduct scientific studies to confirm whether their personal opinions are legitimate? Vlad: Did I mention we also give them job standards for comparison? Interviewer: I don’t understand. My question was about legitimacy. Now you mention comparative job standards. Are you changing the subject? Vlad: Pay attention. We give clients a set of external standards to compare applicants with. If they don’t like their internal averages, they can use external averages. Brilliant, yes? Interviewer: Doesn’t that imply that the organization’s internal requirements are exactly the same as the external ones? Vlad: You catch on fast. Everyone knows that all sales jobs are identical, all managerial jobs are identical, all engineering jobs are identical, and so forth. Furthermore, when you strip away cultural differences, competitive differences, performance differences, product and service differences, managerial expectations, and organizational mission, every organization is exactly the same! This allows us to compare any and every job with a pseudo-standard (well, we don’t actually call it that). Interviewer: Pseudo-standard? Vlad: There’s that echo again! Sure, pseudo, a fake standard, deceitful practice, or pretense. Don’t you own a dictionary? Interviewer: Yes, I’m afraid I do. If I understand vladation, Vlad, you first compare an applicant’s individual test scores against high and low performing groups within an organization, then you compare them against an external job standard which is also based on group averages. Vlad: Yes! That’s it! You understand vladation! Interviewer: Are you aware that this kind of logic is flawed? It’s like saying that since most engineers are men and most social workers are women, a woman cannot be an engineer and a man cannot be a social worker. Or that since most women have small feet, people with small feet must be women. Educated people have known for hundreds of years that you cannot use group theory to make meaningful one-on-one real-world predictions. They even gave it a name: Aristotelian logic. Vlad: Whoa! Stop with all those university words! I’ll agree that vladation might have a few flaws, but that does not keep it from selling tests. As I said before, I have taught the technique to hundreds of test vendors. They don’t complain about high sales. Our buyers never follow up anyway. What harm can it do? Interviewer: Vlad, I was wondering, what kind of studies did your Ph.D. include? Vlad: Lunar agriculture. Interviewer: If you don’t mind my asking, how does a degree in lunar agriculture qualify you to be a test and selection expert? Vlad: No, I don’t mind. Interviewer: Well? Vlad: Oh, Sorry. I thought you were making a statement. What was your question again?

Article Continues Below

Topics

11 Comments on “Validation vs. “Vladation”: The Flawed Logic Behind Many Assessments

  1. You should be on ‘Last Comic Standing’! Very funny but sadly pathetically true. So, how do do we ‘Validation People’ put the Vladation People’ on the Moon so they can practice lunar agriculture? How do we Deep Fry these Turkey’s?

    You can read the original article here

    Post your own Article Review
    http://www.erexchange.com/p/g.asp?d=M&cid={44B23926-FD72-419A-8C16-FDC4C3C4348C}

  2. I read lots of good articles on here, but this was just childish, unproductive, and lacking altogether in insight.

    Hmm… Here’s a guy slamming ‘experts’ with no practical experience, and yet his bio reflects just that – zero practical experience.

    Dr. Wendell Williams (rww@ScientificSelection.com) is an expert developer of hiring and promotion tests. His time-proven tools reduce turnover by 50%, double productivity, cut training time, and conform to EEOC guidelines. He has been quoted in WSJ, Harvard Business Review, HR Magazine, R?sum?s for Dummies, and many nationally syndicated newspapers. Wendell has a B.S., MBA, MS, and a Ph.D. in Industrial Psychology. He is a Member of the American Psychological Association, The Society for Industrial and Organizational Psychology and The Association of Test Publishers. His website is http://www.ScientificSelection.com. He can be contacted by phone at 770-792-6857 or by email.

    You can read the original article here

    Post your own Article Review
    http://www.erexchange.com/p/g.asp?d=M&cid={E9283B7E-0440-4415-BB5B-D463CC037ECC}

  3. Ouch…Boy, do I feel stupid!

    I guess my four academic degrees in engineering, business, social psychology, and Industrial Psychology are worthless.

    Furthermore, my 30 years experience as a senior executive, owner of two consulting companies, head of training, and Senior Consultant for an international consulting company seems to have taught me nothing.

    Also, I guess I should cancel my membership in the American Psychological Association, The Society for Organizational and Industrial Psychology and the Association of Test Publishers.

    Finally, the expertise I have gained developing tests and assessement centers for dozens of companies among the Fortune 500, my job analyses on over 30,000 people, and 15 years as a test development and validation expert probably has no redeeming value.

    Sorry Andrew. I am more than qualified to comment on this subject. People who use ‘vladation’ techniques are not test ‘experts’…unless you define expertise as basing financial decisions on junk science.

    By the way, what are your technical qualifications as a test and validation expert? You are not a vladation proponent, are you?

    You can read the original article here

    Post your own Article Review
    http://www.erexchange.com/p/g.asp?d=M&cid={F49C25F9-3C7B-4184-8E3A-9EDDCF92CBB5}

  4. Hello Mitch,

    I never though of myself as ‘Ivory Tower’..just someone who calls it like it is (and, as you can see, is willing to take ‘hits’ from people who disagree).

    Legitimate testing technology has been around for about 5000 years (the earliest recorded records were found in China). Today, validation is standard practice for most of the Fortune 500. It’s the ONLY way the organization knows whether a test predicts job performance.

    Large organizations like Target, Manpower, Geico, US Government, Home Depot, and the ‘Baby Bells’, among others, all do test validation. Validation is an essential part of good business practice.

    Validation procedures were documented by the DOL in the 1978 Uniform Guidelines on Employee Selection Procedures; the EEOC uses them to determine the legal nature of hiring and selection systems; the technology is published in the 1999 Standards for Educational and Psychological Testing; and, over 40 Universities offer termininal degrees in the I/O field.

    Think about it: if recruiting and HR practitioners were expected to know as much about their technical field as line managers, the majority would be out of work.

    In fact, I am just a messenger. When I point out bad practices; my critics would find it much more productive to master validation techniques than write public nasty-grams. After all, their job depends on validation of the interview, test or any other selection tool.

    People who forget this fact will never earn the executive respect they crave.

    You can read the original article here

    Post your own Article Review
    http://www.erexchange.com/p/g.asp?d=M&cid={D7EFAAFD-B294-4755-B557-5FAF90645AC9}

  5. Fair is fair – I’ll admit that Wendell recycles from time to time (as do Lou, John, etc. – it takes time, patience and repetition to change habits) to make a point but ‘zero practical experience’? Sorry, not the Wendell I know.

    Perhaps another interpretation is helpful – hard to believe but their are three types of lies: Lies, damned lies, and statistics. Far too often, test developers ‘undervalidate’ (or, clutch the pearls, invalidate) their instrument by using the wrong statistical test to analyze the results (sure, Freud made a career out of N=1 studies, but he was a pioneer who was less concerned with predictive validity than he was with unlocking elements of the individual).

    While vladidation MAY account for a small percentage of the variance of something that is measured, the reticence of many recruiters to demand more from tests smacks with self preservation. Can you imagine the pain and suffering if competent data analysis demonstrated that assessment tools or intuition were not especially good at prediction?

    There is a science behind test development – can you tell me why so many are afraid of it?

    You can read the original article here

    Post your own Article Review
    http://www.erexchange.com/p/g.asp?d=M&cid={51924E4D-F669-4797-A053-5A9BF1379FB3}

  6. Actually I found the article refreshing. Sure, technique and methodologies are great. But unless someone is using intestinal motility to produce gold bricks there is no iron clad method of recruiting. I have no problem with someone marketing their system but it is possible to have rigor and process without becoming a slave to a system and surprise, surprise be very successful. Yes, I am educated and I value you education in others too but there is a difference in the Ivory Tower and the real world.

    You can read the original article here

    Post your own Article Review
    http://www.erexchange.com/p/g.asp?d=M&cid={2F54765F-ACC5-4406-8F50-480E42045373}

  7. I do not have the expertise of Dr. Wendell. But, I have been reading his posts for about 6 months and find them to be full of useful, accurate (to my knowledge), and sometimes even entertaining, information. He is quite direct and very sure of himself. Subsequently, he does sometimes come across as a bit condescending.

    I have yet to see a single reply authored by someone who claims to approach his level of expertise and experience, however.

    One element I would like to add to the assessment discussion:

    As a test ‘peddler’ we have found it disappointing that, having gone to great lengths and expense to produce a valid and reliable tool with built in ‘lie’ detecting scales, we find few HR professionals in small to medium sized businesses with the time, expertise or inclination to really understand how these measures make an assessment tool superior. Many seem only somewhat familiar with the terminology of assessment validity and lack confidence in their ability to evaluate their assessment options.

    In fact, Dr. Wendell has very nearly implied in two of his replies in this forum that unless you have a PhD then you are likely to make a foolish, uniformed assessment choice.

    So the question is, what can all of those HR professionals who do not have Ph.D.s, who work for medium sized or smaller companies that can’t afford to hire OD consultants with PhDs, do to understand and benefit from pre employment assessments?

    You can read the original article here

    Post your own Article Review
    http://www.erexchange.com/p/g.asp?d=M&cid={BB4C4FEC-B8A7-4E69-BB27-E7972D6A5D4D}

  8. I take exception to anyone’s claim that screening tools are either ‘valid’ or ‘reliable’.

    Having screened thousands of candidates using these tests, I have yet to see a program that is either. I don’t care whether it’s a sales test, personality test, psychological test, or whatever, they just don’t work. I’ve never seen any evidence to show that candidate performance after testing is any better than for those hired before testing was implemented.

    I could go on for hours on how I have seen these tests abused in implementation. It doesn’t take long before they lose effectiveness. Candidates pass on tips (or the actual test) to prospects, recruiters let slip how to pass because they need their numbers, and frequently, hiring managers will give the answers to candidates they want hired (or just waive the test results altogether).

    Businesses face constant pressures to automate everything, and weak managers love these tools because they relieve them of responsibility for their hiring mistakes. If I were a corporate CEO, I’d make quarterly interview training mandatory for all managers. There is no substitute for strong interview skills.

    If you have a competent recruiting staff, coupled with an interview team that understands what they want, what they need, and how to interview, no test is going to be able to match the quality of the people you will hire using this method.

    I know I’m fighting a losing battle. All I can suggest is before you go implementing one of these tests, have a legitimate reason. Don’t just do it for the sake of having a screening process.

    You can read the original article here

    Post your own Article Review
    http://www.erexchange.com/p/g.asp?d=M&cid={F1CF8EB4-BF67-44EC-B0F8-9288263B93D3}

  9. ‘Sorry to occasionally come across as condescending. It is never my intent. (I should probably hire someone to pre-read my material before it is posted).

    Anyway, moving on to your question…I get empassioned about poor testing practices because: 1) it hurts the organization; 2) it hurts the applicant; 3) it is occupationally unprofessional; and 4)it is lawsuit-bait in the US.

    People don’t need a Ph.D. to use tests, but it helps to understand test principles before trusting test scores. Anyone who wants to use tests can read a few books on test design and validation, take a few courses in psychometrics or hire qualified people to do it for them.

    Information about using tests can be found in the 1999 Standards for Educational and Psychological Testing (www.apa.org/science/standards.html), the 1978 Uniform Guidelines on Employee Selection Procedures (www.dol.gov/dol/allcfr/ESA/Title_41/Part_60-3/toc.htm) or Getting Started with Tests (www.scientificselection.com/Downloads/Hiring%20Test%20Overview.pdf).

    Validation data? Don’t leave home without it.

    You can read the original article here

    Post your own Article Review
    http://www.erexchange.com/p/g.asp?d=M&cid={39A8E8FE-5C79-41F3-ABE7-B9BE93C80DFF}

  10. Hello, Mark…I cannot deny your experience with validation and testing. When done wrong, it is destined to fail; but, I can assure you, when done right the results are dramatic.

    It is not magic…Just using better tools that screen out a higher percentage of unqualified people than interviews.

    A well-designed system, (i.e., based on job analysis, legitimate tests and trained users) cuts turnover, increases personal productivity and decreases training expense.

    Want proof? Providing the management is decent (e.g., no one can expect good people to work for bad managers), give any qualified practioner a company with a training problem, performance issue, or high turnover and he or she will jump at the chance to do the work free in exchange for a share of the savings/revenue…

    How many occupations can make that claim?

    You can read the original article here

    Post your own Article Review
    http://www.erexchange.com/p/g.asp?d=M&cid={63B4FDE4-9891-408E-A7AF-9E99AAADC109}

  11. Mark,

    I have to agree with Dr. Williams. I’ve seen many companies misuse tests that are not properly validated to predict job performance. However, for those companies that have implemented testing correctly, the results are dramatic in terms of higher productivity, reduced turnover and improved morale. I don’t think anyone is saying to base a hiring decision soley on a test, however, a good test can direct the interviewer on the proper areas to focus on and it also helps remove interviewer bias in those less trained and skilled interviewers.

    You can read the original article here

    Post your own Article Review
    http://www.erexchange.com/p/g.asp?d=M&cid={1F3CD625-4847-444E-BF5A-330FE5FC5561}

Leave a Comment

Your email address will not be published. Required fields are marked *