Will the Internet Ever Solve Recruiting Problems?

There seems to be a good deal of confusion about how the Internet can “solve” recruiting problems. Sorry, but in my observation, the Internet has done more to operationalize bad practices than it has to promote good ones. Translating a brochure, job listing, or application form into electronic form may keep web designers off the street, but thinking it can somehow turn bad recruiting practices into good ones is just a bunch of “hooey” (a highly technical term for the stuff you scrape off your shoes after walking through cow pastures and barnyards). Restriction of Applicants Ever thought about how many people don’t live and die for the Internet? Not everyone searches the Net for jobs. Using the Internet as a primary recruiting tool is like standing on one street corner and screening only the people who cross there. Some published reports, for example, show minorities are substantially under represented on the Net. Sure, playing games and using email are attractive, but using one communications medium for all potential job applicants? Fishing Few people want to fill out application forms ó especially when they know the answers will be used to disqualify them. Who wants to read a bulletin board of job titles? Job titles tell us almost nothing about the job or what is expected. And job postings are often almost as fictitious as resumes. How does an applicant know which title to apply for? Which one maximizes his or her chances of being hired? Is the web design oriented to job postings, marketing brochures, service requests, or all three? Job applicants need information about what kind of place it is to work and the nature of the jobs. Playing the lottery is about the only time people will invest 100% effort with an almost zero chance of winning. Automated front-end application forms may be easy to build, but they will ever seldom work as expected. Tower of Babel Imagine a world where everyone uses a different metric system and a different set of accounting practices, speaks in different languages, or uses a different numbering system. This gives us some idea of what it is like to describe human performance. Until organizations come to grips with the fact that “performance” means more than proficiency in a certain technical area, applicants and employers are doomed to wander aimlessly in the dark asking “What do ‘taking responsibility for…’ ‘leading a group to…’ or ‘successfully accomplished…’ really mean, and how can I measure it?” Sifting Through the Trash Sifting through trash to find a lost article is among my least favorite activities. I seldom find what I’m looking for, but I always get old coffee grounds under my fingernails. Automated resume screening is similar. For one thing, it takes a pretty smart keyword search engine to discriminate between “thought about getting an IT degree from Harvard in 1995” and actually graduating from Harvard in 1995. And, even if you get the search engine perfect, the data is often trash. For example, I recently read a resume where the applicant listed his skills as “responsible for making RJ45 connections and installing peripheral computer equipment in 7-foot racks.” Translation: “Crimped plastic connectors on wires and used machine screws to attach equipment to metal railings.” Wow! Just the guy I was looking for to man my technical hotline. Matching Skills to Skills? Sorry no cigar here, either. There is more to jobs than having specific knowledge ó little things like being able to think and solve problems, being able to plan and organize work, getting things done through people, and having the right kind of attitudes and motivations. Hiring a person who retook a skills course six times is a tribute to human tenacity, not problem-solving ability. Systems that use a straightforward match-to-match process are functionally retarded. Human decision making is far more complex. Humans use trade-off strategies, are willing to substitute one skill for another, and employ knockout factors. Using a match-to-match algorithm is like using the alphabet to multiply numbers. Cow Hurling In “The Search for the Holy Grail,” Monty Python’s Crusaders encounter two characters who personify much of today’s recruiting environment: the Knights Who Say “Nee!” and the French soldiers who catapult cows at the Crusaders. It goes something like this: Step 1. The hiring manager gives the recruiter an incomplete idea of job requirements (i.e. “Bring me a shrubbery!). Step 2. The recruiter searches around for a suitable cow. Step 3. The recruiter hurls one cow after another over the wall until one stays put. Step 4. The hiring manager takes credit if the cow is a prizewinner, but hides the evidence if the cow is a loser. There you have it: miscommunication, poor measurement, shift of responsibility, and poor feedback. Cow hurling is as silly in recruiting as it is in the movies. I wonder why this profession is considered replaceable? What’s Happenin’? A while ago, researchers sought to learn about human behavior by studying animals. In one fascinating experiment, pigeons were taught how to roll a ball down a plank with their beak. If they knocked down pins, they received a few pellets of grain (much like the compensation program in one of my past jobs). Anyway, one group of pigeons could see the pins, while a control group had their view blocked with a curtain. Results? Pigeons that could see the pins learned to knock down more pins. Accurate and prompt feedback leads to improvement. Recruiting needs to get specific feedback. Getting Smarter…or Not I’ve been doing some work on an Attitude, Interest and Motivation test lately, comparing test scores with job performance. In technical terms this is called a “concurrent performance criterion validation study” (in lay terms, this kind of study measures whether test scores are better predictors of test-publisher revenue or future job performance). We could have just set some logical targets for our test and “winged it,” but we thought it was a better idea to actually see if test scores (e.g., attitudes) were statistically associated with job performance ratings ó things like quality, sales volume, and call time. We found that scores varied depending on job and rating area. In some cases, people who liked problem solving and generating creative ideas did better in sales; in others, productivity was associated with a desire to follow rules; while in a third job, only the dull and lazy stayed on the job five years or more. This is valuable information to know. A little bit of mental gymnastics tells us what’s important and what’s not. We can use that sort of information in hiring. Putting Tests on the Web Internet applications, like it or not, are tests. They either predict something or nothing. ASPs designed by recruiters often look a lot like interviews. ASPs designed by techies look a lot like technical qualifications. Only applications designed by measurement experts can deliver trustworthy results ó and this kind of accuracy takes considerable homework to define. ASPs that are not carefully tailored to the job and the organization may be pretty to observe, but will inevitably fall short of expectations. The government takes a dim view of testing ó not because testing is “bad,” but because testing is often misapplied. The EEOC is not against testing, it is against test misuse. Their purpose is not to force organizations to hire unqualified people, but to make sure tests are job-related, based on business necessity and don’t unnecessarily screen out people based on irrelevant factors like race, gender, age, and so forth. Any application that uses tests or interviews better be able to back-up test results up with facts and data. Wrong-Way Web-Thinking A few months ago, a very self-impressed person described his “perfect” ASP. He said he would collect data about each applicant he placed, then study it to identify which sources provided the most placements. Nice idea for if you are in the short-term recruiting business, but way off the mark if you are a client who cares about human performance. Keeping record of placements without keeping records of performance is like a salesperson who sells a product and hopes it is not returned. I have never heard a line manager complain about sourcing problems. They care about on-the-job performance. I may be wrong, but it just seems logical to give the customer what they want ó a high-performing employee. But maybe that’s just me. Complaints Please send all nasty-grams to the attention of the EEOC. Be sure to include name, contact information and a detailed explanation why their hiring and placement guidelines take too much work or should be ignored. (If you think there aren’t that many lawsuits, be sure to check out the EEOC website). Be sure to copy the company COO. It will show him or her you are on technology’s cutting edge!

Article Continues Below

Topics

Leave a Comment

Your email address will not be published. Required fields are marked *