How to make applying for jobs less painful | The Way We Work, a TED series

How to make applying for jobs less painful | The Way We Work, a TED series

Articles Blog

Applying for jobs online is one of the worst
digital experiences of our time. And applying for jobs in person
really isn’t much better. [The Way We Work] Hiring as we know it
is broken on many fronts. It’s a terrible experience for people. About 75 percent of people who applied to jobs
using various methods in the past year said they never heard anything back
from the employer. And at the company level
it’s not much better. 46 percent of people get fired or quit within the first year
of starting their jobs. It’s pretty mind-blowing. It’s also bad for the economy. For the first time in history, we have more open jobs
than we have unemployed people, and to me that screams
that we have a problem. I believe that at the crux of all of this
is a single piece of paper: the résumé. A résumé definitely has
some useful pieces in it: what roles people have had,
computer skills, what languages they speak, but what it misses is
what they have the potential to do that they might not have had
the opportunity to do in the past. And with such a quickly changing economy
where jobs are coming online that might require skills that nobody has, if we only look at what someone
has done in the past, we’re not going to be able
to match people to the jobs of the future. So this is where I think technology
can be really helpful. You’ve probably seen
that algorithms have gotten pretty good at matching people to things, but what if we could use
that same technology to actually help us find jobs
that we’re really well-suited for? But I know what you’re thinking. Algorithms picking your next job
sounds a little bit scary, but there is one thing that has been shown to be really predictive
of someone’s future success in a job, and that’s what’s called
a multimeasure test. Multimeasure tests
really aren’t anything new, but they used to be really expensive and required a PhD sitting across from you and answering lots of questions
and writing reports. Multimeasure tests are a way to understand someone’s inherent traits — your memory, your attentiveness. What if we could take multimeasure tests and make them scalable and accessible, and provide data to employers
about really what the traits are of someone who can make
them a good fit for a job? This all sounds abstract. Let’s try one of the games together. You’re about to see a flashing circle, and your job is going to be
to clap when the circle is red and do nothing when it’s green. [Ready?] [Begin!] [Green circle] [Green circle] [Red circle] [Green circle] [Red circle] Maybe you’re the type of person who claps the millisecond
after a red circle appears. Or maybe you’re the type of person who takes just a little bit longer
to be 100 percent sure. Or maybe you clap on green
even though you’re not supposed to. The cool thing here is that
this isn’t like a standardized test where some people are employable
and some people aren’t. Instead it’s about understanding
the fit between your characteristics and what would make you
good a certain job. We found that if you clap late on red
and you never clap on the green, you might be high in attentiveness
and high in restraint. People in that quadrant tend to be
great students, great test-takers, great at project management or accounting. But if you clap immediately on red
and sometimes clap on green, that might mean that
you’re more impulsive and creative, and we’ve found that top-performing
salespeople often embody these traits. The way we actually use this in hiring is we have top performers in a role
go through neuroscience exercises like this one. Then we develop an algorithm that understands what makes
those top performers unique. And then when people apply to the job, we’re able to surface the candidates
who might be best suited for that job. So you might be thinking
there’s a danger in this. The work world today
is not the most diverse and if we’re building algorithms
based on current top performers, how do we make sure that we’re not just perpetuating
the biases that already exist? For example, if we were building
an algorithm based on top performing CEOs and use the S&P 500 as a training set, you would actually find that you’re more likely to hire
a white man named John than any woman. And that’s the reality
of who’s in those roles right now. But technology actually poses
a really interesting opportunity. We can create algorithms
that are more equitable and more fair than human beings
have ever been. Every algorithm that we put
into production has been pretested to ensure that it doesn’t favor
any gender or ethnicity. And if there’s any population
that’s being overfavored, we can actually alter the algorithm
until that’s no longer true. When we focus on the inherent
characteristics that can make somebody
a good fit for a job, we can transcend racism,
classism, sexism, ageism — even good schoolism. Our best technology and algorithms
shouldn’t just be used for helping us find our next movie binge
or new favorite Justin Bieber song. Imagine if we could harness
the power of technology to get real guidance
on what we should be doing based on who we are at a deeper level.

37 thoughts on “How to make applying for jobs less painful | The Way We Work, a TED series”

  1. So, if the best fit for a job is a middle-aged female and the job market for that position is saturated with middle-aged females. You'll actually adjust your algorithm so that middle-aged males start getting more of those positions? What if middle-aged males aren't applying for those positions? What if the middle-aged males aren't as qualified for that position?

  2. "if there's any population that's being over favoured we can change the algorithms" nice. How is that fair? Whoever is most qualified should be hired regardless of race and gender. If a gender is not good at a certain job, why should the algorithms be forced to include them? Moronic.

  3. Doesn't favour any gender or ethnicity? I hope you achieved that by just not bothering to capture them in the hiring process, as that seems a simple solution. If being fair is the way forward, then get rid of ridiculous CV's, people asking why there's a 3 month gap in your working life, asking for any personal details that aren't going to be used to contact you, and allowing the three best candidates a short trial period performing the job. If the employer doesn't have the time, then they should look for candidates as disinterested as them.

  4. TED, you become disappointing. What is this video about? Problem (HR's discriminate empolyees) -> Solution (Let's use AI). Nice 🙁
    Stereotypes emerge because we tend to simplify things, to save our brain from overload. Of course, following stereotypes (regardless if they are positive or negative) can possibly offend somebody and, of course, it is sad. But why haven't we broken the old-fashioned system and fired all HR's and dived into this Computer-Aided Faschism?
    The reason is that human resourse management works under DYNAMIC conditions. Employees are not gear wheels, which are made once and degrade by wearing-off. They solve problems and obtain experience, adopt to situations, learn. Taking the same man or woman at the beginning, in the middle and at the end of career, you'll see various persons.
    How do modern recommender systems work? They collect data, find hidden correlations and produce a set of deterministic rules which allow to classify a new chunk of data. Our computers also simplify things, but their perception is very short-sighted compaired to ours, they "see" only prepared learning data, while the learning result depends on various factors like learning algorithm implementation and order of training chunks. Assuming the system is complete and working, what should we get? On the one hand, it will really help some people to find proper job. But what if your race or gender does not fit these rules? Should they be tuned for divercity? And wouldn't this tuning ruin all idea?
    The author of this video wants to simply replace human stereotypes by a set of rules, produced by machine learning, to transfer responsibility on AI. But this would not solve the proposed problem.

  5. oh, if this algorithms worked in Russia… In my country you will work for 200$ and all life will pay credits….

  6. Though interesting, this did not address easing the pain of applying for a new job.  It was too theoretical and not practical.

  7. I hope this happens, and then shy and introverted people get good jobs in the western world where social butterflies get the best jobs because it's all based on connections.

  8. If an algorithm pick employees based on merit e.i qualifications, and the outcome is mostly male (just as an example). If you then alter it to be "fair and not favor any gender" to pick more females (just as an example) then it does in fact favor a gender… Females

  9. It's wild how the job market is these days. Even when you get interviewed in person (and going through multiple rounds of interviews) there's still a chance that the employer may 'ghost' on the interviewee.

  10. Not what I expected from the title but…

    It was very interesting and I’m sure this is the future, they say in 30 yrs most jobs that people will do don’t even exist today!

  11. But will this algorithm favor neurotypical people? As someone with ADHD, the idea of taking a test that measures attentiveness in order to get a job is terrifying. Would the test reveal to employers that I have ADHD? That's not normally something I would disclose in an interview.

  12. She had me until she said that they would adjust the algorithm if they weren't getting enough candidates from a particular population. What if qualified people from that population had not applied for whatever reason? Would changing the algorithm be the best solution? Why not perform better marketing for the hiring by outwardly engaging a broader set of the population?

    All in all it sounds good until the people in HR begin "gaming" the system. Computer algorithms are great but let us not forget the possible biases of the people that create them.

  13. If im looking for someone to manage a multi-million dollar company that's just starting out, I wouldn't care if im talking to John or Johnessa, what matters is if they're competent. Theres three things that you can measure their competency by.

    1. Job history (experience)
    2. Education
    3. Aptitude

    Not some wishy-washy parameters to make it more "equitable".

  14. This person is only talking about the commodidised jobs market, what about the rest of the market? Algorithms are an extension of human discrimination, however its the customers and not the employer who ultimately pick the candidate, in the uk we have 7.1 million unemployed, 2.2 advertised jobs, 2 million unadvertised jobs, 13% of jobs are created each year whilst 12% are lost… The job Market is not an economically perfect market and is inefficient costing our country 1_2 ‰ gdp a year…. So why have the government not paid to retrain nurses, and trained suitable unemployed disabled to be doctors? In essence a free market jobs market has failed

  15. All the wrong examples used to prove the necessity of AI. A person is not employable for their clapping hands' skills, but much more than that. Wrong. Totally wrong.

  16. This is why you are not hired.

    This person's great ideas.

    If I can do your job and mine but you can't do either why were you the first one hired, that's what's frustrating for the rest of us.

    It's not frustrating that we don't understand it, we do. It's an annoyance that you think we'll put up with it, when we won't.

Leave a Reply

Your email address will not be published. Required fields are marked *