How to make applying for jobs less painful | The Way We Work, a TED series



Finding a job used to start with submitting your résumé to a million listings and never hearing back from most of them. But more and more companies are using tech-forward methods to identify candidates. If AI is the future of hiring, what does that mean for you? Technologist Priyanka Jain gives a look at this new hiring landscape.

The Way We Work is a TED original video series where leaders and thinkers offer practical wisdom and insight into how we can adapt and thrive amid changing workplace conventions. (Made possible with the support of Dropbox)

Visit https://go.ted.com/thewaywework for more!

source

38 thoughts on “How to make applying for jobs less painful | The Way We Work, a TED series

  1. Double edge sword. I’m creative but cautious, so the red dot test wouldn’t apply to me. Good intentions, interesting technology, but would still discriminate on numbers.

  2. This is why you are not hired.

    This person's great ideas.

    If I can do your job and mine but you can't do either why were you the first one hired, that's what's frustrating for the rest of us.

    It's not frustrating that we don't understand it, we do. It's an annoyance that you think we'll put up with it, when we won't.

  3. All the wrong examples used to prove the necessity of AI. A person is not employable for their clapping hands' skills, but much more than that. Wrong. Totally wrong.

  4. This person is only talking about the commodidised jobs market, what about the rest of the market? Algorithms are an extension of human discrimination, however its the customers and not the employer who ultimately pick the candidate, in the uk we have 7.1 million unemployed, 2.2 advertised jobs, 2 million unadvertised jobs, 13% of jobs are created each year whilst 12% are lost… The job Market is not an economically perfect market and is inefficient costing our country 1_2 ‰ gdp a year…. So why have the government not paid to retrain nurses, and trained suitable unemployed disabled to be doctors? In essence a free market jobs market has failed

  5. If im looking for someone to manage a multi-million dollar company that's just starting out, I wouldn't care if im talking to John or Johnessa, what matters is if they're competent. Theres three things that you can measure their competency by.

    1. Job history (experience)
    2. Education
    3. Aptitude

    Not some wishy-washy parameters to make it more "equitable".

  6. She had me until she said that they would adjust the algorithm if they weren't getting enough candidates from a particular population. What if qualified people from that population had not applied for whatever reason? Would changing the algorithm be the best solution? Why not perform better marketing for the hiring by outwardly engaging a broader set of the population?

    All in all it sounds good until the people in HR begin "gaming" the system. Computer algorithms are great but let us not forget the possible biases of the people that create them.

  7. But will this algorithm favor neurotypical people? As someone with ADHD, the idea of taking a test that measures attentiveness in order to get a job is terrifying. Would the test reveal to employers that I have ADHD? That's not normally something I would disclose in an interview.

  8. It's wild how the job market is these days. Even when you get interviewed in person (and going through multiple rounds of interviews) there's still a chance that the employer may 'ghost' on the interviewee.

  9. If an algorithm pick employees based on merit e.i qualifications, and the outcome is mostly male (just as an example). If you then alter it to be "fair and not favor any gender" to pick more females (just as an example) then it does in fact favor a gender… Females

  10. TED, you become disappointing. What is this video about? Problem (HR's discriminate empolyees) -> Solution (Let's use AI). Nice 🙁
    Stereotypes emerge because we tend to simplify things, to save our brain from overload. Of course, following stereotypes (regardless if they are positive or negative) can possibly offend somebody and, of course, it is sad. But why haven't we broken the old-fashioned system and fired all HR's and dived into this Computer-Aided Faschism?
    The reason is that human resourse management works under DYNAMIC conditions. Employees are not gear wheels, which are made once and degrade by wearing-off. They solve problems and obtain experience, adopt to situations, learn. Taking the same man or woman at the beginning, in the middle and at the end of career, you'll see various persons.
    How do modern recommender systems work? They collect data, find hidden correlations and produce a set of deterministic rules which allow to classify a new chunk of data. Our computers also simplify things, but their perception is very short-sighted compaired to ours, they "see" only prepared learning data, while the learning result depends on various factors like learning algorithm implementation and order of training chunks. Assuming the system is complete and working, what should we get? On the one hand, it will really help some people to find proper job. But what if your race or gender does not fit these rules? Should they be tuned for divercity? And wouldn't this tuning ruin all idea?
    The author of this video wants to simply replace human stereotypes by a set of rules, produced by machine learning, to transfer responsibility on AI. But this would not solve the proposed problem.

  11. Doesn't favour any gender or ethnicity? I hope you achieved that by just not bothering to capture them in the hiring process, as that seems a simple solution. If being fair is the way forward, then get rid of ridiculous CV's, people asking why there's a 3 month gap in your working life, asking for any personal details that aren't going to be used to contact you, and allowing the three best candidates a short trial period performing the job. If the employer doesn't have the time, then they should look for candidates as disinterested as them.

  12. "if there's any population that's being over favoured we can change the algorithms" nice. How is that fair? Whoever is most qualified should be hired regardless of race and gender. If a gender is not good at a certain job, why should the algorithms be forced to include them? Moronic.

  13. So, if the best fit for a job is a middle-aged female and the job market for that position is saturated with middle-aged females. You'll actually adjust your algorithm so that middle-aged males start getting more of those positions? What if middle-aged males aren't applying for those positions? What if the middle-aged males aren't as qualified for that position?

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get this amazing Ebook for Free

Loading

By subscribing to this newsletter you agree to our Privacy Policy

Skip to content