As AI is increasingly used by job-matching platforms and AI interviews become more widespread, a new sector is emerging to help applicants make the algorithms work in their favour. In the same way that companies use SEO to try to help their websites up the rankings, applicants are hiring coaches and trying new techniques to game algorithms and get past the AI gatekeepers to actual human recruitment teams.
So What
On the face of it this seems innocous – why not do what you can to improve your chances? However the direction of travel is concerning. If we move towards societies mediated by AI, humans risk finding themselves distorting their behaviour on a regular basis just to meet the perceived requirements of automated systems (which may themselves contain hidden bias, depending on the data used to train them).
For example job hunters are already describing having to write ‘like cavemen’ – i.e. very simply, and with unnatural levels of desired keywords. It is starting to become accepted practice to have two resumes, one for the AI and one for humans. What might the systemic implications of this trend mean over the long term? What might it mean for other areas where AI is used to judge and sort human activity?
Join discussion