News

22.04.2025

The Legal Pitfalls of Using AI to Land Your Dream Job

Job applicants are using AI to land new positions, with some even using AI tools to help them formulate “model” answers – in real time – during remote interviews. Despite an absence of law and regulation on the topic, there are meaningful steps employers can take to restrict the practice.

It is no secret that job applicants are using AI to boost their chances of landing a new position.  Recent studies suggest that about 50% of job seekers use ChatGPT to generate resumes and/or cover letters.  Less well known, however, is that candidates are also using AI to “ace” their remote interviews.  While few would object to using an AI tool to generate potential practice interview questions based on the job description or information about the interviewer (in fact, a Google Super Bowl ad promoted this practice), the issue is far murkier when an applicant uses an AI tool to answer questions during a remote interview.

While federal and state legislatures and agencies have not formally addressed the use of AI by job applicants, existing federal employment guidance and laws on employers’ use of AI tools provide a window into how employers might restrict applicants’ AI use.  For example, it is well established that an employer cannot use AI to discriminate against applicants.  However, an employer should be free to adopt a general policy prohibiting AI based on a legitimate, nondiscriminatory business reason – such as blocking plagiarism and minimizing misrepresentations about skills and experience.  Indeed, an applicant who uses an AI tool to provide “model” answers to interview questions – à la Cyrano de Bergerac – is arguably misrepresenting herself and her abilities to a prospective employer.  This is particularly true for a software engineer who uses AI to help her write sample code during an interview.  Surely employers have a right to know whether an applicant truly possesses the skills and expertise she claims or demonstrates during an interactive interview.

Some states – like Illinois – require employers to disclose their use of AI in the hiring process.  This suggests that employers could require candidates do the same.  It would seem prudent for the organization to inform applicants about the use of technology that can detect the influence of AI in resumes and during the interview process and that offenders will be disqualified.

Employers should consider taking the following steps to address job applicants’ use of AI:

  1. Ensure there are legitimate nondiscriminatory business reasons for prohibiting or restricting the use of AI during the application process.
  2. Post a notice about your AI policies on your application portal and include a link to those policies.
  3. Consider requiring candidates to attest that they have not used AI in any part of the application process or, alternatively, require them to disclose their use of AI.
  4. Train recruiters and interviewing staff to detect the use of AI and consider employing software tools to facilitate that detection process.

As always, consult with a qualified attorney before implementing any AI policy in your job application process.

 

Article provided by INPLP member: Jason Kravitz (Nixon Peabody, USA)

 

 

Discover more about the INPLP and the INPLP-Members

Dr. Tobias Höllwarth (Managing Director INPLP)