When Hilke Schellmann heard a Lyft driver talking about being interviewed by a “robot” back in 2017, her instincts as an investigative journalist kicked in. That’s a lot of power to hand over to lines of code. Who built the AI used in hiring? How does it work? Does it work? Can a hiring algorithm really be more efficient, fair, and objective than a human?
In “The Algorithm: How AI Decides Who Gets Hired, Monitored, Promoted, and Fired and Why We Need to Fight Back Now,” Schellman, a 2022 Knight Science Journalism Project Fellow, says algorithms are just as flawed as the humans who build them. Absent a human’s capacity for empathy and flexibility, AI can magnify the biases and subjectivity of hiring decisions, creating exclusive and unfair processes.
Published this January, Schellman’s book arrives in the midst of both intense hype and concern about the use of AI. AI headlines about plagiarism, ethics, deep fakes, and job security are juxtaposed with those espousing prospects of efficiency, problem solving, and data crunching abilities. Schellmann takes on the algorithmic beast and dissects it for all to see, laying bare the process of AI creation and implementation and pulling back the curtain to reveal the flawed humans behind. In a recent email exchange, she spoke with KSJ about the origins of the book, her writing process, and her outlook on the future of AI. (The following interview has been lightly edited for clarity and length.)
Knight Science Journalism: What led you to write a book about AI in hiring?
Hilke Schellmann: My first touch point with AI in hiring was a Lyft ride in late 2017. The driver told me that he was interviewed by a “robot” that day for a baggage handler job at a local airport. I had never heard of “robot” or pre-recorded interviews and wanted to learn more. It led me down a rabbit hole showing how pervasive AI in hiring already is and eventually led me to write this book, which takes readers on an investigative detective story with me.
I was writing the book while AI was becoming “the thing” that everyone talked about. As a society, we are entering a new era when we hear many promises of AI’s transformative power.
This book is a case study of one of its major applications: how companies and institutions use AI to hire, monitor, promote, and fire employees at scale. In my reporting, I explore the profound changes from human hiring practices and human oversight at work to machines quantitatively assessing and surveilling us. I don’t believe algorithms will replace the majority of humans in the workplace, but these automatic systems will quantify and surveil our work in ways we’ve never seen before.
By understanding how AI is used in our workplaces, I hope everyone will be better prepared to question its usage when we encounter it everywhere else.
“The idea of algorithms deciding our futures counters the idea that we, as humans, can flourish on our own and surprise ourselves and others.”
Hilke Schellmann
KSJ: The world’s relationship with AI has changed so rapidly, even in the last few months. How did this rapid development impact your writing process?
HS: Yes, AI hype has gone into overdrive, and suddenly everyone is talking about AI, but it turns out the underlying AI models that are primarily being used in hiring (predictive AI tools) have actually not changed much in the past five years or so, they just come in different packages.
Writing this book has shown me that we need to do more to fight bad algorithms ruling our lives and making high-stakes decisions about us—many times without us ever knowing.
I was able to truly understand this by testing the technologies myself and incorporating my experience into my writing process. I quickly realized that many of these tools do not work as advertised and further marginalize people, including women, people of color, and people with disabilities. Getting my hands on some of these tools to test them myself as I reported on them showed me how flawed some of these technologies are.
KSJ: As you conducted interviews and did your research, was there anything you found that really surprised you?
HS: While many AI tools used in hiring perpetuate bias against vulnerable populations and give companies more access to our information, not all changes brought about by AI are necessarily negative. AI is changing how companies hire and promote internal candidates by relying less on credentials and more on skills.
KSJ: How did your own perspective and life experiences influence how you wrote “The Algorithm?”
HS: As an investigative journalist, my job is to hold the powerful accountable. AI tools now make high-stakes decisions about us, so it was a natural transition for me to try to hold these tools accountable.
I am using traditional investigative methods and developing new ones to test these tools. Most of the technologies I examine are new—I didn’t have to submit myself to the current hiring processes that rely on predictive AI. That’s why I took it upon myself to test them and understand the challenges that job applicants face now. I also spoke with over two hundred people, including AI enthusiasts, vendors, lawyers, professors, HR managers, job applicants, whistleblowers, organizational psychologists, and really anyone who wanted to talk to me about this profound change in the world of work. My extensive relationship with this topic and my own tests and interviews allowed me to see the discrimination that these tools may inflict and their failure to deliver even their most basic promises.
KSJ: You’ve written a number of pieces on AI reporting and the biases that AI can create. What is your outlook on the future of AI?
HS: The effectiveness of AI tools depends on the quality of human intelligence and understanding that goes into them, as well as the proper parameters for protecting the rights and privacy of those affected. The idea of algorithms deciding our futures counters the idea that we, as humans, can flourish on our own and surprise ourselves and others.
Building a human future is still possible. We are in the beginning stages of these algorithms dominating our lives and threatening our human future. Now is the best time to fight back. With transparency about what is happening in AI and by pushing back against its unethical uses, AI can help us build a future that preserves our will to act and keep our humanity at the center of work and society.
KSJ: What do you hope your readers take away from the book?
HS: We live in a system in which algorithms define who we are, where we excel, and where we struggle. What’s at stake is the way we live. I hope lawmakers will get wise about what’s happening in the industry and start mandating transparency of training data and technical reports on these tools. It would be great if government agencies started testing these tools and even developed a control process for high-stakes AI decision tools used in policing, hiring, surveillance at work, credit rating, and criminal sentencing. I also hope this book provides readers with practical application tips if they’re thinking of changing jobs and want to understand what job hunting is like in the age of AI or if they want to learn about the technologies their companies are using to measure their successes and failures.
KSJ: If people want to keep up with your journey as an author and journalist, where should they follow you?
HS: Anyone interested in learning about the rise of artificial intelligence can find me on LinkedIn and my website (www.hilkeschellmann.com).