The latest event in the Future of Work Speaker Series centered on a discussion of how emerging surveillance technologies make it easier for employers to extract, collect, and process data from their employees and prospective hires to make decisions about hiring, firing, and evaluation. These technologies, which are wired into both built and digital environments, are often pitched as neutral and objective ways of tracking and incentivizing worker performance, but they end up recreating the same systems of discrimination that harm workers who are already marginalized due to their race, ethnicity, gender, age, socio-economic background, or disability. Panelists brought their expertise on the future of work to the conversation, offering insight into the complex future we face.

Javed Ali, Associate Professor of Practice at the Gerald R. Ford School of Public Policy, discussed his previous experiences working in the field of national security, where surveillance, to ensure the secrecy of classified information, was paramount. “Once you’ve stepped into this,” says Ali, “you give up your personal privacy,” though in the context of national security, worker surveillance seems more justified. In the governmental setting, surveillance “was not designed to measure your productivity,” but rather to protect sensitive information. Of workplace surveillance generally, Ali believes it is “here to stay…. It’s just going to look different now,” becoming more pervasive across fields and industries. 

Nazanin Andalibi, Assistant Professor at the University of Michigan School of Information, is interested in thinking about surveillance largely through “emerging technologies,” for example emotional recognition software. These new technologies, applied to the workplace, introduce complex questions about the relationship between privacy and wellbeing of workers. “These technologies are already in use in the workplace,” says Andalibi, but “people might not even know that their employer is using emotional recognition in the workplace.” In addition, these technologies tend to replicate the biases of their creators, creating issues of scientific validity, accuracy, and more. “Despite all of these critiques,” Andalibi notes, “technologies are patented and developed…so it’s a challenge.” Andalibi also emphasizes that “opportunities for choice and refusal and resistance” with regard to these surveillance technologies “aren’t equitably available to all workers,” and collective action may be the only way to secure worker privacy rights in future.

Elizabeth Anderson, Max Shaye Professor of Public Philosophy, John Dewey Distinguished University Professor; and the Arthur F. Thurnau Professor at the University of Michigan, examined labor history as a way to understand the American workplace’s current philosophy on worker surveillance. Micromanagement of labor, Anderson says, is deeply connected to “the legacies of slavery” and extracting maximum productivity from workers, “often in very brutal ways.” Modern surveillance, though “gentler than that,” remains “founded on a presumption that workers can’t be trusted.” Autonomy, trust, and respect are important values to workers, though they don’t often receive these from their superiors. Trust and respect are reciprocal, Anderson says. “Instead of falling for tech bro hype, maybe talk to your workers. Give them a voice in how this is happening.”

Keep an eye on our website for upcoming events in the Future of Work Speaker Series. Watch the full recording of this event now on our YouTube channel: