Monitoring: what is it for?
The online youth, news and entertainment site Pedestrian recently published a anonymous article about a boss asking all staff working remotely to leave their cameras on – all day, every day.
And when asked why staff were turning off their cameras, those who spoke were asked to step down and turn their cameras back on.
It sounds like an extreme example, but the rise of surveillance technologies, dramatically accelerated by COVID-19 forcing workers and students to work from home, raises new questions about what is not appropriate – or even legal. – in the push for ever greater productivity or to guarantee the safety and well-being of the staff.
In the workplace, companies like Sneek and Activtrack have promised to help employers and their staff by providing means not only to facilitate remote meetings and interactions, but also to monitor and analyze the hourly and daily productivity of the remote workforce.
Another offer, PointGrab, allows real-time monitoring of office density, with the goal of ensuring workers are adhering to COVID19 social distancing protocols.
Education has also seen an increase in surveillance technologies.
According to an American survey research of Center for Democracy and Technology80% of responding teachers and 77% of responding students said their school provided them with devices, such as laptops with monitoring software.
Again, it is often said that this is justified by the values of the institution – reducing cheating or promoting well-being. But is this really the right way to go?
The same report showed that some teachers believe this technology helps them keep their students safe at home and mentally adapt. However, the students were largely unaware that the surveillance was taking place.
Preventing students from cheating or plagiarizing has always been a real quest for educators. Traditionally, exam supervision has been carried out in the sacred corridors of the examination rooms.
But technology is now being used to monitor students taking exams at home. Indeed, COVID19 has caused an explosion in online monitoring platforms. During extended shutdowns, these products were quickly adopted by many institutions and students, and strongly rejected by others.
In our recently published research Regarding exam supervision, we offer a guide on the type of governance issues and procedures that institutions should consider when assessing whether to use these technologies.
Remote monitoring systems can record information online from student laptops and, using AI-based facial detection, can “flag” apparently suspicious exam behavior.
Institutions using these programs will need to consider, for example, whether students without good internet connections or devices will be unfairly disadvantaged, whether intrusions into the privacy of exam monitoring outweigh the benefits of remote monitoring, and whether this surveillance sets a worrying precedent for the future. intrusions.
When using these technologies, do the institutions that use them know how the AI and machine learning components of this technology actually work? Is there transparent information for the students involved?
Whenever an organization is considering the use of surveillance technology, it must also consider these ethical issues. And depending on the circumstances, these ethical considerations may lead to a decision not to use a given technology or spark ideas about other ways to achieve monitoring or surveillance goals.
monitor fairly and in good faith
It’s easy to get caught up in the bad stories about the technologies that follow and watch us.
Surveillance technologies can be easily, and often accurately, criticized as invasive, frightening, and riddled with power imbalances that compromise the rights and privacy of those under surveillance.
On the other hand, the very widespread QR recording apps used to better control the COVID-19 pandemic show that when presented with a public interest case, we are sometimes ready to be watched, if only temporarily.
In certain circumstances, therefore, some oversight – carried out in a fair and good faith manner – might be justified by the good it produces.
For example, allowing students to take their exams remotely using secure exam monitoring technologies could be life-changing for people with disabilities or with accessibility issues that make it difficult to attend campus, or those who don’t. cannot afford to live near campus.
The same goes for other technologies. TO Center for Artificial Intelligence and Digital Ethics (CAIDE), our next project is to monitor surveillance technologies in the workplace.
Here again we see the relevance of the context and the problem that is addressed in assessing the role of technology, and the non-technological and less invasive alternatives to privacy.
For example, using Artificial intelligence (AI) to analyze density limits in buildings as part of the ongoing management of COVID-19 could be effective and efficient. But organizations also need to consider whether this can be done without identifying or collecting data on individual workers.
Likewise, apps that monitor the use of hard hats on construction sites could be a non-invasive way to keep employees safe and ensure workplace health and safety. (OSH compliance).
But in implementing a system like this, organizations must go beyond the end goal and consider the wider impact on the rights and privacy of their staff.
A BROADER CONTEXT TO TECHNICAL PROBLEMS
Consideration of these broader factors should make it clear that the organization needs to ensure that all employees are aware that the technology is deployed and that steps are taken to ensure that compliance is fair and untargeted.
These considerations can also persuade an organization to consider another way to encourage compliance.
Looking at the surveillance issues, it is clear that we need to consider the whole picture. To focus only on the technology itself without assessing its social context or its wider ramifications is too narrow.
Powerful technology can now be easily created and adopted quickly, but Silicon Valley has shown us that when developers and users operate in silos, isolated from these larger concerns, they can overlook the impact of technology surveillance on our health. mental, on our private life and even on our capacity to be human.
Monitoring and surveillance technology can be a force for good, but only when done well, carefully thought out, and examined from various angles, including the rights of those being watched.
Banner: Getty Images