“New opinions often appear first as jokes and fancies, then as blasphemies and treason, then as questions open to discussion, and finally, as established truths.”
|
- GEORGE BERNARD SHAW
“New opinions often appear first as jokes and fancies, then as blasphemies and treason, then as questions open to discussion, and finally, as established truths.”
|
AI Driven Performance Monitoring in the Workplace - What are the Effects on the Workforce?18/11/2021 Have you ever been micromanaged? Or had a boss watch over your team and you like a teacher looking for cheats in the exam room? Let's just say that it's not my Favourite management style, and certainly one that I will never be adopting myself. Performance monitoring and measuring of employees work is a method that is focused with 'protecting' and building a company's 'bottom line' by ensuring a profitable and efficient work from employees. But should that come at the expense of the mental health of workers whose actions are being tracked to ensure this? Whereas setting a suitable level of performance targets could arguably be a good motivator for some, having your productivity routinely measured by an algorithm as if you are another computer program working - not indeed, a human being - could understandably put a strain on one's stress levels. After reports of employers watching their teams and computer screens through desktop and recording movement sensors for keyboard strokes and mouse movements, trade union Prospect are now calling for the UK government to address this largely unregulated privacy area, and to make it explicitly illegal for employers to use webcams to monitor employees working from home except from when taking part in meetings and calls.[1] The All Part Parlimentary Group (APPG) in their report, the New Frontier: Artificial Intelligence at Work have said: “Pervasive monitoring and target-setting technologies, in particular, are associated with pronounced negative impacts on mental and physical wellbeing as workers experience the extreme pressure of constant, real-time micro-management and automated assessment,”[2] These impacts then translate into an employee's work habits, therefore contradicting the purposes of the monitoring in the first place. A 2015 study found that close monitoring in the workplace via "cameras, data entry, chat and phone recording", had "significant negative effects" on an employees self-efficacy and perceived control, as well as negatively impacting their job satisfaction and commitment. [3] One suggestion from the APPG report to help tackle negative impacts from '‘algorithmic systems’ is for corporations and public sector employers to take preventative measures through "An Accountability for Algorithms Act", and to conduct algorithmic impact assessments, accompanying the "Human Centred AI" principle which the report aims for. The World Health Organization lists "Awareness of the workplace environment and how it can be adapted to promote better mental health for different employees"[4] as a key step for organizations to take in order to create a healthy workplace. As industries continue to digitalize and harness the benefits and opportunities that technology can bring, it is important that we do not overlook the responsibility to assess and monitor how the uses of technology in the workplaces, could be impacting the workforce. Read the full report Here and below for the reports Key Reccommendations.[5] 1. An Accountability for Algorithms Act: The Act would establish a simple, new corporate and public sector duty to undertake, disclose and act on pre-emptive Algorithmic Impact Assessments (AIA). 2. Updating digital protection: The AAA would raise the floor of essential protection for workers in response to specific gaps in protection from adverse impacts of powerful but invisible algorithmic systems. 3. Enabling a partnership approach: To boost a partnership approach and recognise the collective dimension of data processing, some additional collective rights are needed for unions and specialist third sector organisations to exercise new duties on members or other groups’ behalf. 4. Enforcement in practice: The joint Digital Regulation Cooperation Forum (DRCF) should be expanded with new powers to create certification schemes, suspend use or impose terms and issue cross-cutting statutory guidance, to supplement the work of individual regulators and sector-specific standards. 5. Supporting human-centred AI: The principles of Good Work should be recognised as fundamental values, incorporating fundamental rights and freedoms under national and international law, to guide development and application of a human-centred AI Strategy. References: [1] Parkinson, J., 2021. 'The way my boss monitored me at home was creepy'. BBC News. https://www.bbc.co.uk/news/uk-politics-59152864
[2] Milo, D. 2021. Algorithic tracking is 'damaging mental heath' of UK workers. the Guardian. https://www.theguardian.com/technology/2021/nov/11/algorithmic-monitoring-mental-health-uk-employees [3] Jeske, D. and Santuzzi, A. 2015. Monitoring what and how: psychological implications of electronic performance monitoring. New Technology, Work and Employment, 30(1), p.62-78 [4] Who.int. 2021. Mental health in the workplace. https://www.who.int/teams/mental-health-and-substance-use/promotion-prevention/mental-health-in-the-workplace [5] APPG on the Future of Work. 2021. Report into The New Frontier: Artificial Intelligence at Work — APPG on the Future of Work https://www.futureworkappg.org.uk/our-work/report-into-the-new-frontier-artificial-intelligence-at-work
1 Comment
11/11/2022 02:46:22 pm
Let evening would weight call help. Create simple general government later feel. Her ability purpose consider. Production boy once challenge.
Reply
Leave a Reply. |
Want to Contribute?Email Archives
January 2025
|
T h i n k L o n g - T e r m
T h i n k G l o b a l T h i n k E t h i c a l |
|