Business Daily Media

Algorithms workers can't see are increasingly pulling the management strings

  • Written by Tom Barratt, Lecturer, School of Business and Law, Edith Cowan University

“I’m sorry, Dave. I’m afraid I can’t do that.” HAL’s cold, if polite, refusal to open the pod bay doors in 2001 A Space Odyssey[1] has become a defining warning about putting too much trust in artificial intelligence, particularly if you work in space.

In the movies, when a machine decides to be the boss – or humans let it – things go wrong. Yet despite myriad dystopian warnings, control by machines is fast becoming our reality.

Algorithms – sets of instructions to solve a problem or complete a task – now drive everything from browser search results to better medical care[2].

They are helping design buildings[3]. They are speeding up trading[4] on financial markets, making and losing fortunes in micro-seconds. They are calculating the most efficient routes for delivery drivers[5].

In the workplace, self-learning algorithmic computer systems are being introduced by companies to assist in areas such as hiring, setting tasks, measuring productivity, evaluating performance and even terminating employment: “I’m sorry, Dave. I’m afraid you are being made redundant.”

Giving self‐learning algorithms the responsibility to make and execute decisions affecting workers is called “algorithmic management[6]”. It carries a host of risks in depersonalising management systems and entrenching pre-existing biases.

At an even deeper level, perhaps, algorithmic management entrenches a power imbalance between management and worker. Algorithms are closely guarded secrets. Their decision-making processes are hidden. It’s a black-box: perhaps you have some understanding of the data that went in, and you see the result that comes out, but you have no idea of what goes on in between.

Algorithms at work

Here are a few examples of algorithms already at work.

At Amazon’s fulfilment centre in south-east Melbourne, they set the pace for “pickers”, who have timers on their scanners showing how long they have to find the next item[7]. As soon as they scan that item, the timer resets for the next. All at a “not quite walking, not quite running” speed.

Amazon's fulfillment centre in Dandenong, south-east Melbourne, Victoria. Amazon’s fulfillment centre in Dandenong, south-east Melbourne, Victoria. Revere Agency/AAP

Or how about AI determining your success in a job interview? More than 700 companies have trialled such technology[8]. US developer HireVue says its software speeds up the hiring process by 90% by having applicants answer identical questions and then scoring them according to language, tone and facial expressions.

Read more: Facial analysis AI is being used in job interviews – it will probably reinforce inequality[9]

Granted, human assessments during job interviews are notoriously flawed. Algorithms,however, can also be biased[10]. The classic example is the COMPAS software used by US judges, probation and parole officers to rate a person’s risk of reoffending. In 2016 a ProPublica investigation[11] showed the algorithm was heavily discriminatory, incorrectly classifying black subjects as higher risk 45% of the time, compared with 23% for white subjects.

How gig workers cope

Algorithms do what their code tells them to do. The problem is this code is rarely available. This makes them difficult to scrutinise, or even understand.

Nowhere is this more evident than in the gig economy. Uber, Lyft, Deliveroo and other platforms could not exist without algorithms allocating, monitoring, evaluating and rewarding[12] work.

Algorithms workers can't see are increasingly pulling the management strings Algorithms dictate who gets work on food-delivery platforms like Deliveroo and Uber Eats. Gonzalo Fuentes/Reuters

Over the past year Uber Eats’ bicycle couriers[13] and drivers[14], for instance, have blamed unexplained changes to the algorithm for slashing their jobs, and incomes.

Rider’s can’t be 100% sure it was all down to the algorithm. But that’s part of the problem. The fact those who depend on the algorithm don’t know one way or the other has a powerful influence on them.

Read more: Uber drivers' experience highlights the dead-end job prospects facing more Australian workers[15]

This is a key result from our interviews with 58 food-delivery couriers[16]. Most knew their jobs were allocated by an algorithm (via an app). They knew the app collected data. What they didn’t know was how data was used to award them work.

In response, they developed a range of strategies (or guessed how) to “win” more jobs, such as accepting gigs as quickly as possible and waiting in “magic” locations. Ironically, these attempts to please the algorithm often meant losing the very flexibility that was one the attractions of gig work.

The information asymmetry created by algorithmic management has two profound effects. First, it threatens to entrench systemic biases, the type of discrimination hidden within the COMPAS algorithm for years. Second, it compounds the power imbalance[17] between management and worker.

Our data also confirmed others’ findings that it is almost impossible to complain about the decisions of the algorithm. Workers often do not know the exact basis of those decisions, and there’s no one to complain to anyway. When Uber Eats bicycle couriers asked for reasons about their plummeting income, for example, responses from the company advised them “we have no manual control[18] over how many deliveries you receive”.

Broader lessons

When algorithmic management operates as a “black box” one of the consequences is that it is can become an indirect control mechanism[19]. Thus far under-appreciated by Australian regulators, this control mechanism has enabled platforms to mobilise a reliable and scalable workforce while avoiding employer responsibilities[20].

“The absence of concrete evidence about how the algorithms operate”, the Victorian government’s inquiry into the “on-demand” workforce[21] notes in its report, “makes it hard for a driver or rider to complain if they feel disadvantaged by one.”

The report, published in June, also found: it is “hard to confirm if concern over algorithm transparency is real.”

But it is precisely the fact it is hard to confirm that’s the problem. How can we start to even identify, let alone resolve, issues like algorithmic management?

Fair conduct standards to ensure transparency and accountability are a start. One example is the Fair Work initiative[22], led by the Oxford Internet Institute[23]. The initiative is bringing together researchers with platforms, workers, unions and regulators to develop global principles for work in the platform economy. This includes “fair management”, which focuses on how transparent the results and outcomes of algorithms are for workers.

Understandings about impact of algorithms on all forms of work is still in its infancy. It demands greater scrutiny and research. Without human oversight based on agreed principles we risk inviting HAL into our workplaces.

References

  1. ^ 2001 A Space Odyssey (theconversation.com)
  2. ^ better medical care (theconversation.com)
  3. ^ design buildings (theconversation.com)
  4. ^ speeding up trading (theconversation.com)
  5. ^ delivery drivers (www.ups.com)
  6. ^ algorithmic management (onlinelibrary.wiley.com)
  7. ^ to find the next item (www.abc.net.au)
  8. ^ have trialled such technology (theconversation.com)
  9. ^ Facial analysis AI is being used in job interviews – it will probably reinforce inequality (theconversation.com)
  10. ^ biased (journals.sagepub.com)
  11. ^ ProPublica investigation (www.propublica.org)
  12. ^ allocating, monitoring, evaluating and rewarding (journals.aom.org)
  13. ^ bicycle couriers (www.news.com.au)
  14. ^ drivers (www.twu.com.au)
  15. ^ Uber drivers' experience highlights the dead-end job prospects facing more Australian workers (theconversation.com)
  16. ^ interviews with 58 food-delivery couriers (journals.sagepub.com)
  17. ^ power imbalance (journals.sagepub.com)
  18. ^ no manual control (www.news.com.au)
  19. ^ indirect control mechanism (journals.sagepub.com)
  20. ^ employer responsibilities (www.fwc.gov.au)
  21. ^ inquiry into the “on-demand” workforce (s3.ap-southeast-2.amazonaws.com)
  22. ^ Fair Work initiative (fair.work)
  23. ^ Oxford Internet Institute (www.oii.ox.ac.uk)

Authors: Tom Barratt, Lecturer, School of Business and Law, Edith Cowan University

Read more https://theconversation.com/algorithms-workers-cant-see-are-increasingly-pulling-the-management-strings-144724

Business Today

Cathay Pacific 2021 Sustainability Report

New commitments in carbon neutrality and diversity, and supporting the Hong Kong community during the pandemicThe Cathay Pacific Group has released its annual Sustainable Development Report that addresses its commitment and prog...

Cozycozy, is tackling consumer manipulation in the travel industry

With its user-centric algorithm, Cozycozy fights against the Online Travel Agencies' rampant manipulation of consumers’ choices. In 2019, 112.3 million domestic trips and 11.3 Million international trips were taken by Austral...

Foreign companies exiting Russia echo the pressure campaign against South Africa's racist apartheid system

McDonald's is leaving Russia after three decades of operating there.Sefa Karacan/Anadolu Agency via Getty ImagesMcDonald’s provided many Russians with their first taste of capitalism three decades ago. Now, the global fast-f...

Jennifer Westacott's interview with Ross Greenwood

Ross Greenwood, host, Business Now: Joining us is the chief executive of the Business Council of Australia, Jennifer Westacott. Jennifer, many thanks for your time as always. Quite clearly the government is setting out its age...

Business success as Aussie kids skateboard brand, Ookkie, launches in UK and Europe

Australia’s leading kids skateboard company, Ookkie, has set the wheels in motion for global business success, expanding into two new overseas markets thanks to growing demand for its products from ‘skateboard loving paren...

How a sustainability index can keep Exxon but drop Tesla – and 3 ways to fix ESG ratings to meet investors' expectations

Tesla CEO Elon Musk, shown at an electric vehicle factory, called ESG ratings 'a scam' after an index dropped Tesla. Maja Hitij/Getty ImagesA major stock index that tracks sustainable investments dropped electric vehicle-maker Tes...

Business Daily Media Business Development

Cathay Pacific 2021 Sustainability Report

New commitments in carbon neutrality and diversity, and supporting the Hong Kong community during the pandemicThe Cathay Pacific Group has released its annual Sustainable Development Repor...

Business Daily Media - avatar Business Daily Media

Cozycozy, is tackling consumer manipulation in the travel industry

With its user-centric algorithm, Cozycozy fights against the Online Travel Agencies' rampant manipulation of consumers’ choices. In 2019, 112.3 million domestic trips and 11.3 Million i...

Business Daily Media - avatar Business Daily Media

How to talk to your employer about trauma

Chanintorn.v / ShutterstockThe impact of trauma and post-traumatic stress disorder (PTSD) is a hidden chronic mental health issue in the workplace. A 2016 survey of UK adults found that one ...

Stefanos Nachmias, Principal Lecturer and Deputy Head of HRM Department, Nottingham Trent University - avatar Stefanos Nachmias, Principal Lecturer and Deputy Head of HRM Department, Nottingham Trent University

Jennifer Westacott's interview with Ross Greenwood

Ross Greenwood, host, Business Now: Joining us is the chief executive of the Business Council of Australia, Jennifer Westacott. Jennifer, many thanks for your time as always. Quite clear...

Ross Greenwood - avatar Ross Greenwood

Business success as Aussie kids skateboard brand, Ookkie, launches in UK and Europe

Australia’s leading kids skateboard company, Ookkie, has set the wheels in motion for global business success, expanding into two new overseas markets thanks to growing demand for its ...

Business Daily Media - avatar Business Daily Media

world's biggest port is returning to normal, but supply chains will get worse before they get better

Shanghai is slowly emerging from a gruelling COVID lockdown that has all but immobilised the city since March. Although Shanghai’s port, which handles one-fifth of China’s shippi...

Sarah Schiffling, Senior Lecturer in Supply Chain Management, Liverpool John Moores University - avatar Sarah Schiffling, Senior Lecturer in Supply Chain Management, Liverpool John Moores University



NewsServices.com

Content & Technology Connecting Global Audiences

More Information - Less Opinion