Algorithmic management in the focus of sociology of technology
- Authors: Yudina M.A.1
-
Affiliations:
- HSE University
- Issue: Vol 24, No 3 (2024)
- Pages: 734-746
- Section: Sociology of management
- URL: https://journals.rudn.ru/sociology/article/view/41179
- DOI: https://doi.org/10.22363/2313-2272-2024-24-3-734-746
- EDN: https://elibrary.ru/FKZYCT
Cite item
Full Text
Abstract
This article considers the managerial aspect of digital transformation - various programs and infrastructure that have recently received the general name “algorithmic management”. The boom in the use of such tools occurred during the covid-19 pandemic as a unique set of circumstances for the digitalization of human life. The authorities of several countries monitored their citizens’ behavior, including with the QR-code systems that limited their rights, in the fight against the spread of the covid-19, which has caused discussions and even protests. Businesses accelerated their digital transformation in HR management due to government restrictions and lockdown measures and to production needs in the new conditions. Quarantines are over, but the active development of algorithmic management continues; it extends beyond the platform economy and plays an integral role in Industry 4.0, which makes the study of algorithmic management relevant and timely. A significant contribution to understanding algorithmic management was made by the report of the experts from the European Commission and International Labor Organization. Based on the relevant publications up to 2022, they suggested giving up the narrow understanding of algorithmic management as a platform economy issue; however, most studies are still based on this interpretation. The article presents a broader definition to identify additional social contradictions and challenges of digital transformation. The author considers algorithmic management in the perspective of sociology of management and sociology of technology, in particular the works of A. Feenberg and P. Edwards. The approach of sociology of technologies studies (STS) allowed the author not only to analyze the events of the recent pandemic but also to consider the future of such technologies under the transition towards Industry 4.0. The article identifies three elements of algorithmic management together with hidden social-managerial biases and contradictions related to their implementation and shows how the new approach integrates direct and indirect control in management.
Full Text
In the early 2000s, theories of post-industrial society were scientific mainstream; today the ideas of data- and sharing-economy, Industry 4.0 (In management) and reindustrialization era seem to destroy the previous of consensus, and that is exactly what contemporary convergent technologies do — change everything, promising some future profit. Digital platforms are promoted as innovative business-models supporting the sharing economy with the “future is now” slogans, but is their creative destruction as positive as the owners of platforms try to convince us? “Through digital platforms, the sharing economy creates a technological infrastructure for new interactions between producers and consumers” but “can increase social inequality by creating privileges for those who own property and make money by renting out resources” [26. P. 14].
It is only logical that management increasingly relies on technologies in the digital era. In 1986, Beniger “developed a theory of industrial capitalism centered around the problem of control, a functional issue linking technological, social, institutional, and information dimensions” [9. P. 205]. The same functional issue determined the development of algorithmic management which is defined by the International Labor Organization (ILO) and European Commission (EC) experts “in general, as a social-technical process” [2. P. 5]. The Joint Research Centre of the EC published the paper “The Algorithmic Management of Work and Its Implications in Different Contexts” to provide a conceptual framework “for this emerging phenomenon”, including for “Building Partnerships on the Future of Work”.
ILO and EC define algorithmic management (AM) as “the use of computerprogrammed procedures for the coordination of labor input in an organization” [2. P. 1], and this paper adds to this definition different theories and methods of sociology of technologies studies (STS) to ensure a deeper understanding of AM in the social-managerial scope and go beyond consideration of only platform algorithms. Moreover, the current stage of digital transformation affects management in a variety of ways; thereby, leaders of countries, CEOs of transnational corporations, technological and scientific researchers and journalists suggest different names for the transformation of management due to technological changes. STS methods and a broader interpretation of AM help to see how digital transformation changes the ways to control labor, what perspectives AM has in Industry 4.0, and how it would affect social interactions.
STS for digital transformation
Descriptive approaches focus on different characteristics of the interconnection between social and technical phenomena without interpreting it. For instance, the Social Construction of Technology (SCOT) theory argues that technologies are shaped by human activity and not otherwise [4]. Giddens’ structuration theory [14] with its duality of structure (a set of rules and the result of individual actions) also describes technology and society [6; 29]. Actor-network theory, focusing on a constant flexible network of subjects and objects (actants), was initially a part of “the pragmatic turn” in sociology, i.e., from the very beginning considered technology [20; 21]. Edwards used the pragmatic turn’s results to define modernity as the co-construction of technology and society: “infrastructures form the state of modernity and are formed by it, in other words, they are in the process of coconstruction” [9]. Thereby, infrastructures are sustainable technological systems, organizational structures, and the basic individual knowledge of their use. Coconstruction theory is also based on Beniger’s view on industrial capitalism. This research also refers to Beniger [3], considering AM as different technological forms of solving the control issue in contemporary societies.
Feenberg developed his instrumentalization theory, referring to Latour’s configuration of interconnected sets of technologies (“technogram”) and individual actors (“sociogram”) [19]: a particular technical configuration reflects a particular network of actors [11; 12]. Thus, the task of a good social-technological theory is dual: to describe the ways in which technology is chosen and to identify the goal behind this choice. Instrumentalization theory implies a critical emphasis on actors/ stakeholders that adopt and use technologies, which is why technology is always biased [13] — due to the combination of time, place, and the way of its creation and introduction. Feenberg’s instrumentalization theory can be used to emphasize that AM is always biased socially and technologically. According to Edwards, any infrastructural form of AM would depend on actants (In the actor-network theory terminology); therefore, bias can be embodied in the seemingly neutral technological solutions to social problems, since the symptom of the deep social condition, which determines the problem, cannot be eliminated by this technical decision. The technology used in a particular social context is driven by what Feenberg calls “technical codes” as a combination of “the rule under which technical choices are made” and “a certain meaning or purpose that explains” [12. P. 47] the necessity of these choices. Whatever bias AM has as a high-tech way to solve coordination and control tasks of management, it needs sociological analysis due to the disruptive nature of such innovations.
Livingstone [22. P. 174] believes that due to digital technologies, there are three types of shifts: (1) government intervention in personal life with surveillance technologies and personal data collection; (2) privatization of the personal and personalization of the private through the social media, since personal images and stories become available and accessible to anyone; (3) corporate commercialization of personal life with “surveillance capitalism” [32], which use the private/intimate life of individuals to make profit. However, not only governments, but businesses use surveillance technologies of control, and AM may represent a significant shift in the people-technology balance.
Development and spread of algorithmic management
Before the covid-19 pandemic, the IT sector of the economy grew, attracting more workers not just by higher salaries but also by special treatment of “computer personnel”, who, among other benefits, can work from home. The pandemic led to multiple lockdowns all over the world, and the digital transformation accelerated, sometimes provoking public protests against the massive violation of privacy by leading technological corporations. Digital technologies were used both to decrease loneliness and isolation [18] and to ensure an extensive, distant, “soft” control over citizens. The interconnection of people by social media increased a sense of personal responsibility, thus stimulating preventive behaviors [24]. Many copied other people’s behavior based of the social media content (like photos of people wearing face masks). Covid-19 prevention was stimulated by instruments installed in the social media interfaces, such as hashtags and geolocations. Nondigital mechanisms were primary, but digital technologies prolonged their effects, motivating people to demonstrate publicly conformist social behavior. Evaluability code helps to sustain this effect as the social media provides instruments for quantifying the audience’s perception of what is posted publicly.
During the covid-19 pandemic, health (normally a private/intimate matter) became a public affair: masks and sanitizing were the signs that private health as no longer an individual matter. Many countries used special apps to track and control the state of health and movements of people. In some countries, such types of smartphone apps were to be used only by the sick or those in contact with the sick. In other countries, like South Korea, many people used the “self-quarantine safety protection app and self-diagnosis app”. 15 countries developed and actively used 17 mobile apps to control the covid-19 pandemic [17], and only 3 out of these 17 apps were applied according to the national data protection laws. The world experienced the largest tech violation of privacy under the cover of public good, i.e. governments easily gave up their citizens’ privacy rights for societal health concerns.
Zuboff [32] had emphasized the use of technologies for surveillance before the pandemic. In Russia, Dudina in 2018 mentioned the possibility of pan-spectron as “the placement of human bodies around a central observer (panopticon) is replaced by a multitude of sensors; cables are placed everywhere, recording all incoming information and accumulating it in computers” [8. P. 24]. In 2020, in some countries, a system of QR-codes replaced traditional vaccine certificates: special systems allowed vaccinated citizens to access public spaces and forbade it to those not vaccinated, thus producing a social division. The other technology applied was the AI face-recognition algorithms: by getting access to publicly recorded photo and video materials, these programs could identify a person in a public space, track his movements or even send a signal to the law enforcement agencies.
Such technologies indicate a shift in public control over personal life, including a geolocation and health status. To fight the pandemic, these private areas became a part of public interest, and the traditional generalized policy transformed into selective and occasional individual-level policy, which is characterized by some researchers as biopolitics [23]. The main challenge in terms of social interaction was to keep social distance. Most digital solutions included traditional distant communication instruments (social networks, messengers and video conferences) that were either used more actively or transformed to better fit the online socializing: many shopping businesses went online, developing purchasing platforms and apps, and many entertainment establishments, such as theaters or cinemas, launched online broadcasting. Thus, people witnessed a deep shift in the public/private dichotomy, and digital tools played a key role in it. AM is only a part of such tools but a very special one due to the changes in the human-artificial balance in the work environment.
How algorithmic management transforms the world of work
The covid-19 social turbulence changed the territorial and time frames of work as millions of people had to change their way of organizing workspaces. For businesses, profitability became an incentive for organizing technology to make workers more productive and effective in the unstable environment. We may classify a variety of technology used these days to control labor in several types: the first is direct digital control from the employer; the second is workers self-management on a digital basis, and the third as a combination of the first two. The first type is introduced by the employer to control and organize the worker (sometimes in a home-office space): work-tracking and performance-tracking computer programs, voice-over-internet protocols (VoIPs), video conference programs, work collaboration tools, cloud technologies, communication technologies (messengers), and virtual working spaces.
Control is not the only managerial function that technologies take care of — HR management goes through digital transformation too, and this is the first type since employers use them. As for the legibility of such algorithms in the workers’ perspective, research results are contradictory. In their research of employment relationships in AM, Tomprou and Lee considered how employees perceive algorithmic agents taking on a managerial role as compared to human agents and found evidence in favor of diametrically opposite points of view [27]. Some studies provide results more compliant with human decision-making, while in others, workers considered algorithmic and human agents similarly or preferred the algorithmic ones. Thus, the organizational agent type, algorithmic versus human, influences one’s psychological contract depending on the organizational inducement type: transactional versus relational [27]. Transactional inducements are tangible and calculative (salary and bonus), promissory cues on them can be conveyed equally by humans and algorithms. Relational inducements focus on subjective, personal aspects of work; therefore, “during recruitment, using algorithmic agents could lower perceived employer commitments compared to human agents interacting through video chatting, but this was not observed during onboarding” [27. P. 9]. The research shows that AM can be as effective as human managers are even with relational inducements algorithms.
Workers’ self-management technologies are the second type. Some of them had been introduced before the recent pandemic, but it created special social-economic conditions for a previously unprecedented demand for such tools (many were tried for the first time). The transformation of the working environment into a homeoffice broke some daily habits; thereby many workers were forced to reorganize the control over their performance to avoid procrastination and preserve a certain work-life balance. These technologies ensure that profitability becomes a personal value through the use of such programs as calendars and planners, goal-setters, budget-planners, etc. In other words, workers are supposed to consider work efficiency as a personal goal rather than something useful for employers. Some authors argue that AM of platforms is a new stage of control in management after the direct control (“exercised by superiors and based on the direct surveillance”) and indirect control (“a form of domination over workers’ autonomy”) [1. P. 88]. However, this paper supports a broader perspective, considering the first and second types of technologies for organizing work as means for direct and indirect control and as types of AM.
The third type of technologies called AM is presented in most cases as algorithms of digital platforms. Although some authors use the terms “platformization” and “AM” as synonyms [2], AM is used by platforms for organizing working processes, but there are other types to control work algorithms and tools. Some platformowners get huge amounts of money by their innovative business practices, while others criticize such practices as illegal precarization based on the insufficient legal regulation of digital technologies. Platformization is the large-scale and systematic example of disruption created by digital transformation: platforms often create questionable values, because the traditional, “maintaining the status quo” kind of employment contracts implies taxes for both employers and employees, while platform-workers do not pay taxes (and do not have social guarantees) in most countries.
In the control perspective, platform algorithms are special due to their innovative mix of indirect and direct control: “algorithmic management devices… give rise to even more pervasive forms of precariousness and intervene directly in modeling identities through a mechanism similar to the interiorization of market imperatives” [1. P. 89], i.e., these devices are “suppliers and users of control”, while platforms act as independent regulators [25]. AM becomes an infrastructure which changes management — sub-reporting becomes extinct on platforms, because it is neither vertical nor lateral; there is no accountability in algorithmic accounting [25]. This lack of accountability is an especially worrying feature of platforms due to their questionable legal status. During the covid-19, workers without legal status were in double trouble due to (1) the risk of getting the virus and being pushed by AM to still complete tasks not to lose ratings, and (2) to the prior state support of taxpayers (like it was in Russia).
For instance, at the peak of the pandemic crisis in Barcelona, delivery workers as “subcontractors” of the platforms worked illegally, thus being deprived of social protection, but at the same time they complied with the strict requirements of AM of platforms [28]. Without a guaranteed minimum wage, the income of delivery employees depended on a combination of factors: the ability to work during hours that the platform considers necessary (especially in the evenings and at weekends), the speed of delivery, and customers’ feedback. “Subcontractors” got points by achieving goals set by the platform, and points determined the ability to take orders (earn). Traffic jams, poor food quality, any other failure, including those not depending on the courier, led to losing points, but at the outbreak of the pandemic, due to high demand, for the first time platforms provided the opportunity to choose working hours.
Covid-19 lockdowns revealed the precarious status of platform-contractors in many countries due to their protests after some cases of fatalities, unsafe work routines, and the questionable use of AM due to distant working. EC and ILO experts believe that AM can lead to mental distress at work [2. P. 21]: when used for decisionmaking and coordination of input, AM may make workers feel that they have little control over their work and that they are constantly monitored and evaluated, which leads to anxiety, stress and burnout. In the case of platform AM, the unpredictability of work demand and schedule can make it difficult for workers to achieve a healthy work-life balance, which can also contribute to stress. Thereby, the AM of more mundane business models might be less stressful due to being more stable — the usual working routine with some transactional inducements by algorithms is closer to pre-digital times.
There are at least six challenges for the AM of platforms [2]: job and income insecurity due to the unpredictability of work demand; difficulty in making autonomous decisions to comply with given instructions; accidents and mental distress at work; high work intensity to meet requirements or make a living; deterioration of work-life balance; a deep shift from traditional HR practices to new forms of control, monitoring and discipline. This might create a worrying picture, but it misses positive sides of this kind of infrastructure creation. That is why further we to Industry 4.0 with a much more positive reputation, for which AM plays a key role.
Algorithmic management in Industry 4.0
AM should not be reduced to platforms due to providing disruptive instruments for all other sectors of the economy. Interconnectivity makes Industry 4.0 demands unimaginable without automation of human performance tracing, but it is an open question whether the last word stands with a human manager or should we expect platform-like power-shift.
The Industry 4.0 Maturity Index [16] implies several stages on the way to the fourth industrial revolution. Computerization is only the first step of the organization’s digital transformation. The next step is connectivity — when different business-apps are connected in synchronized work, which is the basis for visibility — when all workers have open access to the information. This is an important ideological shift, because in many (if not most) groups power is based on control over information.
However, some elements of Industry 4.0, including artificial intelligence, flourish in China with an entirely different concept of management, although the founders of Industry 4.0 argued that open information policy was the best way to reduce a broad variety of mistakes and bugs and to raise the level of trust. Workers can fight for open access to information, but providing access to control and management decisions is one of the key elements of managerial trust [30. P. 118]. Some authors [15; 31] argue that granting open access to job monitoring both to the worker and the employer is a sign of biopolitics. It implies a high level of workers’ self-management and argues for adding technologies of this type to a mosaic of AM, at least for a better understanding of the shift in control functions.
What the Industry 4.0 creators suggest as the only way to achieve technological advances seems to be in contradiction with the paradigm of unqualified control: “It is this control which orients technical development toward disempowering workers and the massification of the public” [12. P. 53]. Should we consider access to data on their performance for workers as empowering? In biopolitics studies, selfmanagement technologies are considered just another stage of controlling others, thus, creating only an illusion of empowerment. The Industry 4.0 Maturity Index implies that at the connectivity stage, open access to information may lead to the “widespread willingness to embrace change within the company, which is supported by continuous development and innovation” [16. P. 18]. Nevertheless, at the next visibility stage, when a special Industry 4.0 “company’s digital shadow” becomes a reality and the organizational structure is to change, the acceptance of all data collection on all stages becomes critically important. It is also a cultural change — when workers are open to the idea of everyone knowing their level of performance, which requires a digital model of the situation in the company. If such a model is available only to decision makers, it will lose its efficiency, since the pressure of hierarchy makes employees hide information about their mistakes, which diminishes chances for preventing damage to the company as soon as possible.
The fourth stage is transparency and implies the use of big data for understanding the digital shadow of the company. Again, it is expected that employees of different levels would be ready to participate, and management would become more agile: “predictive capacity is a fundamental requirement for automated actions and automated decision making” [16. P. 20]. We may see parallels with the use of AM on platforms, but it is much harder due to the need to turn a hierarchy into more flexible networks, while platforms, starting from zero point, have no problem with changing what previously worked well.
It may seem that thanks to AM, “the operational autonomy of management and administration positions them in a technical relation to the world, safe from the consequences of their own actions” [12. P. 53]. If Industry 4.0 is as connected as it supposed to be by its ideologists, administrational defense would not be as effective as before — all managers would become actants in a huge digital shadow of organization networks. However, this does not work for business in Industry 4.0 and platform-owners: what looks like huge inequality and hierarchy reduction might be a huge increase in control. Before AM and Industry 4.0, capital and/or firm owners controlled their assets through their managers. Today the whole organization (humans and objects), thanks to ubiquitous computerization, can be ruled by decision-making based on systems of algorithms. Platform-owners create a technological environment with unique opportunities for market manipulation [5].
The main goal of Industry 4.0 creation is its predictive capacity — technologies help to understand not only what is happening with the company now also to predict its future based on all obtained data. Such capabilities are key, game-changing, business advantages. However, if the organizational structure should be ready for changes, the demands on management skills would increase. To implement changes based only on predictions might be harder than based on current company results open to all levels of employees. The Industry 4.0 Maturity Index does not comment on the risks to which such openness may lead, when workers leave the firm knowing what future its shadow version promised. In the platform economy, short transactions do not provide workers with much information, while the informed Industry 4.0 worker has many more ways to affect the situation. The employee already has better control through visibility; at the transparency stage, management may pay even more attention to the employees’ opinions; the predictive capacity of the Industry 4.0 system gives workers more than previous generations had (like knowledge of the current weak spots of the company), but there also is a risk of bad relationship with those aware of the most reliable forecasts of the company’s future. If workers are to have access to all the data, we need a huge change in the corporate culture.
The final step and goal of digital transformation within Industry 4.0 is adaptability — to make an organization an entity comprising people and machines more flexible for now and for the future. Adaptability is achieved when all real-time adjustments are performed automatically; therefore, it demands “flexible communities” and “agile project management” with life-long learning for all employees [16. P. 21].
Thus, the development of Industry 4.0 demands a combination of AM with deep social, cultural, and structural changes on each step of creating this infrastructure, i.e., technological infrastructure should never be a mere tool but a complex of special ways and cultural patterns to use it. It is the social shift in management that raises most questions about digital transformation: technological infrastructure should never be seen as mere tools but as a complex of special ways to use them. If Industry 4.0 truly demands an open access to information for all employees, why is it no less effective in countries with the authoritarian managerial culture? According to the index [16], most businesses have not achieved the final stages yet, so today Industry 4.0 and its managerial algorithms are more about accepting connectivity and constant job monitoring, in some societies willingly, in others not. Moreover, if we talk about open data for managers or for workers too, which model is more effective? Even if total transparency seems to be the most effective way, can it support a more equal relationship between people in the organizational hierarchy? Or such equality is impossible in the knowledge economy with power held by technocrats? These questions still await answers.
The environment changes management as a profession, although there are still too many aspects cheaper to be done by people than to automate. Standards of effective management have become higher — there is the need to be good with both people and algorithms. AM changes the role of human-managers in a highly digital environment due to high demands on emotional intellect and skills to motivate your team. With further predictive capabilities of digitally controlled firms even low levels of management are to check the validity of such prognoses: are they based on the right and properly calculated data? Can HR managers decide the future of the employee based on the AM results, or should the labor law secure people from such simplification? Today the new management infrastructure is implemented, but it still can be changed to a more humanitarian one, if people stand against the automation of working relations. That is why the European Trade Union Institute issued policy recommendations “Regulating algorithmic management. An assessment of the EC’s draft Directive on improving working conditions in platform work” [10]. We should expect more such initiatives in the future.
Digital transformation might have looked different if societies had not experienced covid-19 the way they did. The lockdowns across the world boosted our virtual life and its infrastructure, the private/public dichotomy was challenged by governmental technologies and management digitalization more widely and forcefully. This research focused on one aspect of this global change — AM in the broader definition of the term suggested by the EC and ILO experts [2] and not reducing it to platform algorithms. For sociology of management, we suggest the define AM as a combination of (1) the automation of managerial functions by the employer; (2) technologies for workers’ self-management; (3) the digital economy way of organizing labor, in which platform is a third party. AM represents one of many innovations changing the human–artificial relationship and share in the work environment, being one of many signs of the public/private dichotomy deterioration due to digital transformation. It may become the last nail in the coffin of privacy in the reality of social media, smart houses, smart cities, and e-government. Being a combination of direct and indirect control, AM already has shown many social biases in business. However, it is still understudied since there are not many companies with a high Industry 4.0 Maturity Index, but mostly because the mainstream still reduces it to a matter of the platform economy. Even the network-like anti-hierarchy Industry 4.0 increases the operational autonomy of business-owners in technical relations to the world, safe from the consequences of their own actions. In some cases, it may end up in the form of pan-spectron surveillance without legislative protection. Since AM as infrastructure is still at the early stages of Industry 4.0, many underestimate possible problems — opportunities created by virtual shadows of organization may look too tempting to care enough for the risks. AM is much more than just the digital support of traditional management functions; it implies deep structural and cultural changes.
About the authors
M. A. Yudina
HSE University
Author for correspondence.
Email: m.yudina@hse.ru
Pokrovsky Blvd., 11, Moscow, 109028, Russia
References
- Armano E., Leonardi D., Murgia A. Algorithmic management in food delivery platforms: Between digital neo-Taylorism and enhanced subjectivity. Digital Platforms and Algorithmic Subjectivities. London; 2022.
- Baiocco S., Fernández-Macías E., Rani U., Pesole A. The Algorithmic Management of Work and Its Implications in Different Contexts. Seville; 2022.
- Beniger J.R. The Control Revolution: Technological and Economic Origins of the Information Society. Cambridge-London; 1986.
- Bijker W.E., Thomas P.H., Pinch T.J. (Eds.). The Social Construction of Technological Systems: New Directions in the Sociology and History of Technology. Cambridge; 1987.
- Calo R., Rosenblat A. The taking economy: Uber, information, and power. Columbia Law Review. 2017; 117 (6).
- Desanctis G., Poole M.S. Capturing the complexity in advanced technology use: Adaptive structuration theory. Organization Science. 1994; 5 (2).
- Digital Platforms and Algorithmic Subjectivities. Armano E., Briziarelli M., Risi E. (Eds.). University of Westminster Press; 2022.
- Dudina V.I. From panopticon to panspectron: Digital data and transformation of surveillance regimes. Sociological Studies. 2018; 11. (In Russ.).
- Edwards P.N. Infrastructure and modernity: Force, time, and social organization in the history of sociotechnical systems. Modernity and Technology. Cambridge; 2003.
- ETUI: Regulating Algorithmic Management: An Assessment of the EC’s Draft Directive on Improving Working Conditions in Platform Work. 2022. URL: https://www.etui.org/publications/regulating-algorithmic-management.
- Feenberg A. Transforming Technology. Oxford; 2002.
- Feenberg A. Critical theory of technology: Overview. Tailoring Biotechnologies. 2005; 1 (1).
- Feenberg A. Technosystem: The Social Life of Reason. Harvard; 2017.
- Giddens A. Central Problems in Social Theory: Action, Structure, and Contradiction in Social Analysis. Los Angeles; 1979.
- Hardt M., Negri A. Commonwealth. Cambridge; 2009.
- Industry 4.0 Maturity Index. Managing the Digital Transformation of Companies. Update 2020. Schuh G., Anderl R., Dumitrescu R., Krüger A., Hompel M. (Eds.). URL: https://en.acatech.de/publication/industrie-4-0-maturity-index-update-2020.
- Jalabneh R. et al. Use of mobile phone apps for contact tracing to control the covid-19 pandemic: A literature review. Applications of Artificial Intelligence in covid-19. Medical Virology: From Pathogenesis to Disease Control. Nandan Mohanty S., Saxena S.K., Satpathy S., Chatterjee J.M. (Eds.). Singapore; 2021.
- Kovacs B., Caplan N., Grob S., King M. Social networks and loneliness during the covid-19 pandemic. Socius: Sociological Research for a Dynamic World. 2021; 7.
- Latour B. Science in Action. How to Follow Scientists and Engineers Through Society. Harvard University Press; 1987.
- Latour B. Reassembling the Social: An Introduction to Actor-Network-Theory. Oxford; 2005.
- Law J. Notes on the theory of the actor network: Ordering, strategy, and heterogeneity. 1992. URL: https://www.lancaster.ac.uk/fass/resources/sociology-online-papers/papers/law-noteson-ant.pdf.
- Livingstone S. In defense of privacy: Mediating the public/private boundary at home. Audiences and Publics: When Cultural Engagement Matters in the Public Sphere. Bristol; 2005.
- Lorenzini D. Biopolitics in the time of coronavirus. Critical Inquiry. 2021; 47 (2).
- Piper Liping L. Digital disinformation about covid-19 and the third-person effect: Examining the channel differences and negative emotional outcomes. Cyberpsychology, Behavior, and Social Networking. 2021; 23 (11).
- Stark D., Pais I. Algorithmic management in the platform economy. Sociologica. 2020; 22 (3).
- Styrin E.V., Dmitrieva N.E. State Digital Platforms: Formation and Development. Moscow; 2021. (In Russ.).
- Tomprou M., Lee M.K. Employment relationships in algorithmic management: A psychological contract perspective. Computers in Human Behavior. 2022; 126.
- Viera T. The lose-lose dilemmas of Barcelona’s platform delivery workers in the age of covid-19. Social Sciences & Humanities Open. 2020; 2 (1).
- Workman M., Ford R., Allen, W. A structuration agency approach to security policy enforcement in mobile ad hoc networks. Information Security Journal. 2008; 17.
- Yakhontova E.S. Assessing trust in the context of personnel management. Sociological Studies. 2004; 9. (In Russ.).
- Zhelnin A.I. From “the first controlled pandemic in history” to rational biopolitical administration. Discourse-Pi. 2021; 1. (In Russ.).
- Zuboff S. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. New York; 2018.