About the author: Shreeja Sen is a Research Associate at IT for Change, and contributes to the organisation’s work on digital labour, Big Tech regulation, and data governance. The core of her work revolves around engaging with various policy consultations, at the national, regional, and global levels, and research projects with special focus on legal developments. She regularly writes on issues of regulation of data and corporate accountability, in particular Big Tech. She has a bachelor’s degree in law and a master’s degree in public policy.
This blog symposium is co-organised by OECD Watch and NOVA School of Law.
Introduction
Increasing digitalisation has spawned a system-wide disruption resulting in a reorganisation of production on a global to local scale. This is evident in both traditional value chains and in newer forms of work organisation, like platform work, cloud work, microwork and other invisible, but significant, sectors. The world of Big Tech, and its value chains, has recently reached the tipping point where governments and regulators are increasingly aware of the immense power these corporations hold. There is focus on ensuring these corporations are held accountable, corporations that make up a majority of the top ten companies by market capitalisation. This has been done either through legislative instruments like the Digital Markets Act, the proposed EU Corporate Sustainability Due Diligence Directive (Directive), or even the Platform Work Directive, which requires platform companies to provide basic minimum guarantees to their workers, or through action from regulatory agencies like the US Federal Trade Commission. These measures have been a step in the right direction to ensure that corporations, especially the enormously powerful Big Tech, are responsible for their actions and impact on people, society, and the planet. In this backdrop, the 2023 targeted updates to the OECD’s Guidelines for Multinational Enterprises (MNEs) on Responsible Business Conduct (Guidelines) are also significant. While these Guidelines are voluntary for corporations, they offer an opportunity for improved conduct for MNEs, and scope for redress through the National Contact Point (NCP) process.
The latest update to the Guidelines, which comes over a decade after the adoption of the 2011 version, is certainly timely. The fast paced nature of development makes it imperative that regulation and oversight mechanisms are equally swift. This becomes significant especially in the context of the technology sector, which has seen changes in leaps and bounds in the last fifteen years. In particular, the 2011 Guidelines had become grossly outdated in today’s context, especially with regard to its Science and Technology chapter. In the 2011 version, the tech chapter focused on encouraging MNEs housed in developed or Global North countries to bring technology to developing or Global South countries, in a version of tech evangelism. This version did not call on companies to undertake due diligence for their technology-related harms, and took a simplistic view of technology. In comparison, the 2023 Guidelines – which rename the chapter to “Science, Technology and Innovation”– recognise the importance of the data value cycle and need for risk-based due diligence across the entire technology value chain. This means that NCP complaints can now be filed against companies for failure to undertake due diligence over the human rights and environmental harms occurring in their technology value chains, providing affected communities a chance at remedy. The updated Guidelines also recognise the significance of privacy and data protection norms that must be maintained. Yet, the Guidelines fall short on several accounts.
Missed Targets
The updates to the MNE Guidelines were intended to be “targeted”, and this has been understood as the reason extensive changes have not been made to the text. The 2023 Guidelines updates have expressly sought to cover downstream impacts, including in the tech sector. While this may be the case, it is worth asking if such targeted updates should fail to incorporate nuances of the current context, especially when technology continues to evolve in myriad ways. In an increasingly digital economy, where technology occupies a primary position not just in tech companies, but also in more traditional corporations, this seems like a blinkered position to hold. Additionally, it is useful to also assess whom these gaps end up helping – for instance, if Big Tech corporations, with origins in either the US or China, are not held accountable for their data power by the Guidelines, then these enterprises are free to function as they wish to (as they have in the past).
It is important to acknowledge these gaps, which in the coming years are likely to become more apparent. The gaps can be grouped in the following buckets.
Data value, frontier tech and extractivism
The Guidelines, in the tech chapter, don’t recognise the financial value of data, and consequently, the wealth and power that corporations holding data wield. There is also an absence of considerations beyond collection and sharing of vast amounts of data, outside of having transparent data sharing and access mechanisms. Questions of digital intelligence and aggregate data mapping do not find a place in the Guidelines. The chapter also fails to mention any emerging or frontier technology, in the nature of generative AI models, cryptocurrency, or metaverse and their regulation. The considerations of open data in the Guidelines also don’t account for freeriding and data capture-related issues by Big Tech. These are blind spots in the chapter, since they fail to account for business models of both first-mover tech and digitalising corporations. The result is that the extractivist nature of Big Tech corporations remains absent and goes unacknowledged in the text.
Missing cross-linkages across chapters
The Guidelines fail to outline the specific impacts resulting from the technology sector on topics addressed in other chapters, including in labour, competition, and taxation. These areas have witnessed material impact because of digitalisation and platformisation, in the nature of digital labour platforms like Uber, Deliveroo, and Amazon Mechanical Turk; discussions around monopoly powers of Big Tech, driven by their data accumulation; and digital tax havens, which enable Big Tech to pay almost no tax, especially when compared to the profits they make.
Downstream value chain impact
Risk-based due diligence can often be highly stratified – across regions, type of third party (including scale of operations) and type of business relationships corporations have with the third party. This means that smaller players in the value chain, that are often located in Global South countries and provide critical support, can be left behind. For instance, the fast fashion website Shein uses small scale suppliers in China to fulfil its orders, which can lead to a race-to-the-bottom scenario with regard to rates and working conditions. The updated Guidelines are clearly focused on the importance of companies preventing and acting against harm. However since the focus of companies is often on ensuring due diligence is practicable for corporations, and not in particular on preventing harm, those downstream the value chain often become invisible.
The Significance of Data Power
As discussed earlier, the Guidelines do not consider Big Tech’s data power — intensive and pervasive control over the entire value chain –, data accumulation, and digital intelligence as a factor in the tech chapter. This is of particular relevance for Global South countries, who are usually data suppliers for digital MNEs, who are based in the Global North. This introduces another issue, that of cross-border data transfer – again, only addressed by the Guidelines in terms of international commerce and knowledge exchange. The unidirectional flow of data and digital intelligence to the Global North, without any benefit sharing for the supplier countries, is not critically addressed in the text.
Separately, the fact that the Guidelines view data from the lens of privacy and personal data protection, but without setting down explicit standards, could be misleading. An UNCTAD study showed that the understanding of “sensitive data” varied from country to country and can often be based on the prevailing mores. The general understanding of data governance norms, data protection and privacy thus fall short in a transnational perspective, a gap the Guidelines could have, but did not, close.
Intersections with Trade Agreements
While the Guidelines are important to have as a voluntary standard, they are often undercut by regional or bilateral trade agreements that set lower or conflicting expectations on overlapping issues. In the context of the digital, in particular, the trade agreements route has often been used as the primary mechanism of harmful (de)regulation that conflicts with the Guidelines. For example, aspects like the moratorium on tariffs on electronic transmission – which prevents countries from imposing customs duties on electronics products – have the effect of directly competing with the Taxation chapter of the Guidelines that require ‘enterprises to contribute to public finances of host countries’. The moratorium prevents host countries – usually from Global South – from the benefits of such taxes, revealing how the tech chapter fails to protect the interests of Global South countries.
Other regional agreements, like the Indo-Pacific Economic Framework for Prosperity, which is a US-driven initiative, seeks to address issues of supply chains, digital innovation, labour, environment and corporate accountability standards. Trade agreements like this do not only undermine the voluntary OECD Guidelines, but also national legislations and norms. Such agreements can often be onerous when being negotiated by a Global North nation. Similar issues can be observed in bilateral situations, like the EU-India Free Trade Agreement (FTA) or the UK-India FTA, where India’s Southern location can hurt its prospects. Enterprises are expected to implement the higher of the expectations demanded by the Guidelines, on one hand, and requirements in domestic or international law, on the other. But in practice, conflicts between the two standards, and the voluntary nature of the Guidelines, often result in companies following the lower treaty standards, alone.
Way Forward
In many ways, clear directions for the Guidelines to be made robust emerge in this essay. At the first instance, it is important to acknowledge that the Guidelines are insufficient in their current form to ensure meaningful responsibility by Big Tech companies. They are insufficient to safeguard the rights of Global South countries and their citizens in an adequate manner. Additionally, since these are voluntary guidelines, a lot depends on their implementation by companies and the NCP process – which has its own challenges, including getting companies to come to the table for discussion. There is a need to evaluate the efficacy of such voluntary frameworks, and compare them with more binding obligations. This is necessary to ensure that impacted countries, which are usually located in the Global South, can have ample scope for recourse against digital MNEs. Finally, review processes for OECD Guidelines, if done every ten years or so, may not be sufficient especially for the tech chapter, given the pace of development. Targeted updates, if a more achievable goal, could be done at shorter intervals. In particular, it is useful for the OECD to develop sector-specific and practical guidances for the tech-sector, to address some of these gaps.
*The analysis in this article draws from the work of IT for Change on data governance.
Suggested citation: S. Sen, ‘Missed Opportunities in the OECD Guidelines Tech-related Updates’, Nova Centre on Business, Human Rights and the Environment Blog, 2nd November 2023