The Competition and Markets Authority ("CMA") has published a new research paper: 'Algorithms: How they can reduce competition and harm consumers', identifying (in its view) the most harmful areas of algorithmic operation in digital markets.  In addition, the CMA has launched a call for information, giving market participants the opportunity to help shape the CMA's understanding of how algorithms might support and/or harm consumers and, most importantly, its future enforcement stance.

In summary, the paper is highly instructive for firms seeking to understand the CMA's likely approach to the use of algorithms when its new Digital Markets Unit launches in April, setting out:

  • Types of direct harm to consumers – e.g. the use of personalisation measures to discriminate between consumers;
  • Exclusionary behaviour attributable to algorithmic use, e.g. certain self-preferencing activities might result in anti-competitive foreclosure of competitors (i.e. restricting their market access);
  • Measures that can be adopted by both firms and regulators to audit algorithmic systems more effectively; and
  • How regulatory powers can be used to enforce competition law in digital markets, including enhanced information gathering powers and potential new ex-ante enforcement powers.

Digital theories of harm

Following the CMA's 'Digital Markets Strategy' published in 2019, this paper presents its most recent views on how the use of algorithms might result in consumer harm and anti-competitive conduct. Although the paper is not guidance, it is highly instructive for firms on the CMA's likely enforcement stance and provides pointers on future audit requirements.  It is therefore helpful to consider the paper in the context of internal compliance policies in order to avoid future issues. 

1.  Direct harms to consumers

The CMA considers several practices which focus on the personalisation vis-à-vis specific consumers that can be achieved through algorithmic data collection and analysis, e.g. personalised pricing; personalised ranking; and geographic targeting.  As well as identifying potential anti-competitive harm resulting from price-based discrimination between consumers, the CMA explores a number of other 'softer' factors and conduct, e.g. whether platforms can favour certain merchants based on underlying commercial agreements, or limit the extent to which consumers are able to 'shop around' on certain sites and have full price transparency. 

However, the CMA goes beyond traditional competition law application in drawing from other areas of law and policy to underpin the potential 'harm' to consumers, e.g. personalisation or ad-targeting based on 'protected characteristics' could be an infringement of equalities legislation and/or unlawful processing of personal data under data privacy legislation.  Further, the manipulation of website choice architecture such as 'dark patterns' to design unfair and non-transparent ranking results or pre-empt customer switching (which encourages 'stickyness' of consumers) might be considered an unfair trading practice through misrepresentation of the offering.

The paper also tackles well-trodden ground regarding the potential for algorithmic collusive practices, discussing both explicit and tacit coordination between firms, as well as hub-and-spoke collusion.  Indeed, the paper builds on the CMA's previous 2018 paper on algorithmic (personalised) pricing collusion between firms, as well as drawing from its recent Online Platforms and Digital Advertising market study.  Unsurprisingly, perhaps, the paper does not offer any new observations but repeats the current mantra: at present, the empirical evidence of such practices is uncertain, though growing. 1

2.  Exclusionary practices by (dominant) firms

The CMA expresses concerns that competition 'on the merits' might be impeded by dominant firms, e.g. a platform operator might self-preference its own goods and services over the offerings of competitors on its platform.  Similar to the direct harms discussed above, the CMA also discusses how algorithmic design in search ranking practices might achieve self-preferencing outcomes. The EU Platform to Business Regulation (the "P2B Regulation") – applicable to the UK – partially regulates such conduct by imposing transparency requirements on providers of online intermediation services and search engines giving 'differentiated treatment' (such as self-preferencing) to their own goods or services over the goods and services of other business users on the platform.  However, the P2B Regulation's aim is to ensure transparency and fairness for business users, not to address the potential impact of such practices on consumers. In contrast, the European Commission has been actively taking action against such conduct as infringing the competition law prohibition on abuse of dominance, and the CMA appears to be looking to take a similar stance in the post-Brexit world.  The CMA also notes how predatory pricing may result in anti-competitive foreclosure, particularly where algorithms allow customer switching patterns to competitors to be uncovered.

However, as with its analysis of direct harms, the CMA goes further again and explores theories of harm beyond choice architecture design leading to anti-competitive foreclosure, e.g. information asymmetry and conflicts of interest might arise where a firm acts as both merchant and agent/intermediary for other sellers on a site.  The practical management of those relationships or the implementation of customer/client instructions might be processed through automated decision-making programmes, which fail to appropriately manage the conflicts.

3.  Audit processes and documentation

The paper sets out a shopping list of ways in which both firms and regulators can effectively audit algorithmic operation. Interestingly, the CMA appears confident that such audits can be undertaken without access to firms’ internal data, e.g. from simple processes such as utilising 'mystery shoppers' through to more complex reverse engineering or emulation of web applications. 

Of course, access to underlying code is preferable, and it would appear that the CMA expects firms to have appropriate documentation, pseudo-code and general explanations as to how their algorithmic processes function.  Firms should ensure that they have accessible "communications and internal documents about the business context, objectives, design, architectural diagrams, training (including relevant function(s) that has been maximised during an algorithm’s training stage), key performance indicators (KPIs), and monitoring of algorithmic systems."  Indeed, where a firm does conduct audits, fact sheets, model cards, transparency reports, or other internal reviews are likely to be required.  The Financial Conduct Authority,2 the Information Commissioner's Office (the "ICO")3 and other independent bodies have published recommendations as to relevant documentation.

At a minimum, firms should be ready to provide regulators with transparent explanations on how complex algorithms function: "it is incumbent upon companies to keep records explaining their algorithmic systems, including that more complex algorithms are explainable". However, beyond record-keeping, firms should have a proper understanding of how their algorithms operate.  The paper points towards a clear expectation by the CMA that businesses will need to demonstrate adequate systems and controls with regard to algorithms, including training of compliance and other decision-makers in how they operate.  The CMA states upfront that "firms are responsible for effective oversight of such systems, which should include robust governance, holistic impact assessments, monitoring and evaluation".  It stands to reason that the more integral the algorithm to the operation of the firm's business, the greater this expectation becomes. It is therefore crucial that firms attribute responsibilities to compliance, legal, commercial and IT personnel to ensure an appropriate understanding of how the algorithms function in practice and the emerging boundaries within which they can operate, as well as adequate underlying documentation.

The CMA notes the desire to avoid unduly burdening firms with audit requirements, which will be particularly felt by smaller firms; however, it might be inferred that firms which routinely undertake robust and transparent audit practices will be able to present a stronger defence to regulatory action. 

4.  Regulatory action

The CMA paper goes beyond the "usual" competition law remedies (e.g. greater disclosure, internal/external monitoring and risk assessment) to explore the possibilities and limitations of conducting further monitoring activities through the use of other means, e.g. regulatory sandboxes; the hiring of dedicated compliance monitoring staff; and/or use of new accreditation bodies to demonstrate compliance.

In addition, the CMA's Data, Technology and Analytics (DaTA) Unit conducts monitoring and testing activities (similar to its Mergers Intelligence Unit) and the CMA has recommended to the UK Government that its statutory information gathering powers are extended to enable proactive monitoring of conduct without having to open a formal investigation or market study.

Finally, the new 'Digital Markets Unit' within the CMA is expected to commence work in April 2021, and the CMA has already recommended that UK Government introduce further enforcement powers to enable the CMA to enforce competition law better against firms with so-called "strategic market status".  Included within these requested measures is the introduction of a legally binding and tailored code of conduct for digital firms; and greater ex-ante scrutiny of mergers, including for the first time a proposed mandatory notification requirement.

Conclusions: no more heads in the sand(box) when it comes to algorithms

The CMA is now actively inviting engagement with market participants on the application of the competition law regime to algorithms.  The paper is detailed, and demonstrates that the CMA is both actively interested in algorithmic operations and the anti-competitive harms that might arise from their operation; but also that the CMA is equipping itself with the resources and knowledge to take action against and remedy their anti-competitive effects in the marketplace.

The paper points to towards a very clear audit and compliance expectation on all firms, irrespective of their size. This not only translates into heightened systems and monitoring controls, with adequate documentation, robust audits and training programmes, but a clear expectation that certain personnel – including senior management – understand how the these algorithms function.

Clearly, larger firms will be better resourced to rise to these compliance challenges. However, even smaller firms that perhaps license "off the shelf" algorithms will not be exempt, even if they do not have access to the underlying data or analytics tools required to train the algorithm.  Smaller firms will therefore need to ensure that they carry out appropriate due diligence and that their commercial agreements and licences with larger providers address the provision of these audit and compliance materials; they will also need to consider how they can satisfy the obligation to understand and demonstrate that understanding of their operation.  Accordingly, to the extent that smaller firms will be burdened with the same expectations as larger and even potentially dominant firms, this perhaps calls into question the proportionality of the future enforcement regime and re-opens the debate surrounding the potential unintended chilling effects on competition of enhanced regulatory scrutiny.

Overall, however, businesses should be in no doubt that anti-competitive harms that might be attributed to algorithmic processes are clearly within the CMA's sights under the remit of the new Digital Markets Unit, and will remain so for the foreseeable future.