– Soumya Hariharan, Sakshi Agarwal and Akrathi Shetty†
Overview
Enterprises are increasingly relying on algorithms to track, predict, and create business models to improve their efficiencies by way of reduced costs, enhanced quality, and better allocation of resources. Enterprises prefer using algorithms over human labour as they achieve several benefits, including, speedier decision-making, analytical sophistication, etc. Netflix and Spotify use algorithms to provide recommendations to users based on viewing history, which not only curates a better user experience but also enables product development by personalizing experiences based on individual specific data. Utilising data collection and data analytics, algorithms gauge market behaviour and assist companies to offer better products/ services that engage consumers by understanding their demands.
Pricing algorithms predict the optimal and profit maximizing price given various data inputs, which would include market demand and supply conditions, and could include the prices charged by competitors for similar goods. This allows for the implementation of dynamic pricing, including surge pricing (for cab aggregators) or personalized pricing (discounts on shopping websites). Despite the pro-competitive benefits of algorithms, there are several concerns surrounding the use of algorithms in the digital economy. For instance, pricing algorithms designed to identify the maximum price consumers may be willing to pay for a product might escape antitrust scrutiny, but ultimately result in consumer harm by charging higher prices.
Algorithms and the risk of collusion
The Competition Act, 2002 (“Act”) prohibits any agreement which causes, or is likely to cause, an appreciable adverse effect on competition (“AAEC”) in markets in India. Section 3(1) of the Act prohibits any agreement having an AAEC in India, and such agreements shall be void.[1] Under this, price fixing cartels are prohibited. However, the use of automated pricing algorithms may enable a novel form of ‘algorithmic collusion’. Algorithms may be designed to tacitly collude, without the help of traditional means of communication. These pose a greater challenge for antitrust regulators than human collusion, as such collusion is harder to detect and uncover. One such challenge could be identifying an ‘agreement’ in the absence of communication and explicit co-ordination.
Primarily, there are three ways in which pricing algorithms may lead to tacit collusion:
hub and spoke cartels – competitors using the same pricing algorithms as a central ‘hub’ to coordinate prices;
predictable agent – each firm unilaterally creating an algorithm that reacts to market events in a predictable way, allowing competitors to capture signals which could lead to a coordinated outcome;
artificial intelligence or digital eye – pricing algorithms becoming sophisticated enough to self-learn and anticipate events in the market even before they occur, potentially leading to a coordinated outcome.
While these are the key ways in which algorithmic pricing collusion may take place, the effects of such algorithms continue to be assessed and will evolve with time. As our knowledge about algorithms increases, there is a high likelihood of emergence of other methods by which algorithmic collusion may ensue. Antitrust regulators are catching up with the technological challenge that may mandate additional skilled resources, including data scientists to appreciate and holistically analyse the threats such algorithms pose to markets and consumers.
Additionally, algorithms may pose specific challenges for antitrust agencies that may go beyond proving an anti-competitive agreement. The possibility that algorithms could learn to coordinate prices without their developers being aware of this or intending an anti-competitive outcome, brings along with it a host of unanswered questions and risks. One of the fundamental concerns is when the anti-competitive exchange of information between algorithms is not due to any co-ordinated arrangement, but due to an oversight on behalf of the programmers to implement the necessary safeguards to prevent anti-competitive exchange of information. To this effect, the European Commissioner for Competition, Margrethe Vestager, has observed that “companies can’t escape responsibility by hiding behind a computer program.”
Algorithms have made it complex for antitrust regulators to detect the existence of an infringement and prove such infringement, in the absence of communication between competitors. Thedetection of algorithmic collusion might be difficult as they limit the need for direct communication or physical meetings, and parallel conduct by itself is insufficient to prove an illegal cartel. In order to be compliant, companies would need to ensure that they undertake self-assessment measures and adopt compliance programs that routinely monitor any adverse implications of their algorithms on markets. Incorporating adequate safeguards while implementing their algorithms would mitigate any inadvertent anti-competitive practices.
Global Developments
The earliest case in the European Union relating to algorithmic collusion was Eturas, concerning a platform on which Lithuanian travel agencies sold their products that would enter data – based on which the algorithm would determine flight prices. In this case, it was held that the presence of “knowledge” was necessary to find that the parties tacitly agreed to an anti-competitive action, and without knowledge, there can be no infringement. This knowledge would be inferred from ‘objective and consistent’ criteria. Therefore, if an enterprise signed up for the platform knowing that it is also used by its competitors and that the algorithm fixed the prices at a certain level, then such an enterprise may be held liable for anti-competitive conduct.
Recently, the European Commission (EC) imposed fines on four consumer electronics manufacturers for restricting the ability of online retailers to set their own retail prices. This demonstrates the first instance where the EC referred to the impact of pricing algorithms in a competition infringement decision. The online retailers instituted pricing algorithms that automatically adapted retail prices to those of their competitors. The EC held that the use of sophisticated monitoring tools to effectively track resale prices and intervene whenever prices decreased limited effective price competition between retailers and led to higher prices with an immediate effect on consumers.
In the United States of America (U.S.), anti-competitive conduct resulting from algorithms has been the target of enforcement actions by antitrust authorities. United States v. Airline Tariff Publishing Co. (ATPCO) marks the first case that observed that computer-determined pricing may be prone to the same co-ordination as human determined pricing. ATPCO was an agency that collected pricing data from airlines in advance and published the same, based on which airlines would announce tariff raises that were subsequently matched by their rivals. According to the Department of Justice (DOJ), the computerised data exchange mechanism permitted rivals to co-ordinate prices without explicitly communicating. However, the DOJ observed that it is difficult to determine collusion without being able to demonstrate any form of co-ordination. It was observed that the ATPCO was originally to be used for disseminating information regarding fares to the travel agents and the public, but it was used by the airlines to collude and to reach price-fixing agreements by co-ordinating their prices. The case was settled on the condition, inter alia, that the airlines agree to stop announcing price increases in advance of the date on which they took effect, thus restricting behaviour that facilitated the communication of information that supported collusion.
Subsequently, in United States v. Topkins the DOJ prosecuted David Topkins, an Amazon seller, for coordinating the price of posters sold online with other competitors by intentionally developing pricing algorithms with the object of collusion. Recently, an antitrust case was brought against Uber for facilitating an illegal price-fixing conspiracy between the drivers through its computer-based algorithm but was ultimately settled by arbitration. The class action was filed by an Uber customer who alleged that the pricing algorithm yielded supra-competitive prices and, in its absence, the drivers would be free to charge a lower price to compete for customers.
By facilitating monitoring, algorithms could in principle facilitate the implementation or maintenance of a collusive agreement between competitors. For instance, the Competition and Markets Authority (“CMA”) penalized two competitors who sold posters and frames online for agreeing not to undercut each other’s prices for specific products sold online. The sellers usedautomated re-pricing software to monitor and adjust their prices and ensure that neither was undercutting the other. Further, the CMA recently penalised two musical instrument makers for indulging in online resale price maintenance (“RPM”). The manufacturers implemented RPM by restricting retailers from setting prices online which were below a set minimum price. They used a price monitoring software to monitor each other’s prices on a real time basis and to ensure compliance. In order to tackle RPM, the CMA has developed its own bespoke price monitoring tool, created to detect suspicious online pricing activity. The CMA stated that, “The ability of manufacturers and retailers to track and monitor online prices in this way gave us the idea to monitor online pricing ourselves.” The enforcement experience of the CMA demonstrates that reliance on monitoring software to detect algorithmic collusion and spot suspicious activity, may deter parties from engaging in RPM, and could be extended to other sectors in the future.
The CMA’s Working Paper on the ‘use of pricing algorithms to facilitate collusion and personalized pricing’, acknowledges the widespread use of algorithms by online platforms and identifies the possibility of it being used to instantaneously detect competitor prices and deviations, thereby enabling market coordination or tacit collusion between competitors. The CMA finds that the hub and spoke scenario raises the most immediate risk as it requires the adoption of the same algorithm by several firms. The need of the hour based on the indispensable nature of algorithms necessitates regulating the way businesses use algorithms, as opposed to preventing or restricting them.
Developments in India
The jurisprudence relating to algorithms is evolving in India and the Competition Commission of India (CCI) has substantively assessed few cases pertaining to algorithmic collusion. The first case involving algorithms was in Matrimony.com v. Google LLC & Ors.[2], pertaining to Google’s use of search algorithms. The algorithms enabled Google to determine the positioning of advertisements on the search results page. Google’s ability to own, control and design the algorithm, enabled it to manipulate and intervene in the automated search process to impact the relevance and ranking of the results. Google’s search verticals were not determined strictly by relevance, and misled consumers to believe that the prominent search results were algorithmically determined to be most relevant. The CCI found Google’s algorithms to be discriminatory and favouring its own services by manipulating search results and consequently found its conduct to be unfair and abusive.
Subsequently, in Samir Agarwal v. ANI Technologies Pvt. Ltd & Ors., the CCI dismissed the allegation of a hub and spoke cartel between ride hailing apps, Uber and Ola (collectively, “Cab Aggregators”) and their drivers, by means of pricing algorithms. The CCI held that to establish the existence of a hub and spoke cartel, there must be exchange of sensitive information by the spokes (drivers) through the hub (Cab Aggregators), which acts as a facilitator of the cartel. The CCI found that the algorithmically determined price in this case was different for each rider and each trip due to the interplay of big data. The CCI found that the fares are algorithmically determined and are based on several factors such as, distance, time, traffic, etc., therefore, it could not be collusive.
Most recently, the National Company Law Appellate Tribunal (“NCLAT”)[3] upheld the decision of the CCI in the Cab Aggregators case and agreed that an anti-competitive hub and spoke conspiracy would require an agreement among all drivers to set prices through the Cab Aggregators or to allow them to co-ordinate prices between them. Owing to the absence of any proof of communication between the drivers inter se, the NCLAT dismissed the possibility of collusion between the drivers through a hub, since the Cab Aggregators did not function as an association that could facilitate a cartel between its respective drivers.
In the CCI’s merger control decision in the radio taxi sector, Hyundai Motor Company/ Kia Motors Corporation, in order to allay any competition law concerns, the parties voluntarily offered commitments pertaining to the algorithms. The commitment would require the algorithm of the radio taxi marketplace to not prefer the driver solely based on the brand of passenger vehicles manufactured by the acquirers, or discriminate against any driver based solely on the brand of passenger vehicles manufactured by any other automobile manufacturer (i.e., other than the acquirers). This decision is significant as it demonstrates the CCI’s willingness to accept modifications pertaining to algorithms in the digital economy that determine how enterprises operate and impact their respective markets.
Challenges & Looking Forward
The use of algorithms no doubt poses several challenges, including distorting markets by concurring to make collusive outcomes more likely without contact between competitors or implementing facilitating practices making detection by competition authorities difficult. A major concern with machine learning and deep-learning algorithms which detect meaningful patterns in datasets, is that of ‘opaqueness’. As the algorithm develops, it becomes less transparent, leading to a lack of control and supervision, and heightened risks of harmful conduct. Antitrust authorities are taking steps towards scrutinising algorithms and towards this end, the CMA recently established a data unit to detect anti-competitive behaviour. To this effect, the French and German antitrust regulators launched a joint project on algorithms and their implications on competition. The enforcement mechanism likely to be adopted by antitrust agencies would need to efficiently balance the pro and anti-competitive effects of algorithms and its impact on consumers.
From an Indian enforcement standpoint, the Ministry of Corporate Affairs set up the Competition Law Review Committee (“CLRC”) to review and recalibrate the Act, with its key recommendation(s) being factored in the Competition (Amendment) Bill, 2020 (“Bill”). The CLRC has observed that the existing framework under Section 3 of the Act is sufficient to cover scenarios of algorithmic collusion. Notably, the Bill stipulates the introduction of hub and spoke arrangements within the scope of anti-competitive agreements.[4] The proposed insertion would tighten the grip of the CCI on those entities using a third-party provider’s algorithm as a ‘hub’, to determine prices or react to changes in the market. Such widening of the CCI’s powers would allow algorithms acting as ‘hubs’ to be caught within the ambit of the Act, thus accounting for the enforcement gap. The CCI is cognizant of the increasing use of algorithms deployed by businesses in India and has kept abreast with its unique nuances and challenges. The CCI has been deliberating several issues in its advocacy engagements on whether the increased use of artificial intelligence and algorithms will lead to new ways of collusion. This is further elucidated by the following observation, “As more online players use AI and pricing algorithms, will it create new ways to collude? How will antitrust law work when decisions are no longer made by humans but instead by machines?”[5]
Competition authorities across the globe are signalling a clear message – companies must ensure they don’t use algorithms in a manner that adversely affects competition. Moreover, the complex challenges created by algorithms coupled with the increasingly borderless nature of online markets place significant importance on effective co-operation between competition authoritiesinternationally to share intelligence and discuss best practices. It would be interesting to see how traditional antitrust concepts like ‘meeting of minds’ and ‘agreement’ are implemented when dealing with algorithms, with little or absolutely no human interference. How would such decisions be imputed to human behaviour? Attributing liability may be particularly complex in a situation where prior to adopting the algorithm, firms provide robust evidence of due diligence indicating little or no possibilities of tacit collusion. In such scenarios, a one-size fits all approach may not be suitable for antitrust enforcement and regulators may need to analyse anti-competitive effects on a case-by-case basis.
Soumya Hariharan is a Partner in Trilegal’s competition law practice; Sakshi Agarwal is a Senior Associate (Competition Law) at Trilegal; and Akrathi Shetty is an Associate (Competition Law) at Trilegal. They are based out of Mumbai and can be reached at soumya.hariharan@trilegal.com, sakshi.agarwal@trilegal.com and akrathi.shetty@trilegal.com for any queries.
[1] Section 3(1) of the Act states that, “No enterprise or association of enterprises or person or association of persons shall enter into any agreement in respect of production, supply, distribution, storage, acquisition or control of goods or provision of services, which causes or is likely to cause an appreciable adverse effect on competition within India.”
[2] The matter is currently pending on appeal before the NCLAT.
[3] The Finance Act, 2017 has transferred the appellate functions under the Act to the NCLAT from the erstwhile Competition Appellate Tribunal, which has ceased to exist effective May 2017.
[4] Section 3(3) of the Act (proposed proviso).
[5] Keynote Address by Mr. Augustine Peter, Former Member, Competition Commission of India at ASSOCHAM 5th International Conference on Competition Law & Tech Sector.
Comments