Last week in a hearing before the Senate Subcommittee on Antitrust, Competition Policy, and Consumer Rights, Assistant Attorney General (AAG) Makan Delrahim announced that the Department of Justice (DOJ) is pursuing criminal charges against competitors who allegedly engaged in a price-fixing scheme facilitated by the use of search algorithms. While he did not reveal further details about the case, AAG Delrahim announced that he expected the investigation to conclude soon.

Algorithms are a series of instructions, frequently iterative, that when followed may solve a problem, and they are commonly written into software to process vast amounts of data and quickly arrive at a solution. Algorithms are frequently used in pricing, where a company may create a process that incorporates data on costs, competitors’ prices, and other significant factors to determine pricing. Algorithms are largely used in a procompetitive manner, for example, by enabling a competitor to efficiently process huge quantities of data to aid in reacting to a competitor’s prices in real time.

Nevertheless, antitrust enforcement against companies employing algorithms is a new, but growing, phenomenon. In 2015, in U.S. v. David Topkins, a seller of posters, prints, and framed art pleaded guilty to criminal antitrust charges brought by the DOJ for entering into an agreement with other third-party sellers to fix prices on certain posters. To implement the conspiracy, Topkins designed an algorithm that the parties adopted to set prices at a certain level and monitor the prices of the parties to the agreement. Because of the ease at which the parties kept the agreement, the DOJ stated that the conspiracy was “self-executing” after implementation of the algorithms.

Indeed, algorithms can amplify the effectiveness of an illicit agreement. In 2017, the DOJ and Federal Trade Commission (FTC) warned that algorithms can be highly effective in facilitating illegal agreements because the “speed and ease of algorithmic pricing…likely reduce[s] the benefit that a firm would otherwise enjoy from…defecting from collusive pricing [i.e., from cheating on an illegal agreement].” The antitrust enforcers are clearly monitoring the use of algorithms in illegal agreements, and their willingness to pursue criminal charges shows that the DOJ considers this conduct to be particularly pernicious.

The DOJ and FTC have also warned that algorithms may be used to facilitate an illegal hub-and-spoke conspiracy; for example, competing firms could agree with a single firm “to use a particular pricing algorithm and…do so with the common understanding that all of the other competitors would use the identical algorithm.”

Foreign jurisdictions have also taken an active role in monitoring the use of algorithms in potentially anticompetitive conduct. For example, the UK’s Competition and Markets Authority (CMA) coordinated with the DOJ in the online poster sales price-fixing scheme described above and further found two companies involved liable for violating the country’s competition law. German competition authorities have launched investigations into dynamic pricing in various e-commerce markets, implicating the use of algorithms. Last December, Andreas Mundt, the president of Germany’s Bundeskartellamt (Federal Cartel Office), chided companies for trying to “hide behind algorithms” in response to an investigation into pricing in the passenger airline industry. Going one step further, Margrethe Vestager, the European Commissioner for Competition, has said that businesses have the responsibility to “ensure antitrust compliance by design” of their algorithms, even if a company “may not always know exactly how an automated system will use its algorithms to [m]ake decisions.”

Key takeaways: Companies can face significant antitrust liability in employing algorithms in price-fixing or other illegal agreements. Merely because an algorithm is a set of rules applied by a machine for pricing, marketing, advertising, or other purposes does not insulate the company from antitrust scrutiny for the design or the implementation of the algorithm. After all, using an algorithm requires a decisional human component: from choosing or writing the algorithm (and thus determining the data inputs, process, and acceptable range of solutions sought) to implementing the algorithm (and implicating the question as to whether any agreement was reached with a competitor regarding the decision to institute the algorithm). And this human decision making is still subject to the strictures of the antitrust laws.