Imagine a marketing executive pitching a revolutionary new tool that automatically calculates the optimal price for every item you sell, tailored to appeal to specific customers and adjusted in seconds to account for industry conditions and competitor pricing. Powered by AI, your entire marketing department fits in one box. What's not to like?
Without asking the right questions, algorithmic pricing engines are often the hottest thing in the marketing world, but they're also a hot item among antitrust lawyers and lawmakers. The Sherman Antitrust Act and state laws prohibit competitors from acting in concert to set prices, and many suspect that is happening inside AI-driven pricing services.
If your algorithmic pricing engine just calculates company-provided numbers and widely available industry data, such as public prices, you're probably fine. The problem begins when the engine uses not only your own internal data, but also your competitors' internal data. Katherine B. Forrest, a partner at Paul Weiss who focuses on legal issues surrounding AI, says lawyers call this “give-to-get,” and it could be illegal.
“If you provide specific information, you'll get a better response,” Forrest explained. “What's bothering people is classified information.”
Recovery is not collusion
The U.S. Department of Justice has settled a lawsuit against rent-setting service RealPages, alleging that the company algorithmically revived an old antitrust concept, the hub-and-spoke conspiracy. Competitors (spokes) share sensitive information with a vendor or industry organization at the center (hub) with the understanding that everyone shares that information and uses it to make coordinated pricing decisions.
What makes the whole arrangement illegal is a collective understanding called the “rimu.” Greystar, the nation's largest apartment landlord, paid $50 million as part of a $141 million industry settlement with the government that included an agreement to stop using “competitively sensitive data of competitors.”
Is your pricing engine breaking the law? It's a difficult question to answer without digging into the details. The Ninth Circuit Court of Appeals has dismissed a lawsuit against Las Vegas hotel pricing engine Cendyn. The judges failed to understand the boundaries and clear understanding between competitors to price their rooms based on confidential information. Simply matching a competitor's price, even if it gives the appearance of collusion, is known as “parallel pricing” and is allowed under long-standing antitrust law precedent.
“Antitrust laws limit agreements among competitors about how to compete, but they do not require a company to turn a blind eye to the same information just because its competitors also know it,” the Ninth Circuit said.
know your algorithm
The challenge for administrators is to get enough details from pricing vendors to determine whether their AI engines are following the rules. Does your black box process numbers other than the ones you provide? And is that information provided by a competitor? Does your contract require you to follow recommended pricing? Do the same requirements apply to your competitors? Perhaps most importantly, do you understand how this mysterious engine works?
“If you don't have the prospect of someone explaining it to you, you could end up buying yourself a lawsuit that you can't easily escape,” Forrest says.
The safest bet, she says, is to keep a written record of the answers to the questions above and use a separate vendor to analyze the data and make pricing recommendations. Avoid package deals that bundle both services in one mysterious box.
Managers must ask: “If something goes wrong, can someone clearly explain why the algorithm doesn’t work as a pricing vehicle?” she says. “Who can say that's not what's happening?”
