Now the trend is spreading. E-commerce platforms such as Booking.com routinely test personalised discounts, depending on your profile. Ride-share apps, grocery promos, digital subscription plans – the reach can be broad.
How AI-driven personalised pricing works
At its core, such systems mine data, a lot of it. Every click, the amount of time spent on a web page, prior purchases, abandoned carts, location, device type, browsing path – these all feed into a profile. Machine learning models predict your “willingness to pay”. Using these predictions, the system picks a price that maximises revenue while hoping not to lose the sale.
Some platforms go further. At Booking.com, teams used modelling to select which users should receive a special offer, while meeting budget constraints. This drove a 162% increase in sales, while limiting the cost of promotions for the platform.
So you might not be seeing a standard price; you might be seeing a price engineered for you.
The risk is consumer backlash
There are, of course, risks to the strategy of personalised pricing.
First, fairness. If two households in the same suburb pay different rent or mortgage rates, that seems arbitrary. Pricing that uses income proxies (such as device type or postcode) might entrench inequality. Algorithms may discriminate (even unintentionally) against certain demographics.
Second, alienation. Consumers often feel cheated when they find a lower price later. Once trust is lost, customers might turn away or seek to game the system (clear cookies, browse in incognito mode, switch devices).
Third, accountability. Currently, transparency is low; firms rarely disclose the use of personalised pricing. If AI sets a price that breaches consumer law by being misleading or discriminatory, who’s liable — the firm or the algorithm designer?
What the regulators say
In Australia, the Australian Competition and Consumer Commission (ACCC) is taking notice. A five-year inquirypublished in June 2025 flagged algorithmic transparency, unfair trading practices, and consumer harms as central issues.
Is this efficient, or creepy?
We’re entering a world where your price might differ from mine — even in real time. That can unlock efficiency, new forms of loyalty pricing, or targeted discounts. But it can also feel Orwellian, unfair or exploitative.
The challenge for business is to deploy AI pricing ethically and transparently, in ways customers can trust. The challenge for regulators is to catch up. The ACCC’s actions suggest Australia is moving in that direction but many legal, technical, and philosophical questions remain.