Canadians have grown accustomed to a lot when buying food. Shrinkflation has reduced package sizes. Skimpflation has diluted quality. Loyalty programs increasingly resemble surveillance rather than savings. Prices often feel disconnected from what is happening at the farmgate. Yet 2026 may mark a more consequential shift: consumers realizing that artificial intelligence itself may be pushing grocery bills higher, not because food costs more to produce, but because the industry knows more about them, individually.
At the center of this shift is dynamic pricing. The practice is not new. Airlines, hotels, and ride-sharing platforms have used it for years, and consumers, however begrudgingly, accept the logic. Groceries are different. Food is not a discretionary purchase. It is a necessity, and the social contract around food pricing has long been grounded in predictability and fairness. That contract is now under pressure.
Crucially, this is no longer just an online issue. With the rapid adoption of digital shelf labels, dynamic pricing can now be deployed inside physical grocery stores. Prices can change in real time and potentially vary by location, timing, or consumer profile. The line between online and in-store pricing is disappearing, bringing algorithmic price-setting directly into the aisles.
Charging different consumers different prices for the same food, in the same store, at the same time, simply because an algorithm decides so crosses an ethical line. Evidence from the United States suggests this is already happening. A recent investigation by Consumer Reports, the Groundwork Collaborative, and More Perfect Union asked 437 shoppers in four cities to purchase identical grocery baskets online at the same time. Nearly three-quarters of items appeared at multiple prices, with some products showing as many as five different price points. On average, price differences reached 13 percent per item and about 7 percent across entire baskets.

At one Seattle grocery store, identical baskets ranged from roughly $114 to $124, a spread of more than nine dollars on a single order. Extrapolated over a year, researchers estimate that such pricing variability could cost a family up to $1,200 annually. For households already struggling with food affordability, that is not insignificant.
Once pricing becomes individualized—or appears arbitrary—trust erodes quickly. Consumers are no longer comparing stores; they are unknowingly being compared against each other. Whether online or in-store, the checkout ceases to be a level playing field.
Platforms argue these are limited tests to optimize pricing. But consumers never consented to be part of experiments involving essential goods. Algorithms are trained on data—purchase history, location, loyalty activity—and when pricing decisions are opaque, “randomness” begins to resemble profit maximization by design. Digital shelf labels simply make this easier to execute, faster and at scale.
Canada is already grappling with food affordability and declining trust in pricing. Governments are debating transparency and codes of conduct precisely because consumers feel squeezed. Introducing opaque, AI-driven price variability into this environment—especially in physical stores—would only worsen that distrust.

This is not a rejection of technology. AI can reduce waste, improve forecasting, and strengthen supply chains. But using it to quietly test how much more consumers will pay, without disclosure or consent, breaches a fundamental expectation of fairness.
If two people buy the same food, from the same store, at the same time, they should pay the same price. Full stop. If pricing experiments exist, consumers should be told. And if fairness cannot be guaranteed for essential goods, regulators should intervene—decisively.


















How does this work? I can see how prices could vary by location or timing, but not by consumer profile — unless the store scans you before you hit the checkout, and shows different shoppers different prices when they’re standing in front of items.
Here’s how it works Ferdy. They use info and permissions from your loyalty profile (ie PC Express & Scene), combined with your purchase history, online browsing, social media scraping for more info about you (through their partnerships with Meta) and then they curate a profile for each of us by tokenizing our data and making it “anonymous”. Some use Bluetooth low energy in their price displays to track your footsteps or ping your phone via Bluetooth (again using permissions from your loyalty profile), and measure how long you react to their displays and ads in store and online. They they know when you com in store that you are X profile and usually buy A, B & C. They use AI to predict what you will buy next and what you will pay for it, based on your curated profile.
“Dr.” Charlebois is merely one of 40,000,000 Canadians who will soon be prey to this outrageous corporate affront to civility, sanity and decency being perpetrated on those who buy groceries. Given his inability to speak in either plain English or French, maybe one of the other grocery customers in Canada would be better to help parse this latest abomination: Translating this entire piece would be a chore far more tedious than simply finding a better voice to publish. We can begin by naming the crime and proposing that it is up to the people to not wait for their easily-distracted politicians to fix.
This is ROBOTIC PRICE GOUGING.
The headline should read: Canadians Across the Country Are Blocking the Doors to Grocery Stores Until ROBOTIC PRICE GOUGING Stops!
Subhead: Surveillance Software IS NOT “Loyalty” Card Sweet Talk. MASS ROBO-PICKPOCKETING Is Grand Larceny.
I have filed a complaint about this issue with the Canadian government regarding this very issue. We need AI regulation – stat! Especially in for dynamic pricing in the grocery sector. Access to healthy and nutritious food is a recognized human right! I sent a 10 page paper on grocery store tech to Dr. Charlebois on December 9, 2025. Interesting that he never responded to me but posted this article a week later…