Artificial intelligence is quietly transforming the mechanics of online shopping. Every product search, price comparison, and recommendation increasingly involves automated bots scanning websites, analyzing prices, and shaping the digital marketplace behind the scenes.
For companies, those bots can drive traffic and sales or quietly distort data and erode revenue. The challenge for businesses is no longer simply blocking automated traffic. It is understanding which bots help the business grow and which ones undermine it.
Tom Howe, Director of Insights Engineering at Hydrolix, studies these patterns daily through vast streams of web traffic data. His work sits at the intersection of engineering, analytics, and human behavior. With a background in social sciences and more than a decade working in integrated data sciences, Howe focuses on translating complex behavioral data into insights companies can act on.
“Knowledge over fear,” Howe says early in the conversation. The phrase reflects a philosophy that contrasts sharply with the instinct many organizations have when they detect automated traffic. Rather than treating bots as a single category of threat, Howe views them as part of a broader ecosystem that businesses must learn to interpret.
The Blurred Line Between Good and Bad Bots
For many companies, bots fall into a simple binary. They are either good or bad. Howe says that assumption quickly breaks down when examined through data.
“No, the line can sometimes be very unclear,” he explains. “What’s worse is it’s all relative. A good bot can accentuate your goals in the marketplace, and a bad bot can degenerate your goals and achievements. Bad bots can also behave in ways that aren’t understood or are misleading.”
He emphasizes a third category that is often misunderstood.
“There is a third type of bot. Malicious bots are outwardly trying to get you. Their primary goal, what they are programmed to do, is to sabotage your website or system, and sometimes people confuse malicious bots with bad bots. The major distinction is that there can be opportunities for bad bots to become good bots, by AI or data engineers influencing the bot behavior. But malicious bots cannot become good.”
Understanding that distinction determines whether a company blocks the traffic or learns how to manage it.
Detecting Bots in the Data
Bots rarely interact with websites the way humans do. Their patterns reveal themselves through the data they leave behind.
“The most noticeable bot detection is that they hit sites in a different way than a human would,” Howe says. “The patterns that humans typically use as they click, browse, or navigate page views is currently not replicated by any bot.”
Human browsing behavior tends to follow predictable rhythms. Visitors pause on product pages, scroll through details, or move between categories while comparing options.
Bots move differently.
“The data derived from actual user interaction on a website is what sets the precedence for what may trigger a bot event,” Howe explains. “For example, how much time someone spends on a product page and domain specific page separates bots from humans.”
Other indicators include unusual traffic clusters or patterns that do not match the context of the site.
“Companies can be tipped off by behaviors that are uncharacteristic, like all activity coming from the same location or IP address or actions that don’t make sense for the context of the website.”
Understanding Bots Requires Asking Better Questions
Bot detection is often framed as a purely technical challenge. Howe argues that interpretation matters far more than technology.
When asked how much of understanding bot behavior comes down to technology versus simply asking better questions of the data, Howe answers succinctly.
“20/80.”
Within organizations, the same dataset can produce very different interpretations depending on who is analyzing it.
“Within a company, there are different motivators for employees, even different motivators within departments,” Howe says. “Individuals can also classify data differently depending on how they’re categorizing it.”
He offers a practical example.
“From the perspective of a marketing manager, bots crawling a website in order to find deals means my marketing must be working because a consumer is shopping. However, from a business leader’s point of view, this may lead to inaccurate website data or missed revenue.”
Understanding automated activity therefore requires evaluating outcomes rather than isolated behaviors.
“In order to really grasp what bots are doing, you have to recognize what you may value in behavior versus someone else and also look into outcomes, not just immediate actions.”
The Five Million Dollar Bot Blocking Mistake
Companies often respond to suspicious traffic with the simplest possible solution. They block everything that looks automated.
Howe says that instinct frequently backfires.
When one enterprise SaaS company blocked what it believed were harmful bots, the decision cost the organization roughly five million dollars in lost revenue.
“Honestly, it’s way too common and at the same time more common than we know,” Howe says. “Companies underreport because of optics and bad PR.”
The financial damage extended beyond the immediate loss of sales.
“In the retro of the SaaS company, leadership acknowledged that it wasn’t one day of lost sales. It was a long term de indexing crisis that lasted days.”
By blocking automated traffic broadly, the company had inadvertently blocked search engine crawlers responsible for indexing its pages.
“You could be blocking search engine crawlers or other beneficial bots,” Howe explains. “Plus, it can take months for a company’s SEO ranking to bounce back after this type of algorithmic erasure.”
When Bots Help Drive Revenue
Not all bots represent a threat. Some play a central role in how consumers discover products online.
Search engine crawlers from companies such as Google and Microsoft allow e-commerce sites to appear in search results.
“When a company blocks bots universally, they can lose all traffic that could be derived from this method,” Howe says.
The goal, he argues, is precision rather than prohibition.
“Bots need to be handled differently and more precisely. The goal isn’t a complete blackout; it’s reducing potential damage. Use a scalpel, not a machete, so you don’t cut off more than you expect.”
In some cases, companies can even use automated traffic to stimulate sales.
“One example could look like multiple profiles coming from the same network with the same request,” Howe explains. “Instead of blocking it, an automated ten percent off coupon is triggered that actually exploits the bot in order to help the user and the company.”
That approach converts automated browsing into an opportunity for conversion and loyalty.
“And this brings me back to my sociology background,” Howe adds. “In a way, bots help to manage users by manipulating their behaviors.”
Why Blanket Bot Blocking Hurts Businesses
Many organizations still rely on rigid detection rules that treat all automated activity as suspicious.
That approach can distort performance metrics and alienate legitimate customers.
“Bots that play by the rules need the rules to be in place,” Howe says, pointing to mechanisms such as robots.txt files that guide search engine crawlers.
Completely eliminating bots also eliminates the benefits they provide.
“Stopping where traffic is suspicious can also eliminate real traffic, which can lose sales or users,” Howe notes.
Certain users may trigger bot detection simply because their browsing environment differs from the norm.
“Linux users can suffer from bot detection simply by being a different user agent than the norm,” Howe explains.
Privacy tools such as VPNs or specialized browsers can produce similar false signals.
“By relying on one-solution-fits-all detection rules, companies can punish real customers who may have tools like VPNs or specialized browsers in place.”
Reading Signals from Massive Data Streams
Hydrolix processes massive volumes of traffic data generated by content delivery networks and web application firewalls. Within those streams, patterns reveal whether automated activity supports or threatens business goals.
“Knowing what the behavior of the bot is and what its desired outcome is is immediately necessary,” Howe says.
A bot that leads to a sale may benefit the company. A bot that scrapes sensitive information may represent a risk.
Even some scraping activity can evolve into opportunity.
“In some cases companies have found ways to license their material to companies that are intentionally scraping,” Howe says.
The critical step is determining whether the bot’s objective aligns with the company’s interests.
“Data scientists need to see if those align in order to distinguish between a good or bad bot.”
A Future Where Bots Negotiate on Behalf of Consumers
The next phase of e-commerce may bring an even greater shift toward automated decision-making.
Howe believes companies will increasingly market to artificial agents rather than directly to people.
“Increasingly businesses are recognizing that they are no longer marketing directly to humans, but rather to AI agents via the bot,” he says.
In that environment, software may negotiate with other software to find the best deals for consumers.
“I do see a future where it’s software meeting software,” Howe explains. “A consumer shopping is really just one machine telling another machine what the consumer’s options are and negotiating on behalf of that consumer.”
For most shoppers, that activity will remain invisible.
“The odds of an individual cracking the bot code is pretty close to zero,” Howe says. “At the present time, the only way a consumer would know is by monitoring their own digital footprint.”
A Marketplace Powered by Data
Artificial intelligence continues to reshape the digital economy. Businesses, engineers, and consumers are learning how to navigate an environment where automated systems influence nearly every interaction.
For Howe, the path forward requires curiosity rather than panic.
“It feels like we’re still at the forefront of this AI technology,” he says. “We’re learning as we go and the technology develops.”
Companies that succeed will resist the instinct to react quickly and instead invest time in understanding the data.
“We don’t fear what we don’t know,” Howe concludes. “We outsmart this new frontier with knowledge.”
