Half the Web's traffic turns out to be from bots and crawlers, and that's costing companies a boatload of money.
That’s one finding from a report released Thursday by DeviceAtlas, which makes software to help companies detect the devices being used by visitors to their websites.
Non-human sources accounted for 48 percent of traffic to the sites analyzed for DeviceAtlas’s Q1 Mobile Web Intelligence Report, including legitimate search-engine crawlers as well as automated scrapers and bots generated by hackers, click fraudsters and spammers, the company said.
DeviceAtlas is owned by Afilias, which calls itself the world’s second-largest Internet domain name registry.
Bot technologies have long been known to account for a significant amount of traffic, but today they’re becoming more malevolent—and more expensive, said Ronan Cremin, CTO of DotMobi, a mobile content delivery company also owned by Afilias.
Bots are commonly used to generate “clicks” and false ad revenue, but in some cases, they make purchases online with the goal of influencing prices, Cremin said.
“It’s a tricky problem,” he said. “Now that it’s so cheap and easy to deploy bots, the game has changed.”
Digital marketers have long known that much of the traffic to their websites is not legitimate human traffic, and nearly all Web analytics tools attempt to filter out that non-human traffic, said analyst Frank Scavo, president of Computer Economics.
Generally speaking, non-human traffic was on par with human traffic at nearly 50%. Here is the breakdown of traffic by the source type.
But it’s not an easy task.
“Fraudsters go to great lengths to make their traffic appear to be human-generated,” Scavo said. “Moreover, ad sellers and marketing agencies may not be particularly interested in seeing their Web traffic numbers reduced.”
So, what’s a company to do?
“If you’re advertising on a per-impression or per-click basis, you need to closely scrutinize your analytics,” Scavo said. “Trust me, you’re never underpaying.”
If possible, it’s best to link Web marketing expenses to concrete business results like conversions rather than impressions or clicks, he said.
Equipped with analytics tools that can identify non-human sources, companies can also send those bot visitors to slower servers, Cremin said.
“Your main website could be significantly slowed for human visitors by bots, and that’s not a good place to be,” he said. “You can achieve significant cost savings by restricting that traffic.”
Another option is to restrict the content served to bot visitors.
“We don’t offer some site features when we know the visitor is not human,” Cremin said.
That will vary with the nature of the business but cutting off bots’ ability to buy tickets, for example, could be a good move.
Companies should also remember that some bots are created just to obtain information that might be easier to get with a company-provided API, said Michael Facemire, a principal analyst with Forrester.
“If I find some information that is useful to me right now but also would be useful over time, as a developer, the first thing I do is see if there’s an API to get that information,” he said. “If the answer to that is ‘no’, the next easiest way to get it is to write a bot or crawler to regularly scrape the site for that information.”
Since crawlers negatively affect a company’s website, it’s important to use analytics: first to see what pages are being pulled, and then to decide whether a public API could expose some of that data, he said.
Ultimately, it’s a game of cat and mouse, said analyst Roger Kay, president of Endpoint Technologies Associates.
“The bad guys always devise a workaround, and the good guys do the best they can under the latest assault to filter out extraneous traffic,” Kay said.
That’s one finding from a report released Thursday by DeviceAtlas, which makes software to help companies detect the devices being used by visitors to their websites.
Non-human sources accounted for 48 percent of traffic to the sites analyzed for DeviceAtlas’s Q1 Mobile Web Intelligence Report, including legitimate search-engine crawlers as well as automated scrapers and bots generated by hackers, click fraudsters and spammers, the company said.
DeviceAtlas is owned by Afilias, which calls itself the world’s second-largest Internet domain name registry.
Bot technologies have long been known to account for a significant amount of traffic, but today they’re becoming more malevolent—and more expensive, said Ronan Cremin, CTO of DotMobi, a mobile content delivery company also owned by Afilias.
Bots are commonly used to generate “clicks” and false ad revenue, but in some cases, they make purchases online with the goal of influencing prices, Cremin said.
“It’s a tricky problem,” he said. “Now that it’s so cheap and easy to deploy bots, the game has changed.”
Digital marketers have long known that much of the traffic to their websites is not legitimate human traffic, and nearly all Web analytics tools attempt to filter out that non-human traffic, said analyst Frank Scavo, president of Computer Economics.
Generally speaking, non-human traffic was on par with human traffic at nearly 50%. Here is the breakdown of traffic by the source type.
But it’s not an easy task.
“Fraudsters go to great lengths to make their traffic appear to be human-generated,” Scavo said. “Moreover, ad sellers and marketing agencies may not be particularly interested in seeing their Web traffic numbers reduced.”
So, what’s a company to do?
“If you’re advertising on a per-impression or per-click basis, you need to closely scrutinize your analytics,” Scavo said. “Trust me, you’re never underpaying.”
If possible, it’s best to link Web marketing expenses to concrete business results like conversions rather than impressions or clicks, he said.
Equipped with analytics tools that can identify non-human sources, companies can also send those bot visitors to slower servers, Cremin said.
“Your main website could be significantly slowed for human visitors by bots, and that’s not a good place to be,” he said. “You can achieve significant cost savings by restricting that traffic.”
Another option is to restrict the content served to bot visitors.
“We don’t offer some site features when we know the visitor is not human,” Cremin said.
That will vary with the nature of the business but cutting off bots’ ability to buy tickets, for example, could be a good move.
Companies should also remember that some bots are created just to obtain information that might be easier to get with a company-provided API, said Michael Facemire, a principal analyst with Forrester.
“If I find some information that is useful to me right now but also would be useful over time, as a developer, the first thing I do is see if there’s an API to get that information,” he said. “If the answer to that is ‘no’, the next easiest way to get it is to write a bot or crawler to regularly scrape the site for that information.”
Since crawlers negatively affect a company’s website, it’s important to use analytics: first to see what pages are being pulled, and then to decide whether a public API could expose some of that data, he said.
Ultimately, it’s a game of cat and mouse, said analyst Roger Kay, president of Endpoint Technologies Associates.
“The bad guys always devise a workaround, and the good guys do the best they can under the latest assault to filter out extraneous traffic,” Kay said.
Post a Comment