Fraudulent Web Traffic Continues to Plague Advertisers, Other
found that about 28% of website traffic likely came from bots and
other “non-human signals”
traffic is rife with bots and non-human traffic, making it
difficult for ad and media businesses to understand who is
visiting their sites and why, according to new findings fromAdobe.
a recent study, Adobe found that about 28% of website traffic
showed strong “non-human signals,” leading the company to
believe that the traffic came from bots or click farms. The
company studied traffic across websites belonging to thousands
is currently working with a handful of clients in the travel,
retail and publishing industries to identify how much of their
web traffic has non-human characteristics. By weeding out that
misleading data, brands can better understand what prompted
consumers to follow their ads and ultimately visit their
websites and buy their products.
the past two years, the Association of National
Advertisers’ #SeeHer initiative has been
striving to increase accurate portrayals of
women and girls in advertising and media. In a
conversation with Deloitte Digital CMO Alicia
Hatch, #SeeHer Chair Stephen Quinn discusses the
movement’s origin and why marketers are a
critical piece of the equality equation.
note: The Wall Street Journal News Department was not
involved in the creation of the content above.
really about understanding your traffic at a deeper level. And
not just understanding, ‘I got this many hits.’ What do those
hits represent? Were they people, malicious bots, good bots?”
said Dave Weinstein, director of engineering for Adobe
hardly the firststudy
of online fraud, Adobe’s findings are one more indication
of how the problem has roiled the fast-changing ad, media and
digital commerce industries, while prompting marketers to
rethink their web efforts.
traffic can create an “inflated number that sets false
expectations for marketing efforts,” said Mr. Weinstein.
often use web traffic as a good measure for how many of their
consumers saw their ads, and some even pay their ad vendors when
people see their ads and subsequently visit their website.
Knowing more about how much of their web traffic was non-human
could change the way they pay their ad vendors.
have told Adobe that the ability to break down human and
non-human traffic helps them understand which audiences matter
“when they’re doing ad buying and trying to do re-marketing
efforts, or things like lookalike modeling,” he said.
Advertisers use lookalike modeling to reach online users or
consumers who share similar characteristics to their specific
audiences or customers.
buyers can also exclude visitors with non-human characteristics
from future targeting segments by removing the cookies or unique
web IDs that represented those visitors from their audience
addition to malicious bots, many web visits also come from
website “scrapers,” such as search engines, voice assistants or
travel aggregators looking for business descriptions or pricing
information. Some are also from rivals “scraping” for
information so they can undercut the competition on pricing.
bots from big search engines and aggregators tend to overtly
present themselves as bots, and can easily be discounted from
human web traffic, a small percentage of scrapers generate
visits even if they’re not intentionally posing as visitors,
said Mr. Weinstein.
realized that with the growth of things like Alexa and Google
Home and other assistants, increasingly more and more traffic is
going to be automated in nature,” he said. “In the long term,
real humans at real browsers will be a diminishing portion of
there aren’t any plans to monetize a tool that can analyze
non-human web traffic for clients, Adobe eventually could use it
to sell something like a “bot score,” said Mr. Weinstein. For
now, the company will likely just build the function into its
existing analytics products.