The Forrester Wave for Search Marketing Agencies is out and…

By Acronym News, Archives, News, Tech No Comments

Acronym was ranked as one of the top 12 search marketing agencies by research firm Forrester.

In fact, The Forrester Wave™: Search Marketing Agencies, Q4 2017 report noted the 5 Strong Performers, including Acronym, offer a “competitive option.”

The report included an evaluation of the strengths and weaknesses of 12 vendors against 25 criteria in current offering, strategy and market presence.

Acronym scored the highest possible score in the following criteria:

· Collaboration
· Account Management
· Global Execution
· And Performance.

“Acronym did well in this year’s study due to its ability to collaborate with other agencies and internal stakeholders,” the report said. “It also received high praise from client references for its account management…”

Forrester said in conducting this research it found search marketing agencies have reached what it called “full maturity” because they have mastered SEO and paid search.

“Every agency in this Forrester Wave demonstrated that they have a good framework or methodology in place to create a client’s SEO strategy and each agency uses proprietary or third-party tools that help automate SEO tasks like keyword research or site auditing,” the report said.

What’s more, Forrester said non-traditional search engine expertise and agency strategy are key differentiators among search marketing agencies.

These are two key Acronym attributes. In fact, the news from Forrester comes on the heels of awards from two industry bodies lauding Acronym’s work in voice-activated queries, proving Acronym’s understanding of modern end user interactions across mobile devices and digital assistants, such as Alexa, Siri, Cortana and Google Assistant, is quite advanced.

“Acronym is leading the next generation of search agencies because we’re not only becoming more AI- (or, more specifically at this point, machine-learning-) led, we’re broadening our service offerings to clients,” said CMO Mike Grehan. “We were pioneers in the first generation of search agencies specializing in SEO and paid search. Now, we are focused much more on understanding searcher intent and consumer behavior throughout their journeys. That means we’re well positioned to deliver in social, display, content, discovery and more. Simply put: The path to purchase is fragmented today – and brands need a Strong Performer to navigate.”

AcroBabble – Going (Creatively) Digital – July 16, 2015

By Archives, Tech No Comments

Going (Creatively) Digital


Williams Sonoma, Visa Checkout Partner On TrueView Shoppable Videos

One of the first brands to get into the kitchen of TouTube’s nascent commerce-enabled video service TrueView is Williams Sonoma, in partnership with Visa Checkout. The National Retail Federation reports that consumers can peruse a series of videos created by global food lifestyle network Tastemade and purchased items like glasses, cocktail pates and platters directly from the video. Via Checkout compresses 44 fields of information into just a few clicks—delivering the latest digital iteration of instant gratification.


Facebook Hopes Consumers Will Like Brands In Their News Feed

The latest from Mark Zuckerberg and Co. is a program that allows consumers to pin Brands to the top of Facebook users’ news feeds. The company hopes it will assuage the concerns among marketers whose messages got buried in the feeds and make them top-billing stars again. According to BloombergBusiness, FB users can select a group of preferred friends, whose updates will always appear at the top of the home page. Then users can decide which retail and consumer brands, news organizations or interest groups they Liked and those will appear at the top as well.


Buyers Look The Other Way With Regard To Attention-Based Ad Metrics

Amid the continuing evolution of—and search for—better digital advertising engagement metrics, it would appear that attention-based measurement isn’t catching fire, beyond a handful of online publishers. Digiday quotes buyers from shops like Media Kitchen and Mediacom North America saying they’re still focused on purchasing clicks, impressions and audience. One publisher in the forefront of attention-based performance is The Financial Times, which gets kudos for trying advance the industry’s quest for 100% ad viewability. Chief among the concerns about attention-based performance is a divergence between branding and click-through rates as KPI’s.


Twitter Testing Direct Response Ad Products

Twitter’s top management flux isn’t stopping the company from rolling out new products for direct response advertising. One such offering enables marketers to showcase apps within a video ad, giving potential downloaders a chance to understand how the app or product works. Previously, brands could showcase apps solely through pictures. Reuters reports that tests involving a select group of advertisers will lead to the new ad products being widely available later this year.


C-Suite Moves

Nancy Richardson to Clearly as Chief Marketing Officer, from lululemon athletica, where she had been VP, Digital and Brand Strategy.

Jen Grant to Looker as Chief Marketing Officer, from Elasticsearch, where he was Chief Marketing Officer.

Freddy Mangum to AirTight Networks, from Lastline, where he was Chief Marketing Officer.

Steve Guberman to Chief Merchandising Officer at US Foods, where he has been SVP, Merchandising and Marketing Operations.


Barbara Cooperman to Kroll as Chief Marketing Officer, from The College Board, where she was Chief Marketing Officer.

What’s Next For Yahoo After Impending Maps Exit?

By Archives, Tech No Comments

By Kelly Marcus

In the most recent sign of Yahoo’s ongoing resource realignment, Yahoo Maps officially reaches the end of its road at the end of June—just shy of a decade’s journey.

This comes as a shock to few SEOs, given the minimal reach Yahoo was able to muster versus competitors like Google and Apple. For example, according to ComScore, the average multi-platform reach of Yahoo Maps is around five million people. To put that in perspective, the multi-platform reach of Google Maps is around 144 million.

While Yahoo’s market share seems significant as an independent number, it’s likely that this shutdown will have a minimal impact on local search due to the popularity of Google and Apple Maps. The shutdown applies only apply to Mapping technology will still be available on Yahoo Search and its various platforms such as Flickr.

So what’s next for Yahoo in the mapping sector?  It seems most likely the company will find a third-party mapping technology provider to take over mapping services and remove it from internal Yahoo management, similar to what Yahoo accomplished with the Yahoo-Bing search deal back in 2009. Moreover, Yahoo needs to be a player in mobile search, so it can’t do without a maps feature.

The most likely result of the Yahoo Maps suspension will be an increase in referral traffic from Google Maps and Apple Maps, as frequent Yahoo users are forced to find alternative platforms.

Local businesses should place an emphasis on having up-to-date and accurate information on their Google Maps & Bing Maps listings.  Local listing management systems are helpful when managing citations, NAPs, and other business information.

Monitoring analytics will also be essential in determining if this suspension strongly impacts traffic.  If it does, you want to make sure your local assets and landing pages are optimized. Most experts believe it will be difficult to determine any impact in traffic until early to mid-August.

Because the suspension of mapping services is limited to, local businesses should continue to monitor traffic from Yahoo, as the company will continue to use its mapping features in its search engine. If Yahoo users remain dedicated to the platform, they may continue to use Yahoo as a local listing search platform with the absence of maps. In any case, businesses should still maintain their Yahoo listings and monitor referral traffic from both and


KellyKelly is an SEO Analyst on the Travel Team at Acronym Media. Kelly supports the team in executing a variety of efforts including on-page, off-page, and technical optimizations with special focus on local search optimization. She has enhanced the SEO strategy for high-profile clients such as Four Seasons, Viceroy Hotels & Resorts, and Denihan Hotel Group.

Prior to Acronym, Kelly graduated from Penn State University in 2014 with degrees in Public Relations and Psychology. She has past experience in public relations, marketing, paid search, and social media.

In The World Of Spoken Search, Viv Lurks In The Shadows Of Cortana, Google Now And Siri

By Archives, Paid Search, SEO, Tech No Comments

290-x-175_vivThe race for the best artificial intelligence-fueled, voice-enabled digital assistant gets more competitive with each passing week.

Thus the ides of June brought news that Toshiba USA will add a new key to Windows PC’s that would summon Microsoft’s Cortana—and in so doing possibly replace the venerable ESC key. Regardless of the position, it’s obvious that Microsoft is seeking to maximize awareness and use of its digital assistant by locking in key real estate, reports The Register.

At roughly the same time, BGR broke the news that the beta version of Hound from SoundHound had eaten the lunch of the best known personal assistant dogs in a test conducted by Seems that Hound leads the pack in learning its master’s intentions based on signals from questions that could naturally lead to related questions. 

Meanwhile, the folks who invented Siri and sold it to Apple for $200 million are at it again. This time the technology is called Viv and its creators are seeking nothing less than world domination of voice search via artificial intelligence and the Internet of Things.

In February, TechCrunch reported that Viv Labs had raised $12.5 million in Series B funding led by Iconic Capital—AKA the “Silicon Valley billionaires club.” The report said that the round “was oversubscribed and values the company at north of nine figures.”

What makes Viv different from Siri? While the latter can only respond to what it’s been expensively and painstakingly programmed to understand, Viv will constantly learn to parse natural language and complicated questions by plucking information from third parties throughout the Internet. The more it does, the smarter it gets. If all goes well, eventually Viv will respond to voice commands when you tell it to close your garage door from the office or turn off your car’s headlights in the airport parking lot once you’re on the plane.

In the May 15 issue of Esquire (once you get past the cover image of Charlize Theron in fetching black), writer John H. Richardson does a deep dive on Viv and its three founders, Chris Brigham, Adam Cheyer and Dag Kittlaus. They believe that what they are doing will completely change the way advertising works online—including paid search.

Take travel. The Esquire story relates how Cheyer asks Viv “What’s the status of JetBlue 133?” The technology knows enough to access flight data from and quickly retorts “Late again, what’s new?” The exercise progresses: “What’s the best available seat on Virgin 351 next Wednesday?” Viv taps into Travelport, which is the back end for Expedia and Orbitz, finds 28 seats and then hits up for specifics on those 28 seats. Why is that? Because Viv already knows—via a private, linked database of Cheyer’s personal information titled “My Stuff”—that he prefers aisle seats and extra legroom.

The whole thing ends with Viv using Cheyer’s credit card information to book the seat.

How could this scenario influence (as in, reduce) the amount of money that marketers like Priceline pay to buy keywords? Too soon to tell. But such an early valuation of Viv “north of nine figures” suggests that some pretty smart people believe it could.

There’s not much on the Viv website just yet, but you can subscribe to receive news going forward. And one other thing:  “If you’re an insanely talented developer with an interest in the future of A.I. we’re always looking for top tier people to join our team. Please inquire at [email protected] for more information.”


By Acronym staffers




MarTech Strategies Should Include Intent-Based Solutions

By Archives, News, Tech No Comments

290-x-175_bingyahoo copy

Scott Brinker, the MarTech advisory board, and thousands of community members created another meaningful forum hosted in San Francisco. One could describe the MarTech Conference in early April as a mashup of marketing and technology professionals. More than 1,100 attendees came together to reimagine what’s possible, get real with one another, and share how to put skin in the customer experience (CX) game to win hearts, minds, and sustainable transactions.

While at the event, I kept seeing the connection between MarTech strategies and search and digital marketing agency Acronym’s intent-based solutions point of view. Why? Because the customer is in control of virtually the entire buyer’s journey and they are searching for information and insights across a variety of devices, 24/7. Digital-savvy customers now expect meaningful experiences across all touch points.

Customer Experience: Not a Collection of Silo Activities

I think many marketers and technologists focus too much on their respective activities and not enough on transformative, customer experience strategies that are unique to their brand, business goals, and organizational structure. We need to reach across the aisle and ask one another, “what is customer experience success that truly matters?”

To further illustrate customer experience opportunities and urgency, Econsultancy and IBM released on April 1st the results of a new study called “The Consumer Conversation” with more than 1,200 responses from consumer and brand surveys. Jay Henderson, Director of Strategy at IBM, and I discussed some of the findings on camera:

[x_video_embed type=”16:9″][/x_video_embed]

Study insights include: personalization disconnect between how well brands think they know customers vs. reality according to customers’ experiences with brands; poor experiences drives more customer churn; customers are more willing to provide information when they trust brands:

  • Over 90% of marketers agree that personalizing the customer experience is critical to their success
  • Despite this agreement, consumers are not getting the personalized experience they seek—only 21% said the communications from the average company are “usually relevant” while only 35% said those from their preferred retailers are “usually relevant”
  • 49% of consumers changed service providers in the last 12 months, with experience-related factors playing a prominent role
  • Of those who changed providers, 30% switched due to provider failure, with 51% citing customer experience as the number one factor
  • 72% of consumers said they would share their geographic data with a brand they trust, an increase of 89% over the average brand
  • 61% of consumers would be willing to share their personally identifiable information with a brand they trust, an increase of 65% over the average company

Organize for Exceptional Customer Experiences

“For organizations serious about making themselves customer-centric the most difficult challenge is structure, not technology. Most are making shallow changes in usability or merchandising, but they’re not prepared to reorganize their teams around the customer,” said Stefan Tornquist, VP Research at Econsultancy and principal author of the study.

Stefan added, “A customer-centric strategy is inherently long-term. It insists on investments that increase customer value in exchange for gains in retention and lifetime value. That’s in sharp contrast to the quarter-to-quarter rabbit wheel that so many companies are caught in.”

Download ‘The Consumer Conversation’ report from Econsultancy:

Commit to Customer Experience Success

Customer experience achievement requires a team approach including interdisciplinary skills and perspectives. Game-changing collaboration and innovation will help us win and keep winning in the fast-moving, MarTech environment – to keep meeting and exceeding customer experience expectations. Read my CMSWire article to gain more insights from 19 presenters at #MarTech Conference.

Follow creatorbase on Twitter to connect with more digital and sustainable business creators, ideas, and content. Share this blog with others and let me know what you think; thanks for your time!


Many years of value creation as a consultant, team leader, individual contributor, and entrepreneur at companies including creatorbase, Selectica, Oracle, Ektron, Sitecore, Lyris, Return Path, Nokia, Creator Connection, Mark Monitor, Cisco Systems, GlobalFluency, Sun Microsystems, Philips N.V., Pandiscio Design, Elm Products and CBS television. A patent holder with agency, Fortune 500, media, and startup successes; degree in Business Administration from Notre Dame de Namur University. Enjoy collaborating with other creators, open conversations, family/friends time, and tennis. Follow @creatorbase.

Impala App: The Future of Smart Photography Is Here

By Archives, Design, Tech No Comments

290-x-175_IMPALA_WEB-1Many smartphone owners are familiar with photo filter apps like Instagram, AfterLight, Aviary and Fused, all of which facilitate the most aesthetically pleasing image. These are the offspring of worldwide innovation competition combined with the ever-increasing power of smartphone chips. In addition, digital giants like Baidu, Facebook and Google provide server-based image identification features.

Into this arena steps Impala App from Euvision Technologies (recently acquired by Qualcomm). It’s believed to be the first “smart photography” technology solution that both modifies and categorizes photos completely within a smartphone—no servers required. Available free from the Apple ITunes and Google Play, Impala automatically creates a series of labeled folders (such as animals, automobiles or mountains) and places these images into folders to assist user in locating them.

Morever, Impala has taken blocking content to a higher level. It’s one of the more interesting and exciting capabilities of the app. Just as software like Photoshop prevents users from uploading a file containing a scanned federal banknote, Impala has applied these engines to assist media platforms to moderate content.

For an example of these impressive capabilities, Impala is trained to recognize hands. Hands were chosen for representation of color, texture, and shape of certain unwanted scenes. Once the hand becomes visible in the camera’s field of view, the hand is pixilated, and the recording button is made inactive preventing the capture of the image.

Now replace that hand recognition for Adult Content Classifier and this technology goes to another level. It also does not store questionable images on the cloud. Could this be the end of celebrity nude hacking scandals? Hollywood should be cheering. TMZ not so much.


JaimeNashHS2Jaime Nash
Art Director

5 Tips for Successful Analytics Deployment

By Archives, Tech No Comments

By David Sprinkle

These five tips will help minimize the amount of stress, confusion, and aggravation during analytics deployment. Are you using these strategies?

Over the past five years or so I’ve worked on the deployment of many dozens of analytics implementations. Many have been low-stress and on time; others not so much. Recently I’ve been thinking a lot about what made the good ones go smoothly, and I’ve boiled it down to five tips that I think are universal. If every organization just kept these in mind, I think there would be a lot less confusion and aggravation both during and after analytics deployment.

1: Identify SPOCs

In addition to the helpfully rational perspective Vulcan team members can provide, choosing SPOCs (Single Points of Contact) goes a really long way toward making sure your deployment will get finished on time. For an analytics deployment, you’ll need at least two: one for the marketing stakeholders, and one for the development stakeholders.

These people don’t necessarily need to be subject matter experts on analytics (that’s what the pros like me are for!) — they just need to be the ones responsible for communicating, following up, and committing to timelines.

In my experience, it’s usually better to identify a stakeholder who’s already part of your marketing and development teams, rather than bringing on an external project manager, because an embedded team member is going to have much better visibility into the other projects that are going on, dependencies, and organizational/technical limitations that may affect your deployment timeline.

2: Line Up Development Resources

I’m not sure how this happens, but sometimes I’ll be brought in to a deployment to which the marketing team is very committed — they have budget, executive buy-in, and a clear vision — but nobody seems to have told the developers that they’re going to be needed. Even if the tool is simple, even if you’re using a tag manager, 100 times out of 100 you are still going to need to invest some development resources to deploy an analytics tool successfully.

A related (and fortunately easily solved) issue is getting the development resources at the right time, for the right amount of time. It usually takes a few weeks after the project launches before developers will need to be brought in, and sometimes they’ll be there for the kickoff and then drift away since there are no action items for them immediately. Clear communication at the start of the project is the obvious solution here.

Similarly, if you’re using a modern, complex tool, it’s important to set the expectation that the code may need to be tweaked after its initial release. Sometimes developers I’ve worked with have seemed surprised that our quality assurance process uncovered bugs they need to resolve…which kind of makes me wonder what they thought we were doing QA for in the first place!

3: Use a Phased Approach

[pullquote cite=”David Sprinkle” type=”left”]The best deployments I’ve done have broken the requirements up into smaller chunks.[/pullquote]Modern websites may have well more than 100 custom dimensions and metrics they want to track. Tackling the deployment all at once can be a Herculean effort that overwhelms developers and causes challenges for quality assurance stakeholders as well. All too often, what winds up happening is the official deployment has big chunks missing, leading to a loss of confidence in the data — which can be fatal for stakeholder buy-in and makes analysis a lot harder down the road.

The best deployments I’ve done have broken the requirements up into smaller chunks — for instance, phase one might just be the global tagging and “every page” customizations, while phase two includes forms, internal search, and videos, or something like that. Another key advantage to this is that it lets you gracefully handle scope creep: if marketers keep coming up with new things they want to track, you can schedule them into later phases and stick to your current deadlines.

4: Don’t Let the Best Be the Enemy of the Good

This is kind of a reiteration of my last point, but it bears repeating. Some deployments I’ve worked on keep getting pushed back or even taken back to the drawing board because they’re not absolutely perfect. I’ve seen organizations decide to wait to deploy analytics until their site relaunch, which then gets pushed back month after month; I’ve seen marketers refuse to sign off on a deployment because 0.00006 percent of the data looks wrong to them (yes, that actually happened).

Web analytics, as we like to say, is about trends, not exact numbers. The limitations of our technology and the unpredictability of human behavior mean we need to learn to be comfortable with some degree of ambiguity in our data. And if you want to establish trends, the sooner you get started the better.

5: Socialize It!

Hey, you just invested a lot of money and energy into getting all of this data about your customers! Let the rest of your organization know about it! Give people access, hold training sessions, start an internal wiki. Does that mean people are going to come up with a bunch of weird questions? Of course. Will you be overwhelmed with help desk requests? Maybe. (I might write another post soon about ways to deal with that — stay tuned.) But on balance, I think that developing a data-driven culture of empowered individuals leads to better business decisions than keeping your analysts squirreled away like some high priests of statistics.

So those are some of the things I’ve learned working in this field, and I hope reading this can help your team get through deployment and into the fun data analysis stuff (yeah, I do consider that fun, thanks). If you’ve got any other tips to contribute, feel free to post comments/tips to contribute.


Originally published at ClickZ


David_Headshot2An expert on analytics architecture and integration, David specializes in the innovative design and implementation of analytics solutions that deliver both global “big picture” insights and detailed performance metrics. David leads Acronym’s Analytics Practice as well as its Adobe Preferred Partnership, wherein Adobe subcontracts work to David’s team.

David also has extensive experience working with major analytics, bid management and reporting platforms, and is noted for his expertise in integrating such solutions into companies’ larger marketing and business infrastructures. David is a Certified Omniture Professional and a veteran industry speaker. His Client portfolio includes such leading brands as Four Seasons Hotels and Resorts, SAP, The Tribune Company, HP, Scholastic and Humana, among others.

Google Unwraps Android 5.0 Lollipop

By Archives, Design, Tech No Comments

By Jaime Nash

Google is aiming to be the Sultan of Sweet (design) with its universal visual language for users of the new Android 5. Lollipop user ANDROID-LOLLYPOP-290x175interface, visions of which began to emerge in early November under the company moniker Material Design.

Far more than simply added pixels, Material Design (formerly cross-platform design) synthesizes the rules and principles of good graphic design with the innovation of technology and science. It replaces KitKat.

Google’s goal with Lollipop:

• Create a visual language that represents a metaphor based on shared user experience
• Develop a single system to allow for a unified experience throughout Google’s platforms and devices
• The company’s desire to write The Bible on interactive design

Lollipop brings a raft of new features to Android devices, from a single typeface and specialized color pallet to greater sharing across devices. The design emphasizes flat elements and bold colors. Lollipop enables new uses of color (including full-bleed images) along with a modern, crisp and clean, cutting-edge graphic design. Think of the space within your Android devices not as flat lands but as small topographical environments and you get the picture.

And while Lollipop might have been created in service of consumers and customers, it is most certainly intended to assist designers to end poor designs that lead to confusing and products that are difficult to navigate.

Early press on Lollipop highlighted such non-visual improvements as greater battery life and a Screencasting feature that transmits your device’s screen to your TV via a Chromecast dongle.

Want more info? See for yourself at


JaimeNashHS2Jaime Nash
Art Director

Decrypting Google’s HTTPS Security Carrot

By Archives, SEO, Tech No Comments

By Winston Burton

Originally published at MediaPost

Maybe the folks at the National Security Administration know the long-term motives behind Google declaring that websites’ use of HTTPS will now be an organic search ranking factor, although an extremely small one. After all, if any entity could scan Google’s internal emails — the way the company itself does with Gmail — it would be the NSA.

In the meantime, SEOs are left to ponder the consequences of Google’s adding HTTPS encryption to its approximately 200 search ranking signals. It’s eerily reminiscent of the company’s move a while back to add the speed at which web pages load to its algorithmic bag of tricks. After all, how do security and speed relate to content authority and relevance.

The bottom line for now is that SSL/HTTPs is a lightweight signal that will have little impact on increasing search engine rankings. How lightweight? Google says that using SSL/HTTPs has affected “less than 1% of global queries.” Since Google’s mission is to make the entire web secure, all of a site’s web pages (entire domain) must be secure and not just the shopping cart page and checkout pages.

Sites that have high quality content, good domain authority, fresh content, quality links, social endorsements and provide a good user experience will continue to rank highly in Search Engine Results.  For sites that already secure the entire domain and protect personal information (banks, financial institutions etc.) this modification may provide a small boost in search engine rankings. In practice, this signal could most likely be used in a tie-break ranking situation more than a general indication of quality or popularity. It’s not likely that a “tier 1” indexed result will have a “tier 2” result leapfrog over it due to a slight difference in protocol.

Having said this, there is no reason to rush to heed Google’s latest policy edict, which is actually more of a request at this point. The reality is, when it comes to providing its users with exactly what they are looking for, consumers don’t perceive Google as underrepresenting major brands and the most popular sites on the web. But Google does not own the World Wide Web, nor does it have access to all of its content. So it can request, suggest or even tease with a “ranking carrot,” but the democratic element of the Web is free to lose/use the protocol of choice.

SEOs would be advised to conduct keyword research on the most popular 100 queries at Google and the top 100 websites for those queries, and see how many quickly make the change to SSL. Sites that already have SSL/HTTPs on a portion of their web pages will probably not experience a lift in search engine rankings. Since Google’s mission is to make the entire web secure, all of a site’s web pages (entire domain) must be secure and not just the shopping cart page and checkout pages. Once the entire domain is secure, this may help increase visibility to a small degree.

One fear is that websites may experience a decrease in rankings in the switch from HTTP to HTTPs once redirects have been set up. Such sites may experience a small decline in traffic depending upon when Google crawls them and indexes the new URLs. For a big site with thousands of pages, it could take time for Google to crawl and index the new pages that are secure. In this case, it’s a good idea to create a new XML sitemap to inform Google about the new URLs so the engine can find and index them.

For sites that are already in the process of a total redesign/redirect exercise, it may be worthwhile to consider securing the entire site to go to HTTPS.  Incorporating HTTPs/SSL into the redesign or redirect project could yield an increase in search engine visibility, and the risk of loss of existing rankings, traffic, links has already been considered into the equation.

Otherwise, just sit tight and adhere to the adage “If it ain’t broken, don’t fix it.”


Winston-HeadshotWinston joined Acronym in 2014 with over ten years in search marketing. Prior to joining Acronym, Winston was the VP of SEO at Havas Media, one of the world’s top ten global ad agencies. He started the SEO practice for Havas and built the practice to include Clients such as Choice Hotels, Fidelity, Exxon, Volvo and Marc Jacobs to name a few. Winston spearheaded SEO strategy including content marketing, mobile, link building, and all technical areas of SEO. Winston’s career also included the SEO Manager role at Rosetta and time at Zeta Interactive.

What Is a Data Layer?

By Archives, Tech No Comments

By David Sprinkle

Originally published at ClickZ

Data layers provide a standardized format for developers, making it much easier to track or make changes. Are you using data layers in your analytics strategy?

My team and I spend a lot of time helping companies implement analytics tools and other tracking pixels on their websites. And quite often, we run into questions like these:

• “We’ve been using Webtrends, but could you help us switch to Adobe?”
• “Here’s a new pixel for our new remarketing program. Could you tell us what we need to do to install it?”
• “We’ve redesigned our booking engine. What do we need to do to make sure tracking is still working?”

These are the bread-and-butter challenges for analytics implementation. Back in the old days (meaning, like, two years ago), we’d usually need to write up new instructions for each tool, provide detailed data validation for each new pixel, and spend many hours of our clients’ developer resources (and consulting dollars!) handling these changes. All that, just to replicate tracking that was already in place in a slightly different format!

There should be a better way…and there is – it’s called a data layer…

It seems like there should be a better way…and there is – it’s called a data layer.

I like to think of data layers as being similar to a patch bay in a music studio. You’ve got a bunch of outputs (from microphones, amps, etc.) and then a bunch of inputs (like your mixing board, monitors, and recording devices). Instead of snaking dozens of cables from device to device, which inevitably get tangled – and good luck if you need to make any changes later – a patch bay lets you wire each input and output to a dedicated slot on the back of your patch bay. Then you can take small cords and plug output A into input D, like an old-time telephone switchboard operator. It’s far easier to make changes, and you can keep all of the wiring out of sight and out of mind.

A data layer, in essence, is the same thing (only better!). It consists of a standardized format for your developers to output all of the info that you need for tracking in a user-friendly JavaScript object. Then, usually with the help of a tag management system (TMS), we can interpret that data and send it out in all directions to the various analytics tools in your toolkit.

Let’s take a sales confirmation page as an example. You might have 10 different tracking tools, each with their own special complexity, on this page. All of them want to know that it’s the sale confirmation page, the order ID, and how much revenue you made; some of the tools also want to know which products were sold, plus a bunch of other info about the shipping details, discount amounts, etc.

Unfortunately for your developers, each tool wants the data in a different format. So they need 10 different sets of instructions covering the unique quirks of each tool. And, if a new developer comes in and needs to modify anything or copy this tracking over to a new system, THEY need to read 10 manuals, too.

With a data layer, on the other hand, the developers just send one signal that basically says, “This is a sale confirmation page. Order ID = 12345. Revenue = $200.00. Product = blue widget #5.” Then in the TMS, my team can route the relevant data to each of the various pixels.

This makes it much easier to maintain, because any new developer can look at the code and understand what’s supposed to go into the “order ID” field. And if he or she can get that right, then all 10 of the tracking pixels will continue to receive the right data.

In the past year or so we’ve been involved in many implementations that have switched to a data layer format, and I’ve been impressed with the results. It’s easier for the developers, leads to quicker deployment, and makes it a lot easier to maintain even complex implementations over time. I wouldn’t be surprised if within a few years the best CMS tools output one automatically…but there’s no reason to wait for them before making the switch.


David_Headshot2An expert on analytics architecture and integration, David specializes in the innovative design and implementation of analytics solutions that deliver both global “big picture” insights and detailed performance metrics. David leads Acronym’s Analytics Practice as well as its Adobe Preferred Partnership, wherein Adobe subcontracts work to David’s team.

David also has extensive experience working with major analytics, bid management and reporting platforms, and is noted for his expertise in integrating such solutions into companies’ larger marketing and business infrastructures. David is a Certified Omniture Professional and a veteran industry speaker. His Client portfolio includes such leading brands as Four Seasons Hotels and Resorts, SAP, The Tribune Company, HP, Scholastic and Humana, among others.