Category

Analytics

Snapchat Introduces Trends Tool

By Analytics, Insights & News, Social Media, Uncategorized No Comments

Snapchat continues to boast 293 million daily active users who share visual moments from their lives. For marketers who want to improve their engagement on Snapchat, the platform introduced a new Tools feature called Snapchat Trends that highlights the most popular keywords so you can better engage with your audience.

How This Impacts You:

Market Research can now be found within Snapchat through trend data showing changes in conversation volumes for targeted keywords, including behaviors and categories. This can help marketers shift their messaging focus to better connect with Snapchatters.

Messaging and Copywriting can be adjusted to reflect keyword usage within Snapchat in a way that ensures marketers create contextually relevant content for target audiences.

Better User Profiles and Personas can be created based on the behavior insights from Snapchat Trends. By capturing more intelligence around your target audiences’ daily lives, including when and how they shop, marketers can better target media strategies that align with your target customers’ core interests to drive purchase.

Key Moments Identification becomes easier. We know that Snapchatters use the platform to celebrate major milestones. Now, with Snapchat Trends, marketers can identify the “hashtag holidays” that matter to their customers. From National Ice Cream Day to International Women’s Day, brands looking to “own” a relevant moment in time can utilize this new data for content planning.

Competitive Research on Snapchat is made easier with this new Trends tool. Marketers can not only understand customer sentiment around brands or products, but they can also gain competitive insights on how those products fit in the market. By analyzing multiple keywords in one query, you can evaluate customer conversations to determine brand health as compared to the competition.

In other words, with this new Trends tool, Snap can provide insights into top organic trends, helping brands monitor community chatter and understand top Snaps for trending topics. This helps brands learn more about potential consumers and the Snapchat community as a whole, so they can better research organic behaviors to determine the market fir for their vertical or product/service.

If you’d like to learn more about how to leverage Snapchat to drive brand engagement, contact us today.

Understanding The Google Page Experience Metrics

By Analytics, SEO No Comments

Google’s mission is to provide its users with information that best satisfies their information needs. Thus, one of the most important signals for Google is whether or not the end user has a positive experience in their search journey. If your site’s content satisfies the information need and provides a good page experience for the user, your website will be rewarded with better rankings.

In recent months, Google announced the Page Experience’s full role in search rankings will roll out by the end of August 2021 with an update to their search ranking algorithm. This update impacts how Google evaluates the ‘page experience’ of websites, including visual indicators in search results to highlight sites that have a great page experience. This means Search marketers have a window of about one month to perform key optimizations to their websites and ensure we’re providing the best user experience to searchers.

What is Page Experience?

Google uses a set of signals to detect whether users are likely to have a positive site and content browsing experience. These signals consist of how quickly a page loads (page speed) to give users what they want in the moment; whether or not the page renders properly on mobile devices; and if the site has secure encryption and does not pose a security threat; and if the site has any disruptive pop-ups or interstitials. All of these fall under what Google calls Page Experience signals. Within these signals, Google is introducing a new set of metrics related to page speed that are called Core Web Vitals. These metrics look at load times for the main content elements on a page, load times before a page is ready for user interaction (i.e., clicks, scrolling), and the extent to which content elements shift positionally on a page as it loads and renders. You may reference the diagram below that illustrates the Page Experience signals.

What Does This Mean?

Although page speed has always been a key factor for SEO marketers, the new Core Web Vitals provides additional and clear metrics for how we should optimize page load times.

Additionally, the introduction of a visual indicator in search results will notify users that certain pages have been determined by Google to offer a positive page experience.

Google is no stranger to providing users with such icons, with previous examples including AMP icons, PageRank, mobile-friendly labels, and more. Nothing has been detailed yet on what this visual indicator will look like, but there is some testing going on and we expect to see the label to be rolled out with the updates. The new label is likely to be interpreted by users as a “seal of approval” by Google and therefore its presence or absence can have a substantive impact on clickthrough rates on search results.

Brands can prepare for the upcoming updates by prioritizing efforts to improve page speed for their web presence. This includes identifying pages with longer load times (optimizing the file weights of images and animation and removing unnecessary code from pages. The Core Web Vitals report in Google Search Console is also an excellent place to start understanding how your site is performing in these areas.

If you’d like an audit of your website in preparation for these new metrics, please contact us. We’re happy to help.

POV from Winston Burton, SVP, SEO, Acronym.

holding ipad with google

Google Delays Chrome’s Cookie-Blocking for 2 Years

By Analytics, Insights & News, Privacy, SEO No Comments

Google announced the company is delaying its plans to block third-party cookies until late 2023 as it reconciles the challenge of protecting user privacy while still enabling advertisers to deliver personalized ads.

Chrome’s Engineering Director Vinay Goel said in a blog post:

We need to move at a responsible pace, allowing sufficient time for public discussion on the right solutions and for publishers and the advertising industry to migrate their services. This is important to avoid jeopardizing the business models of many web publishers which support freely available content.”

One part of Google’s rationale for pushing back its plan is centered around concerns that blocking cookies now might encourage tracking companies to use more controversial tactics like fingerprinting to gather browser configuration details.

Meanwhile, the company has faced backlash around both its use of cookies across the web and its plans to block them. In fact, earlier this week, the European Union said it is investigating Google’s plan to remove cookies as part of a wide-ranging inquiry into allegations that Google has abused its prominent role in advertising technology.

And, The Wall Street Journal reported that Google has separately pledged to give the U.K.’s competition watchdog at least 60 days’ notice before removing cookies to review and potentially impose changes to its plan, as part of an offer to settle a similar investigation. That probe stemmed from complaints that Chrome’s removal of cookies would give an advantage to ads on Google’s own products, like YouTube or Search, where Google will still be able to do individual-level targeting.

In the U.S., Google’s cookie-replacement plan was raised in a December antitrust lawsuit against the company brought by Texas and nine other U.S. states.

Google has been testing several new tools to replace various functions of third-party cookies, as part of what it calls a privacy sandbox. The first such replacement technology, dubbed federated learning of cohorts, or Floc, is intended to allow advertisers to target cohorts of users with similar interests, rather than individuals, in order to protect their privacy.

Acronym’s SVP of Performance Media, Gregg Manias reacted to the news:

“I’m not really shocked by this, we have seen over time that privacy search engines like Duck Duck Go blocked it, then we saw large publishers like New York times block it, then we saw competitor browsers like Firefox block it, I think the death of this plan by Google was last week when Amazon blocked FlOC right before prime day.”

Google, of course, plays a central role in the online advertising ecosystem as the company owns the dominant tools used to broker the sale of ads across the web. Cookies, small bits of code stored in web browsers to track users across the web, are widely used in the industry, including in Google’s Chrome browser, which has 65% of the market globally.

Acronym’s EVP of Analytics, Stephanie Hart added:

“Google needs a way to provide advertisers with the ability to target users and it doesn’t seem that the current version of FLOC is it. Google is having a difficult time balancing the demand from regulators and users for privacy against the need for revenue. The market will continue to evolve as Google develops solutions to this dilemma.”

Meanwhile, as the Search giant seeks to find a resolution, Acronym’s SVP of SEO, Winston Burton recently shared some of the other ways marketers can capture customer information through permission-based tactics, including content which, when done right, captures users’ interest at every stage of the funnel.

Google said it expects to complete testing of all of its new cookie-replacement technologies, and integrate them into Chrome before late 2022. Then the advertising industry will have a nine-month period to migrate their services, during which time Google will monitor adoption and feedback. The final phaseout of cookies will happen over three months in late 2023, the company said, adding that it will publish a more detailed timeline.

In the meantime, if you need assistance planning for these changes, please contact us. Our experts can help you navigate these ever-changing waters so you deliver the personalized experiences your customers expect in a way that still respects their right to privacy.

The Potential Impact of Google Allowing Users to Opt-Out of Tracking

By Analytics, Content Marketing, Web Analytics No Comments

Ahead of an upcoming developer conference, Google announced it will let Android users opt out of being tracked by the apps they download from the Google Play Store. 

This move mirrors Apple’s roll-out of iOS 14, which gave consumers the option to opt out of tracking via the IDFA, or device identifier that tracks consumer behavior across apps. 

According to the announcement, this will be launched via a Google Play services update in “late 2021.” 

Why is this important?  

Since Apple’s iOS 14 announcement in 2020, advertisers have been waiting to hear how Google will respond to its competitor. With all three major smartphone players (iPhone, Android & Samsung) using either the Google Play Store or Apple Store, this means nearly all social media app users will have the option to disable tracking.  

Considering that more than half of all worldwide web traffic so far this year was generated via mobile devices, the option to disable tracking will significantly impact first-party data.  

What is the impact on brands? 

This update will further decrease audience size, bringing higher CPMs and less qualified targeting via website data, in the same way Apple’s iOS 14 does. However, the severity of this decrease will depend upon whether this switch will be automatically pushed to users, or if this switch is simply an option users will have to manually turn off in settings. 

What action should brands take? 

First party data will be the name of the game in this privacy-first era. We recommend brands continue to find ways to leverage and foster first-party audiences, whether by creating a newsletter that requires email, a Facebook Store where shoppers interact, or leveraging video ads that can track people who watch most of the video or engage with your ad. 

If you’d like help identifying the best approach for your brand and your specific audience, please contact us. Our experts are available to help.

 

POV by Acronym’s Paid Media Team

abstract covid graphic

Understanding the Coronavirus Pandemic Through Data Visualization

By Analytics, Archives, News, Technology No Comments

By Jonah Feld

We are not epidemiologists. We are experts in data visualization, and we put that expertise to use to better understand the COVID-19 crisis.

The New York Times collected this dataset, which it “made available to the public in response to requests from researchers, scientists and government officials who would like access to the data to better understand the outbreak.”

The following eight interactive visualizations combine this COVID-19 data with US Census data, allowing users to filter and explore critical information at a state and county level.

We took raw numbers for cumulative cases and cumulative deaths by county and by date and turned it into something that allows the user to explore to better understand the meaning of raw data.

We also created measurements, like ratios, rates of change, percentages, etc., that add context. And we added filters to let users narrow the scope to create their own visuals that answer the most important question in data interpretation: “Compared to what?”

Through these graphs, we hope to visually communicate how serious this disease is, as well as its overall impact to date and the effectiveness of efforts so far to flatten the curve.

We hope these graphs help us all better understand this difficult situation.

Stay safe.

 

 

Descriptions of the types of graphs:

Treemap: A treemap visually represent parts of a whole. Contrasted with a map, the states and counties are proportionate to the desired value, not land area.

Animated Scatter: An animated scatterplot is a fantastic visualization when a line chart won’t do. Click on a single dot to trace progression over time.

Racing Chart: A racing chart is like a flipbook of bar charts over time. For a single measurement, it communicates changes in comparative rank, differences, and scale in an easily digestible clip.

Historic Table: This sortable table shows most metrics through the point in time indicated in the date picker. Right-click on a state to drill down to county, or skip to all counties nationally.

Daily Chart: These charts show daily activity (rather than cumulative activity as of each date). The vertical orientation encourages comparisons of daily measures over a shared x axis for dates.

Log Chart: A logarithmic scale is best for representing exponential functions. A steady slope of a line represents a fixed rate of exponential growth, and the horizontal gridlines indicate a relative change in magnitude, typically 10x.

Since Inception: This chart replaces the dates on the x axis with the number of days since reaching a common starting point: either 100 cases or 10 deaths. By aligning start points, the differences in rate of growth are more easily observed.

Small Multiples: Small multiples, popularized by visualization guru Edward Tufte, are a matrix of similar graphs using the same scale and axes, allowing them to be easily compared.

 

Jonah Feld is a Director of Product Development at Acronym. He specializes in data visualization and data integration for Keyword Objects, Acronym’s proprietary Enterprise Keyword Management platform for professional search engine marketers. He has 17 years of experience in SEO/SEM, with a heavy focus on analytics and reporting, having worked at agencies and in consultative roles developing business intelligence solutions.

Acronym’s Adobe Analytics Rockstars Take the Stage

By Analytics No Comments

As American Idol prepares to move to ABC next year, Adobe is auditioning rockstars for its own Analytics-themed version, which will take place during Adobe Summit 2018.

Analytics Idol should be familiar to Summit attendees, but this so-called pre-conference Analytics Rockstar Tour is new – and kicked off this week in New York with Acronym’s chief analytics officer, David Sprinkle, and director of analytics, Janelle Olmer, on stage.

The tour continued in Chicago on October 19 and it will be in San Francisco on November 1 before the finalists are determined. They will present their tips for a final face-off in Las Vegas in March.

Adobe calls Analytics Idol a “fun, fast-paced and informative session where Adobe Analytics users help their peers become rockstars by sharing their top tips and tricks.”

Olmer said Adobe asked Acronym to contribute tips based on work it has done with clients to leverage data to solve business questions.

Sprinkle and Olmer were invited to present after Adobe screened their tips based on “how innovative, practical, and valuable they [were] as well as how broadly they could be used by analysts at other companies in different industries,” Adobe said.

“We’re looking for tips that would help your analytics peers uncover new and deeper insights or perform their daily tasks more efficiently or effectively,” Adobe added.

Acronym’s tips included how to enable accurate reporting in any local currency using only a few new events, as well as how to determine the drop-off rate for online form errors.

“We were selected to participate when Acronym was recognized for outstanding use of Adobe Analytics and applying the tool in creative ways to solve business challenges,” Olmer said.

And, when voting was final in New York, Acronym’s rockstars came in second.

“A lot of what we presented was taking features that have been available in Adobe for a long time and using them alongside stuff that is new,” Olmer said. “Clients who are maximizing the value of Adobe Analytics are the ones who can leverage not just the new stuff, but also the old stuff. And, for clients, it’s important to have a partner who knows it all and is up to date. Adobe is making changes every month that can impact where you can be more effective with tracking and reporting.”

And, luckily for Acronym clients, Olmer, Sprinkle and the rest of the analytics team are always on top of these changes and ready to innovate.

3 Things You Should Know About Google’s New Global Site Tag

By Analytics No Comments

Marketers have access to more consumer data than ever, which is in part why Acronym is expanding its Analytics practice. That, of course, includes hiring digital analytics guru Olaf Calderon, who joined Acronym in August. Here, Calderon weighs in on Google Tag Manager (GTM) and explains what’s hot – and what’s not – about Google’s new Global Site Tag (gtag.js):

Google’s recent release of the Global Site Tag (gtag.js) certainly created some waves in the analytics and marketing sectors. Although, to be fair, any time Google releases anything, it’s sort of a big deal and there are lots of eyes on it.

This release is a new code library that will eventually replace Google Analytics’ analytics.js library. The main idea is to make the coding for Google Analytics, AdWords and Firebase more uniform and improve inter-tool communication. Eventually, the goal is to have just one tag that sends data to all Google tools, instead of having separate tags for Analytics, AdWords, etc.

So without getting into the technical features and frameworks, here are three basic but important things you should know about gtag.js:

1. It is in beta.

Gtag.js is currently in beta. This means it has limited functionality so far, but it’s laying the groundwork for future improvements. The purpose is, after all, to have better communication between the different tools and Google has been making a move toward having a unified structure to make this communication seamless. It is very important to have AdWords, DoubleClick, Firebase, Google My Business and Analytics libraries follow a global structure for data and this is a big step closer.

So should you update your code now? Well, beta is a testing platform, so you’ll be taking a risk and you’ll be assuming all responsibility. Some of the issues with beta releases include not having all features enabled (much as analytics.js was missing the display features) or not having SLA coverage. In addition, things may change, move or stop working. Our suggestion is to set it up on a test environment and start getting familiar with it. Eventually, it will be ready for you. You’ll find out more about compatibility below.

2. It is not for Tag Management Systems (TMS).

While gtag.js lives on the www.googletagmanager.com domain, it is important to note that it is not Google Tag Manager (GTM). Google representatives have said there are two main reasons that the GTM domain was used:

Some Google Clients must whitelist domains in order to comply with internal security policies. For these organizations, GTM is generally already whitelisted. Loading googletagmanager.com within gtag.js means that no additional work is required for these Clients.

Some pieces from GTM are included in the gtag.js file. Only select code snippets are shared between GTM and gtag.js though, like loading in various request templates / libraries automatically & efficiently, and translating data passed to the platforms into the correct format. The gtag.js is roughly 20KB in size, but deduplicates a lot of the logic required to implement various tags.

There is no web interface for gtag.js, which means everything is governed via the code. Nobody “owns” a container, like they would in GTM, nor can any tags be added without developers’ knowledge. You can deploy gtag.js through another tag manager if you want, without worrying about whether this creates a “backdoor” where others can add tags within gtag.js without your knowledge.

3. It is backwards-compatible.

The new library allows for some notable changes and simplifications to custom on-page Google Analytics code, such as a simplified syntax for tracking events. But Google has made sure that gtag.js fully supports analytics.js code syntax. This means that you can make the change in your snippet and not worry about other inline code breaking things. Think of gtag.js as an add-on for now. Keep using the code syntax you have in place, but start experimenting with the new features and definitely start planning a strategy for migration over time.

It’s also likely analytics.js will be supported for a very long time. Just think: The old urchin.js code still works and it’s been deprecated for over ten years.

So, overall, gtag.js is like a road sign on the roadmap towards a unified Google code base. It’s worth getting familiar with how it works and some of the new syntax, but we wouldn’t recommend planning a migration in the near term.

If you want to get into the technical details, you might as well go straight to the source:
https://developers.google.com/analytics/devguides/collection/gtagjs/

How to Staff Your Analytics Team

By Analytics No Comments

By David Sprinkle
290x175cv

Since the demand for analytics professionals outweighs the supply, this column outlines which roles are best kept in-house and which ones you should outsource.

If you’ve ever had to hire people for your web analytics team, you’ve probably noticed it is far easier said than done. The fact is, in today’s market there’s far more demand than supply for analytics experts. That’s great for those of us in the industry who like job security, but it can be a huge headache when you’re trying to build a team.

One option, of course, is to hire consultants (like me!). But is it really a good idea to outsource something as fundamental to your business as your web analytics? In this post, I’m going to try and answer that question based on my experience and the experience of many of my clients.

 

Key Analytics Roles

The first point I’d like to make is that there are actually a few different roles in an analytics team and you probably shouldn’t worry about trying to find them all in one person. The key roles I’d outline are a front-end developer, a web analytics developer, a solution architect, a project/team manager, an analyst, and a test strategist. For our purposes, I’m including testing in the web analytics team, since the whole point of analyzing your website data is to act on it and make the site better.

 

What to Keep In-House

The only role that I would say you absolutely need to keep in-house is the project/team manager. A consultant can help manage an individual deployment, sure, but you need somebody who’s going to oversee your overall analytics strategy. The fact is, an outside consultant isn’t going to know when you are planning a site redesign and isn’t going to be in every meeting that might be relevant for analytics. If you’re serious about analytics, you need somebody in-house to take charge.

Typically, I’d also recommend keeping the front-end developer, who’s going to be modifying your server-side code, either in house or as part of your existing third-party development agency. You want someone who knows your site well; they don’t necessarily need to understand the arcane ins and outs of your analytics tool.

If you have a large organization with a lot of developers, I do find that it helps a lot to have one or two developers who are considered points of contact for analytics projects. These people should get at least a basic training in doing code validation and using the tool interface. This person can prove invaluable as a bridge between marketers and IT.

 

What to Outsource

Conversely, in my experience, there are some analytics roles that it probably makes sense to outsource to a consultant. These include the solution architect, web analytics developer and sometimes, the test strategist.

Unless you have a huge organization, it’s unlikely you’re going to have enough work to hire a full-time solution architect or developer who knows all the ins and outs of your tools’ code base. Furthermore, in my experience, there are advantages to having people in these roles who have a broad experience across a lot of different sites and who are familiar with different ways that different organizations solve the same challenges.

 

The Toss Up

So that leaves analysts somewhere in the middle. For most organizations, I do think it is a good idea to have somebody in house as an analyst, though smaller companies may not be able to afford or justify hiring one full-time. For companies that are trying to build up their analytics teams, I usually recommend hiring one or more analysts full-time but leaning on consultants in the interim. Ideally, your consultants can help train your internal stakeholders and “teach them to fish” over time.

The good news is that being a good analyst is mostly about being smart, enjoying mysteries, and having a firm grip of deductive and inductive reasoning. You don’t even really need to be all that good at math! So, if you can hire people that are smart and willing to learn, within a few months or a year, you can turn them into an analytics wizard.

 

DavidAcro400x400

An expert on analytics architecture and integration, David specializes in the innovative design and implementation of analytics solutions that deliver both global “big picture” insights and detailed performance metrics. David leads Acronym’s Analytics Practice as well as its Adobe Preferred Partnership, wherein Adobe subcontracts work to David’s team.

David also has extensive experience working with major analytics, bid management and reporting platforms, and is noted for his expertise in integrating such solutions into companies’ larger marketing and business infrastructures. David is a Certified Omniture Professional and a veteran industry speaker. His Client portfolio includes such leading brands as Four Seasons Hotels and Resorts, SAP, The Tribune Company, HP, Scholastic and Humana, among others.