Tag

google Archives - acronym

Understanding The Google Page Experience Metrics

By Analytics, SEO No Comments

Google’s mission is to provide its users with information that best satisfies their information needs. Thus, one of the most important signals for Google is whether or not the end user has a positive experience in their search journey. If your site’s content satisfies the information need and provides a good page experience for the user, your website will be rewarded with better rankings.

In recent months, Google announced the Page Experience’s full role in search rankings will roll out by the end of August 2021 with an update to their search ranking algorithm. This update impacts how Google evaluates the ‘page experience’ of websites, including visual indicators in search results to highlight sites that have a great page experience. This means Search marketers have a window of about one month to perform key optimizations to their websites and ensure we’re providing the best user experience to searchers.

What is Page Experience?

Google uses a set of signals to detect whether users are likely to have a positive site and content browsing experience. These signals consist of how quickly a page loads (page speed) to give users what they want in the moment; whether or not the page renders properly on mobile devices; and if the site has secure encryption and does not pose a security threat; and if the site has any disruptive pop-ups or interstitials. All of these fall under what Google calls Page Experience signals. Within these signals, Google is introducing a new set of metrics related to page speed that are called Core Web Vitals. These metrics look at load times for the main content elements on a page, load times before a page is ready for user interaction (i.e., clicks, scrolling), and the extent to which content elements shift positionally on a page as it loads and renders. You may reference the diagram below that illustrates the Page Experience signals.

What Does This Mean?

Although page speed has always been a key factor for SEO marketers, the new Core Web Vitals provides additional and clear metrics for how we should optimize page load times.

Additionally, the introduction of a visual indicator in search results will notify users that certain pages have been determined by Google to offer a positive page experience.

Google is no stranger to providing users with such icons, with previous examples including AMP icons, PageRank, mobile-friendly labels, and more. Nothing has been detailed yet on what this visual indicator will look like, but there is some testing going on and we expect to see the label to be rolled out with the updates. The new label is likely to be interpreted by users as a “seal of approval” by Google and therefore its presence or absence can have a substantive impact on clickthrough rates on search results.

Brands can prepare for the upcoming updates by prioritizing efforts to improve page speed for their web presence. This includes identifying pages with longer load times (optimizing the file weights of images and animation and removing unnecessary code from pages. The Core Web Vitals report in Google Search Console is also an excellent place to start understanding how your site is performing in these areas.

If you’d like an audit of your website in preparation for these new metrics, please contact us. We’re happy to help.

POV from Winston Burton, SVP, SEO, Acronym.

holding ipad with google

Google Delays Chrome’s Cookie-Blocking for 2 Years

By Analytics, Insights & News, Privacy, SEO No Comments

Google announced the company is delaying its plans to block third-party cookies until late 2023 as it reconciles the challenge of protecting user privacy while still enabling advertisers to deliver personalized ads.

Chrome’s Engineering Director Vinay Goel said in a blog post:

We need to move at a responsible pace, allowing sufficient time for public discussion on the right solutions and for publishers and the advertising industry to migrate their services. This is important to avoid jeopardizing the business models of many web publishers which support freely available content.”

One part of Google’s rationale for pushing back its plan is centered around concerns that blocking cookies now might encourage tracking companies to use more controversial tactics like fingerprinting to gather browser configuration details.

Meanwhile, the company has faced backlash around both its use of cookies across the web and its plans to block them. In fact, earlier this week, the European Union said it is investigating Google’s plan to remove cookies as part of a wide-ranging inquiry into allegations that Google has abused its prominent role in advertising technology.

And, The Wall Street Journal reported that Google has separately pledged to give the U.K.’s competition watchdog at least 60 days’ notice before removing cookies to review and potentially impose changes to its plan, as part of an offer to settle a similar investigation. That probe stemmed from complaints that Chrome’s removal of cookies would give an advantage to ads on Google’s own products, like YouTube or Search, where Google will still be able to do individual-level targeting.

In the U.S., Google’s cookie-replacement plan was raised in a December antitrust lawsuit against the company brought by Texas and nine other U.S. states.

Google has been testing several new tools to replace various functions of third-party cookies, as part of what it calls a privacy sandbox. The first such replacement technology, dubbed federated learning of cohorts, or Floc, is intended to allow advertisers to target cohorts of users with similar interests, rather than individuals, in order to protect their privacy.

Acronym’s SVP of Performance Media, Gregg Manias reacted to the news:

“I’m not really shocked by this, we have seen over time that privacy search engines like Duck Duck Go blocked it, then we saw large publishers like New York times block it, then we saw competitor browsers like Firefox block it, I think the death of this plan by Google was last week when Amazon blocked FlOC right before prime day.”

Google, of course, plays a central role in the online advertising ecosystem as the company owns the dominant tools used to broker the sale of ads across the web. Cookies, small bits of code stored in web browsers to track users across the web, are widely used in the industry, including in Google’s Chrome browser, which has 65% of the market globally.

Acronym’s EVP of Analytics, Stephanie Hart added:

“Google needs a way to provide advertisers with the ability to target users and it doesn’t seem that the current version of FLOC is it. Google is having a difficult time balancing the demand from regulators and users for privacy against the need for revenue. The market will continue to evolve as Google develops solutions to this dilemma.”

Meanwhile, as the Search giant seeks to find a resolution, Acronym’s SVP of SEO, Winston Burton recently shared some of the other ways marketers can capture customer information through permission-based tactics, including content which, when done right, captures users’ interest at every stage of the funnel.

Google said it expects to complete testing of all of its new cookie-replacement technologies, and integrate them into Chrome before late 2022. Then the advertising industry will have a nine-month period to migrate their services, during which time Google will monitor adoption and feedback. The final phaseout of cookies will happen over three months in late 2023, the company said, adding that it will publish a more detailed timeline.

In the meantime, if you need assistance planning for these changes, please contact us. Our experts can help you navigate these ever-changing waters so you deliver the personalized experiences your customers expect in a way that still respects their right to privacy.

Should Google be scared of Amazon?

By Insights & News, Technology No Comments

In March 2017, Amazon Advertising Platform pulled a surprise upset, beating out Google’s DoubleClick Bid Manager as the most-used DSP with 40% of all brand and agency respondents saying they used it, per a study from analytics firm Advertiser Perceptions.

And while reports acknowledge Amazon’s $1 to $2 billion in ad revenue in 2017 is modest compared to Google’s and Facebook’s, they also say the company is well-positioned to increase its market share.

That’s in part because Amazon offers something of a triple whammy, Bloomberg says: it sells hundreds of millions of items, runs a video streaming service and has access to tons of consumer data.

Amazon has also increasingly prioritized sponsored products in search, nudging brands to pay for better placement, not unlike Google before it.

And, Bloomberg says, CPG companies are also increasingly viewing Amazon as the digital version of store shelves where they can gain prominent display space.

What’s more, most customers that come to Amazon are looking to make a purchase – and, as mobile shopping increases, Bloomberg says more consumers are circumventing search engines altogether and searching directly on Amazon.

In fact, a 2016 study from personalization platform BloomReach found 55% of consumers turn to Amazon first when searching for products online and approximately nine in ten consumers check Amazon if they find a product they want to purchase on another retailer’s site.

“Amazon continues to be the first destination when consumers want to find a product, driven largely by a perceived superior end-to-end experience. Online shopping is all about relevance and convenience and comparison shopping has never been easier – especially with mobile growth,” said Jason Seeba, BloomReach head of marketing, in a statement at the time.

This has prompted an increasing number of brands to ask their agencies for Amazon strategies.

Indeed, Daniel Olduck, executive vice president of global strategy at search agency Acronym, said every one of Acronym’s retail clients made this request in 2017, leading Acronym to emphasize the platform even more in 2018.

Nathan Grimm, director of marketing at Amazon-focused agency Indigitous, too, said client marketing budgets on Amazon have increased significantly in the last year and he expects this trend to continue over the next few years.

But, he added, “I’m not sure if that budget is being taken away from other channels or if it’s the product of expanded marketing budgets.”

Bloomberg, too, noted ad dollars going to Amazon aren’t necessarily coming at Google’s expense, but said it is likely rather from TV or offline retail budgets.

At the same time, Acronym CMO Mike Grehan noted it’s not just Amazon threatening Google, but all mega marketplaces that have heavily trafficked sites with loyal consumers, like Walmart and Target, and that are beginning to realize how much they could add their revenue by selling ads.

And that means PPC budgets may shift even further away from Adwords.

Pointing to Google’s investments in the Shopping search vertical and its experiments with various fulfillment models, Grimm said he thinks Google is likely concerned by Amazon’s growth.

“I think they understand that Amazon isn’t winning market share because of their search engine but because customers love to shop there,” Grimm said. “To counter, Google needs to create or acquire their own rival shopping platform so they can own customer relationships.”

For his part, Grehan said Google is likely to make up some ground by taking more of an affiliate approach for transactions that take place with the aid of a voice assistant. For example, when a consumer asks Google Assistant to make a restaurant reservation, it would go to that consumer’s OpenTable account by using a deep link and Google would take a percentage of the transaction.

Meanwhile, Bloomberg says Amazon is trying to position itself as a lifestyle media brand with broad influence on consumer purchasing decisions.

“The whole point of [Amazon’s rumored acquisition of Target] is to be exactly what they wanted to be, which is the supplier of all things – the Santa Claus principle,” Grehan said.

yext logo

The 15 Best Gary Illyes Quotes from SMX East

By Insights & News, SEO No Comments

At the recent SMX East event in New York, editors from Third Door Media sat down with Google Webmaster Trends Analyst Gary Illyes for an “Ask Me Anything”-style presentation.

In the 75-minute interview, they covered a lot of territory – the mobile-first index, schema, voice search, ranking factors, disavowing links and unicorns, to name a few. Here’s a compendium of Illyes’ 15 most insightful responses from the event:

On where SEOs should focus in 2018:

If you still have sites that are not mobile-friendly, do really focus on that. Not that the mobile-first index will [cause your site to] disappear from the Internet/search results…we live in a mobile-first world [and] even if…your business is not getting right now traffic from mobile, it might just mean you’re not getting it because you don’t have a mobile-friendly site. Perhaps fix that.

If you are already mobile-friendly and the content on desktop and mobile is comparable…and [you] already rank with your desktop site, make sure the mobile [site ranks] also. Structured data is still important [as is] metadata…also on the mobile site. Different types of media – make sure they are on the mobile site and perhaps that’s it.

On the mobile-first index:

We’re working hard to move sites that are ready into the mobile-first index. It’s a slow process, [so I] don’t want to give a fixed timeline. It will probably takes years until [there’s a] full mobile-first index and even then it’s not 100% complete.

“Mobile-first index” is a new thing as a phrase, [but we’ve been] telling publishers small and big to go mobile for perhaps seven years at least. If you did that, then you’re largely good to go — especially if you have responsive design. If you have a mobile site, the resources that would have to be put in to ensure you would do well in the mobile-first index is not that much. Look at content. If you have a small- or medium-sized business, I don’t think you have to invest too much.

On what impact the mobile-first index will have:

The mobile-first index sounds like a bigger splash than I think it will be. I think it will be similar to the Mobileggedon you guys created where the fear of it will be much, much greater than it should be.

I doubt that many sites will even realize they are in the mobile-first index at all.

On what schema does:

Right now, schema is used for learning connections between entities…When you’re reading a book, you don’t need extra context or data to understand you’re reading about quantum mechanics. If you’re reading War and Peace, you don’t have to learn the whole of Russian history to understand what’s happening. Similarly, the algorithms won’t need extra data eventually and should understand simple text and videos publishers put up and make connections.

On why schema is important:

For now, I will say schema is important. We do look a lot at what’s in the structured data and I do think that if we recommend it, you probably want to make use of it.

Schema in general is helpful for us to understand the content on the page and by using that in our search features, we’re helping users find what they’re looking for.

[There was] a survey on the whole “how search should work”-thing and I think that ultimately we should have at one point an algorithm that can figure out the same thing that schemas can provide us – [Google co-founder] Larry [Page] doesn’t believe in manual elections because we should be able to see something is spammy and just [not] include it in our index. Similarly, schemas are helpful for [that], but as algorithms become more advanced, it might not need it.

On how mentions impact rankings:

Mentions not necessarily help rank you better, but rank a little bit better indirectly. They give a better idea of what your site is about or what keywords a site should show up for.

Imagine the algorithm is like a human. If a human sees a lot of brand mentions on the Internet, it will…store [this] in its memory and associate that brand with something. Say you’re selling unicorns and your brand is mentioned with unicorns, so we might learn that your brand is a good place to buy unicorns.

On the relevance of search ads:

I click a lot on search ads… [and I] often find the ads we show in search results are more relevant to me as a user than the ten blue links. That is bad for web search, of course, and we should fix that, but to me as a user, [it’s not].

Bids correlate to relevance, the quality of the site and so on. The same or almost the same thing applies to ads as well – [if the] ads [are] on top, it invariably means they are more relevant for the user in some way than the ten blue links.

On voice search:

I don’t have numbers, but it’s growing…it’s growing really fast and becoming a very important part of search, as well as products like Home. We want to ensure people can search however they want since voice is becoming [more prevalent] and [we want to] ensure recognition quality is very precise.

On whether direct traffic helps rankings:

Search traffic in general is not something we would directly use in ranking…so we’re using other kinds of traffic and [when it comes to] direct traffic…we would see that through Analytics…and I can swear in front of a court we are not using that data for search rankings.

On Panda and on pruning content:

Ultimately, you just want to have a really great site people love. I know it sounds like a cliché, but almost [all of] what we are looking for is surely what users are looking for. A site with content that users love – let’s say they interact with content in some way – that will help you in ranking in general, not with Panda. Pruning is not a good idea because with Panda, I don’t think it will ever help mainly because you are very likely to get Panda penalized – Pandalized – because of low-quality content…content that’s actually ranking shouldn’t perhaps rank that well. Let’s say you figure out if you put 10,000 times the word “pony” on your page, you rank better for all queries. What Panda does is disregard the advantage you figure out, so you fall back where you started.

I don’t think you are removing content from the site with potential to rank – you have the potential to go further down if you remove that content. I would spend resources on improving content, or, if you don’t have the means to save that content, just leave it there. Ultimately people want good sites. They don’t want empty pages and crappy content. Ultimately that’s your goal – it’s created for your users.

On the featured snippet algorithm:

RankBrain is a general ranking algorithm, not focused on features – it is trying to predict what results would work better based on historical search data. Featured snippets have their own algorithms to determine what is a good result and makes a good featured snippet for a certain query.

The theme is we’re working around the clock to improve relevance to ensure we’re not showing something stupid as a featured snippet and we’re changing the underlying code extremely often. It’s a volatile code base that is constantly changing. The featured snippets we show can also change based on external signals like number of links…quality of links [can have] a dramatic effect on what we show on the results page.

On disavowing links:

I have a site with [about] 100,000 visits every two weeks and I haven’t looked at the links for two years, but I know I have some porn links because someone pointed it out and I’m fine with that – I don’t use disavow. If it makes you feel better, then use it, just make sure you’re not overusing it. It is a big gun and can destroy your rankings in a matter of hours if you are misusing it.

Don’t be afraid of sites that you don’t know. There are hundreds of millions – billions probably — of sites on the Internet. There’s no way you’ll know each of them. If they have content and are not spammy, why would you disavow? It’s extremely likely it won’t hurt you.

On black hat techniques:

Imagine you go to the spam report form [and you] file against your competitors. I know a few cases where [they] reported competitors and it resulted in a very deep review for both sites and we found [the site they reported] clean and the reporter was found doing stuff they shouldn’t. You have to be careful about what you report and make sure you’re clean.

On what he wishes websites would do more often:

As a user, I would like fewer ads…as a trends analyst for Google, I understand why, [but I] wish [they] would figure out some way to at least put ads on the site that are not blocking the user interface and are actually fast. Some sites load in the background four tracking scripts and ads and it slows down the site a lot and it’s an awful user experience. You wouldn’t do that for Google, why would you do that for users? Unless it’s a critical part of their lives and they can’t abandon you, they will if your site sucks.