Google has dropped support for job training structured data and rich results in Google Search. Google said based on its initial tests, the search company “found that it wasn’t useful for the ecosystem at scale.”
Not useful. Google posted about this saying “We initially tested this markup with a group of site owners, and ultimately found that it wasn’t useful for the ecosystem at scale.” Google did not say how large its tests were but just said it did test this in Google Search.
Other job rich results not impacted. Google added that this does not affect any other features that may use Job training markup. Plus, Google said you are welcome to “leave the markup on your site so that search engines can better understand your web page.” Although, I am not sure which other search engines use this this markup.
Why we care. If you were using job training structured data, Google will no longer show them as rich results in Google Search. You may notice click through rate changes on those pages, as the rich results no longer will be displayed in Google Search.
Digital marketing agency NP Digital announced today that it has acquired AnswerThePublic, a popular freemium keyword research tool. Terms of the deal were not revealed.
NP Digital, which was co-founded by marketing influencer and entrepreneur Neil Patel, is also the owner of another popular freemium keyword research tool, UberSuggest.
What this means for AnswerThePublic users. All the tool’s existing features will remain. Users can continue to access the tool via its website or app.
There are a couple of noticeable changes so far, including the addition of “by UberSuggest” under AnswerThePublic’s logo in the main navigation and Patel’s face on a dancing robot’s body on the homepage.
AnswerThePublic has approximately 1 million monthly users and has been a favorite among search marketers for years. NP Digital said it is working on new free features for marketers.
However, the tool hasn’t gotten any significant updates or improvements lately. That’s because the owners weren’t focusing on it (they had another software company).
What it means for UberSuggest users. UberSuggest has approximately 1 million monthly users. Those users will now be able to access the features and data through either platform. Users can soon expect to see new features related to keyword visualizations.
NP Digital said Ubersuggest now spans over 30 billion keywords and 50 trillion backlinks across 249 countries.
Why NP Digital acquired AnswerthePublic. Both platforms leverage the same data sets. But each had its own unique features, functions and visualizations.
NP Digital CEO Mike Gullaksen told me: “This platform allows marketers to leverage the data from the largest focus group in the world – Google search users. Being able to mine massive data sets to identify consumer interest around needs and wants is one of the most powerful yet underused strategies for content and search marketers. Layering on top of these data insights the ability to both lookback and see real-time trends with powerful visualizations allows marketers to move their content initiatives forward very quickly while staying on the pulse of the consumer mindset.”
Why we care. NP Digital says this acquisition will mean more new free capabilities for both platforms – and that you can access these insights from either platform. So both tools will continue to help marketers find inspiration and insights that can help shape SEO and content marketing strategies.
A year of acquisitions. This has been a big year of change in the SEO tools space and we’re not even halfway through 2022. Below are links to coverage of the other big acquisitions we’ve seen so far:
Google Search Console’s News performance report had a logging error which may have resulted in displaying a drop in impression and clicks from Google News.
Google said this was just a reporting glitch and it had no real impact on how your site was ranking or serving in Google News.
The notice. Google posted this notice here stating “Because of a logging error, site owners might see a drop in their Google News data during this period. This is just a logging error and not a real drop in Google News performance.”
Timeframe impacted. The timeframe where Google had this logging error was between May 12, 2022 through May 26, 2022.
Why we care. If you have or will provide clients with reporting, keep in mind the News performance report had logging issues for about half of the month of May. Make sure to annotate this logging error and communicate this to your clients and stakeholders.
On May 25, 2022, Google began rolling out the May 2022 core update, this update came over six months after the November 2021 core update, whereas there was about four and a half months span between the November update and the July 2021 core update. This was the the first update we had in 2022, in contrast, in 2021, we had a total of three core updates.
Historically, we have waited longer to report on the impact of these core updates but honestly, after writing several of these core update impact stories, generally the vast majority of the impact is realized within the first few days of the update (although there have been outliers to this). With this update, the impact was felt super quickly, within 24 hours of the announcement, so we feel it is now safe to report on the impact of this May 2022 core update.
Data providers on the May 2022 core update:
Generally the data providers, which have consistently been Semrush and RankRanger for these reports, have agreed on how volatile these updates have been but with this update – they seem to disagree, that is until you dig into the data.
Semrush. Semrush data showed that the May 2022 core update hit pretty quickly after the announcement. In terms of its volatility tracker, as shown below (or you can view live at the Semrush Sensor tool).
In terms of the speed for these core updates to roll out, “This is already the 3rd core update in a row where the initial roll-out saw a very short burst of initial rank volatility,” Mordy Oberstein, Semrush Communication Advisor, told us. He added that “this seems to be a new pattern” with the rollout of these core updates.
When you compare the May 2022 core update to the November 2021 core update, at first glance, it seems the May update was less volatile than the November update. This is with the exception of the real estate niche, “which seemed to undergo a significant shakeup,” the company shared. Here is this chart comparing the May 2022 core update to the November 2021 core update by vertical:
The issue is that the average level of volatility prior to the May 2022 core update was higher than that of the volatility seen before the November 2021 core update, Semrush explained. In fact, Semrush said the overall increase in rank volatility compare to the baseline level of before the core update volatility was 19% less during the initial release of the May 2022 update compared to the November 2021 core update on desktop, and 24% on mobile.
So if you plot the peak volatility, you see things differently:
So this might mean that even with Semrush data, May 2022 might have been more volatile than the November 2021 core update? Again, it is all about how you process and interpret the data.
This chart below shows you that 17% of the new top 20 ranked results in Google post May 2022 core update came from position 20 or beyond, which is not far off from the previous November 2021 core update:
RankRanger. The RankRanger team also analyzed the Google search results after this May 2022 core update rollout and here you can see how quickly there tool also picked it up (you can also see this live at the Rank Risk Index tool). RankRanger did say the May 2022 core update was a “significant update.”
The folks at RankRanger did compare the May 2022 core update to the November 2021 core update for us as well. RankRanger found that in its data the May 2022 update’s average position changes were higher than the November 2021 update.
When you dive in and compare by position, the volatility does seem more similar across positions:
Retail seemed to be most impacted according to RankRanger data, as you can see by these charts below:
SISTRIX. SISTRIX, another data provider that tracks the changes in the Google search results, sent their top 20 winners and losers for the May 2022 core update. These are US based sites from Sistrix’s data set.
Sistrix added “In our example, we saw that the visibility of the domain was 25.84 points on Thursday, then increased to 27.95 on Friday and as of now (Monday 30th May 08:55) the Visibility Index is 31.98.”
More on the May 2022 core update
The SEO community. The May 2022 core update to the community seems much more significant than the November 2021 core update. Unlike the November 2021 core update, where the timing for that update was not the best, i.e. right during the busiest online shopping season, this update was scheduled a lot better for retailers. I was able to cover the community reaction in one blog post on the Search Engine Roundtable early on. It includes some of the early chatter, ranking charts and social shares from some SEOs.
On Twitter you can find plenty of examples of SEOs sharing charts from their clients – mostly showing winners but also showing losers – with this update.
What to do if you are hit. Google has given advice on what to consider if you are negatively impacted by a core update in the past. There aren’t specific actions to take to recover, and in fact, a negative rankings impact may not signal anything is wrong with your pages. However, Google has offered a list of questions to consider if your site is hit by a core update. Google did say you can see a bit of a recovery between core updates but the biggest change you would see would be after another core update.
Why we care. It is often hard to isolate what you need to do to reverse any algorithmic hit your site may have seen. When it comes to Google core updates, it is even harder to do so. What this data and previous experience and advice has shown us is that these core updates are broad, wide and cover a lot of overall quality issues. The data above has reinforced this to be true. So, if your site was hit by a core update, it is often recommended to step back from it all, take a wider view of your overall website and see what you can do to improve the site overall.
We hope you, your company and your clients did well with this update.
Throughout the history of SEO, people have debated the pros and cons of relying on technical SEO tools. Relying on the hints from auditing tools isn’t the same thing as a true SEO strategy, but we’d be nowhere without them. It’s just not feasible to manually check dozen of issues page per page.
To the benefit of the SEO industry, many new auditing tools have been created in the past decade, and a few of them stand strong as industry leaders. These few technical auditing tools have done us a great service by continuing to improve their capabilities, which has helped us better serve our clients, bosses and other stakeholders.
However, even the best auditing tools cannot find four important technical SEO issues that could potentially damage your SEO efforts:
Canonical to redirect loop
Hacked pages
Identifying JS Links
Content hidden by JS
Get the daily newsletter search marketers rely on.
Some of these issues could be detected by tools, but they’re just not common enough to come across their desk. Other issues would be impossible for tools to detect.
As with many cases in SEO, some issues may affect sites differently, and it all depends on the context. That’s why most tools won’t highlight these in summary reports.
Required tools to uncover these issues
Before we dive into the specific issues, there are two specific requirements to help us find these issues.
Your web crawling tool of choice
Even though most tools won’t uncover these issues by default, in most cases, we can make some modifications to help us detect them at scale.
Some tools that you could use include:
Screaming Frog
Sitebulb
OnCrawl
DeepCrawl
The most important thing we need from these tools is the ability to:
Crawl the entire website, sitemaps and URL list
Ability to have custom search/extraction features
Google Search Console
This should be a given, but if you don’t have access, make sure you acquire Google Search Console access for your technical SEO audits. You will need to be able to tap into a few historic reports to help us uncover potential issues.
Issue 1: Canonical to redirect loop
A canonical to redirect loop is when a webpage has a canonical tag pointing to a different URL that then redirects to the first URL.
This can be a rare issue, but it’s one that I’ve seen cause serious damage to a large brand’s traffic.
Why this matters
Canonicals provide the preferred URL for Google to index and rank. When Google discovers a canonical URL different from the current page, it may start to crawl the current page less frequently.
This means that Google will start to crawl the webpage that 301 redirects more frequently, sending a type of loop signal to their Googlebot.
I’ve seen this happen to some large brands. One recently came to me asking to investigate why one of their key pages hasn’t been driving the traffic they were hoping for. They had invested a lot of money into SEO and had a well-optimized page. But this one issue was the sore thumb that stuck out.
How to detect canonical redirect loops
Even though this issue will not appear in any default summary reports in standard auditing tools, it’s quite easy to find.
Run a standard crawl with your preferred technical SEO auditing tool. Make sure to crawl sitemaps as well as a standard spider crawl.
Go to your canonical report and export all of the canonicalized URLs. Not the URLs the tool crawled, but what the URL in the canonical tag is.
Run a new crawl with that URL list and look at the response codes report with this list of canonicals. All response codes should return a status 200 response code.
Issue 2: Hacked pages
Hacked websites for profit is not a new topic. Most seasoned SEOs have come across websites that have been hacked somehow, and the hackers have conducted malicious activities to either cause harm or generate profit for another website.
Some common website hacking that happens in SEO includes:
Site search manipulation: This occurs when a website’s search pages are indexable. A malicious person then sends a ton of backlinks to their search results page with irrelevant searches. This is common with gambling and pharma search terms.
301 redirect manipulation: This happens when someone gains access to the site, creates pages relevant to their business and gets those indexed. Then they 301 redirect them to their own websites.
Site takedowns: This is the most straightforward attack when a hacker manipulates your code to make your website unusable or at least non-indexable.
There are dozens of types of site hacking that can affect SEO, but what’s important is that you maintain proper site security and conduct daily backups of your website.
Why this matters
The most important reason that hacking is bad for your website is that if Google detects that your website might have malware or is conducting social engineering, you could receive a manual action.
How to detect hacked pages
Luckily, there are many tools out there to not only mitigate hacking threats and attempts but there are also tools to detect if your website gets hacked.
However, most of those tools only look for malware. Many hackers are good at covering their tracks, but there are ways to see if a website has been hacked in the past for financial gain.
Use Google Search Console
Check manual actions report. This will tell you if there are any current penalties against the site.
Check the performance report. Look for any big spikes in performance. This can indicate when a change may have happened. Most importantly, check the URL list in the performance report. Hacked URLs can stick out! Many of them have irrelevant topics or may even be written in a different language.
Check the coverage report. Look for any big changes in each sub-report here.
Check website login accounts
Take a look at all users to find any unusual accounts.
If your website has an activity log, check for recent activity.
Make sure all accounts have 2FA enabled.
Use online scanning tools
Several tools will scan your website for malware, but that may not tell you if your website has been hacked in the past. A more thorough option would be to look at https://haveibeenpwned.com/ and scan all website admin email addresses.
This website will tell you if those emails have been exposed to data breaches. Too many people use the same passwords for everything. It’s common for large organizations to use weak passwords, and your website can be vulnerable.
By now, we’d think that our SEO auditing tools should be better at detecting internal links generated by JavaScript. Historically, we’ve had to rely on manually discovering JS links by clicking through websites or looking at link depths on reports.
Why this matters
Googlebot does not crawl JavaScript links on web pages.
How to find JavaScript links at scale
While most SEO auditing tools can’t detect JavaScript links by default, we can make some slight configurations to help us out. Most common technical SEO auditing tools can provide us with custom search tools.
Unfortunately, browsers don’t really display the original code in the DOM, so we can’t just search for “onclick” or anything simple like that. But there are a few common types of code that we can search for. Just make sure to manually verify that these actually are JS links.
<button>: Most developers use the button tag to trigger JS events. Don’t assume all buttons are JS links, but identifying these could help narrow down the issue.
data-source: This pulls in a file to use the code to execute an action. It’s commonly used within the JS link and can help narrow down the issues.
.js: Much like the data-source attribute, some HTML tags will pull in an external JavaScript file to find directions to execute an action.
Issue 4: Content hidden by JavaScript
This is one of the most unfortunate issues websites fall victim to. They have so much fantastic content to share, but they want to consolidate it to display only when a user interacts with it.
In general, it’s best practice to marry good content with good UX, but not if SEO suffers. There’s usually a workaround for issues like this.
Why this matters
Google doesn’t actually click on anything on webpages. So if the content is hidden behind a user action and not present in the DOM, then Google won’t discover it.
How to find content hidden by JavaScript
This can be a bit more tricky and requires a lot more manual review. Much like any technical audit generated from a tool, you need to manually verify all issues that have been found. The tips below must be manually verified.
To verify, all you need to do is check the DOM on the webpage and see if you can find any of the hidden content.
To find hidden content at scale:
Run a new crawl with custom search: Use the techniques I discussed in finding JS links.
Check word counts at scale: Look through all pages with low word counts. See if it checks out or if the webpage looks like it should have a larger word count.
Growing beyond the tools
With experience, we learn to use tools as they are: tools.
Tools are not meant to drive our strategy but instead to help us find issues at scale.
As you discover more uncommon issues like these, add them to your audit list and look for them in your future audits.
Google continues its 'Ask Googlebot' video series with a new installment explaining why Search Console graphs fluctuate when no changes are made to a website.
Can UGC affect your organic search rankings? Read on to learn whether there is any connection between user-generated content and improved Google rankings.
Are you looking to boost customer engagement? Discover the 10 most popular social media platforms you can use to create unforgettable brand experiences.
DuckDuckGo's search syndication deal with Microsoft is the root of an issue leading to data being tracked in the company's allegedly private mobile web browser.
SEO is a big thing. Yes, it is made up of a lot of small things. Some connected, some not.
We’d all love to stay on track, on plan and have everything go smoothly.
But the reality is that, at some point, something won’t perform as expected or a resource won’t come through.
That’s why, to some degree, SEO is based on problem-solving as a whole.
We have to be ready for those situations and know what to do because SEO roadblocks and challenges are inevitable.
Some SEOs are great strategists, others great implementers. Few excel at both. Everyone has different strengths and levels of idea generation, strategy development and tactical implementation disciplines.
With so many stakeholders and variables involved in SEO, what does it takes to be a successful SEO problem-solver? Here are nine ways to become an SEO problem-solver.
Get the daily newsletter search marketers rely on.
SEO success (fair or not) is often judged by non-SEOs and, at the same time, can be held back or negatively impacted by others as well.
Problem-solving gets easier when you already know the expectations, identify possible roadblocks in advance and have a full context. Whether it is company politics, differing levels of understanding of SEO subject matter, or wildly different expectations for performance and timing, you need to know all the players and assess what challenges might be ahead.
The more you can manage the stakeholder mix and expectations, the easier it will be to troubleshoot issues or to go down the right path when they happen. And yes, that’s a “when,” not an “if.” I’m not being snarky, but nothing ever goes according to plan.
2. Set up roles and communication plans
Beyond the full set of stakeholders, there are distinct people who you work with. That may include people on your team, within your agency, within your department and/or other functions whether agency-side or client-side.
You will need others to be successful unless you have the skills and roles beyond SEO of writer, designer, developer and approver.
Establish clear roles and responsibilities. Know who your go-to people are for the different functions you need. Learn their processes and sync them up with yours.
Understand lead times and turnaround times. Make sure they know that unplanned requests and things will happen.
Make it crystal clear what you know you need and what you might need, and how timing and responsiveness will impact SEO performance. Build allies and include them in your problem-solving and troubleshooting process and work to gain as much agility with resources that you can.
3. Maintain baselines and goals
You want to have as much objectivity and cause and effect as you can in any SEO effort.
There are so many misunderstood and gray areas that, without baselines and goals of where the effort is going, you can get way off track with resources, why something isn’t going according to plan, and more.
There are often many ways to accomplish your goals. We can get lost down a rabbit hole on a technical issue if we can’t tie it back to a baseline or impact on a goal.
We also can take a step back and reprioritize our efforts when we receive resistance or a roadblock if we find out that a dev update to resolve a technical issue might take six months.
4. Leverage your strategy and plan
First, I hope you have a defined strategy and plan. If you don’t have it or your baselines and goals (noted above), take a step back and work on this. Otherwise, it is hard to be proactive and lead in the SEO effort as you’ll always chase down issues.
With your strategy and plan, you can further build on the objective aspects of the campaign or cause that your baselines and goals help with.
As I noted in the intro, SEOs can be great at big picture strategy, some at detailed implementation, and many have a range of experience and favorite parts (technical vs. content, etc.).
Unification around a strategy and plan will allow you to know how hard to push for a specific fix versus moving on to bigger impact items. However, it allows you to adjust expectations. If the content writer or approval process is booked for months out, you can raise the red flag about how that will cause a change in the plan and expected timing and what that might do to push results further out.
Using your plan and any changes that come to manage expectations will help you get resources or engage others who can help you.
5. Go off-script and be agile
Even with the best plan and all the resources you could want at your disposal, things often play out in different ways that we project or anticipate. Sure, we work through all of the title and meta description tags and they are “perfectly” optimized. Yet we might find that there are issues that remain with duplicate tags or how they are being indexed.
Should we check off the box and move on?
Should we do another round of optimization?
Should we start doing other things in the plan in parallel?
Do we need to get a developer or copywriter involved?
Again, things don’t always go according to plan. Sometimes we have to double down in certain areas.
Finding the right balance of adjusting the plan and being agile while you go versus sticking to the plan is probably the most important troubleshooting or problem-solving ability that an SEO can have.
6. Develop technical skills and/or resources
Knowing the “what” and “why” of an SEO issue is powerful. This is a step beyond being able to rely on tools or performance issues as indicators that something is not performing according to expectations.
If you can dig into the XML sitemap, robots.txt, HTML code or other related factors yourself to get to the root of the problem, you can get deeper into problem-solving directly.
At a minimum level, you need to be able to quarterback a situation by bringing your resources together. However, with the ability to solve issues yourself or speak the same language and be highly prescriptive and direct with your resources, you’ll have a better chance of getting a resolution to your satisfaction and hopefully quicker.
7. Have content backup plans
One of the top reasons plans and performance get off track: not getting the quality and volume of content needed.
I don’t know many SEOs who still are writing or making content edits. In most cases, SEOs rely on a client, another resource, or a partner responsible for writing and producing content. In some industries, this is also shaped by legal and compliance requirements.
Content resources can get booked up even if you have a content calendar and needs established.
What happens when your content resource is unavailable or gets off track from the initial plan?
Do you have backup resources?
Do you go deeper into technical and off-page optimization to compensate?
It is one thing to be a problem-solver when content isn’t performing. It is another when you can’t get the content you need.
8. Be patient, but don’t wait
Be a team player and respectful of your partners and resources you collaborate with.
Pushing too much and/or not being tactful won’t help your cause. Give some grace and have patience, but also don’t wait.
If you’re stuck on content (per the section above), or a dev edit, or a technical update or on any specific resources beyond your control, find ways to move things around in the plan.
You can always prioritize link building, tag updates, or some other type of audit or update to keep things moving forward.
It might take some creativity, but don’t sit idle while waiting on others. Keep moving something forward.
9. See roadblocks as opportunities
My tone has probably been pretty strong because there will be challenges, roadblocks and things to troubleshoot. That’s the nature of SEO and the web in general.
A problem-solver mentality is important.
Accepting this reality and being positive in the face of adversity, being a realist and getting others on board with this reality are critical.
SEO is hard for everyone. We’re trying to be the best possible with our website and strategy.
If it were easy, everyone would be good at it, and we’d have a different set of problems.
Google Search seems to be displaying more FAQ rich results in its search results over the past few days. Both RankRanger’s tracking tool and some SEOs are noticing this increase in the number of times a site is showing FAQ rich results.
What are rich result FAQs. Web pages that have a list of frequently asked questions (FAQs) that contain a list of questions and answers pertaining to a particular topic can markup the questions and answers with FAQ structured data. Google may then show those FAQs in the search results snippets as illustrated below:
More showing. Google is now showing these FAQ rich results five percentage points more often according to RankRanger, who happened to be recently acquired by Similarweb. Here is the data chart showing the uplift:
SEOs like Brodie Clark and Glenn Gabe noticed the increase too on clients they have access to:
The most effective way to review this change is via Semrush. Filtering with instances where the subfolder ranks within FAQ rich results, we can see the uplift. This uplift correlates with other tools such a RankRanger and Moz, which show an overview of this data publicly. pic.twitter.com/ZxKjuqAwgB
Two links. As a reminder, Google recently limited the number of links you can see within an FAQ rich result to two links. A couple years ago, Google also tightened the guidelines around using FAQ schema on your site.
Why we care. With more FAQ results showing up in Google Search, it may benefit your site if you gained those rich results but at the same time, if your competitor now shows up for these rich results, then it might have the opposite affect. Rich results generally lead to a higher click through rate from the Google search results snippet to the publisher’s site but not always. In this case, if the searcher gets his or her answer from the FAQ rich result, they made not end up clicking over to your site.
So test, test and test to see if you want these FAQ rich results for your site.
Social media benchmarking involves comparing your metrics and processes against the industry standards. Learn how you can get a clear idea of how you stack up against the competition.
Hear from Rival IQ and NetBase Quid experts about the metrics and benchmarks you can use to measure your social media performance.
Google Ads has a bug of some sorts impacting some a “subset of non-US campaigns” where cost-per-click amounts are incorrectly inflated, the company posted.
The notice. Google posted this notice about 30 minutes ago:
We’re aware of a problem with Google Ads affecting a significant subset of users. We will provide an update by May 25, 2022, 1:00 AM UTC detailing when we expect to resolve the problem. Please note that this resolution time is an estimate and may change. We’re aware that a subset of non-US campaigns are affected by a technical issue causing cost-per-click (CPC) to be incorrectly inflated. We are working to resolve this issue.
Seeing inflated costs. If you are seeing inflated CPCs and costs on your non-US campaigns, do not worry, Google is aware and working on a fix.
It is not clear if this is a reporting issue or an issue impacting your budgets. Either way, you should ask for refunds after we learn more about the underlining issues.
Fix coming. Google has not posted an estimated time for when this will be resolved but Google will provide and update within the next 12 hours or so.
Why we care. If you are running campaigns outside of the US and you notice CPC inflation, you are not alone. Google is aware and will fix the issue – so no need to panic. Stay tuned as we provide more updates as they come in.
Google Ads is reminding advertisers about some changes to its audience targeting and reporting features. These changes, which were shared via email with advertisers, are fairly minor and some have already started rolling out to accounts.
Reuse audiences. Advertisers will be able to reuse audiences across campaigns. When you build an audience to use in a campaign, Google Ads will save it so you can use it again in a future campaign.
This feature is now available for use as an audience signal on Performance Max and is coming soon to Discovery, Video Action and App campaigns. The ability to reuse audiences will be expanding to more campaign types in the coming months, according to a tweet from Ginny Marvin, Google’s ad products liaison.
New terms. Google Ads is renaming some key terms in your audience report and throughout Google Ads. You may have seen this already in some accounts.
For example, Audience types (e.g., similar, custom, in-market, affinity) are now audience segments and Remarketing is now Your data. Here’s the full list of name changes:
New audience reporting. Google is consolidating audience reporting into a new Audiences tab. Located in the left-side navigation menu, you’ll find reporting about demographics, audience segments and exclusions. Google said this is a “simplified view” of all the same reporting features. This is another change you may have seen already in some accounts.
Why we care. Instead of rebuilding audiences manually in each campaign, new reusable audiences will allow advertisers to save time while keeping targeting consistent across campaigns. The experience should be similar to how custom segments act currently, where once an audience is created, it can be applied to any campaign instead of manually checking off types in each campaign. Changes to the audience segments will then be distributed to all campaigns targeting the audience segment.
Here’s the email from Google, shared on Twitter by @PPCGreg:
Google Lens results within Google Chrome on desktop will now be displayed on the right side of the same browser tab you are viewing. This is instead of the results opening up in a new tab or new window within Chrome.
How it works. Here are the steps to take on Chrome to see this yourself:
Open a page in Chrome.
Right-click on an image.
In the menu, choose “Search image with Google Lens.”
If you right-click anywhere outside an image, from the menu, you can choose “Search images with Google Lens.” After you click this option, you can drag to select an image.
Tip: Search results display on the right side of your screen. To display them in a new tab, click Open .
Here is a GIF of it in action:
Who can see it. Google said this feature is now rolling out to all Chrome users. Google said this is part of the search company’s “broader effort to help people search and access information in more natural and intuitive ways.”
Why we care. This may encourage searchers and Chrome users to search more visually using Google Lens. If your content is displayed in these results, there is a chance you might see more traffic to your site through this search feature. Either way, you should be aware of this new Chrome feature as a potential source of traffic to your site and also how useful it can be for you to learn about images or things.
Ready to upgrade your SEO knowledge at the largest, multilingual SEO event in 2022? This free online conference is happening in English, Spanish, and Portuguese.
Google is testing a new trail version of the Google News portal at news.google.com. It is a limited trail, I was only able to bring it up once in Safari private mode, but then I lost it. The new home page is more visual, brings the navigation menu from the left side to the top and overall cleans up the look of the home page.
What it looks like. Here is a screenshot of the top of the page that I was able to screen capture when I saw the test – you can click on it to enlarge it:
Here is the bottom portion of the page where you can see the “Fact check” section. Again, you can click on it to enlarge it:
When will you see it. Again, this is just a test, just a trial, Google is running to see if those in this test group like the new Google News design and if the responses they expect from the new design is positive or negative. Google is constantly testing new user interfaces across all their platforms, so this should come as no surprise.
Why we care. Whenever Google releases a new design or user interface in Google Search or Google News, that can impact ones visibility and clicks to their web site. So keep these user interface tests in mind when understanding any risks or rewards you might see in the future with Google News interface changes.
Again, this is just a test – it is hard to know if and when this new design will go live.
Everyone’s talking about privacy. When Google announced the deprecation of third-party cookies in early 2020, privacy became a hot topic.
The loss of third-party cookies impacts all advertisers and is especially challenging for B2B marketers, who struggle to reach the right audience even with third-party cookies in play.
Let’s review how today’s privacy changes came about – and then look ahead to what it all means for marketers.
Get the daily newsletter search marketers rely on.
In the early days of the internet, it was the Wild West. No one cared about privacy.
As with anything new, consumers were enamored with going to a website, ordering whatever they wanted and having it show up at their door.
Sure, mail-order had been around for a long time. But it wasn’t exciting to fill out a form, write a check and send it in – only to wait 6-8 weeks for the order to arrive.
The internet changed buying habits forever.
It was possible to find and buy nearly anything online easily. Still, the internet also offered a treasure trove of user data that marketers could tap into for insights into buyer behavior.
Somewhere around the mid-2000s, retargeting was introduced.
I remember being at a search conference around 2005, watching a demo of a new technology that would dynamically serve ads based on users’ search activity and the websites they visited.
My mind was blown. Do you mean we can show different ads to different users based on things we know about them? Sign me up!
No one thought about privacy then either. We were so enamored with this new technology that we never gave privacy a thought.
Privacy becomes a thing
Fast forward to today.
Retargeting is everywhere. Everyone knows when they are being retargeted. And advertisers are often doing it poorly.
Every digital marketer can come up with a handful of bad retargeting they’ve experienced personally.
For me, a memorable one was just after I’d made an online reservation at a hotel for a business trip to Seattle. I was immediately bombarded with ads – from the same hotel I’d just booked, saying, “Book your trip to Seattle now!”
Come on.
I believe that lazy marketers are partly responsible for the privacy changes coming later this year. People are sick of poorly targeted ads that follow them incessantly.
How privacy affects B2B search marketing
B2B search marketing is challenging under any circumstances. Searchers don’t self-identify as B2B users when they perform a search.
And often, the keywords they use are the same keywords a consumer might use, even though each is looking for two different things.
Terms like “insurance,” “security” and even “design software” are vague. The searcher could be looking for services for themselves or their business.
That’s where third-party cookies came in.
Advertisers got excited when Google introduced audience targeting options like affinity and in-market audiences. Finally, a way to layer on audience signals based on search and browsing behavior!
However, audience targeting options are hopelessly consumer-focused. Here are Google’s current affinity segments:
See anything that looks remotely like B2B? Me neither.
In-market segments aren’t much better. Here’s one for Business Services:
The “Business Technology” category isn’t bad, but the others, such as “Business Printing & Document Services,” seem tailored to small businesses, not enterprises.
The death of third-party cookies
So what does all this have to do with privacy?
Targeting options like affinity audiences and in-market audiences are built from third-party cookies. Search engines use signals (e.g., which websites users visited) to compile the audiences.
Google has announced the deprecation of third-party cookies from Chrome within the next year.
In other words, most of these targeting options are going away soon.
First-party audiences to the rescue
First-party audiences are great for B2B. They remove many obstacles B2B advertisers face: consumer-focused targeting, or targeting that’s too broad for the business need.
But first-party audiences also pose challenges for B2B.
The biggest hurdle is creating the audiences in the first place.
To efficiently use first-party audiences, advertisers need some way to compile audience data, group users into cohorts and securely pass the data to advertising platforms like Google Ads and Bing Ads. Usually, this is done through a data management platform (DMP)
Advertisers who use a DMP have a relatively easy time using first-party audiences in their PPC campaigns. The DMP can be used to upload audiences directly to search engine platforms.
Unfortunately, even among our enterprise clients, surprisingly few have a good DMP setup. This means most advertisers are not able to use first-party audiences effectively.
And even for advertisers who do have a suitable DMP, we often find that the first-party audiences are too small to target.
Unlike e-commerce, B2B is a smaller universe. There aren’t as many people researching enterprise business software as there are people buying shoes on a given day.
There are even fewer people from companies with more than 5,000 employees researching ERP software for the enterprise.
See where I’m going with this?
Audiences that are too small to target aren’t much help.
Or are they?
Search engines use audiences as a signal for targeting ads. Think of an audience as a way to tell Google and Bing who you’re trying to reach.
One way to amplify the signal of a small first-party audience is by using similar audiences (also called lookalike audiences).
Similar audiences are often 2-10 times bigger than first-party audiences. Here’s an example:
The first-party audience only has about 5,000 members – it’s large enough to target but won’t drive much traffic.
But the similar audience has anywhere from 10,000 to 50,000 members for search and up to 1 million for display – a much larger reach.
Similar audiences are especially helpful for B2B, which tends to have a low audience match rate). We’ve seen strong performance from similar audiences for our B2B clients.
Let paid social help
Another way to create B2B audiences is to use paid social to inform paid search.
Paid social is usually used for upper-funnel activity – awareness and consideration. But we’ve used paid social to create audiences for paid search retargeting.
The great thing about paid social is that we know a lot about our target audience. We can target based on employer, job title, company size, education, skills and other factors that indicate the user is a good target for B2B.
Create a dedicated landing page for paid social traffic for your B2B audience targets and tag it for retargeting. Then target people who visited that page with Google Ads.
We’ve done this with YouTube videos too. People who watch a 30-60 minute keynote from a B2B conference make a great audience for follow-up with RLSA or display retargeting.
And don’t forget about LinkedIn targeting in Microsoft Ads. Being able to use LinkedIn profile attributes to target is a big differentiator for Microsoft Ads, and it’s especially useful for B2B advertisers.
Use micro-conversions as signals
Another way to create retargeting audiences is to use micro-conversions as signals for intent.
B2B has a long sales cycle – usually 12-18 months or longer. No one buys a six-figure business software system in a single visit with a credit card.
The process usually involves a lot of research, with multiple touchpoints along the way.
Users might follow these steps on the way to purchase:
Read an article
Download a whitepaper
Read an ebook
Request a demo
Sign up for a free trial
Contact sales
Purchase
Each of these actions represents a micro-conversion.
You could create audiences for people who downloaded a whitepaper. You could even segment this further by creating audiences based on the type or product of the whitepaper they downloaded if you’re selling multiple products or targeting multiple audiences.
Retarget users who downloaded a whitepaper with an offer for a free demo or trial. Then retarget users who signed up for a demo or trial, asking them to contact sales.
If you sell to multiple business sizes, you can also start to segment by small business vs. large enterprise based on the content they consumed.
Using first-party data is an investment – of time and money
The days of simply picking an audience based on in-market traits or affinity groups are numbered. Lazy marketing is soon to be a thing of the past.
Now is the time to start building your first-party audiences and think about your buyer journey.
Get serious about creating micro-conversions and paid social audiences to reach your target.