Ad Collision: Braking Before the Crash

The Search Agents Feed - Thu, 04/19/2018 - 11:21

Have you ever visited a site and witnessed the same ads displayed multiple times on the same page? If you’ve found yourself in this scenario, you’re experiencing ad collision. Ad collision is officially defined as a situation when several ads are served in multiple ad slots on the same page without the intention of buying a roadblock.

Stroke of genius or a waste of media dollars? While ad collision may be a good strategy for short-term and high-impact campaign objectives, limiting it is the most efficient option for your campaigns. But how exactly do you curb ad collision?


Reduce Campaign Frequency


Frequency is the largest contributor to high levels of ad collision. By allowing a high frequency, the system is understanding that multiple impressions can be triggered even if they’re on the same page and as long as they stay within established frequency caps. Setting a frequency cap is essential whether it’s set on an hour, day, week or month time frame — although it is recommended to narrow in to the smallest window. Account-level frequency caps may also be available, so check in your UI or work with your platform representatives.


Re-Evaluate Campaign Structure


Ad collision is often a sign of high audience overlap. Creating a hierarchy of audiences to prioritize and then applying audience suppression is key. For instance, your most valuable audience (i.e. similar audience 1-percent affinity) should be excluded from your second-most valuable audience (i.e. similar audience 2-percent affinity) to eliminate overlap and subsequently limit the amount of ad collision.


Exclude High Ad Collision Offenders


Run a placement report to identify which sites are the largest offenders of ad collisions by observing frequency per site. If your platform does not have a site report available in its UI, ask your rep, since many platforms can pull a report on the back end. Often, you can visit pages with high frequency after visiting your client’s site (and thus entering the retargeting pool) to force an ad collision experience.


Severely Limit Creative Sizes


By “limit,” I really mean only run one creative size per campaign. This is not recommended, but can serve as a last resort for a client that wants to significantly reduce the possibility of ad collision. Why does this work? Publishers typically only have one size per page available. By limiting the size of creative, you almost entirely eliminate the possibility of ad collision. However, withholding creative sizes also will negatively impact performance due to the lack of variety in creative options. In addition, inventory will be impacted due to the reduction in bid-to-inventory matches.

Implementing these changes can positively impact the appearance of ad collision, which ultimately saves media budget and supports more efficient performance. Not only will the client benefit, but the end user also benefits from less irritating experiences.

The post Ad Collision: Braking Before the Crash appeared first on The Search Agency.

Discovering How Your Content Is (and Isn’t) Being Indexed

The Search Agents Feed - Tue, 04/17/2018 - 08:14

In the earlier days of SEO, things were pretty black and white: Google indexed your HTML content, and if you searched for that content, you’d see it in bold type within the search results; your JavaScript content, on the other hand, was completely invisible to search engines.

Things have changed a lot since then, to the point where Google is so confident about their ability to render JavaScript that they are no longer going to support the workaround they developed for crawling AJAX content. However, JavaScript crawling is still far from perfect. Even when Google is able to render the content, it can take significantly longer for the fully rendered content to be indexed.

Furthermore, even when your content is completely loaded within the on-page HTML, Google takes CSS and JavaScript into account, and some indexed content may be treated differently depending on how it is displayed on the page.

In the past, Google’s cache was a reliable way to verify how your content was being picked up, but these days, there are sometimes discrepancies between the cached and the actual indexed version of a page. For example, a text-only cache will only display content loaded within the HTML source.

In this article, we’ll go over different ways to determine if and how your content is being seen by Google.




  • Google’s “info:” operator
  • Google’s “site:” operator
  • Chrome’s DevTools (“Elements” panel)
    • Alternate: FireFox’s “View Selection Source” feature
  • The Fetch as Google tool in Google Search Console


Is Your Page in the Index?


The first thing to determine is whether your page is even indexed. Fortunately, this is easy. Google’s “info:” operator will give you details about a particular page. If the page is not indexed, no information will be displayed.

Simply type “info:” followed by the page’s URL. There should be no space between the colon and the start of the URL. The HTTP/HTTPS protocol is optional.



Be sure to use the canonical version of the URL. If a page exists at multiple URLs, it’s most likely to be indexed at the URL referenced in the canonical tag.

If a URL isn’t indexed, nothing will be displayed.


There are several reasons a page may not be indexed, including:

  • Google hasn’t found/crawled it yet.
  • It is blocked in robots.txt.
  • It has a noindex tag.
    • When looking for a noindex tag on the page, be sure to use Chrome’s DevTools (“Elements” panel) to search within the code for the fully rendered DOM, rather than just the HTML source.



Is Your On-Page Content Indexed?


Once we’re certain a page is indexed, we’ll want to determine if Google is actually indexing all of the content on the page. The easiest way to do this is by using the “site:” operator and searching for a snippet of text from the page, in quotes. If only some of the content on your page is loaded via JavaScript, be sure to test a snippet of that content to verify that it’s indexed. Simply type “site:” followed by the page’s URL. Again, there should be no space between the colon and the start of the URL, and the HTTP/HTTPS protocol is optional. After the URL, include a space, followed by a short snippet of the content you want to verify is indexed.

For example, we can verify that the main text on The Search Agency’s Search Engine Optimization page is properly indexed:



If the page being investigated doesn’t appear in these results, this indicates the content wasn’t indexed.


“Hidden” Content


You’ll notice the content we searched for in the above example is bolded in Google’s search result. This indicates that Google knows this content is immediately visible to users on the page. That is, they don’t have to click a “Read More” button or scroll through a carousel to view the content.

If Google doesn’t think your content is immediately visible to users, they’ll still index this content, and the page will still appear in the results for this type of search. However, the content may not appear in the snippet for the result.



This is important because research has shown that content that isn’t immediately visible to users isn’t weighted as heavily as content that is. See Moz’s article on CSS and JavaScript “hidden text” for more information.

That said, Google has stated this won’t be the case when mobile-first indexing rolls out. Per Google’s John Mueller, “So with the mobile-first indexing will index the the mobile version of the page. And on the mobile version of the page it can be that you have these kind of tabs and folders and things like that, which we will still treat as normal content on the page even. Even if it is hidden on the initial view.”


Is Google Technically Able to Render Your Content?


If the steps above show that some or all of the content on your pages isn’t being indexed, the next step is to find out if Google is technically able to render the content.


Fetch as Google


The easiest way to determine if Google is capable of rendering your content is to use the “Fetch as Google” tool available in Google Search Console. Just put in the URL of the page you want to check and click the “Fetch and Render” button.

It may take a minute to process, but once the render is complete, you can click on the URL to see an image of how Google was able to see the page. Here, you can verify that the content you want indexed appears in the screenshot.



Ideally, the screenshot under “This is how Googlebot saw the page” will match the screenshot under “This is how a visitor to your website would have seen the page.”

If the desired content isn’t included in Googlebot’s screenshot on the left, there may be an easy fix. Below the content shown in these screenshots, the tool tells you if any resources are being blocked by robots.txt. If any relevant JavaScript files are being blocked, this may prevent Google from being able to render the content. Any blocked scripts or CSS files with a medium or high severity should be unblocked to allow Google to fully render the page. Unblock and try again.

If there are no blocked resources and the “Fetch as Google” tool still doesn’t display your content, there may be other technical issues on the page, or Google may not understand the framework you’re using. Further investigation is necessary.

If the “Fetch as Google” tool shows that Google is able to see your content, and yet your content still isn’t being indexed, read on.


How Quickly is Your Content Being Indexed?


If your page is indexed and your content is rendered within the HTML source, chances are that content will be indexed at the time the page is indexed. However, when it comes to JavaScript-loaded content, it’s inconsistent.

If a page is indexed and “Fetch as Google” shows that Google can see the content on the page, but the content isn’t being indexed, it may simply take more time (several days or more) for Google to fully render and index the page.

If you publish content frequently, you can get an idea of how many pieces of your recently published content are indexed (and which ones) by using the “site:” operater to do a search of your site, then selecting “Tools” and a span of time from the first drop-down that appears. For example, you can see all content indexed in the past week:



Any delays in getting your content indexed can mean lost traffic. If your content is particularly newsworthy or time sensitive, you may completely miss your chance to appear in the results for relevant searches.

If you’ve made your content fully accessible to Google (i.e. the page isn’t noindexed and it and none of its resources are blocked via robots.txt) and you continue to see inconsistent or delayed indexation of on-page content, there are other options, such as:

  • Rebuilding the site to render all crucial content server-side
  • Using a different JavaScript framework that may be more SEO-friendly
  • Using a pre-render service to create HTML snapshots of your JavaScript-dependent pages

The alternative is simply waiting for Google to get even better at crawling all JavaScript-rendered content, but no one knows how long that will take. Can you afford to wait?

The post Discovering How Your Content Is (and Isn’t) Being Indexed appeared first on The Search Agency.

4 Ways to Get the Most Out of Your Paid Search Campaigns

The Search Agents Feed - Wed, 04/11/2018 - 10:39

Keywords still reign supreme in paid search, but audience targeting can take your campaigns to the next level. While display and paid social offer the largest number of targeting options, search audiences have come a very long way and offer many practical features which can help you reach your marketing goals. Here are some options for your audience-targeting strategy.


1. Demographic Targeting


You can target by age, gender, parental status, household income (HHI) or combinations thereof. Some examples:

  • A travel and leisure company catering to an older, more affluent audience, could increase bids for people over age 55 and in a higher income bracket.
  • A clothing store exclusively for young men could tailor ads specifically to them and have entirely different ads intended for the parents shopping for their sons.
  • Check out a great article by The Search Agency’s very own Ami Grant that dives further into demographic targeting.


2. Location/Device Targeting and Ad Scheduling


Location targeting helps you show your ads to customers in a selected geographic location. Device targeting allows you to adjust bids separately for computers, mobile devices and tablets. Ad scheduling allows you to control the day and/or hour your ads appear online based on when you’re there to handle customer inquiries. Combine all three elements into your targeting strategy as it applies to your business:

  • As a classic example, a pizza restaurant probably wants to show one ad to someone searching for “pizza” at 1 p.m. on their PC at work, and a different ad to someone searching for “pizza” at 8 p.m. on a smartphone a half-mile from the restaurant.
  • A financial services company sees the strongest conversion rates on Friday, for people searching on desktops in the LA and NY metros. Therefore, they should bid boost all three targets to get in front of these high-value customers.
  • Note: Multiple bid adjustments for device and location do not necessarily get multiplied together.


3. Remarketing Lists for Search Ads


Tailor bids and ads for people who have previously visited your site when they’re searching online.

  • Segment your RLSA audience by membership duration (i.e. zero to seven days, eight to 31 days, 31 to 180 days) and taper down bid adjustments or create tailored ads depending on how long ago they visited. Test different incentives and promotions to drive conversions.
  • Segment your RLSA audience by website interaction (researchers vs. high-intent) and re-engage these visitors with a message that will get them to take the next step towards conversion.
  • If an advertiser’s goal is to attain new customers, they could refine their audience strategy by excluding converters from brand campaigns and increase bid adjustments for non-converters.


4. Similar and In-Market Audiences


Apply auto-generated similar audiences to find new and qualified consumers who have shared interests with your remarketing audiences. Opt into in-market audiences to reach potential customers while they’re actively browsing, researching or comparing the types of products you sell.

Within each of these different methods of targeting, you can implement bid adjustments up to a 900-percent increase or as low as a -90-percent (-100 percent to opt out of a device) decrease based on where, when and how people search for your product or service. At the very least, you should be adding as many audiences as possible in an observational state to gather data and make informed decisions down the road. Based on this data, your next step would be to create a unique, customized strategy using the various targets. Throughout the entire process, there are some mistakes and pitfalls you should try to avoid when implementing your new audiences, lists and/or bid adjustments. Some of them are as follows:

  • Not Having a Strategy: Strategies should be specific to your overall business. Start with the basics by adding audiences to gather data and build out according to performance.
  • Be Aware of Settings: Incorrect targeting or observation settings can adversely impact performance if applied incorrectly. Use observations to gather data and guide further actions in your campaigns. Use targeting to narrow your reach to a specific audience which you’ve deemed valuable.
  • List Size Matters: Extremely niche lists can be ineffective as the list size will be too small to generate or gather enough data to provide meaningful analysis. Also, if a list is too small and has low volume, it won’t be eligible to have a similar audience list created.
  • Set Relevant Membership Duration: Select a duration equal to the length of time you expect your ads to be relevant for your visitors. In general, the membership duration should be similar to the length of your sales cycle.

The post 4 Ways to Get the Most Out of Your Paid Search Campaigns appeared first on The Search Agency.

The Selfie Video: Understanding the Draw of User-Generated Content

The Search Agents Feed - Mon, 04/09/2018 - 10:01

Most big companies and brands can afford to produce high-end, engaging video content to feature their products and services over all their social media platforms, website and on television. But with the expansion of video service apps and social media platforms offering video recording and hosting services, consumers are gravitating more towards brands that adopt these services to communicate with them, similar to how consumers communicate with their friends and loved ones. They want more personal connections with the companies they purchase products or services from, desiring the intimacy and familiarity of user-generated content — even if it means the content isn’t always user-generated.

Trending in social media today is the self-recorded video, made popular by video messaging app Snapchat. Users take on an evolved form of the picture selfie and record short videos that they send to their friends on the app. These short videos can disappear as quickly as they are viewed by the recipient, can play on repeat until the recipient closes them or can play in loop along with other videos taken within a twenty-four-hour period.



According to a study done by Hootsuite, over 100 million people in the U.S. and 200 million people worldwide use Snapchat and watch over 10 billion videos daily, spending an average of about 30 minutes a day on the app. Its largest demographic is 18 to 24-year-olds at 37 percent, but 25 to 34-year-olds make up about 26 percent of its userbase.

Once its popularity skyrocketed, Facebook, Instagram and even LinkedIn began offering similar video services. With features like “Days” on Facebook and “Stories” on Instagram, users can document their everyday lives and temporarily share their experiences with friends and the world simply by self-recording on their phones, tablets or computers. For direct communication, more and more millennials and even Generation X prefer to communicate through FaceTime, a feature offered by Apple’s iOS, instead of traditional phone calls or texting.


Traditionally, FaceTime was most popular with older users, such as grandparents wanting to see grandchildren and parents wanting to speak with their kids while away. In recent years, the 18 to 24-year-old demographic has taken to FaceTiming on a more regular basis.


Even in primetime television, we see more and more shows incorporating the use of self-recording devices in order to have a more personal and down-to-earth appeal and to emphasize that they are aware of social media trends.


America’s Next Top Model show creator Tyra Banks records a “Tyra Mail,” a segment in which she gives clues on upcoming challenges on a mobile device and plays it afterwards for the show’s contestants.


There used to be an age where research meant looking for articles in printed publications or essays in books — then it quickly evolved into “let me Google that” or “I’ll just look it up on Wikipedia.” Today, more and more people have become visual learners, so the new way to research something is, “I don’t know what that is but let me YouTube it and I’ll get back to you.” Consumers now look to YouTube personalities and social media videos (sometimes) with celebrities for makeup tips, DIY ideas, tech help, cooking recipes, reviews and even entertainment. The majority of this content is self-recorded using minimal resources, like a phone or laptop computer.


Lull, a boxed mattress company, used many videos created by their customers in order to push a sales campaign. Boxed mattress companies boomed in the past couple years and had great success with millennials with the ease of their online shop and ship features, but it was the user-generated content of real people unboxing their mattresses that gave a few companies an edge over the competition.


Some companies have caught on to the trend, and while some don’t actually use a phone, tablet or computer to record their videos, the attempt is to mimic the look and feel of self-recorded content in order to establish a personal connection. This type of media gives consumers the impression that they are being spoken to — not at — and gives the brand an air of sincerity that today’s consumer culture looks for.


Considering the personal dating app revolves around personal stories and personality, it only makes sense that they would use self-recorded content in their advertising. So far, though, they are the only dating app to do so. Other apps and services similar to them use the “man on the street” or “talking head” interview styles which consumers see quite a bit of already.


Transparency and personality are the new trends that brands are finding success with. By utilizing the self-record form of media and user-generated content, they are building much stronger connections with their consumer base.

The post The Selfie Video: Understanding the Draw of User-Generated Content appeared first on The Search Agency.

Demographic Targeting for Search: Reaching the Right Audience with the Right Ad

The Search Agents Feed - Wed, 04/04/2018 - 09:18

Ever wanted the ability to customize ad copy or landing pages based on a visitor’s age or gender? Google’s demographic targeting for search ads allows you to do just that. This has long been a feature available to all advertisers running campaigns on the Google Display Network, but is now available to all advertisers on Google’s search network. As long as the necessary AdWords conversion pixel is placed on your site, enabling this feature in an AdWords account gains marketers access to click-through and conversion rates at the age and gender level.

This new feature provides the option to adjust bids up or down and tailor ad copy and landing pages based on a visitor’s demographic. This is powerful, especially for advertisers who are looking to better target their core audience or grow a new audience.

We decided to try out this new feature for an established advertiser to better understand ad copy and site performance based on different age groups. We knew their demographic skewed older based on third-party analytics data, but now with the new AdWords feature, we have the ability to see how those visitors actually performed from a click-through and conversion rate perspective. We also knew from the same third-party data that their competitors skewed more towards the millennial age group (18-34 years old), and this is the exact audience the advertiser wanted to capture more of.

After a few weeks of gathering data, we found that while the older age group drove the highest CTRs, they actually drove the lowest conversion rates when compared to the other age groups. A bit of a shocker. On the flip side, the millennials had the lowest CTRs and the highest conversion rates of all the age groups. Not a complete surprise.

Since the millennials were an audience we wanted to capture and grow, we tested the theory that it’s possible to increase click-through rates for this age group by customizing ad copy to their needs and interests. Knowing that millennials are a social media and technology-savvy demographic, we tested language in ad copy that incorporated this theme.

The results of the ad copy test were quite impressive. Click-through rates increased 28 percent for the millennial age group for the new ad copy compared to the original ad copy. These results highlight the fact that a cookie cutter creative strategy will not always capture an intended audience. While ad copy testing and optimization is important for improving click-through rates, customizing copy based on demographics is even more crucial to capturing the audience you want.

As marketers, the more tools we can harness to gain insights into demographic performance, the better equipped we’ll be at targeting and growing the audience we want.


Step-by-Step Guide


1. Mirror out the campaign(s) you’d like to create custom copy for based on the demo group you’d like to target. This option exists for both age and gender.
2. In the mirrored campaign(s), add only the demo groups you want to target, excluding all other demo groups. In this example, target the 18-34 age group only.
3. Be sure to exclude the target demo group from the original campaign(s) — exclude the 18-34 age group so the original and mirrored campaigns are not competing.
4. Create custom messaging by calling out site features, benefits, unique selling propositions, etc. that appeal to the demo you’d like to capture.
5. Continue to run the winning ads from the original campaign until there’s enough data from the new custom ads to determine ad performance.
6. Review ad performance after two weeks to see which ad resonates with your target demo and continue to iterate based on performance.

The post Demographic Targeting for Search: Reaching the Right Audience with the Right Ad appeared first on The Search Agency.

How Facebook’s New User Privacy Changes Will Impact Your Current Advertising

The Search Agents Feed - Mon, 04/02/2018 - 14:57

Over the last week, Facebook has been aggressively addressing user privacy concerns that surfaced during an investigation into Cambridge Analytica, a UK-based data firm, and their use of data to identify and advertise to potential swing voters. In an aggressive sprint to recover from this discovery, Facebook has launched a platform investigation and a bug bounty program, rewarding users that report abuses of data by app developers.

While the investigation and problems within the Cambridge Analytica case had more to do with the mishandling of data by app developers, Facebook’s response has stretched beyond app integrations to begin addressing larger industry concerns about user privacy.


Short-Term Impact on Active Campaigns


Custom Audience Size and Insights Removal


On Monday, Mar. 26, Facebook removed audience insights reporting for custom audiences and started removing custom audience size estimates from audience management views. While some audience sizes are still visible within the UI, custom audience insights were removed starting on Mar. 23.

Facebook Custom Audiences are generated from first-party audience data, either by website pixel data or from imported offline records matched to desired Facebook user IDs. Once these audiences were created, Facebook offered audience size and insight estimates to better understand trends within that targeted segment. Some profile trends included in-market behaviors, purchase behaviors, age, gender, household income and other demographic profile points.

Facebook has not said whether this will be permanently removed, but is actively working on solutions like the recently announced “Custom Audiences Certification Tool” to prevent disreputable advertisers from accumulating personal information of new audiences from data brokers to upload as their own.


Partner Audience Removal


Another announcement came later in the week that really peaked advertisers’ attention: Facebook plans to remove advertisers’ ability to target against all Partner Data segments (usually referred to as third party data). Rolled out in 2013, this feature will be sunsetted in 2018, with the rollback scheduled over the next six months.

Targeting against Partner Data segments allowed advertisers to target based on third-party data from well-known data brands like Axciom, Epsilon and Datalogix. These third-party audience segments within the Facebook ecosystem expanded a marketer’s targeting options and allowed greater integration across channels like display, video, native, etc., where this third-party targeting has been a standard practice for years.

The Partner Audience rollback will start with campaigns targeting audiences built from the UK, Germany and France, reflecting the EU’s General Data Protection Regulation (GDPR) legislation. Those campaigns will continue to run until May 24. Within non-EU geographies, campaigns can be built or edited up until Jun. 30 and run until Sep. 30. Starting Oct. 1, Partner Categories will no longer be a targeting option in the platform.


Necessary Action


Unless you are advertising in the EU, the planned changes are not fully rolled out for another six months on Oct. 1, so work with your campaign manager to understand which third-party audiences are being leveraged and how much of your Facebook budget taps into each segment. Understanding this will help mitigate performance fluctuations once the removals begin to impact Facebook campaigns.


Long-Term Impact


As these changes take hold, marketers and brands are waiting to see whether this disruption within Facebook will set a precedent for all other media publishers and impact privacy laws. Does this large shift within one of the largest social platforms signal the end to the abundant collection and sharing capabilities of third-party data companies that have made digital marketing so effective for brands and marketing professionals? Will these third-party data companies and advertising platforms be able to work within the new guidelines to continue offering brands a highly-relevant, seamless engagement experience to connect with their most valuable audiences?


Going Forward with Facebook


Marketers who have relied heavily on partner data will need to get creative within the Facebook targeting ecosystem in order to reach the users they care about. As long as users continue to share interests, life events, behaviors, etc. within Facebook, there will be many valuable targeting segments available to brands and marketers. We anticipate Facebook will continue to evolve their core targeting product, as they have over the past several years.

The Search Agency will continue to drive Facebook performance for our clients by developing audience-centric social strategies to leverage the evolving segments and ad formats that offer the most relevant and engaging experiences within the Facebook auction. In addition, we will continue to analyze trends in media cost and user engagement to understand how large industry targeting changes — like this massive Facebook feature change — will impact overall ad performance for all brands within the digital marketing ecosystem.

The post How Facebook’s New User Privacy Changes Will Impact Your Current Advertising appeared first on The Search Agency.

The Buddy System: A Multi-Brand Digital Marketing Approach

The Search Agents Feed - Thu, 03/29/2018 - 08:51

In a digital world where brand presence is key, controlling Google search results is important. Brands regularly fight to maintain above-the-fold positions on highly competitive non-brand search results. Showing up in position one on a core non-brand term for the business/vertical the advertiser is in can be the difference between growing/maintaining a successful brand in a competitive space and risking falling victim to those that are able to do it better and more efficiently. Many advertisers spend a lot of time — and money — trying to figure out what works best for their brand and how to take advantage of as much real estate as possible when it counts.

There are some brands that are lucky enough to share space in their vertical with a sister brand, whether through an acquisition or re-alignment internally that produced two brands. Those that are fortunate enough to have multiple brands in the same space can take advantage of building SERP dominance through a multi-brand bidding approach: the Buddy System.

The Buddy System can help brands dominate spaces that are essential for business growth. Several clients that I’ve had the opportunity to work with have gone down this path. This can be a terrific way to increase awareness of a smaller brand, push pesky competitors down the SERP that are trying to usurp an industry leader’s top position or assist in conquesting an area that is important to the brands’ business. This system, in general, has been effectively used for larger brands that have brand capital and are either leveraging it to assist a new brand or are pushing for overall dominance in the space. It’s important when using the Buddy System to be able to look at multiple brands in aggregate. Items sold or leads generated need to be observed in aggregate between the two brands because they’re assisting each other capture the majority of space in the search results. Brand B may be spending more to push down a competitor while receiving less leads. Their portion of user acquisition shouldn’t be looked at alone, as their primary responsibility is to push down competitors and ensure users are focusing only on their company’s brands (Brand A and/or B.)

There are two general paths to go down in these scenarios with the Buddy System, the first being positional conquesting. This is when brands come together to take over a space and refuse to let their competitors in. It can be an effective way of building up brand recognition or ensuring the company maintains its stronghold of the space. The second path, target bidding, may not grab headlines during a strategic meeting, but it has its place, particularly with budget-minded companies.

Positional conquesting works well with a positional bidding strategy. Each brand has its own AdWords account and can therefore bid on the same terms. Brand A, usually the larger brand, will take its position at the top and Brand B, usually the newer or lesser known brand, will take its position just below. There is an art form to getting this correct without increasing your overall CPCs too much and creating too much competition. This strategy will inherently increase CPCs, impression shares, overall volume of impressions, clicks, spend and conversions, as well as CPLs across all your brands to some degree. You aren’t bidding for efficiency in this strategy, but strictly to take up as much of the SERP as possible.

Target bidding also works well with a bid strategy. Each brand has its own goal that it’s willing to pay for an effective lead. They each bid independently from one another and bid towards their target goal. This system will have a lesser effect on CPCs, as competition from each other will be minimized by reduced bids to keep the effective target within range. This, too, becomes an art form, as volume can drop in both accounts if the strategies aren’t leveraged correctly. Impression share can decrease along with volume. However, in aggregate, there should be more overall leads being funneled in together at an efficient rate. If a brand is too close to the edge of visibility (a lesser-known brand usually showing up in position three or four), this strategy can potentially knock them out if not utilized correctly.

The preferred method always depends on the brand and goals of the account. It’s extremely important before implementing the Buddy System that all parties involved understand that this is a strategy that needs to be looked at in aggregate, particularly when dealing with positional conquesting. If all parties are aligned, the buddy system can be an extremely effective way of utilizing a multi-brand approach to your digital marketing.

The post The Buddy System: A Multi-Brand Digital Marketing Approach appeared first on The Search Agency.

Picking a Tag Management Solution: Google Tag Manager vs. Tealium

The Search Agents Feed - Tue, 03/27/2018 - 10:19

A tag management solution (TMS) allows both marketers and developers to implement and update tracking tags through a user interface rather than by hard-coding scripts individually. Within the TMS interface, tags can be set up with custom variables and custom triggers, making future tag updates quicker and easier, as well as providing high level visibility to the current overall tag setup.

In the early days of tracking script implementation, every tag from every vendor would need to be hard-coded individually within the site code. Many tags, such as Google Analytics and the Facebook universal pixel, needed to be placed on all pages, in addition to having separate events pushed for various KPIs. These hard-coded scripts would need to be placed in the site code by the software development team and would then get added to the next code push before being made live. This created a bottleneck and resulted in loss of tracking while waiting for new tags or updates to be implemented and pushed live. A tag management solution solves these issues and allow updates to be implemented by marketers and published independently from a code push.

This article will look at two tag managers that I encounter most frequently: Google Tag Manager and Tealium. I will highlight a few pros and cons of each solution to help you decide which tool is the best fit for you.


Google Tag Manager



Google Tag Manager is a free tag management solution that only requires a Google account to get started. It’s the most commonly used TMS due to its no-cost aspect and offers tag templates for Google’s own tracking tags, in addition to a few other common tags. The three main sections within the user interface are “Variables,” “Triggers” and “Tags.” The “Variables” section is where data is pulled in from the website, such as page URL or click text for a button, which are built-in variables. Custom variables, such as a Purchase Transaction ID, can also be pulled in from a variety of sources, most commonly from the data layer, URL fragment or first-party cookie.

“Triggers” define when the tag will fire and can be set up based on specific page URLs, custom events or form submissions. Once a trigger is configured, it can be applied to all tags that need to fire at that same point, which saves setup time. The “Tags” section is where the actual script is placed and configured. For templated tags, there are configuration options that allow you to easily customize how that script fires, such as adding Custom Dimensions to a Google Analytics script. Google Tag Manager also has the ability to add custom HTML scripts for any tags that do not have templates. The custom HTML option also allows for more complex implementations, like running JavaScript A/B landing page tests. However, these types of setups typically require advanced programming knowledge.





Tealium is an enterprise-level TMS with advanced implementation features and dedicated technical support. One of the main advantages of Tealium is its robust offering of over 1,000 tag templates, which is always expanding as additional tags are released. Another perk to Tealium is its “CMS Provider Bundle” feature, which will add key information such as page name and page description automatically to the data layer after the user provides information on the CMS used to build the site.

Tealium also has a built-in jQuery Selector to track jQuery events on-site, without the need for custom scripting. These are just a few highlights of the advanced offerings of Tealium. The main sections within the user interface are “Data Layer,” “Load Rules,” “Tags” and “Extensions.” These sections correspond closely to the Google Tag Manager sections described earlier, with the key difference being the presence of the “Extensions” section. The Extensions section is where advanced features like variable manipulation, currency conversion, modal overlay insertion and cryptographic hashing can be configured without having to write out complex code.


Google Tag Manager vs. Tealium


Google Tag Manager and Tealium offer similar functionality in terms of getting popular tags set up and recording data. The two systems also have subtle nuances; for example, in GTM, multiple triggers can be added to one tag and the tag will fire if any of the triggers are met, whereas with Tealium, if two separate load rules are added for one tag, both load rules must be met in order for the tag to fire. This results in an occasional need to create separate load rules for specific tags in Tealium. These types of differences are highly subjective and the preference for one TMS over the other typically lands with whichever system a user has the most familiarity with. I’m most familiar with GTM and, in my experience, it’s easier to navigate, with all areas in the UI offering a search bar at the top. I also find it much easier to implement custom scripts that don’t have a pre-defined template within Google Tag Manager.

Conversely, Tealium categorizes by labels and doesn’t offer a filtered results list in the “Tags” section of the UI. Tealium also requires the custom teplate to be modified if a tag that isn’t included in the list of tag templates needs to be added. The Tealium custom template is 121 lines of code with three main sections that need to be edited to add the custom script, resulting in a much longer implementation time when compared to GTM. Where Tealium shines, though, is that it features over 1,000 tag templates, so only rarely is it necessary to have to add a custom script. Google Tag Manager only has about 80 tags, so you will need to utilize its custom HTML container more often. Fortunately, the custom HTML container in GTM is very user-friendly.

Another aspect that GTM makes easier than Tealium is its solution to the Safari “Intelligent Tracking Prevention” update, which requires no change to the existing AdWords and DoubleClick templated tags and just requires one additional “Conversion Linker” tag to be added to all pages. In Tealium, all previously-existing DoubleClick and AdWords tags need to be recreated with the “Global Tag” template. From there, the tag needs to be configured to fire on all pages and conversion events need to be mapped in the “Data Mappings” section using UDO variables with the value acting as a load rule utilizing the “Data Value” extension. This gets very confusing, especially when revisiting the tag after initial implementation, since the UDO value acting as the load rule is viewed within the “Extensions” screen, causing a lot of back and forth navigation to get a handle on where the event is firing.

One other feature that I really like about Google Tag Manager is that it auto-saves all updates in the “Preview” workspace. Tealium requires updates to be saved manually and has an auto sign-out feature after a period of inactivity. There have been times when I’ve been working on a Tealium implementation and had to switch to an urgent project for a few hours and, upon returning to Tealium, realized I had been signed out and my changes weren’t saved. A big advantage Tealium has over GTM is access to support representatives because it’s a paid service. Tealium definitely has a place for users that want access to support representatives or would utilize the advanced features of the “Extensions” section; however, I rarely find the need for the advanced features in my day-to-day use.

The advancement of tag management technology makes the lives of marketers and developers far easier. Whether you choose Google Tag Manager, Tealium or another TMS, having a central platform to view and manage all tags across a site is far more efficient and beneficial than individually hard-coding them. I would highly suggest implementing a TMS if you are still hard-coding marketing tags on your website.

The post Picking a Tag Management Solution: Google Tag Manager vs. Tealium appeared first on The Search Agency.

Product Listing Ads 101: Getting Started

The Search Agents Feed - Wed, 03/21/2018 - 09:08

Product listing ads are a critical component of any successful e-commerce strategy. PLAs help catapult your products to the top of the SERP – even above text ads – and give advertisers an opportunity to put their best foot forward when users search for your products, as well as a chance to immediately out-shine your competition with great prices and special offers. PLAs can take the lead as your main non-brand campaign, providing the most relevant results for each user query while providing a title, price and image — something expanded text ads simply can’t do. Furthermore, more users than ever before are engaging with Google Shopping – a trend that has been increasing steadily since its launch.



Launching product listing ads is fairly simple; doing them well takes some finesse. I’m going to walk through how to launch PLAs, as well as some industry best practices to help ensure you can generate the best results possible with them.


Launching Product Listing Ads


There are four steps required to launch product listing ads:

1. Create a Merchant Center account and verify site ownership.
2. Link your Google AdWords account to your Google Merchant Center account.
3. Import your data feed to Merchant Center.
4. Create your Shopping campaigns.

Let’s look at all of these steps in greater detail.


Creating and Verifying Your Merchant Center


The very first thing you’ll need to do is create a Merchant Center account and verify you own the site that you’re advertising for. You will need to navigate to and log in or sign up with a Gmail account. After doing so, you’ll be prompted to submit your business information, like the country in which your business is located, the name of your store and your web domain.



After doing so, read and agree to the terms and conditions… or just agree to them. Lastly, you’ll have to verify you own the store for which you’re creating a merchant account. This is very simple and can be done a few different ways. One is by placing a small HTML file Google provides, uploading it to your site and clicking “verify” under the Merchant Account settings. If you use this method, it’s important to not remove the file from your site: Should it go missing, Google will de-certify your Merchant Center and you’ll have to go through this step again. There are a few other ways, like placing an HTML tag on your site’s homepage, configuring permissions through your Google Analytics account or using Google Tag Manager. All of these steps are simple and should only take a few minutes.


Linking AdWords to Your Merchant Center


After you’ve created your Merchant Center account, you’ll have to link it to your AdWords account. This is done from within the Merchant Center. Click the three vertical dots in the top right corner of the Merchant Center UI and select “Account Linking.”



Once there, select “Link AdWords Account” and input the ten-digit customer ID for the respective AdWords account, which can be found in the top left corner of your AdWords account. The request will go through to the AdWords account, where you can accept it from the “Settings > Linked Accounts” section in AdWords. Click “View Details” under “Google Merchant Center” and accept the link request. Viola! You’ve now linked your Merchant Center account to your AdWords account and are ready to upload your products and build your shopping campaigns.


Import Your Data Feed into Google Merchant Center


The Merchant Center is the “brain” of your Product Listing Ads campaigns. Not only does it support PLAs, but your feed can also be used to support dynamic remarketing, another powerful tool provided by the likes of Google and Bing. Uploading your feed to the Merchant Center itself is very easy, but a quick word of caution: Ensuring that your data quality matches with Google’s is paramount. If your feed columns are labeled incorrectly, or if you violate character limits, or don’t include key information such as “Google Product Category,” your feed will be disapproved. There are a host of services that can support this process, from manually editing the feed file in Google Sheets (not recommended for sellers with thousands of products) to using a third-party data feed service. For detailed information, we recommend you review Google’s Product Feed Specifications page.

Pushing your feed to the Merchant Center is simple: On the left-hand navigation, select “Products,” then “Feeds.” From there, click the big “plus” sign to create a new feed. Select your target country and language, and make sure you have “Shopping” checked under “Destinations.” Next, name your feed and select how your feed will be connected to the Merchant Center.



This is what I was talking about in the opening paragraph of this section: If you store your data on a Google Sheet, select that option and continue. We suggest using either a scheduled fetch, which allows Google to grab the file from a URL, or “Upload,” which allows you to push your file to the Merchant Center. Follow the few instructions left and you’ll be ready to upload your feed. We also recommend doing a forced upload after you set everything up so that your data is present in the Merchant Center right away, otherwise the data will be pulled in based on the schedule you set. Allow a few business days for Google to review the feed, at which time your products will be viewable in AdWords.


Create Your Shopping Campaigns


Now that the technical steps have been taken, all that’s left to do is to create your shopping campaign, segment your products and set your bids. Anyone familiar with creating campaigns in AdWords will easily be able to set up a Google Shopping campaign. In AdWords, click the big “plus” sign and select “Shopping.” Select “Sales” for some pre-canned settings that are generally optimized for ROI, like automated bidding. These features can be changed at will within your campaign settings while you continue to configure your campaign, so don’t worry about being locked in by any of these default options.



If you’ve successfully linked your AdWords and Merchant Center accounts together, you should see your Merchant Center account appear below. Select your country-of-sale and configure the settings that appear on the next page to your preferences. Your campaign is now ready to go!


Tips and Best Practices


If you followed along to here, your shopping campaign is ready to deliver ads to users. However, we don’t want to just stop here; there are some tips and best practices that we want to share with you to help you get the best experience possible.


1. Segment your product groups


The default setting is going to be “All products,” but you don’t want to stick with this setting. Just like you wouldn’t want to use the same bid on all your keywords, you’re not going to want to use the same bid on all your products. We suggest segmenting along the organization of the website, and segmenting down to specific product categories. You may be tempted to segment down to the product ID, however, we’ve found that segmenting beyond product categories tends to limit impressions. You can use your discretion here based on your goals.


2. Break out brand and non-brand campaigns


Just like with standard search campaigns, you want to make sure you’re maximizing your traffic when users do branded searches for your products. Creating a separate campaign for brand will ensure you’re budgeting for all your brand traffic and not losing valuable impressions. This can be done easily enough by creating two shopping campaigns and adding your brand terms as negative keywords to the non-brand campaign. Then, set the campaign priority (a unique shopping campaign setting) to “low” for your brand campaign; this should help force non-branded queries to the appropriate non-brand campaign. This helps you budget and bid appropriately for brand and non-brand searches, which likely deserve their own bids.


3. Optimize your data feed


This is worthy of its own post, but we can give you some immediate quick tips. Include the brand in your product title so that your PLAs take advantage of your brand power. Use the most specific Google Product Category possible – Google looks closely at this setting to determine what type of product your goods are. Avoid the temptation to use the more general categories. This should help with your match rates and CTRs. Use Custom Labels — these are custom fields that you can further categorize your products based on marketing goals, like ROI or price range.


4. Review the Merchant Center regularly!


The MC will report on any warnings, errors or violations that your feed may be triggering. The majority of these are easy to fix – they simply require awareness to catch them. The diagnostics tab will lay out any and all issues you might have so that you can get to work correcting them. Doing this will ensure all of your products remain active and your data quality score with Google will remain high, both of which will help generate better results.

I hope this tutorial helps you get started on Google Shopping. PLAs are important high-converting and profitable tools available to advertisers. More users than ever continue to use these features on Google to find the best deals on goods. What’s more – once you’re set up on Google, you can easily add a Bing Ads product feed as well, and continue to expand your reach to new users who’re searching for your products.

The post Product Listing Ads 101: Getting Started appeared first on The Search Agency.

5 Life Lessons I Learned from AdWords

The Search Agents Feed - Mon, 03/19/2018 - 08:04

Google AdWords isn’t just the game-changing online advertising platform we see it as. If you spend years working with it, you’ll realize that there’s more to it than just clicks, conversions and cost. Here are a few life lessons you can take away from the not-so-ordinary advertising monarch.


1. Fear is an illusion


AdWords, as a platform, pushes you to experiment. The launch of campaign drafts and experiments is a case in point. Unlike traditional advertising, where you fear a cost-intensive experiment failing, AdWords allows you to test and learn without wasting too much money. In the case of drafts and experiments, you can measure and compare results of your original and experiment campaigns and apply changes accordingly. Therefore, there’s less to fear in AdWords when compared to traditional advertising.


2. Don’t judge a book by its cover


Reporting and data analysis run deep in AdWords. A smart AdWords strategist is someone who digs into this data instead of considering the numbers at their face value. For instance, at the top level, a campaign may not be meeting its CPA target. However, through data analysis, you may find a campaign’s keywords, day of the week or even audiences that are non-performing. There are so many layers of data that AdWords can provide in its reports section that simply ignoring it is a pitfall.


3. Appearances matter


Clients love pretty reports — and they love visuals that make data easy to understand. Before, agencies invested resources in creating visually-appealing reports, but with AdWords’s Report Editor, this is made much easier. With the Report Editor, you can effortlessly pull data into charts, graphs and tables and drag-and-drop dashboards to fit your aesthetic needs. You can even sort data and apply filters as needed.


4. You aren’t always right


AdWords teaches you that one mantra does not work for one industry. An FMCG brand that assumed its audience comprised only young, urban males, was in for a surprise when demographic data was analysed. A good portion of its audience included females as well as 50-year-olds and older. Excluding such audiences while targeting ads would have led to lost opportunities.

An ad that worked for a previous real estate client of yours may not work for another. Keyword insertions may work for some, but not for others. Therefore, what we assume to be true for all cases may not always be right.


5. Change is the only constant


Constant changes in AdWords sometimes draw the ire of many professionals in our industry. When AdWords launched in 2000, ads ran only on the right sidebar on SERPs, but soon enough, the mainline area got consumed with ads, too. It’s 2018 and we’ve now moved away from the sidebar ads completely, reducing the number of ads that can appear on SERPs from 11 to up to seven. The changes in device bidding are also known to all. Staying on top of these constant changes is perhaps the primary criterion of an efficient search marketer.

That’s the reason why working with AdWords can never get boring. Alphabet, the parent company of Google, may have dropped its motto of “Don’t be evil” in favor of ‘”Doing the right thing,” but for those with a keen eye and a sense of humor, there’s a life lesson even in that.

The post 5 Life Lessons I Learned from AdWords appeared first on The Search Agency.

A Rough Outline to Conversion Rate Optimization

The Search Agents Feed - Wed, 03/14/2018 - 10:17

Many online marketers instinctively feel that by getting increases in traffic, their marketing efforts are automatically a success. While this may be the case for many companies, an often more important KPI is how that traffic behaves while on-site. Just a small increase in conversion rate can have big impacts on a company’s bottom line and bumping your conversion rate by five percent could be a smashing success. In this article, we’ll outline some of the key areas that should always be considered when trying to optimize your website for increased conversions.


Establish Conversion Goals and Metrics


You’ve most likely identified what conversion metrics matter most to your business, but if you haven’t, now is the time to do so. Completing this essential step allows you to better focus your optimization efforts on elements that matter most.

Depending on the type of website you operate, the conversion metrics that are most vital to you will vary. E-commerce websites are probably most concerned with conversions relating to product sales, adding to cart and other actions related to making a purchase. For B2B websites, you may be most concerned with newsletter sign-ups, leads generated, whitepaper downloads, etc.

Whatever the case may be, establishing your goals and conversion metrics and setting them up for tracking is a key component in the foundation of conversion rate optimization.


Use Your Analytics


Understanding your website traffic is crucial in many areas of online marketing and CRO is no different. By looking at segmented data like traffic sources and digging into each one, you will gain a better understanding of where your most valuable traffic is coming from. You may find the highest rate of converting traffic comes from referrals, while organic and direct traffic show lower rates of conversion. If this is the case, you may want to focus more efforts on increasing your backlink profile. Or you may want to take a closer look at your top organic landing pages. If you find your organic traffic is converting at higher rates, it may be time to invest more time and effort into SEO.

Whether you decide to focus on top performing traffic sources or underperforming traffic sources, understanding your traffic will help you decide the most important areas to concentrate your efforts.


Promote Pages That Convert Well


If you know what pages convert the best, great! If not, you will need to do a little more digging into your analytics. Hopefully, you have conversions set up in Google Analytics and can easily see which pages or set of pages convert the best. When you have identified your best-converting pages, it’s important to utilize them on-site and off-site, if you’re not already doing so.

Make sure the pages are easy-to-find and linked to from several areas on-site. This includes your header navigation, footer navigation and CTAs and anchor text links from relevant content. Your top converting pages are most likely highly visible, but doing a site audit or running an internal linking report to see how many pages are linking to your top converting pages can be helpful. Make sure that any related content including articles, blogs etc. are pointing to these pages.

Similar to ensuring high visibility to converting pages on-site, you want to ensure high visibility of converting pages off-site, as well. This means posting related social content to your top converting pages with CTAs designed to get people to click or otherwise interact.


On-Page CRO Tips


Understanding how viewers absorb the content of webpages is another key element. Optimizing your web pages for conversion is often referred to as landing page optimization (LPO) and is essentially a part of CRO. An entire post can be dedicated to effective LPO, so we’ll focus on just a few of the basics.


Effective Headlines


Headlines on these pages need to be concise and to-the-point. Ensure your headlines encapsulate the entire pitch of the page.




Different images and the size of them can yield different results. A/B testing may be necessary to understand which images work best.


Copy and Product Descriptions


Similar to headlines, copy should be informative and to-the-point. Nobody wants to read unnecessary text. If your text is too long, try and shorten it down to be more readable and user-friendly.


Strategic CTAs


Much of CRO revolves around a strategy regarding calls-to-action. A lot of research exists around how and where to use CTAs. According to Google research, the most viewable place on a page is right above the fold, but not at the top of a page. Anchor text CTAs have been shown to be effective, as well. Used throughout the content or as a standalone heading between paragraphs, anchor text CTAs should always be considered as they can lead to more clicks than buttons do.




The use of CTA buttons has long been a staple in increasing conversions. A/B testing is likely to yield the best answer here as the color, size, placement and text can all drastically impact the number of clicks a button receives. In general, buttons customized for each landing page are highly recommended.


A/B Testing


A/B testing sometimes goes hand-in-hand with successful CRO. When the necessary time, effort and resources can be allocated into landing page A/B testing, the result can be amazing insights that can lead to significantly higher conversion rates. There are many different elements that can be tweaked, which can have a drastic impact on the amount of conversions a page receives. Essentially, this entails creating an alternative version of an existing page and split testing the two to see which performs better. Consider using A/B testing for converting pages and tweak all the elements from the above section one at a time to see how the perfectly-optimized page should be structured.

CRO is no simple matter, as it takes patience and deep understanding to effectively execute. However, using these five points as an outline for digging deeper into your CRO is likely to yield positive results. Remember to think about the quality of your traffic — not just the quantity.

The post A Rough Outline to Conversion Rate Optimization appeared first on The Search Agency.

The Shift to Audience-Focused Branding

The Search Agents Feed - Mon, 03/12/2018 - 10:18

The rise of social-based platforms — YouTube, LinkedIn, Facebook, Uber, etc. — and the resulting overflow of behavioral data is forcing brands to center marketing efforts around nurturing audience connections over product/service promotion to stay engaged with valuable customers. In our 24/7, informed, disperse digital environment, the audience is now essentially in control of branding through reviews, sharing and platform interactions, so businesses are adapting in a variety of new ways to stay relevant and foster customer growth.


Traditional Branding


Historically, brands have defined their specific product or service with limited depth of story or personal connection. Nike was athletic shoes/clothing, Coke was a cola, Ford was an American-made car and IBM was computers. Brands would build their walls around their physical product or service to protect their distinct offering and pushed out their unique value propositions to the most interested audience. Simple.

Brands then leveraged stories to weave in emotion and personal connections to show how their products/services would make you feel or what being a customer said about you as a person. Toms donating a pair of shoes for each they sell, Corona selling you the beach and Apple offering a higher-end digital experience show how stories and emotion can push products beyond their tangible value.


Branding for Engaged Social Audiences


In this audience-centric environment, the most aware brands are changing this traditional, linear brand direction, often knocking down their own core pillars or giving up creative control to offer their highest potential value to customers. Stories and emotion are still vital, but brands are now cross-selling, up-selling, expanding and adopting new ways/partners to stay constantly engaged with their highly informed, spoiled audience. Although the physical product or service is paramount to initially define a brand, established businesses are evolving past basic product defense and promotion to focus on new partnerships, offerings and opportunities that drive a more engaged connection within the scattered landscape.

Below are just a few examples of brands taking this approach and using it to its fullest potential.

Brands giving product design to influencers and the customer:


Brands expanding into new platforms:



Brands partnering to offer better customer experiences:



Brands expanding into new audiences:



Brands adopting new technologies to expand into new markets:



Brands influencing institutional change to connect with customers:


Are You Sure Your Audience is Millennials?

The Search Agents Feed - Wed, 03/07/2018 - 10:14

Quick tip: Millennials HATE being called teens. That’s probably because they aren’t teens. Although the media and many marketers seems to believe that the millennial generation includes anyone younger than Gen X, they are, in fact, a specific generation — and they are adults. All of them. If you’re targeting millennials and assuming that covers everyone, you’re probably missing some opportunities.


Xennials, Millennials, Gen-Z, Gen Alpha, Oh My!


Most marketers focus on just two generations, but there are eight living generations in the U.S., including THREE generations that follow Gen X, as well as a sub-group of one generation. The dividing years are a bit squishy, but the members of each clearly understand which group they belong to.


Millennials: ~1980 to ~1998


There are now more millennials than boomers, so this is every marketer’s favorite group. The end year is extremely variable. Some say it extends into the early 2000s, but most people born after the late 1990s consider themselves Gen Z.


Xennials: 1977 to 1983


A subset of people born in the dividing line between Gen X and millennials. They don’t feel that either generation fits them and their experiences.


Generation Z: ~1998 to ~2012


Media and marketers often lump older Gen-Z members in with millennials, but they identify themselves as being distinct from the previous generation.


Generation Alpha: ~2012 to Today


These are babies, toddlers and kindergartners. They don’t know they are part of a generation. Their primary defining characteristics are a deep aversion to broken cookies and a love of YouTube toy videos.


Who Should You Be Targeting?


Many marketers focus on millennials and assume that the message that works for them will work for anyone younger, but that approach’s likely to irritate not only the younger generations, but also the older ones that might also be interested in your product or service.

As I said above, millennials are adults. The majority of them have completed their educations, have established careers, and may own a home or have children. Despite inventing the term “adulting,” they are in fact living adult lives and have adult concerns like money management, parenting and career planning. Although technology is an aspect of every part of their lives, and their embrace of it has disrupted multiple industries, they do remember a pre-smartphone world.

Millennials rely heavily on social media and personal recommendations when making purchase decisions. They read user reviews, but know which ones are fake. Focus first on making a product that will generate great reviews, and then make it easy for customers to leave them. Millennials like loyalty or rewards programs, especially if they are in app form. Radio and podcast ads are also effective.

Generation Z has an entirely different set of concerns. For the most part, they are still in school. The oldest of them are just about to graduate college or are entering the workforce. They are aware of their buying power and have a wide range of interests and concerns. The older members are concerned about student debt and being able to find a job after college. This generation has always lived in a digital world and have woven technology into their lives.

Generation Z splits their time across multiple social media platforms, so you need to be where they are. Focus on mobile marketing, because that’s what they’re using. Most prefer video to text, but they like to see individualized messages rather than mass marketing. They want to know what companies stand for, but you only have a few seconds to convey that before they bounce to the next post. They are particularly drawn to real people sharing real stories, and they value diversity.

Generation Alpha is not yet aware of their buying power, but they are aware of their power to beg their parents for toys. Most of their parents are millennials or part of Generation X, who know exactly how marketers are targeting their kids. They want their kids to be happy, though, so they will buy the baffling Shopkins and LOL Surprise dolls that this group loves so much. (You may have detected that this writer is a parent of an Alpha.)

The Alphas have always lived in a digital world, but one that responds to their touch and even their voice. To them, technology has a personality. Targeting this group is challenging, because they are using their parents’ accounts to access content. They have also learned how to push the “skip ad” button on YouTube already, but will watch long ads if they’re particularly entertaining.


Don’t Forget the Older Generations


Despite marketers’ collective amnesia about Generation X, this group does have money and is willing to spend it. This group is in their 40s and 50s, which means most are at the peak of their careers and earning power and may have some extra money to throw around on nicer cars, fancy vacations and pricey toys for themselves (assuming they’re not spending it all on college expenses for their Gen Z children.) Show them some love! Trust me, they’re not all bitter slackers who hate everyone and everything. They are a bit jaded, however, after years of being ignored. If you’re a millennial, your boss may very well be a member of this group.

Given that Gen X is a sandwich generation that is juggling care for aging parents and young children, they love a discount. Email marketing is still effective, if they have time to check their Promotions folders. But don’t email every day! Once a week is plenty. They also like to feel that they’re supporting others with their choices, so companies that support social causes are aces in their book.

Boomers cannot be ignored (and won’t let you ignore them even if you want to), but their buying power is declining. Most are retired or nearing retirement age, and are focused on preparing for that, as well as the challenges of keeping their aging bodies healthy. The most successful of this group still buy cars and vacations, but they also like to buy things for their grandkids and relive the glory days of their youth with music festivals starring other boomers. Traditional advertising methods such as television, print and radio still work on them. They tend to be brand-loyal and interested in the value they will receive from products.

When planning your marketing campaigns, don’t immediately go for the millennials. Instead, figure out who your target market actually is and what they actually need, then target your message appropriately. Don’t start with the message that will best suit the most popular group and then hope it will work for everyone else.

The post Are You Sure Your Audience is Millennials? appeared first on The Search Agency.

Complete Reporting Automation: Are We There Yet?

The Search Agents Feed - Mon, 03/05/2018 - 09:01

Reporting is a big part of any organization’s day-to-day activities. It is the most common requirement across various departments, may it be IT team or executive team. For media agencies, reporting provides performance measures, BI insights and decisive power. Of course, per organization and per department, your reporting frequency, data sources and everything in between will change, but it is still a basic need. We always say, “change is constant.” Reporting is a way to measure this constant through available data. It only makes sense to provide some automation for this common and constant factor, which will take away pain from everyday processes while providing mission-critical inputs. This begs the question: Can we completely automate our reporting?




Reporting automation is still a bit of a daunting task for companies, but there are some that have been trying to provide solutions to this problem for a long time. We’re close, but still a bit far from complete automation. I will focus on some of the challenges for a media agency like us to implement complete automation, but these might be common regardless of industry.

Let’s look at few of the challenges from the data input side:

  1. Lots of data sources.
  2. Very high data refresh rates per data source.
  3. Data segmentation is not unified across data sources.
  4. Lack of data cleanliness and integrity across data sources.

Now let’s find some challenges on the actual report-delivery side:

  1. No unified KPIs.
  2. Too much fragmented reporting.
  3. Inability to connect KPIs to actionable goals.
  4. Inability to provide near real-time reporting for business goals.


Strategies and Solutions


There are excellent solutions and tools available now to solve some report delivery challenges. A business intelligence application with built-in automated reporting can bring you the advantages of both approaches. These applications serve as a single source of truth for all end users and stakeholders. These tools communicate to multiple data sources via various data ingestion methods such as API, email, FTP and database. Simple connection of multiple data sources and easy creation of reports and dashboards allow users to get what they need with little or no help from IT. BI tools such as Tableau, Datorama, Domo, etc. are helping to build reporting automation from report delivery side. However, these are still tools and are only as good as their input data.

Most of the obstacles for reporting automation stem from input data not being clean or identified incorrectly. When data sets from different sources do not talk to each other, they create silo-ed reporting. This ultimately creates obstacles to building unified KPIs that matter to final business goals. Here are some of the simple steps that you can implement that will bring huge benefits to an organization’s goals and automation of reporting.


Determine naming conventions


It’s very important to have standardized naming conventions for your data across sources. This not only helps organize your data, but also helps tremendously to clean the data set from any source. Any anomaly in metrics can easily be found and traced back to the source if proper naming conventions are used. Not only this will help your reporting automation to be set on the right track, but it will help you identify and clean data sets along the path.


Identify common keys


To combine data from various sources, we need common keys among them. Hopefully before you’ve reached this stage, you’ve taken care of your naming conventions. Now, using those conventions, we need to identify a key for each data source which can be utilized to combine its data with other relevant data sources. This common key can be anything – your campaign name, the program name or the business goal name applied to reporting entities. Identifying such keys across sources is important for the next step. In cases where there are many-to-many relationships across reporting entities, we can always use look-up tables to create reporting segments.


Identify and unify KPIs


Once the above two steps are implemented, the underlying data starts to talk to each other. This not only sets us up to automate our reporting, but it also helps us identify data or data sets that are most important for the reporting stakeholders and the overall business. This data determines lot of different KPIs by each stakeholder. Such KPIs range from day-to-day operations that related KPIs to overall business goals. Once you start to see data and KPIs by sources, you can create some source-level KPIs for operational optimization, as well as some business -level KPIs which span across sources.


Identify the right tools


By talking to stakeholders and end users of these KPIs, you can determine data frequency and reporting expectations. Such discussions really help to identify the correct tools for reporting automation, which ultimately helps us build useful and mission-critical dashboards. The choice of tool can be different per industry or reporting need, as well. There is no one-stop solution for all. Having online reporting is a huge benefit for organizations such as media agencies. These factors will really drive the appropriate choice of reporting tool for your organization.

Ultimately, all of this really depends on getting everybody to agree on set strategies and common rules for reporting. This agreement determines the success of your reporting automation. For such automation to work, rules must be kept in check and input data must be sanitized by following these rules. This requires a big buy-in from stakeholders, as they are the ultimate users policing these rules on the data and who’ll be consuming this data.

There are a lot of tools in the market which promise reporting automation, but ultimately, these are still just tools, meaning they’re only as good as the data fed to them. It’s up to end-users and stakeholders to start tagging/identifying those data sets cleanly, then determine what matters most from the data. There needs to be an agreement on adhering to naming conventions, reporting rules and being vigilant about setting strategies for future growth. These simple steps in tandem with BI tools will definitely help achieve the elusive goal of reporting automation. This will get us very close to a good amount of reporting automation — if not 100 percent.

The post Complete Reporting Automation: Are We There Yet? appeared first on The Search Agency.

Personal vs. Professional: Finding Your Voice on the Social Media Playground

The Search Agents Feed - Wed, 02/28/2018 - 00:17

In the many years that I’ve been managing social media campaigns and accounts for clients, there’s always one question that I’m asked within the first two minutes of my initial meeting with them: “Do you think it’s more important to have a professional voice or personal voice online when it comes to social media?” My response is always the same and is a reference to an Old El Paso taco shell commercial: “Por que no los dos?” — “Why not both?”

One way of looking at this question is to consider that while your digital voice needs to be personable and connect with your audience, it also needs to support a professional image. After all, you are still a business and in order to thrive, you need to generate revenue. You want your potential consumers to get to know you, but first, you need to identify who those consumers are and what the best way to communicate with them is.

Many companies and brands feel that they need to be either strictly personal or strictly professional and not try to find a happy medium. For some, that’s definitely true. From time to time, a company or brand will find great success on their social media platforms by committing to both voices. Let’s look at a few examples of companies using both or either voices:


The Instigator



One company that has found monumental success and gained over two million followers on twitter for their personal, witty and often crass commentary on social media is Wendy’s, the fast food chain. They use high-resolution photography and clips from on-air commercials, but what they’re best known for is the youthful and sassy remarks associated with every social media post they create. In the above example, Wendy’s is calling attention to a tweet that seemed copied and pasted from a generic social media content schedule and posted by both McDonald’s brand and corporate Twitter accounts. In its tweet, Wendy’s makes fun of an ongoing complaint about many McDonald’s locations claiming their ice cream machines are broken. This is a perfect (although extreme) example of use of a professional vs. personal voice.

McDonald’s was able to respond with a smart follow-up post, but not before hundreds of thousands of people took notice and engaged with Wendy’s initial tweet. Even in the follow-up, McDonald’s copy was still professional and a little salesy, but it received a decent amount of engagement. Why did this work for both companies? Both Wendy’s and McDonald’s know exactly who they are targeting and how to do it. McDonald’s has been around for a very long time and knows that their voice on social media doesn’t need an update or overhaul. They know that having a more professional voice is what works in order to engage their audience and they know what kind of content their audience expects.


The Executive



Thanks to the social media antics by Wendy’s and the media attention they receive from their fiery posts, they have become real competition for McDonald’s, but still struggle to get out of that third-place slot in the fast food race (which includes Burger King). Keeping their personal voice on social media is what will set them apart from the pack and keep their followers loyal.

Unfortunately, not everyone has been around as long or has as big of a physical and online presence as Wendy’s or McDonald’s, meaning they don’t really have the luxury to choose between fully committing to either a strictly personal or strictly professional online voice. For most companies and brands in today’s market, the competition is so dense that standing out or doing something new can seem intimidating or overwhelming. But the goal isn’t to reinvent the wheel: The goal is to engage an audience and create revenue by employing basic marketing and sales tools and data. Understanding who your audience is will help mold your online voice and help create engaging content. The data you gather from engaging that audience will prove helpful in creating future content that will convert that engagement into revenue.

Using data to create targeted content is a trend that many companies are jumping on board for, but its success relies on its execution. This process is called data mining and it has become a very hot topic in the media because of how some companies are choosing to use it. Below is an example of what not to do:


The Tattletale



Okay, this very funny. I am actually one of the over 400,000 people who Liked it, but what did the tweet accomplish? It is a very clever idea which they came up with by using hard data, but their voice was strictly personal, thus causing a backlash from consumers criticizing the company of overstepping privacy lines and even being downright rude. The post did its job: It received a ton of media attention and the product mentioned in the post performed better because of it, but in the end, the company’s image was seen in a negative light by many of its consumers.

How could they have done better? Let’s look at an example of a brand that went a step further. The company below used data mining to know what kind of content would perform well, then incorporated consumer-generated content to engage potential new customers using a combination of personal and professional voice:


The Savvy Friend



This post is a homerun. Bobbles and Lace mined data to identify a product in their inventory that was performing well, they then used a captivating image taken by the consumer, engaged the actual consumer with a tag, spoke personally with the consumer while encouraging other consumers to produce similar content, drew attention to the product being showcased and provided the necessary info for other consumers to purchase it. On top of that, they even went one step further by remaining active within the conversation thread. This post had a very clear purpose and it was executed very well. It shows that this company knows its audience and knows how to speak to them in a personal way whilst remaining professional enough to propel their business forward.

Finding the perfect blend of personal and professional voice will take time and troubleshooting, but eventually you will learn what works best for your audience and the resulting improvement will become apparent. One thing to consider is that all the social platforms vary on their limitations and the types of content they support. The voice you use on Instagram might be slightly different from the voice you use on Facebook or any other platforms, meaning that your audiences might slightly vary as well. However, as long as your intent and goals are clear, your voice will be, too.


Finding the Perfect Balance


  • Use data from your business to create relevant and engaging content.
  • Directly engage with your consumers using a personal voice and, if possible, use user-generated content.
  • Encourage other consumers to share content via incentives and initiatives.
  • Don’t forget to reference your products and/or business (in a professional voice) as well as provide instructions on how to gain access: website, storefront address, etc.
  • Follow through. Stay engaged with the conversation: Reply to comments, answer questions and give feedback.

The post Personal vs. Professional: Finding Your Voice on the Social Media Playground appeared first on The Search Agency.

Ad Muting: 7 Ways to Optimize Your Ads in the New ‘Opt-Out’ Google Network

The Search Agents Feed - Mon, 02/26/2018 - 02:59

For advertisers, remarketing is an essential strategy for reengaging with past visitors, and nowadays, we are getting better at being able to track people both on and offline. Despite these capabilities, it’s the negative ad experiences that people continue to find themselves dealing with that are causing the increased need for ad blockers. Saw a pair of shoes you liked, but weren’t ready to purchase? Maybe a week goes by and you’re still seeing those ads. Unwanted experiences like these cause the number of ad block users to increase every year and even cause some users to turn away from search engines like Google, Bing and Yahoo altogether.

While some of the web’s top publishers were still busy developing anti-ad-blocker technology, Google decided to stop fighting ad blockers and allowed users to opt-out of unwanted ads within its search and display networks. On January 25th, Google announced the following additions to Ads Settings (previously Ads Preferences Manager) and “mute this ad”: 1) In Ads Settings, consumers will be able to see which advertisers are targeting them and personalize the ads based on their preferences. 2) With the new additions to “mute this ad,” consumers will not see the ads they have muted on any devices when logged in to the same account.

With the use of ad blockers growing every year, it’s apparent that people don’t want the pair of shoes they viewed a week ago following them around as an ad as they read the news. In 2017, DuckDuckGo saw its highest query count to date with 5.9 billion searches last year. As ad blockers become more popular and engines like DuckDuckGo see query volume growing every month, it’s understandable as to why Google would try to join the fight as opposed to fighting the current.


Possible Reasons Behind the Move


  1. Ad blocking software has been growing rapidly in the past few years. According to a survey of over 1000 US internet users, nearly 40 percent of them had used an ad blocker in the last month.
  2. Google had been trying to eliminate ad blockers, but could not have done so successfully. According to PageFair, ad blocking cost Google AdWords $887 million in just one quarter (Q2 2013).
  3. To meet the needs of consumers: Consumers will always buy the products that they need. If ad blockers can satisfy their need for getting out of the emotion of anger from seeing annoying ads, then users will continue to use ad blockers. By only showing preferred ads, Google is trying to give consumers that same satisfaction of using ad blockers.
  4. To fit Google’s long-term vision to provide a better user experience. This doesn’t just transform the current online advertising environment to a cleaner and more preference-based ecosystem, but it also helps increase Google’s profits in the process. Facebook did a test in 2016, in which they blocked ad blockers and gave users the option to hide ads. Resultedly, they saw a boost in revenue.


Benefits to Advertisers


The benefits to Google are very clear, but what do advertisers have to gain from this new change?

Here are some of the possible benefits to advertisers, once more users begin to switch from ad-block software to Google’s new ad muting option:


  1. We could see a decrease in the number of “unknown” users. Although it’s not very clear how Google defines “unknown,” there’s a possibility that when users remove their ad blockers, Google will be more likely to identify the demographics (age and gender) of these users.
  2. Increased headcount of males in demographic targeting. Because we’re seeing that a greater percentage of men are using ad blockers (58 percent male vs. 42 percent female), Google’s new muting feature might attract more male users.
  3. Increased ad spend on desktop. Because there are more ad-blocker users on PCs (51% PC/Desktop users vs. 22% mobile users), if they remove ad blockers, there will be some lift in desktop spend. Desktop is also the main platform where major purchases ($200+ price range) are more likely to happen, so this will help retailers who sell higher-price point products online.
  4. Increase of CTR and cost efficiency on remarketing. Advertisers won’t have to try so hard to reach the audiences they want because Google already negates the “unwanted” audience for them.


Steps Advertisers Can Take Within Google’s New Personalized Ad Experience


1. Rethink Your Marketing Strategy

As Google shifts to serving preference-based ads, it will become more difficult for advertisers to remarket at a granular level. As the segmentation of remarketing lists becomes more ambiguous, the size of these remarketing lists will get smaller. Unless Google provides the list of audiences who mute your ads, it probably won’t make a difference if you set up your remarketing lists by too many segmentations, because it will all come down to two lists of audiences: Those who’re interested in your ads and those who aren’t. And, at the end of the day, you’re only serving your ads to consumers in the first category. So, the question is – what is going to be the new strategy for remarketing?

The answer, regardless of your retargeting plan, is that you’ll need to have a solid strategy to target your overall audience:

  1. Make sure your ads are not “annoying.”
  2. Offer consumers a seamless and great experience – from the first touch point, all the way to post-purchase customer service.
  3. Google sets the end date of the cycle of muting ads to be 90 days – maybe 90 days should be the new benchmark in your consideration when creating a remarketing list.


2. Think Like a Consumer

90 days is actually a long period of time to be followed when you think like a consumer. Especially one who has already purchased, or ultimately decided they weren’t interested.

You’ve been there. Maybe you’ve decided to check out what your favorite site is offering, only to find out you’d rather see an item in person. Or you’ve comparison shopped and found out it’s cheaper on Amazon and already purchased it. Then that product is following you at work, on your home desktop computer… even on your phone when you’re scrolling through Facebook in bed!

As an advertiser and a user, ad muting is a welcomed option for people who aren’t opposed to ads, but opposed to obnoxious or highly irrelevant ads. What if you’re potentially reaching an audience who isn’t interested in your service or product? Muting ads allows for a more relevant and tailored ad feed by eliminating wasteful or irrelevant advertising.

This is an opportunity for advertisers to save their impressions for users who are interested in their product instead of flooding every visitor from your site regardless of intent. As a user, if I’m going to purchase an item or am really interested in it, I most likely won’t mind seeing ads – especially if you’re offering me something in return. If I’m not, I’ll just find a way to tune it out via ad blockers or by muting them. This is even more reason to be sophisticated with your audience building and increase your relevancy through ad copy testing and landing page optimizations.


3. Increase Relevance by Adding Exclusions


If you have a lead generation client looking to acquire new customers, try negating users who have visited your log-in page. If you want to be more exclusive, try negating users who have visited the site and immediately bounced or spent fewer than 10 seconds on the site through a GTM timer event (you may want to check your site speed to ensure you are giving users the most optimal experience). This should improve your engagement KPIs and reduce the number of users in your remarketing list.

4. Create Unique Experiences

As advertisers, we should be striving to put forth the best content and most relevant information for our users, not bombard them with information they already know. It is also important to craft your message to the stage of their engagement. If they’ve interacted with your website within the past 24 hours, chances are you don’t want to give them the same message later on. Study the latency of your conversions, identify the time of conversion (conversion lag) and use that time to reach your customers with a more direct response offer or something that reinforces your brand. This should also be reinforced with your landing page strategy, as well.

5. Eliminate Overlap

Make sure to be exclusive in your audiences by excluding users who may also be included in another audience. For example, make sure you are excluding users who have visited the site in the last seven days from a “Last 30 Days” list, if you have both audiences applied to a campaign. This helps eliminate overlap and lowers the amount of users experiencing high frequencies.

6. Lower Frequency Caps and Improve Quality

Want to reduce the probability of your display ads being muted? Lower the frequency caps on your display campaigns and work with your vendors to avoid fraud and poor-quality placements. Be sure to sort through your placements regularly to ensure you are appearing on highly relevant sites.

7. Save Dollars

If you create content worthy of being seen or shared, you won’t have much to worry about when it comes to ad blocking. Continue to prevent overlap and make sure you aren’t reaching the same users repeatedly — unless you have something unique to offer each time. Save your dollars for potential customers who are interested in your brand.

Looking Forward

Looking to the future, it’s possible that Google believes providing options to remove ads may reduce the likeliness that a user would install an ad blocker. There is also potential for Google to share this data with advertisers for use as an exclusion list for their campaigns. This would be extremely beneficial for user suppression in RLSA and remarketing campaigns.

With the added function of being able to mute ads, what does this mean for a company that raked in $27 billion just in the last quarter? As mentioned above, Google is seeing that people don’t want to be followed by annoying ads. For Google, this initiative seems to be about creating good customer experiences with the hope of staying ahead of the curve. Ads, for the most part, are muted or blocked because they are intrusive and/or irrelevant to a user. As marketing continues to evolve, we need to make it our mission to be as relevant and targeted as possible.

The capabilities for remarketing continue to evolve, which allows marketers to create more refined audiences and less obnoxious ads. In the future, personalization and content creation will be even more important, as well as making information relevant and useful to customers. For marketers, this might mean creating a higher value for these potential customers, which could possibly translate to higher media costs. In the coming months, as Google continues to add features like the ability to mute search ads, it will be up to us as marketers to rethink our remarketing strategies and think of audiences and their values differently.



Co-authored by Stefanie Sizemore and Mariah Madera.

The post Ad Muting: 7 Ways to Optimize Your Ads in the New ‘Opt-Out’ Google Network appeared first on The Search Agency.

The New 320-Character Meta Description Limit: How Should We Be Using It?

The Search Agents Feed - Wed, 02/21/2018 - 09:45

I remember the day that they arrived.

First came the rumours. Half-glimpses, grainy smartphone footage, tweets from folk insisting upon what they’d seen. Of course, rational folk dismissed these. There’s no way, they said, debunking social media hearsay and deriding claimants.

But then came reports from trusted news sources — not conspiracy or clickbait sites -– of spottings in the wild across the globe.

We kept our eyes open, hoping to spot one for ourselves. Was this real? Because if it was, it would turn everything upside down.

The world waited with baited breath. All we needed was confirmation from an official source. And then it came.

On December 1, Google confirmed to Search Engine Land that it had indeed increased the maximum search result snippet length from 160 to 320 characters.

The search world erupted with panic. Would this be rolled out to all search results? How would this affect the 160-character-max meta descriptions we’d been crafting for clients for years? What about SERP length – would even fewer organic results be displayed per page, owing to the increased real estate each demanded? Does this change the way we should write meta descriptions going forward?

Once we’d all stopped screaming and had a cup of tea, most digital marketers figured we could work out the answers, and how best to proceed, on our own.


Will all meta descriptions have 320 characters going forward?


Did all meta descriptions have 160 characters before? While general received wisdom was that meta descriptions should use all the space available to them, countless examples existed of pages bucking this trend and still recording strong organic CTRs.

And that’s even if Google choose to use the meta description provided in the site’s HTML – often the search engine will pull what it deems a better answer for a user query from a page’s body content. A detailed and relevant answer to a question is obviously more likely to be closer to 320 characters than 160, so informational searches are more likely to return much longer results.

Searches for a service, on the other hand, are likely to return a more varied number of snippet or meta description lengths. These sites demonstrate their quality through hard metrics such as conversion rate – so if a page with a 140-character meta description had 30 percent of users landing on it making a purchase before this change was rolled out, it’s less likely to be affected.


Will fewer organic search results be shown in SERPs?


This is the question striking fear into the pit of most SEO stomachs. With rich snippets and image, video and paid results meaning “top ten” and “page one” are no longer one and the same, are we going to have to start securing even higher positions enjoy the benefits of a page-one ranking?

Well, Google doesn’t seem to be returning significantly fewer results per page – but that doesn’t mean pages at the bottom of each SERP aren’t going to suffer.

Why? Just because search snippet and meta description length has doubled, users aren’t going to spend twice as long time browsing search results. The change means that:

  • Higher-ranked descriptions are more likely to satisfy the original query, simply by virtue of having more space to do so.
  • User patience for scrolling through search results is going to run out having seen fewer.

Monitor your pages that generally rank between positions seven and 10. If organic traffic to these pages has dropped since December, then obtaining a higher ranking needs to become an even greater priority.


How does this affect the meta descriptions I’ve written for my clients? I don’t have to go back and rewrite them all, do I? Should I be writing 320-character meta descriptions going forward?


Is it worth your or your client’s time to rewrite or expand every meta description ever written for the site? Probably not. Is it worth investigating whether you can use the new character limit to improve certain pages? Absolutely.

Check your analytics for organic CTRs which are low relative to the rest of your site. Is Google generally returning your old 160-character meta description, and if so, could more detail be added that could help drive the user to click on your site over your competitors? If so, then go right ahead.

It’s important to note here that most search results aren’t hitting the full 320-character limit. Most are coming in in the mid-200s, and most of this length are either pulling the meta description combined with a snippet of body content or the snippet of body content alone.

What’s more important than hitting the limit is ensuring what you’ve provided is relevant and enticing. Remember, your meta description isn’t a direct ranking factor — it merely offers free organic advertising space to help drive users to your site over its competitors. So yes, use the extra space to promote your business’s USPs and add more tangible product detail. Add a call to action if you didn’t have room before. However you can improve your description as a piece of advertising, do it.

But don’t see the extra space as an opportunity to stuff five more keyword variations in. Just as you now have double the room to sell your site and product, you’ve also got twice as much chance of being flagged as spam if you go about it the wrong way.

And don’t try to fill the space just because you can – if you need the full 320 characters just to spell out the purpose of a page or your business, then it’s probably slightly overcomplicated. If you’re filling up your meta description á la a student trying to reach the word count on an essay an hour before the deadline, then you’re going to end up with flabby, overwritten descriptions that deter more users than they entice.


The post The New 320-Character Meta Description Limit: How Should We Be Using It? appeared first on The Search Agency.

10 Free Keyword Tools to Help You Get Started

The Search Agents Feed - Wed, 02/14/2018 - 00:57

Every year, many dedicated and ambitious web developers work to make their keyword analysis tools a little more cutting edge and informative than the prior year. The best tools require a monthly subscription, but there are still some great free options to get you started with the keyword research process.

Here’s a list of the top 10 keyword analysis tools you’ll find online that lend a lot of free and insightful information to help you make the most out of your search marketing initiatives. The metrics used to rank them are ease of use, useful features provided for free (versus paid users) and the breadth of data provided.

Let’s start with…


10. Google Trends



Google Trends easily finds a spot on this list for being such a simple tool to query out keyword search behaviors in all of Google’s various indexes, along with regional targeting and comparative abilities to identify other significant terms. Downloading and sharing is possible through its interface, and it’s also accessible on other TLDs. This tool operates as a sort of starting point to verify whether a term is surging versus declining in user interest.

What’s not to like about it? The answer: “Hmm, your search doesn’t have enough data to show here.” Sorry, Google: That isn’t a satisfying answer because you’re Google and have a massive repository of search data from your various deployed servers. Also, the values provided don’t quite demonstrate a clear quantitative score for their breakout terms and keyword interest during the selected time frame, which other tools will have to provide, which is why this tool is in need of supplementation to demonstrate a term’s absolute popularity. Also, the search fields within Trends could be better with Google Suggest, which would help grant additional direction to finding more terms than without. The home landing page shows a list of trending stories, yet its engaging experience disappears when a query is entered. Links to relevant Google search results could also easily be provided to save the user the time of going into another browser and doing that search to see what is really there.

Ease of Use: 3.5/5
Free Features: 5/5
Breadth of Data: 1.5/5
Total: 10/15


9. Google Correlate



Google Correlate, like its sibling Google Trends, helps users review terms that have a similar search history as another term queried. Simply enter a term in the field provided or upload your own, and a list populates showing what a user might also search for, historically. It’s a great tool to explore similar themes in content marketing and can show what products might lend themselves nicely to a “You may also be interested in” sort of module. The tool renders two line graphs to show search behavior trends and their similar fluctuations over time. Like the Google Trends tool, this data can also be narrowed to different countries or refined to states in the United States. It can also be exported to a CSV file for other statistical analysis.

What’s not to like about it? It has really just one job and one job only: regionalized term trending. No total searches, related links or other quantitative data, thus making it a more narrow, back pocket tool to bring in addition to your other keyword analysis and topic research processes.

Ease of Use: 5/5
Free Features: 2/5
Breadth of Data: 3.5/5
Total: 10.5/15


8. Wordtracker



Wordtracker is a keyword discovery tool that incorporates some very different keyword metrics besides monthly searches and cost-per-click value. Wordtacker evaluates a website’s keyword relevance, as well the competitiveness of those keywords on the web and by region. It does this by seeing how often a keyword appears as linked text on a page and in the title of the same webpage. It has upload capabilities and keyword filters in its interface to manage searches that are questions, as well as narrowing/filtering any desired and undesired phrases. It deploys word clouds and a bar graph to demonstrate annual trending month-to-month. It even gives a sample of ranking pages from the SERP and populates auto-suggestion results from Amazon and YouTube.

What’s not to like about it? Free users are only able to see 20 keyword results per query and get just a handful of searches before a “sign up” request blocks usage. Exporting results is not available until a subscription plan is acquired. Subscription plans start at $27 per month, which isn’t bad. An alternative subscription plan is available at a little over double the price, but it doesn’t seem to get you a whole lot more. If you add a few more dollars to the amount of the $69 subscription plan, you would likely get more value from another digital marketing tool further down the list. Also, ask yourself if you’re a person often working on-the-go with smaller devices. If so, Wordtracker leaves a whole lot to be desired on small screens. It’s accessible, but it’s not going to win hearts of smartphone users.

Ease of Use: 4/5
Free Features: 2/5
Breadth of Data: 5/5
Total: 11/15





SERPS is a great tool to bookmark for those times when you quickly need a keyword’s monthly search volume and its projected cost-per-click. There is no fuss with this tool, no free-user counter and no perpetual “buy” popup getting thrown at you all the time — just a quick-and-easy list of matching results for a keyword search. Save them, export your list and you’re off to the races. It looks satisfactory on mobile, too.

What’s not to like about it? A search can give you over 10,000 results, but the interface only shows 20 results at a time before needing to resort to pagination. Grabbing all the terms available in one fell swoop is not going to happen here quickly. On the bright side, it does give a great list of related long-tail terms, but if you searched just a one-word term, you may not actually see that word populate within your list. A search for “robots” doesn’t give you the search volume for “robots” — it populates the list with “cool robots” or something longer tail, thus leaving you a bit in the dark. No other fancy analysis is included in the free version.

Ease of Use: 3.5/5
Free Features: 5/5
Breadth of Data: 3/5
Total: 11.5/15


6. Google Search Console


Google Search Console has been and will always be a vital weapon in every SEO’s quiver. While it cannot ever be left alone on a back burner somewhere, its role as a resource for keyword research has both positives and negatives. The two areas that offer the most direction would be the “Links to Your Site” area where a user can see pages linked to the most and the keywords associated with them. However, the best resource inside GSC is in the “Search Analytics” area where sorting and analyzing keyword positions can help even the greenest SEO figure out where the strengths and weaknesses to go after are.

What’s not to like about it? For one, only 120 days of data. But the real weakness is that it is only as good as your site performance thus far, so if you just launched and are still getting off the ground, any traffic data Search Console can offer you is going to be extremely limited. It’s far more powerful with significant time put into your website already, so you’ll have to wait to experience its full potential. For more information about this tool and its latest release, take a look at our article diving into Google Search Console’s freshest features.

Ease of Use: 5/5
Free Features: 5/5
Breadth of Data: 2/5
Total: 12/15


5. Mangool’s KWFinder



Mangool’s KWFinder finds itself high on the list of top keyword tools for being very thoughtful in scoring how pages are ranking for keywords of interest, as well as allowing users to build keyword lists to track, export and re-reference when seeking further insights. It also has a very user-friendly interface on both mobile and desktop with cleanly-designed graphs and scores for competitors’ strengths and rankings for terms of interest.

What’s not to like about it? Backlink analysis for exploring anchor text is not yet available in the tool. Even with a paid account, it limits the keyword lookup function to 100, 500 and 1200 terms daily. It also caps its suggestion count at 700, which is unfortunate since other tools can supply more. For free users, export functions are available only to grab terms; grabbing other info requires a subscription. A month-to-month subscription is going to run about $50 a month for the basic version (less if you pay the whole year), but for a few bucks more you can probably get a lot more bang from another tool.

Ease of Use: 5/5
Free Features: 2.5/5
Breadth of Data: 5/5
Total: 12.5/15


4. Moz Keyword Explorer



Everyone knows Moz. Rand Fishkin’s excels at putting out great directions and insights for new and seasoned search marketers. They also have a lot going on in their Keyword Explorer tool to capture keyword suggestions, click-through rates, keyword ranking difficulty and page-one organic results showing who ranks and why. A drop-down menu favors international usage as well. The tool is incredibly powerful and simple to use, affording visitors the ability to download 1000 keywords with a single click in and out without needing to sign up for anything. Everyone should have this tool bookmarked for that alone.

What’s not to like about it? 20 free queries capped per month. The software is also buggy at times and timed out for about half my searches during peak, mid-day hours. The monthly price is roughly $150 per month. That unlocks more results, list saving and endless searching. There are other features like site auditing functions and rank tracking that come into play, too, but those are slightly beyond the scope of the need for just a solid keyword tool. Also, if you want to play around with it on a smartphone, prepare for a more tedious experience than you might expect as its design, while great for larger monitors, doesn’t lend itself as well to handheld devices.

Ease of Use: 3.5/5
Free Features: 4.5/5
Breadth of Data: 5/5
Total: 13/15


3. SEMrush



SEMrush is well-branded as an all-in-one digital marketing suite. With it, so many insightful tools are at your disposal without needing to do much more than register for free. Keyword gap analysis is available to show what terms quantitatively you and competitors are both aiming toward. Paid and organic traffic data is charted to align with Google updates to demonstrate what may have impacted rankings. Anchor text of backlinks can be grabbed to identify strengths and weaknesses in your content authority. For paid users, a social media tracker is even available to watch how you and your competitors’ shares and audiences are growing in conjunction with published content. There is reporting out the wazoo here. No one could even begin to tell you what all of SEMrush’s bells and whistles do without needing a 15-minute break at some point.

What’s not to like about it? The subscription price point of $99.95 is rather steep for single contractors or small agencies when other reliable SEO management and discovery tools can ring up for much less. The free version of SEMrush generally only provides about 10 keyword results at a time and will promptly lock out the user with a sign-up interstitial once 10 queries are made. Also, from playing around with just about every keyword and search marketing software platform there is, this one feels the most overwhelming and busy.

Ease of Use: 4.5/5
Free Features: 4/5
Breadth of Data: 5/5
Total: 13.5/15


2. SpyFu



SpyFu is a tool focused on digging up user and competitor profiles to expose a site’s performance regarding where they have slipped off page one and where they are rising to glory. Its tabbed interface is quick and easy to navigate and is an all-around enjoyable design to play with, as it is both visually appealing and useful on both small and large screen devices. Prompts and suggestions are already supplied in a lot of areas to give a user the feeling that the software is more intuitive than they gave it credit for. The SpyFu team was kind enough to leave out any interstitials and pop-ups that would have prompted a user to sign-up to explore the tool’s latest features, which is great for a user to quickly knock out a few queries without distraction. Its overall analysis spans nicely between organic and paid placement for both your own site and competition, so nothing is getting overlooked. You can even send PDFs without ever signing in. If you choose to subscribe, the basic plan for is very reasonable at about $40 a month, which provides 5,000 tracked keywords and pretty much an unlimited setting for everything else. If you are considering other tools for a few bucks less, you should still head here first and see if you like it. It’s an impressive tool throughout.

What’s not to like about it? Not a whole lot, to be perfectly honest. There’s a large amount of scrolling needed to get through some of the tabs. Some figures in the tools feel rounded or approximated but that’s often a normal feel tool-to-tool. The biggest complaint would probably be that its free offering only allows five keyword result samples and competitor comparisons at a time when not logged in or signed up, which cuts both ways: There’s enough meat there to get me to sign up, but this can also leave the frugal person wanting to venture to another tool with more results available at their fingertips.

Ease of Use: 5/5
Free Features: 4/5
Breadth of Data: 5/5
Total: 14/15


1. Serpstat



Serpstat, the all-in-one SEO platform heavyweight comes out of Ukraine. It’s a goliath of a research tool and is probably designed by a scary group of data-mining individuals always ready to email, chat (and probably dig up dirt) on anyone using their software. The tool performs well mostly at pulling keyword and competitor data from English and Russian-language websites and search engines. A free account gets you 30 daily queries, which will populate 10 keyword results at a time. Trending graphs are available for both site traffic and for keyword popularity. Lists of 20 competitor websites from organic results by keyword commonality and visibility are also available.

Serpstat grants access to keyword lists and free exporting functions for values like CPC, total search volume, total search results and more. Other unique metrics for keyword results include a difficulty score for organic rank, the projected competition score for PPC, as well as PPC cost estimations (still not impressed? Again — nothing bought yet). For free users, you can also grab a sampling of backlink anchor text, new or lost. There is also a free user profile setup and “Tree View” feature, which allows an individual website to be entered and tracked. For this feature, not only do associated keywords the site ranks within the top 100 for show up, but also page locations, keyword search volume, search result volume and CPC estimates for first position bids.

This level of insight generally comes at subscription prices for other available tools. But here it is, right there — nothing spent. And the cost be for analyzing than 10 terms at a time? $19. A basic plan of 200 tracked keywords, 300 daily queries and 100 words per report is less than a fifth of the price of other digital marketing suites. Want more? There’s also a $70 Plan B subscription, which includes even more data. So, if you need a reliable SEO tool that gets you some quick data for free, and some worthwhile data for a fraction of the price of other tools, Serpstat tops them all.

What’s not to like about it? If you just want keyword suggestions and search volume, 10 results at a time on the free version is a little light compared to other tools (but I understand that no one wants to give the whole cow away for free). Also, the view on mobile doesn’t make me super happy.

Ease of Use: 4.5/5
Free Features: 5/5
Breadth of Data: 5/5
Total: 14.5/15

The post 10 Free Keyword Tools to Help You Get Started appeared first on The Search Agency.

Find the Right People and You’ve Found the Right Job

The Search Agents Feed - Sun, 02/11/2018 - 23:31

Doing something you are able to enjoy for your living is just as important as taking pleasure from your hobbies. However, when people ask me what I do, I tend to offer a vague answer about digital marketing and move on to more interesting topics. Thankfully, 99 percent of us are much more interesting and engaging that our job titles suggest.

I work in sales for a relatively large and successful digital marketing agency where day-to-day life is a series of fairly predictable tasks: speaking to existing clients, contacting potential new clients, dealing with internal tasks, etc., etc.

But if my working life is similar to most other sales people, then why do I enjoy my job while so many others do not? A positive mentality with reasonable clients definitely helps, but most important for me is the environment I work in and the people I work with.

Anything can be unpleasant in the wrong circumstances – a simple eight-yard pass into an open goal can be crushingly stressful when it is to clinch the title in front of 60,000 diehard fans. And it’s all the more daunting if you play for Tottenham and your team has NEVER won the title — NEVER. NOT EVEN ONCE. How is that even possible? Who knows…

I like my job and I like being at work. I live in London, so, naturally, I hate travelling there — the tube, the rain, the wind even more than the rain and the daily misery that is negotiating Oxford Circus. However, once in the office, I’m genuinely pleased to be there. Even more pleased when Tom Fairclough eventually wanders in telling stories of his wild life in London’s social scene…

It’s not just me — everyone enjoys our office environment. And our clients can tell. When we meet with potential new partners or current clients, it’s obvious how much we enjoy our work. People who like what they do tend to do it well.

Even when things do not go right, I’ve never heard even so much as a raised voice in all the time I’ve been at The Search Agency, although a few weeks ago after some DEEPLY unsporting conduct displayed on the Mario Kart course, I referred to one of my fellow competitors using a term that was neither flattering nor in the spirit of Christmas.

I benefit from having very capable colleagues, but crucially my colleagues put as much effort into being decent, friendly, affable people as they do in to excelling at their work. That’s why we all like it here and why I’m just as cheery on Monday morning as I am on Friday night…

Essentially, find the right people and you’ve probably found the right job.

The post Find the Right People and You’ve Found the Right Job appeared first on The Search Agency.

Battle of the Smart Speakers: Google Home vs. Amazon Echo

The Search Agents Feed - Wed, 02/07/2018 - 09:08

The verdict’s in: Smart home speakers are a hit, with 25 million devices sold as of 2017. The market isn’t slowing down, either: Apple’s HomePod launch this month will help grow that number to around 36 million in 2018. Smart home speakers like the Amazon Echo and Google Home and their live-in digital assistants have transformed homes all over the globe, allowing people to turn off their lights, lock their doors, get quick answers to questions and find local businesses with just their voice.

Smart speakers and digital assistants aren’t going anywhere, so is it possible for businesses to intentionally acquire a brand presence when users ask their Google Assistant or Alexa where the best place to get pizza is, or how to avoid razor burn or how to make the best chicken parmesan? Yes! All through the magic and power of search engine optimization.

The question remains, though: Which of the two will actually deliver your content in spoken word when users need it, and which should you expect to provide the most return for your efforts? Amazon Echo and Google Home both pull information from a variety of sources, but where they pull it from differs greatly. To answer this question, I asked my Google Home and Amazon Echo Dot a series of queries to learn where their wealth of knowledge stems from, which is smarter and what someone can reasonably expect from optimizing their content for digital assistants.

Which are each’s strengths? Each’s weaknesses? And how likely is each to pull your site’s content to answer one of the millions of voice searches performed each day? Let’s break this head-to-head down by type of query.



“Where can I find good Chinese food?”



The first question I asked my two smart speakers was where I can find good Chinese food around me. My Google Home gave me a brief list of a few “top-rated” local places, calling out their names and addresses. Unfortunately, what it didn’t do was provide me a resource to revisit these listings, so unless I had a pen and paper handy and was able to write with lightning speed, I wouldn’t have been able to remember even half of the places it told me. I asked Google to send those locations to my phone, but it didn’t understand my request.

Of course, Google allows users to look back on their queries through the “My Activity” page, accessible on a browser or through the Google Home mobile app. However, this doesn’t seem to be the most user-friendly solution – not when compared to the way the Echo Dot handles things. Not only did the Echo Dot verbally list out locations for me, but it also laid the information out in an attractive format in my Alexa app.



The only questionable piece of this would be how Alexa doesn’t source where these star ratings are coming from. For all I know, these could be from Yelp or from the Echo development team’s own internal rating system. Why they were food touring Chinese restaurants in the Sherman Oaks area, I dunno. But it’s a possibility. A more elegant implementation would be to not only list the source of these ratings, but also link to their Yelp or Google+ business page so users can easily access it through an app link or their browser.

When searching for local businesses, the winner here for brands would have to be the Amazon Echo. While both Google Home and Amazon Echo let users verbally select from a list and call a local business, the Echo’s way of handling things seems to put users on a more solid purchase path, allowing them the option to visually review shops and restaurants before deciding on one.



“I need a smog check.”



The next request I had for my Dot and Google Home was the conversational, “I need a smog check.” Google was immediately able to identify that I was looking for local auto shops that perform smog tests, but Alexa struggled, replying with an oddly proud, “Sorry, I don’t know that one!” Alexa struggled with a lot of these same types of queries while Google excelled, thanks to its ability to decode and understand natural language.

As smart assistants become, well, smarter, and owners of smart home speakers become accustomed to talking to them like they would another person, Google Home will have a major advantage if it’s able to understand a whole bevy of different ways to ask a question. Local businesses will gain greater exposure through Google Home, which can interpret search intent to an almost-T and dole out information as it sees appropriate. Meanwhile, Alexa will remain crippled if it can only understand basic queries like, “Find me smog test stations nearby.”



“Who’s the wife of the 44th president of the United States?”



I asked my Google Home and Echo Dot for some facts, one of which was the above query. They both responded with the correct answer, Michelle Obama, able to identify the relationship between the entities “wife,” “44th president” and “United States.” To test the capabilities of these two further, I started a new line of interrogation with both.

I asked both, “Who is the 44th president of the United States?” Easy: Barack Obama. Unsurprisingly, both gave me the correct answer. Next, I asked, “Who’s his wife?” Again, they were able to answer correctly. Next, I asked, “What are the names of her daughters?” Alexa tapped out, unable to understand what I was asking. Google, however, responded quickly: “Her daughters are Natasha Obama and Malia Ann Obama.” Sweet. Next, I asked, “How old are they?” Google only interpreted this as a question of Sasha Obama’s age, but gave me the right answer regardless. Lastly, I asked Google, “Where were they born?” That’s when Google gave up, too.

Something else notable was that while Alexa simply provided an answer to my questions, Google also cited its sources. When I asked it who the wife of the 44th president was, Google began with, “Here’s a summary from the website…”



While some consumers may not care for this added level of detail, and may actually prefer Alexa’s straight-and-to-the-point approach, it gives brands the added benefit of a spoken mention. Alexa seems to pull a lot of its information straight from Wikipedia while Google uses text found in its featured snippet blocks.



Because of this, brands may find more value in products equipped with the Google Assistant and, correspondingly, see the need to optimize for featured snippets. With Amazon Echo, unless you’re a Wikipedia editor, there’s little opportunity to have your content served up when a user asks a question. Meanwhile, since Google Home grabs info from featured snippets for a query, there’s a world of opportunity to ensure your content is what Google’s using as its trove of wisdom.

Imagine you owned and your five-star, A-plus content had nabbed you featured snippets for every single roach-related query out there. For that Google Home user whose house is infested with cockroaches, you can bet Google Home would be solidifying your website name in their head with every answer to the dozens of roach-related questions they ask. Then, later, while they’re at work, stressing over the thought of their pending extermination, they’ll type in in their browser to help ease their anxiety, securing both brand loyalty and traffic.



Even later, while at a fancy dinner party with a group of their closest friends, they’ll see a little brown bug crawl across the table, up and over their wine glass, and straight into their soup, where it’ll drown. And they’ll understandably freak out and, for a moment, consider advising their friend to just burn their home down. But then they’ll stop, and think, and remember all the amazing advice their Google Home gave them just before referring their friend to

It may seem like there was an uncomfortable level of detail in that story, but I assure you, it was only hypothetical.



“Tell me a joke.”



The last, but clearly most important, request I had for my smart speakers was to tell me a joke. Google Home responded with, “What do you call a snake wearing a hard hat? A boa constructor!” Alexa gave me, “Knock knock. Who’s there? Lettuce. Lettuce who? Lettuce in – it’s cold outside!

Both were just awful.



In conclusion…



While it has some shortcomings, Google Home is the clear winner in this battle – as far as value to brands and SEO potential goes. Amazon’s Echo line of smart speakers certainly has a lot of tricks and cool features available to it, including integration with Amazon Prime. But with the most advanced search engine in the world powering Google Home, it’s far more capable of promoting discoverability of your brand or website.

As of 2017, Amazon is dominating the smart speaker market, with 70 percent of users owning Echo devices. Although Google trails far behind with only a 24-percent market share, that margin is expected to shrink as years go by. Google Assistant, the brains behind Google Home products, isn’t limited to speakers, either: it lives on smartphones, smartwatches, laptops and more, too. Meaning if your website is optimized for smart speakers, it’s optimized for all those other platforms, as well.

Aim to gain featured snippets. Optimize your content based on the long-tail, conversational searches users are making. Tidy up your Yelp and Google+ presence. Claim your Google My Business page. Whatever methods you can use to structure your content and/or business information in a way that’s easily digestible by search engines will help immensely when it’s time for Google Homes and Amazon Echoes around the globe to answer questions.

With all the new ways users are searching, proactivity is vital. Otherwise, when a smart speaker owner wants some Italian food or a car tune-up or information about dealing with a roach problem, you’ll be left out in the cold. Just imagine a world without a strong voice search presence for The global cockroach epidemic would be catastrophic.

The post Battle of the Smart Speakers: Google Home vs. Amazon Echo appeared first on The Search Agency.