The New 320-Character Meta Description Limit: How Should We Be Using It?

The Search Agents Feed - Wed, 02/21/2018 - 09:45

I remember the day that they arrived.

First came the rumours. Half-glimpses, grainy smartphone footage, tweets from folk insisting upon what they’d seen. Of course, rational folk dismissed these. There’s no way, they said, debunking social media hearsay and deriding claimants.

But then came reports from trusted news sources — not conspiracy or clickbait sites -– of spottings in the wild across the globe.

We kept our eyes open, hoping to spot one for ourselves. Was this real? Because if it was, it would turn everything upside down.

The world waited with baited breath. All we needed was confirmation from an official source. And then it came.

On December 1, Google confirmed to Search Engine Land that it had indeed increased the maximum search result snippet length from 160 to 320 characters.

The search world erupted with panic. Would this be rolled out to all search results? How would this affect the 160-character-max meta descriptions we’d been crafting for clients for years? What about SERP length – would even fewer organic results be displayed per page, owing to the increased real estate each demanded? Does this change the way we should write meta descriptions going forward?

Once we’d all stopped screaming and had a cup of tea, most digital marketers figured we could work out the answers, and how best to proceed, on our own.

 

Will all meta descriptions have 320 characters going forward?

 

Did all meta descriptions have 160 characters before? While general received wisdom was that meta descriptions should use all the space available to them, countless examples existed of pages bucking this trend and still recording strong organic CTRs.

And that’s even if Google choose to use the meta description provided in the site’s HTML – often the search engine will pull what it deems a better answer for a user query from a page’s body content. A detailed and relevant answer to a question is obviously more likely to be closer to 320 characters than 160, so informational searches are more likely to return much longer results.

Searches for a service, on the other hand, are likely to return a more varied number of snippet or meta description lengths. These sites demonstrate their quality through hard metrics such as conversion rate – so if a page with a 140-character meta description had 30 percent of users landing on it making a purchase before this change was rolled out, it’s less likely to be affected.

 

Will fewer organic search results be shown in SERPs?

 

This is the question striking fear into the pit of most SEO stomachs. With rich snippets and image, video and paid results meaning “top ten” and “page one” are no longer one and the same, are we going to have to start securing even higher positions enjoy the benefits of a page-one ranking?

Well, Google doesn’t seem to be returning significantly fewer results per page – but that doesn’t mean pages at the bottom of each SERP aren’t going to suffer.

Why? Just because search snippet and meta description length has doubled, users aren’t going to spend twice as long time browsing search results. The change means that:

  • Higher-ranked descriptions are more likely to satisfy the original query, simply by virtue of having more space to do so.
  • User patience for scrolling through search results is going to run out having seen fewer.

Monitor your pages that generally rank between positions seven and 10. If organic traffic to these pages has dropped since December, then obtaining a higher ranking needs to become an even greater priority.

 

How does this affect the meta descriptions I’ve written for my clients? I don’t have to go back and rewrite them all, do I? Should I be writing 320-character meta descriptions going forward?

 

Is it worth your or your client’s time to rewrite or expand every meta description ever written for the site? Probably not. Is it worth investigating whether you can use the new character limit to improve certain pages? Absolutely.

Check your analytics for organic CTRs which are low relative to the rest of your site. Is Google generally returning your old 160-character meta description, and if so, could more detail be added that could help drive the user to click on your site over your competitors? If so, then go right ahead.

It’s important to note here that most search results aren’t hitting the full 320-character limit. Most are coming in in the mid-200s, and most of this length are either pulling the meta description combined with a snippet of body content or the snippet of body content alone.

What’s more important than hitting the limit is ensuring what you’ve provided is relevant and enticing. Remember, your meta description isn’t a direct ranking factor — it merely offers free organic advertising space to help drive users to your site over its competitors. So yes, use the extra space to promote your business’s USPs and add more tangible product detail. Add a call to action if you didn’t have room before. However you can improve your description as a piece of advertising, do it.

But don’t see the extra space as an opportunity to stuff five more keyword variations in. Just as you now have double the room to sell your site and product, you’ve also got twice as much chance of being flagged as spam if you go about it the wrong way.

And don’t try to fill the space just because you can – if you need the full 320 characters just to spell out the purpose of a page or your business, then it’s probably slightly overcomplicated. If you’re filling up your meta description á la a student trying to reach the word count on an essay an hour before the deadline, then you’re going to end up with flabby, overwritten descriptions that deter more users than they entice.

 

The post The New 320-Character Meta Description Limit: How Should We Be Using It? appeared first on The Search Agency.

10 Free Keyword Tools to Help You Get Started

The Search Agents Feed - Wed, 02/14/2018 - 00:57

Every year, many dedicated and ambitious web developers work to make their keyword analysis tools a little more cutting edge and informative than the prior year. The best tools require a monthly subscription, but there are still some great free options to get you started with the keyword research process.

Here’s a list of the top 10 keyword analysis tools you’ll find online that lend a lot of free and insightful information to help you make the most out of your search marketing initiatives. The metrics used to rank them are ease of use, useful features provided for free (versus paid users) and the breadth of data provided.

Let’s start with…

 

10. Google Trends

 

 

Google Trends easily finds a spot on this list for being such a simple tool to query out keyword search behaviors in all of Google’s various indexes, along with regional targeting and comparative abilities to identify other significant terms. Downloading and sharing is possible through its interface, and it’s also accessible on other TLDs. This tool operates as a sort of starting point to verify whether a term is surging versus declining in user interest.

What’s not to like about it? The answer: “Hmm, your search doesn’t have enough data to show here.” Sorry, Google: That isn’t a satisfying answer because you’re Google and have a massive repository of search data from your various deployed servers. Also, the values provided don’t quite demonstrate a clear quantitative score for their breakout terms and keyword interest during the selected time frame, which other tools will have to provide, which is why this tool is in need of supplementation to demonstrate a term’s absolute popularity. Also, the search fields within Trends could be better with Google Suggest, which would help grant additional direction to finding more terms than without. The home landing page shows a list of trending stories, yet its engaging experience disappears when a query is entered. Links to relevant Google search results could also easily be provided to save the user the time of going into another browser and doing that search to see what is really there.

Ease of Use: 3.5/5
Free Features: 5/5
Breadth of Data: 1.5/5
Total: 10/15

 

9. Google Correlate

 

 

Google Correlate, like its sibling Google Trends, helps users review terms that have a similar search history as another term queried. Simply enter a term in the field provided or upload your own, and a list populates showing what a user might also search for, historically. It’s a great tool to explore similar themes in content marketing and can show what products might lend themselves nicely to a “You may also be interested in” sort of module. The tool renders two line graphs to show search behavior trends and their similar fluctuations over time. Like the Google Trends tool, this data can also be narrowed to different countries or refined to states in the United States. It can also be exported to a CSV file for other statistical analysis.

What’s not to like about it? It has really just one job and one job only: regionalized term trending. No total searches, related links or other quantitative data, thus making it a more narrow, back pocket tool to bring in addition to your other keyword analysis and topic research processes.

Ease of Use: 5/5
Free Features: 2/5
Breadth of Data: 3.5/5
Total: 10.5/15

 

8. Wordtracker

 

 

Wordtracker is a keyword discovery tool that incorporates some very different keyword metrics besides monthly searches and cost-per-click value. Wordtacker evaluates a website’s keyword relevance, as well the competitiveness of those keywords on the web and by region. It does this by seeing how often a keyword appears as linked text on a page and in the title of the same webpage. It has upload capabilities and keyword filters in its interface to manage searches that are questions, as well as narrowing/filtering any desired and undesired phrases. It deploys word clouds and a bar graph to demonstrate annual trending month-to-month. It even gives a sample of ranking pages from the SERP and populates auto-suggestion results from Amazon and YouTube.

What’s not to like about it? Free users are only able to see 20 keyword results per query and get just a handful of searches before a “sign up” request blocks usage. Exporting results is not available until a subscription plan is acquired. Subscription plans start at $27 per month, which isn’t bad. An alternative subscription plan is available at a little over double the price, but it doesn’t seem to get you a whole lot more. If you add a few more dollars to the amount of the $69 subscription plan, you would likely get more value from another digital marketing tool further down the list. Also, ask yourself if you’re a person often working on-the-go with smaller devices. If so, Wordtracker leaves a whole lot to be desired on small screens. It’s accessible, but it’s not going to win hearts of smartphone users.

Ease of Use: 4/5
Free Features: 2/5
Breadth of Data: 5/5
Total: 11/15

 

7. SERPS

 

 

SERPS is a great tool to bookmark for those times when you quickly need a keyword’s monthly search volume and its projected cost-per-click. There is no fuss with this tool, no free-user counter and no perpetual “buy” popup getting thrown at you all the time — just a quick-and-easy list of matching results for a keyword search. Save them, export your list and you’re off to the races. It looks satisfactory on mobile, too.

What’s not to like about it? A search can give you over 10,000 results, but the interface only shows 20 results at a time before needing to resort to pagination. Grabbing all the terms available in one fell swoop is not going to happen here quickly. On the bright side, it does give a great list of related long-tail terms, but if you searched just a one-word term, you may not actually see that word populate within your list. A search for “robots” doesn’t give you the search volume for “robots” — it populates the list with “cool robots” or something longer tail, thus leaving you a bit in the dark. No other fancy analysis is included in the free version.

Ease of Use: 3.5/5
Free Features: 5/5
Breadth of Data: 3/5
Total: 11.5/15

 

6. Google Search Console

 

Google Search Console has been and will always be a vital weapon in every SEO’s quiver. While it cannot ever be left alone on a back burner somewhere, its role as a resource for keyword research has both positives and negatives. The two areas that offer the most direction would be the “Links to Your Site” area where a user can see pages linked to the most and the keywords associated with them. However, the best resource inside GSC is in the “Search Analytics” area where sorting and analyzing keyword positions can help even the greenest SEO figure out where the strengths and weaknesses to go after are.

What’s not to like about it? For one, only 120 days of data. But the real weakness is that it is only as good as your site performance thus far, so if you just launched and are still getting off the ground, any traffic data Search Console can offer you is going to be extremely limited. It’s far more powerful with significant time put into your website already, so you’ll have to wait to experience its full potential. For more information about this tool and its latest release, take a look at our article diving into Google Search Console’s freshest features.

Ease of Use: 5/5
Free Features: 5/5
Breadth of Data: 2/5
Total: 12/15

 

5. Mangool’s KWFinder

 

 

Mangool’s KWFinder finds itself high on the list of top keyword tools for being very thoughtful in scoring how pages are ranking for keywords of interest, as well as allowing users to build keyword lists to track, export and re-reference when seeking further insights. It also has a very user-friendly interface on both mobile and desktop with cleanly-designed graphs and scores for competitors’ strengths and rankings for terms of interest.

What’s not to like about it? Backlink analysis for exploring anchor text is not yet available in the tool. Even with a paid account, it limits the keyword lookup function to 100, 500 and 1200 terms daily. It also caps its suggestion count at 700, which is unfortunate since other tools can supply more. For free users, export functions are available only to grab terms; grabbing other info requires a subscription. A month-to-month subscription is going to run about $50 a month for the basic version (less if you pay the whole year), but for a few bucks more you can probably get a lot more bang from another tool.

Ease of Use: 5/5
Free Features: 2.5/5
Breadth of Data: 5/5
Total: 12.5/15

 

4. Moz Keyword Explorer

 

 

Everyone knows Moz. Rand Fishkin’s Moz.com excels at putting out great directions and insights for new and seasoned search marketers. They also have a lot going on in their Keyword Explorer tool to capture keyword suggestions, click-through rates, keyword ranking difficulty and page-one organic results showing who ranks and why. A drop-down menu favors international usage as well. The tool is incredibly powerful and simple to use, affording visitors the ability to download 1000 keywords with a single click in and out without needing to sign up for anything. Everyone should have this tool bookmarked for that alone.

What’s not to like about it? 20 free queries capped per month. The software is also buggy at times and timed out for about half my searches during peak, mid-day hours. The monthly price is roughly $150 per month. That unlocks more results, list saving and endless searching. There are other features like site auditing functions and rank tracking that come into play, too, but those are slightly beyond the scope of the need for just a solid keyword tool. Also, if you want to play around with it on a smartphone, prepare for a more tedious experience than you might expect as its design, while great for larger monitors, doesn’t lend itself as well to handheld devices.

Ease of Use: 3.5/5
Free Features: 4.5/5
Breadth of Data: 5/5
Total: 13/15

 

3. SEMrush

 

 

SEMrush is well-branded as an all-in-one digital marketing suite. With it, so many insightful tools are at your disposal without needing to do much more than register for free. Keyword gap analysis is available to show what terms quantitatively you and competitors are both aiming toward. Paid and organic traffic data is charted to align with Google updates to demonstrate what may have impacted rankings. Anchor text of backlinks can be grabbed to identify strengths and weaknesses in your content authority. For paid users, a social media tracker is even available to watch how you and your competitors’ shares and audiences are growing in conjunction with published content. There is reporting out the wazoo here. No one could even begin to tell you what all of SEMrush’s bells and whistles do without needing a 15-minute break at some point.

What’s not to like about it? The subscription price point of $99.95 is rather steep for single contractors or small agencies when other reliable SEO management and discovery tools can ring up for much less. The free version of SEMrush generally only provides about 10 keyword results at a time and will promptly lock out the user with a sign-up interstitial once 10 queries are made. Also, from playing around with just about every keyword and search marketing software platform there is, this one feels the most overwhelming and busy.

Ease of Use: 4.5/5
Free Features: 4/5
Breadth of Data: 5/5
Total: 13.5/15

 

2. SpyFu

 

 

SpyFu is a tool focused on digging up user and competitor profiles to expose a site’s performance regarding where they have slipped off page one and where they are rising to glory. Its tabbed interface is quick and easy to navigate and is an all-around enjoyable design to play with, as it is both visually appealing and useful on both small and large screen devices. Prompts and suggestions are already supplied in a lot of areas to give a user the feeling that the software is more intuitive than they gave it credit for. The SpyFu team was kind enough to leave out any interstitials and pop-ups that would have prompted a user to sign-up to explore the tool’s latest features, which is great for a user to quickly knock out a few queries without distraction. Its overall analysis spans nicely between organic and paid placement for both your own site and competition, so nothing is getting overlooked. You can even send PDFs without ever signing in. If you choose to subscribe, the basic plan for SpyFu.com is very reasonable at about $40 a month, which provides 5,000 tracked keywords and pretty much an unlimited setting for everything else. If you are considering other tools for a few bucks less, you should still head here first and see if you like it. It’s an impressive tool throughout.

What’s not to like about it? Not a whole lot, to be perfectly honest. There’s a large amount of scrolling needed to get through some of the tabs. Some figures in the tools feel rounded or approximated but that’s often a normal feel tool-to-tool. The biggest complaint would probably be that its free offering only allows five keyword result samples and competitor comparisons at a time when not logged in or signed up, which cuts both ways: There’s enough meat there to get me to sign up, but this can also leave the frugal person wanting to venture to another tool with more results available at their fingertips.

Ease of Use: 5/5
Free Features: 4/5
Breadth of Data: 5/5
Total: 14/15

 

1. Serpstat

 

 

Serpstat, the all-in-one SEO platform heavyweight comes out of Ukraine. It’s a goliath of a research tool and is probably designed by a scary group of data-mining individuals always ready to email, chat (and probably dig up dirt) on anyone using their software. The tool performs well mostly at pulling keyword and competitor data from English and Russian-language websites and search engines. A free account gets you 30 daily queries, which will populate 10 keyword results at a time. Trending graphs are available for both site traffic and for keyword popularity. Lists of 20 competitor websites from organic results by keyword commonality and visibility are also available.

Serpstat grants access to keyword lists and free exporting functions for values like CPC, total search volume, total search results and more. Other unique metrics for keyword results include a difficulty score for organic rank, the projected competition score for PPC, as well as PPC cost estimations (still not impressed? Again — nothing bought yet). For free users, you can also grab a sampling of backlink anchor text, new or lost. There is also a free user profile setup and “Tree View” feature, which allows an individual website to be entered and tracked. For this feature, not only do associated keywords the site ranks within the top 100 for show up, but also page locations, keyword search volume, search result volume and CPC estimates for first position bids.

This level of insight generally comes at subscription prices for other available tools. But here it is, right there — nothing spent. And the cost be for analyzing than 10 terms at a time? $19. A basic plan of 200 tracked keywords, 300 daily queries and 100 words per report is less than a fifth of the price of other digital marketing suites. Want more? There’s also a $70 Plan B subscription, which includes even more data. So, if you need a reliable SEO tool that gets you some quick data for free, and some worthwhile data for a fraction of the price of other tools, Serpstat tops them all.

What’s not to like about it? If you just want keyword suggestions and search volume, 10 results at a time on the free version is a little light compared to other tools (but I understand that no one wants to give the whole cow away for free). Also, the view on mobile doesn’t make me super happy.

Ease of Use: 4.5/5
Free Features: 5/5
Breadth of Data: 5/5
Total: 14.5/15

The post 10 Free Keyword Tools to Help You Get Started appeared first on The Search Agency.

Find the Right People and You’ve Found the Right Job

The Search Agents Feed - Sun, 02/11/2018 - 23:31

Doing something you are able to enjoy for your living is just as important as taking pleasure from your hobbies. However, when people ask me what I do, I tend to offer a vague answer about digital marketing and move on to more interesting topics. Thankfully, 99 percent of us are much more interesting and engaging that our job titles suggest.

I work in sales for a relatively large and successful digital marketing agency where day-to-day life is a series of fairly predictable tasks: speaking to existing clients, contacting potential new clients, dealing with internal tasks, etc., etc.

But if my working life is similar to most other sales people, then why do I enjoy my job while so many others do not? A positive mentality with reasonable clients definitely helps, but most important for me is the environment I work in and the people I work with.

Anything can be unpleasant in the wrong circumstances – a simple eight-yard pass into an open goal can be crushingly stressful when it is to clinch the title in front of 60,000 diehard fans. And it’s all the more daunting if you play for Tottenham and your team has NEVER won the title — NEVER. NOT EVEN ONCE. How is that even possible? Who knows…

I like my job and I like being at work. I live in London, so, naturally, I hate travelling there — the tube, the rain, the wind even more than the rain and the daily misery that is negotiating Oxford Circus. However, once in the office, I’m genuinely pleased to be there. Even more pleased when Tom Fairclough eventually wanders in telling stories of his wild life in London’s social scene…

It’s not just me — everyone enjoys our office environment. And our clients can tell. When we meet with potential new partners or current clients, it’s obvious how much we enjoy our work. People who like what they do tend to do it well.

Even when things do not go right, I’ve never heard even so much as a raised voice in all the time I’ve been at The Search Agency, although a few weeks ago after some DEEPLY unsporting conduct displayed on the Mario Kart course, I referred to one of my fellow competitors using a term that was neither flattering nor in the spirit of Christmas.

I benefit from having very capable colleagues, but crucially my colleagues put as much effort into being decent, friendly, affable people as they do in to excelling at their work. That’s why we all like it here and why I’m just as cheery on Monday morning as I am on Friday night…

Essentially, find the right people and you’ve probably found the right job.

The post Find the Right People and You’ve Found the Right Job appeared first on The Search Agency.

Battle of the Smart Speakers: Google Home vs. Amazon Echo

The Search Agents Feed - Wed, 02/07/2018 - 09:08

The verdict’s in: Smart home speakers are a hit, with 25 million devices sold as of 2017. The market isn’t slowing down, either: Apple’s HomePod launch this month will help grow that number to around 36 million in 2018. Smart home speakers like the Amazon Echo and Google Home and their live-in digital assistants have transformed homes all over the globe, allowing people to turn off their lights, lock their doors, get quick answers to questions and find local businesses with just their voice.

Smart speakers and digital assistants aren’t going anywhere, so is it possible for businesses to intentionally acquire a brand presence when users ask their Google Assistant or Alexa where the best place to get pizza is, or how to avoid razor burn or how to make the best chicken parmesan? Yes! All through the magic and power of search engine optimization.

The question remains, though: Which of the two will actually deliver your content in spoken word when users need it, and which should you expect to provide the most return for your efforts? Amazon Echo and Google Home both pull information from a variety of sources, but where they pull it from differs greatly. To answer this question, I asked my Google Home and Amazon Echo Dot a series of queries to learn where their wealth of knowledge stems from, which is smarter and what someone can reasonably expect from optimizing their content for digital assistants.

Which are each’s strengths? Each’s weaknesses? And how likely is each to pull your site’s content to answer one of the millions of voice searches performed each day? Let’s break this head-to-head down by type of query.

 

 

“Where can I find good Chinese food?”

 

 

The first question I asked my two smart speakers was where I can find good Chinese food around me. My Google Home gave me a brief list of a few “top-rated” local places, calling out their names and addresses. Unfortunately, what it didn’t do was provide me a resource to revisit these listings, so unless I had a pen and paper handy and was able to write with lightning speed, I wouldn’t have been able to remember even half of the places it told me. I asked Google to send those locations to my phone, but it didn’t understand my request.

Of course, Google allows users to look back on their queries through the “My Activity” page, accessible on a browser or through the Google Home mobile app. However, this doesn’t seem to be the most user-friendly solution – not when compared to the way the Echo Dot handles things. Not only did the Echo Dot verbally list out locations for me, but it also laid the information out in an attractive format in my Alexa app.

 

 

The only questionable piece of this would be how Alexa doesn’t source where these star ratings are coming from. For all I know, these could be from Yelp or from the Echo development team’s own internal rating system. Why they were food touring Chinese restaurants in the Sherman Oaks area, I dunno. But it’s a possibility. A more elegant implementation would be to not only list the source of these ratings, but also link to their Yelp or Google+ business page so users can easily access it through an app link or their browser.

When searching for local businesses, the winner here for brands would have to be the Amazon Echo. While both Google Home and Amazon Echo let users verbally select from a list and call a local business, the Echo’s way of handling things seems to put users on a more solid purchase path, allowing them the option to visually review shops and restaurants before deciding on one.

 

 

“I need a smog check.”

 

 

The next request I had for my Dot and Google Home was the conversational, “I need a smog check.” Google was immediately able to identify that I was looking for local auto shops that perform smog tests, but Alexa struggled, replying with an oddly proud, “Sorry, I don’t know that one!” Alexa struggled with a lot of these same types of queries while Google excelled, thanks to its ability to decode and understand natural language.

As smart assistants become, well, smarter, and owners of smart home speakers become accustomed to talking to them like they would another person, Google Home will have a major advantage if it’s able to understand a whole bevy of different ways to ask a question. Local businesses will gain greater exposure through Google Home, which can interpret search intent to an almost-T and dole out information as it sees appropriate. Meanwhile, Alexa will remain crippled if it can only understand basic queries like, “Find me smog test stations nearby.”

 

 

“Who’s the wife of the 44th president of the United States?”

 

 

I asked my Google Home and Echo Dot for some facts, one of which was the above query. They both responded with the correct answer, Michelle Obama, able to identify the relationship between the entities “wife,” “44th president” and “United States.” To test the capabilities of these two further, I started a new line of interrogation with both.

I asked both, “Who is the 44th president of the United States?” Easy: Barack Obama. Unsurprisingly, both gave me the correct answer. Next, I asked, “Who’s his wife?” Again, they were able to answer correctly. Next, I asked, “What are the names of her daughters?” Alexa tapped out, unable to understand what I was asking. Google, however, responded quickly: “Her daughters are Natasha Obama and Malia Ann Obama.” Sweet. Next, I asked, “How old are they?” Google only interpreted this as a question of Sasha Obama’s age, but gave me the right answer regardless. Lastly, I asked Google, “Where were they born?” That’s when Google gave up, too.

Something else notable was that while Alexa simply provided an answer to my questions, Google also cited its sources. When I asked it who the wife of the 44th president was, Google began with, “Here’s a summary from the website brittanica.com…”

 

 

While some consumers may not care for this added level of detail, and may actually prefer Alexa’s straight-and-to-the-point approach, it gives brands the added benefit of a spoken mention. Alexa seems to pull a lot of its information straight from Wikipedia while Google uses text found in its featured snippet blocks.

 

 

Because of this, brands may find more value in products equipped with the Google Assistant and, correspondingly, see the need to optimize for featured snippets. With Amazon Echo, unless you’re a Wikipedia editor, there’s little opportunity to have your content served up when a user asks a question. Meanwhile, since Google Home grabs info from featured snippets for a query, there’s a world of opportunity to ensure your content is what Google’s using as its trove of wisdom.

Imagine you owned AllAboutRoaches.com and your five-star, A-plus content had nabbed you featured snippets for every single roach-related query out there. For that Google Home user whose house is infested with cockroaches, you can bet Google Home would be solidifying your website name in their head with every answer to the dozens of roach-related questions they ask. Then, later, while they’re at work, stressing over the thought of their pending extermination, they’ll type in AllAboutRoaches.com in their browser to help ease their anxiety, securing both brand loyalty and traffic.

 

 

Even later, while at a fancy dinner party with a group of their closest friends, they’ll see a little brown bug crawl across the table, up and over their wine glass, and straight into their soup, where it’ll drown. And they’ll understandably freak out and, for a moment, consider advising their friend to just burn their home down. But then they’ll stop, and think, and remember all the amazing advice their Google Home gave them just before referring their friend to AllAboutRoaches.com.

It may seem like there was an uncomfortable level of detail in that story, but I assure you, it was only hypothetical.

 

 

“Tell me a joke.”

 

 

The last, but clearly most important, request I had for my smart speakers was to tell me a joke. Google Home responded with, “What do you call a snake wearing a hard hat? A boa constructor!” Alexa gave me, “Knock knock. Who’s there? Lettuce. Lettuce who? Lettuce in – it’s cold outside!

Both were just awful.

 

 

In conclusion…

 

 

While it has some shortcomings, Google Home is the clear winner in this battle – as far as value to brands and SEO potential goes. Amazon’s Echo line of smart speakers certainly has a lot of tricks and cool features available to it, including integration with Amazon Prime. But with the most advanced search engine in the world powering Google Home, it’s far more capable of promoting discoverability of your brand or website.

As of 2017, Amazon is dominating the smart speaker market, with 70 percent of users owning Echo devices. Although Google trails far behind with only a 24-percent market share, that margin is expected to shrink as years go by. Google Assistant, the brains behind Google Home products, isn’t limited to speakers, either: it lives on smartphones, smartwatches, laptops and more, too. Meaning if your website is optimized for smart speakers, it’s optimized for all those other platforms, as well.

Aim to gain featured snippets. Optimize your content based on the long-tail, conversational searches users are making. Tidy up your Yelp and Google+ presence. Claim your Google My Business page. Whatever methods you can use to structure your content and/or business information in a way that’s easily digestible by search engines will help immensely when it’s time for Google Homes and Amazon Echoes around the globe to answer questions.

With all the new ways users are searching, proactivity is vital. Otherwise, when a smart speaker owner wants some Italian food or a car tune-up or information about dealing with a roach problem, you’ll be left out in the cold. Just imagine a world without a strong voice search presence for AllAboutRoaches.com: The global cockroach epidemic would be catastrophic.

The post Battle of the Smart Speakers: Google Home vs. Amazon Echo appeared first on The Search Agency.

Viewability: The Importance of Adequate Measurement in Display Advertising

The Search Agents Feed - Mon, 02/05/2018 - 09:50

Viewability is one of the most important issues facing the digital advertising space today. In a recent survey by the World Advertising Research Center (WARC), it was found to be the No. 1 concern among marketers heading into 2018, with 49 percent of brands citing it as a major concern — and for good reason! Recent studies have shown that less than half of all desktop display impressions have an opportunity to be seen, and the numbers are even lower for mobile and video ads.

 

 

The Interactive Advertising Bureau (IAB) and the Media Rating Council (MRC) qualify a display ad impression as viewable if 50 percent or more of the ad’s pixels appear in the viewable space of an in-focus browser tab for at least one second post-ad render. This definition could change, as the MRC is considering changing the definition to include 100 percent of the pixels in-view instead of 50 percent, and is now applying additional duration standards, as well. For more information, see MRC’s Digital Audience-Based Measurement Standards: A Guide for Marketing and Media Professionals, released in December 2017. Viewability can have major implications for the efficacy of digital campaigns.

 

How viewability can affect the bottom line

 

Higher viewability is correlated with higher CTRs, and a recent DataXu study found that a 10-percent increase in viewability corresponded to a 100-percent increase in CTR. As such, more viewable impressions will almost guarantee you more clicks to your website. Highly viewable inventory does tend to clear at higher CPMs, so it is important to look at your viewable-CPM (vCPM) and the volume of viewable impressions delivered rather than viewability rates on their own. Let’s take a look at how viewability optimization can translate to real results for a hypothetical advertiser.

 

  Scenario One Scenario Two Budget $10,000 $10,000 CPM $5.00 $4.00 Impressions 2,000,000 2,500,000 Viewability % 70% 50% Viewable Impressions 1,400,000 1,250,000 vCPM $7.14 $8.00 Estimated CTR 0.25% 0.15% Clicks 5,000 3,750 CPC $2.00 $2.67 Conversion Rate 1% 1% Conversions 50 38 CPA $200 $263

 

The advertiser who achieved 70-percent viewability paid a higher CPM and got less total impressions, but garnered 12 percent more viewable impressions, resulting in a 67-percent higher CTR, 33 percent more clicks and a 25-percent lower CPC compared to the advertiser with 50-percent viewability. With a consistent back-end conversion rate, higher viewability led to a 25-percent lower cost-per-acquisition. Whether the goal is to drive conversions or heighten awareness and ad recall, advertisers who embrace viewability measurement are sure to achieve better results.

 

Emerging trends in 2018


 

While viewability standards are progressing, the space is evolving to address new ad types and environments. In 2017, the IAB Tech Lab acquired the Open Measurement SDK, a software development kit that will help standardize viewability measurement in the mobile space. This open source platform will help publishers avoid “SDK bloat” — the burden of incorporating multiple SDKs from different vendors, which can cause instability or slow loading times in mobile apps — and help ensure consistency in mobile viewability measurement. In the programmatic space, pre-bid technology built directly into DSPs is allowing advertisers to proactively avoid unviewable inventory rather than just monitoring retroactively.

Since Google’s Active View received MRC accreditation in 2017, agencies using DoubleClick as their primary ad server have been able to track viewability for standard ads with no additional cost or the need for a separate third-party vendor. Viewability averages vary greatly among different ad sizes — 300 x 600 ads boast the highest viewability, while 300 x 250s and 728 x 90s tend to have the lowest, so it is important to produce a variety of ad units to maximize viewability. On Facebook, which has limited compatibility with third-party ad servers like DoubleClick, advertisers have the option to verify viewability with third parties including Moat, IAS, comScore and Nielsen.

 

Navigating the available measurement options


 

Fortunately, advanced measurement tools are now more accessible than ever, so be sure to select an MRC-accredited vendor to monitor your campaigns. The following companies are MRC-accredited for viewability measurement:

 

 

Make sure that your selected measurement partner’s products and services are accredited for your particular media types. For instance, RealVu is accredited for mobile web, but not mobile in-app. Take a look at the full list of MRC-accredited vendors by ad type. With all these options available, advertisers should, at the very least, be monitoring the viewability of their display and social campaigns, if not paying only for viewable impressions. Viewable impressions will likely become the default transactional metric sooner rather than later, making it more akin to other media types. After all, if an ad can’t be seen, can it really make an impression?

The post Viewability: The Importance of Adequate Measurement in Display Advertising appeared first on The Search Agency.

Why User Experience is Important in Paid Search

The Search Agents Feed - Wed, 01/31/2018 - 08:50

Recent hunger cravings and a curiosity to try something new pushed me to go online and browse for some food services. A Google ad piqued my interest, so I clicked on it and browsed a restaurant menu. An interesting photo of “Harissa Udon Noodles” tempted me to place an order, and the food arrived shortly — within 20 minutes of checkout. So, in terms of prompt service, it scored a 10. But that’s where it all ended for me.

In terms of food presentation, it looked nowhere near like the picture I saw on the site, and when it came to taste, it left me dissatisfied. Unsurprisingly, I didn’t finish my meal and decided to not order again from the same service.

This experience is quite like what we face regularly when visiting websites. Pages loading quickly are akin to food being delivered promptly. Seeing the ad showcasing the same message on the site is similar to being served the same food as in a photograph. And you enjoying the product, post-purchase, is reflective of a good user experience.

User experience plays a very big role in paid search, too, by impacting account performance and optimization in the following ways:

 

 

1. Bounce Rate and Time Spent on Site

 

 

Imagine clicking on an ad that says “20% Off” only to find that the offer has already expired. What would you do? You would probably get off the site and crib about the bad user experience. You wouldn’t care to browse through the products because you no longer trust the site, thus impacting time spent on site.

But what does this all mean for your paid search campaigns? It all leads to unnecessary high spend and low conversion rates. By not validating whether the ad messaging matches with the landing page in terms of user experience, you ended up wasting a click that could have potentially led to a conversion. Therefore, it is highly important that you link your Google Analytics account with AdWords to pull this data in. By filtering for campaigns, ad groups and keywords with high bounce rates and low time-spent-on-site, you can quickly figure out if you need to make corrections in your ad copy or include a better landing page.

 

 

2. New Versus Returning Visitors

 

 

Paid search campaigns are an easy way of bringing in new visitors to a site because of the “discovery” aspect. Non-brand campaigns that target keywords which potential customers would be keying in can lead them to discover your website. So, even if potential customers don’t know your brand, you can target them in their moment of research/inquiry. However, that’s not the sole focus of paid search campaigns. Very few digital marketers use paid search efficiently to target returning visitors. Some visitors, who are already aware of your brand, may choose to visit it organically. However, for many others, it is highly important to keep reminding them of your brand. Remarketing plays a great role in that aspect. You can customize your messaging to them and possibly even reward them for loyalty.

Also, the cost of selling to a returning visitor is always far less than the cost involved in acquiring a new customer. Therefore, it is very important that a new visitor gets added to the pool of returning visitors because of a positive user experience the first time around.

 

 

3. Quality Score

 

 

This is the most obvious benefit of a positive user experience. Quality score is impacted by ad relevance, keyword relevance and landing page relevance. Quality score, in turn, impacts your account’s CPC and CPA. A high quality score can allow you to get better visibility with low cost-per-click. So, overall, your cost of acquiring a customer would be highly reduced with a better quality score.

 

 

4. Extensions

 

 

While most of the extensions we add to our paid search campaigns are based on what we choose, there are some that are user-driven. One of these is review extensions, which allow you to add quotes or rankings from published sources. A good recommendation on a popular online source is possible only based on positive user experience.

There are also automated extensions like seller ratings that showcase advertisers with high ratings. Google aggregates these from different sources like Google Customer Reviews and Stella Service (a company that evaluates quality of customer service), besides some independent review websites. Also, they mostly show only when you collect at least 150 unique reviews, with a composite rating of 3.5 stars or higher. So, you cannot fake these reviews. You have to offer a good user experience to be able to showcase these.

Clearly, there are a lot of reasons why you need to up your game when it comes to offering a good user experience on your site. This is even more important now, when factors like page load time and device compatibility make a huge difference. A small investment could lead to huge marketing dollar savings, both organically and inorganically.

The post Why User Experience is Important in Paid Search appeared first on The Search Agency.

Mobile Visual Sitelinks: A Case Study

The Search Agents Feed - Mon, 01/29/2018 - 08:35

Do you have a client that has a colorful array of items for sale or services and content that may be deemed visually pleasing? Were you upset when they sunset Google image ads? If your answer to any of those items was “yes,” then you may be interested in learning more about Google’s mobile visual sitelinks.

Whether or not you have a specific mobile strategy for your client, these may be appropriate for you to roll out. Because, let’s face it: the fact that you run on mobile at all is strategy enough nowadays. Using visual sitelinks is a very easy way to enhance your current sitelink extensions which show on mobile search engine results pages, and it helps you take up a good deal of results page real estate.

 

 

It is fairly easy to get started. First, you need to make sure you are opted in to this feature. You can check by looking under the “Labs” tab in the AdWords UI. If you see a “Visual Sitelinks” option, you are in business! If you don’t, you need to start nagging your Google rep ASAP. When you are opted in, all you need from the client are the images. The specifications which Google provides for images are straightforward:

Google Specifications:

  • JPG or PNG
  • 16:9 aspect ratio
  • 1280x720px
  • Max file size: 5MB
  • Color Space RGB
  • No logos, overlays or collages

After you get these images from the client, it is ALL on you. Mobile sitelinks work very similarly to regular sitelinks, and several of the same copy guidelines and rules will apply. You still can’t use the same landing page more than once (reminder: this includes whatever is being used as the keyword’s final URL). The biggest difference is that instead of two description lines with 35-character limits, you have one description with a limit of 50 characters. Title character limit is still 25. More importantly, you also should make sure that the titles and descriptions you’re writing are relevant to the image itself. You can’t say “Check Out Our Solutions for Office Efficiency” and have it run with an image of a hammock between two palm trees on a remote tropical island. It’s wishful thinking AND your extension will be disapproved.

Other than that, I’d say it is a safe bet to use the same best practices you’d use when launching anything: Relevance is the key. Strong associations between the image you use, your sitelink title and description, the search keyword which triggered it and your landing page content will only serve to help your visual sitelinks serve more often.

We completed a seven-month study testing the effectiveness of mobile visual sitelinks on an e-commerce client. This is a new and innovative tech company that has recently bolstered up their branding both offline and on. It should be noted that, for this study, we only looked at orders and not revenue.

Here are a couple general observations we noticed during this study about mobile visual sitelinks and how Google AdWords has adjusted the way they serve and report on the performance over that time span.

First, it is important to note that if you started using visual sitelinks prior to September, reporting changed significantly towards the end of August in the UI. In the beginning, all visual sitelinks that were triggered were collectively given equal credit for the impression, click, cost and conversion. Now, the individual visual sitelink that was seen or scrolled to receives the impression. Additionally, the visual sitelink that was clicked on now receives the click and the conversion. It is easier now for the user to determine which mobile visual sitelink is generating the best performance.

The second observation revolves around Google optimizing based on performance. We mentioned above that advertisers had little insight into which particular visual sitelink was performing well prior to September. In this case study, the e-commerce client had visual sitelink A (VS-A) receiving most of the impressions, clicks, etc. during September, the first month there was a delta in reporting between the visual sitelinks. Performance was subpar, with a cost per order (CPO) of $188. Visual sitelink B (VS-B) had a CPO of $32 during this time. The breakdown of impressions are as follows for September 2017:

VS-A – 81 percent
VS-B – 16 percent
VS-C – 3 percent

The Google algorithm saw this discrepancy and performance and adjusted the following month (October 2017) to focus more on VS-B. The breakdown of impressions are as follows for October 2017:

VS-A – 8 percent
VS-B – 88 percent
VS-C – 4 percent

The system made the appropriate changes by moving VS-B to the first image shown. This had a significant impact to the mobile visual sitelink performance. Performance prior to September was $470 CPO for all mobile visual sitelinks for the months of June – August 2017. October 2017 – January 2018 had a $108 CPO — a 77-percent reduction in CPO. Over time, the CPO improved to $80 from December 2017 – January 2018.

While the reduction in CPO is great, the reason for taking the time to develop assets and utilize a feature like this is to better optimize the performance of your campaigns. We need to look at mobile performance overall during this time frame versus mobile visual sitelinks to understand if the impact is helping reduce the CPO or increase the engagement (CTR) of the ads.

Here’s a side-by-side comparison by the numbers:

 

 

As an engagement ad, mobile visual sitelinks perform better than the rest of mobile. Again, mobile visual sitelinks only show 10 percent of the time when you are in position one on the SERP. There are a few factors that are in play here. The higher your position on the SERP, the higher your CTR. Mobile visual sitelinks ensure that they are only present when the ad is in position one, the highest CTR location. Brand also has a higher CTR and is most likely to show in position one over non-brand keywords, thus giving visual sitelinks an even higher probability of outpacing mobile overall. Here are the statistics against mobile branded campaigns in position one:

 

 

While there are still similar overall discrepancies, the outcome remains the same. Mobile visual sitelinks are an excellent way to take up SERP space and draw the user in to engage with your ad. Unfortunately, they aren’t a reliable source of immediate returns. Mobile visual sitelinks are best used as a branding engagement on a lead gen client. If these extensions continue to simply be eye candy and not an effective way of marketing towards a return goal, they’ll probably go the way of the dodo — or Google image ads.

 

 

Co-authored by Sean Cohen.

The post Mobile Visual Sitelinks: A Case Study appeared first on The Search Agency.

Google Search Console Beta: What’s New and What We Hope to See

The Search Agents Feed - Fri, 01/26/2018 - 13:58

In the pool of underappreciated — or, at least, poorly understood — tools, Google Search Console tops the list. It began its current life as Google Webmaster Tools and gave technical webmasters and SEOs a window into their search presence. While Google Analytics offers rich data about users after they’ve decided to visit your site, GWT brought insight to the process that brought them there in the first place.

In the world of (Not Provided), this became an even more powerful tool, offering our only, or at least most accurate, glimpse into search query data direct from the source. Beyond keywords, however, the rebranded Google Search Console gave us valuable information about everything from mobile usability and site speed to robots.txt status and structured data. If you’re not using Search Console, you’re likely flying blind.

Let’s take a look at the new Google Search Console beta release.

 

Historical Data

 

One of the biggest frustrations with GSC for anyone looking for long term insights and trends is the seemingly arbitrary and otherwise inexplicable 90-day data limit. Many marketers and webmasters, particularly the SMB crowd, lack the time resources to even warehouse years’ worth of data, much less to compile and analyze it. For those who do, processing this with R or dumping into a big data visualization tool is certainly effective, but cumbersome, to say the least.

Finally, Google is giving us lengthy data reporting baked right into the Google Search Console interface — the default selections include three, six, 12 months, and… “full duration,” which has been reported by some sources as 16 months. I can anecdotally confirm this based on my own accounts, though the existence of the data makes me wonder if Google could go back further if they wanted. Either way, year-on-year comparisons for impressions are now a thing, and that’s exciting!

 

Duplication Duplication (Pun Intended)

 

When someone in the SEO world talks about “duplication” or (gasp) “duplication penalties,” there is an immediate polarization — either they definitely know what they’re talking about, or they definitely don’t.

Back in reality, while there is no explicit duplication penalty, Google has suggested that if a web page is considered highly duplicative, then it’s inherently less valuable to a user than a page that is very unique. The exact logistics of this are a secret, of course, but it parallels how we treat everyday life, often seeking customized and varied products, from food to cars to clothing, but with full acceptance that there will inevitably be some level of overlap.

The Search Console beta now gives us a plethora of tools to determine exactly what Google considers duplicative, to what extent, and most importantly, whether it is affecting organic traffic. In a section that Google calls “Index Coverage,” you can filter the report to show pages which have been crawled but not indexed, along with broad reasoning for that choice. Some are pretty straightforward like “Excluded by ‘noindex’ tag” and “Blocked by robots.txt.” Others are much more intriguing, like “Duplicate page without a canonical tag,” and “Google chose different canonical than user.”

In both cases, based on the New Search Console support page, Google is outright telling us that these pages are (too) similar to another page, and therefore haven’t been indexed. Some are obvious — think parameters and user sessions — while others are strangely different from what you might expect. In my own digging, I’ve found that while I think page A should be the canonical for B, C and D, Google actually thinks it should be page C. Why does Google think the “Blue Widget” page should be the canonical? That’s a question for another time, but for now I can use this information to better leverage canonicals, internal linking and content armed with the knowledge that page C is quietly winning this arms race.

 

Our Search Console Wishlist

 

The ability to see over a year’s worth of impression and click data from Google SERPs is invaluable, but occasionally, the visuals are less than spectacular. I’d love to see some smoothing features added to long-term graphs, similar to what we have in Google Analytics, with day, week and month groupings.

Within the “Index Coverage” report, you can overlay an “Impressions” line above the “Valid”/”Excluded”/”Error” bars, which is quite helpful. Unfortunately, this functionality goes away when you drill down any deeper, like into a specific exclusion type, and it doesn’t include clicks or position data. This may be an intentional limitation as to not give away the secret sauce, but I’d love to see it, nonetheless.

For any given site, as has always been the case, the secure/non-secure versions, as well as www, are added as explicitly distinct entities. This makes sense, and is actually better in most cases, but the ability to view an aggregate could go a long way in reducing headaches — for me, at least. Case in point: A client moves to all secure URLs (great!), but now our historical GSC data is effectively segregated, preventing the year-over-year comparisons we raved about earlier. My guess is that this is another limitation based more on how and where the data is stored, rather than Google being mean… but still, it would be nice.

 

Using and Understanding the New Search Console

 

Overall, the latest version of this venerable tool has a lot to offer for anyone willing to dig deep. It will be interesting to see if someone will apply some machine learning to the results and perhaps glean a stronger understanding of how and why Google thinks the way it does. We may not be able to predict what RankBrain will do, but this may help turn the giant black box just slightly more white.

Have you checked out the Google Search Console beta? What insights did you find and what features are you hoping for? Share your thoughts with TSA!

The post Google Search Console Beta: What’s New and What We Hope to See appeared first on The Search Agency.

Common Communication Issues in a Modern Business Environment

The Search Agents Feed - Wed, 01/24/2018 - 08:11

Communication is at the heart of everything we do. It’s how we understand our clients’ needs and goals. It’s how we work together as a team to put those needs and goals into a cohesive and actionable strategy. Finally, it’s how we generate revenue for both our clients and ourselves. Communication is the oil which lubricates the engine of our business as well as every other successful business in the modern marketplace.

At the same time, communication can be a double-edged sword. It’s capable of demotivating and alienating employees, as well as building an atmosphere of distrust and enmity. Here are some simple mistakes we should all avoid.

Lack of awareness/ignorance: Different people have different needs and expectations or may work in an environment that gives a different context to their message. This can be a different country with different laws, currency, customs, holidays, time zone or just a different department in their own office. It pays to be aware of who your audience is and possibly try to understand them before you start.

Tone is important at any time, but especially in times of challenge or change. No matter what the circumstances, learn to pay attention to the tone of your message.

Email: The internet has transformed the way business is conducted and email has become the standard communication channel. A 2005 study found that although 50 percent of all online communication is misunderstood, senders believe that their message is being received clearly. This becomes an even greater challenge when combined with a lack of interpersonal nuances like body language and tone (again) to give context.

Avoiding the difficult conversation: We all face conflict. Unfortunately, avoiding it doesn’t make it go away. We need to learn how to plan for and carry out these situations by providing clear and actionable feedback, even when it’s not the easiest path.

Speaking more and listening less: To stay on top of any situation, occasionally stop speaking and listen. When you listen, you open yourself up to learning more about a given situation and gaining empathy for what is happening. You may even accomplish more.

Reacting instead of responding: When it’s your impulse to react with anger and frustration, just wait. Take a deep breath, engage your brain and consider all the facts (even those you may not know yet). When you pause to reflect, you respond instead of reacting.

Using communication as a weapon: This is especially true in the case of email where threatening or passive aggressive communication may go unanswered at the time of receipt but are sure to have a longer term negative effect.

Underestimating your audience: It can be tempting to gloss over issues because “people won’t understand.” Why explain goals when you can simply say, “This is what we need to do”? Your colleagues may not be masters of strategy and business planning, but they deserve to know the rationale behind changes that affect their lives and their careers. Many managers like to gloss over problems when motivating their teams. If things aren’t going well, those teams are probably aware of the problems. In fact, they probably knew about them first. Rather than avoiding the situation, enlist the skills at hand to help find solutions.

These are a few of the pitfalls surrounding communication in today’s workplace. If each of us were to be a little more mindful and take a bit more care in how we communicate, we could make sure to not only avoid them, but also take advantage of one of the most powerful tools at our disposal: each other.

The post Common Communication Issues in a Modern Business Environment appeared first on The Search Agency.

Keeping Your Tags in Check

The Search Agents Feed - Tue, 01/16/2018 - 11:28

We all use tag tools to help us validate what we are implementing with our online marketing presence. After you’ve spent time figuring out the tags you need, worked hard determining the website actions that you want to track and have implemented tags to help track them, you might wonder: How do you know your tags are firing correctly?

Every day, tags stop working as they should due to updates and changes to your page — don’t lose out on your tracking capabilities because of an inability to quickly check for problems. You’ll want to use tag debugging tools to determine which tags are firing correctly and which ones need some help. Take a look at these top browser add-ons that we at The Search Agency use to validate tags. Please note that while these add-ons are they are effective and easy to use, they are also only available on Google Chrome.

 

Google Tag Assistant

 

Tag Assistant is a great tool that is easy to use for validating Google tags like AdWords, Google Analytics, DoubleClick and Google Tag Manager tags. Once you have the add-on installed, click on the “Google Tag Assistant” icon in your Chrome toolbar, then enable add-on and navigate to the page you wish to check.

 

 

Facebook Pixel Helper

 

As indicated by the add-on name, Facebook Pixel Helper will only show Facebook pixels/tags. This add-on will indicate whether a tag has loaded successfully, as well as all the event pixels/tags that are on a page. Once installed, all you have to do is navigate to the page you want to test and Facebook Pixel Helper will indicate which pixel/tags it detects.

 

 

Bing UET Tag Helper

 

Bing’s UET Tag Helper is similar to Facebook Pixel Helper, but exclusively for checking Bing Ads tags. This add-on not only tells you if a UET tag has loaded successfully — it will also give you suggestions on how to fix any that need fixing. Like other add-ons on this list, Bing UET Tag Helper will need to be turned on before you navigate to the page that you want to test.

 

 

ObservePoint TagDebugger

 

ObservePoint TagDebugger works with most tags, eliminating the need to use multiple add-ons to validate multiple types of tags. Of all the add-ons I’ve mentioned, ObservePoint comes packed with the most features and provides the most comprehensive information. ObservePoint allows you to get additional information about tags that have fired on a page. It also comes with the option to enable recording, which will keep track of all the tags that fire as you navigate from page to page. Additionally, ObservePoint allows you to download a full list of tags it has detected. After installing ObservePoint TagDebugger, you’ll need to open Developer Tools by pressing CTRL + Shift + i and then click on the ObservePoint tab.

 

The post Keeping Your Tags in Check appeared first on The Search Agency.

An Introduction to Cryptocurrency

The Search Agents Feed - Tue, 01/02/2018 - 09:19

About two years ago, one of my friends hailed his belief in Bitcoin and explained how he moved his entire life savings onto a hardware wallet and memorized his private alphanumeric key. He sounded like a character from a Neal Stephenson novel — gritty, daring and irreverent of the world’s established banking systems. While Bitcoin has soared since then and made him a millionaire many times over, the question now is if it will continue to rise in value or if it’s already reached its height.

Cryptocurrency is digital currency that has no tangible paper or physical coin representation. Instead, encryption techniques using computers and open source software generate the currency based on mathematical proof, or blockchains. Cryptocurrencies such as Bitcoin are decentralized: There’s no one place like a bank where the currency is held, and a private security key tied to an open source ledger proves who holds the value. As an electronic payment system, cryptocurrencies are instantaneous and have low transaction fees compared to traditional banking systems, which are comparatively slow and have high fees associated.

If you are completely new to Bitcoin, check out this video explanation by one of my favorite publications, The New York Times, to kick-start your understanding.

 

 

According to Bitcoin.org, “Bitcoin is a consensus network that enables a new payment system and a completely digital money… Bitcoin is pretty much like cash for the Internet… the most prominent triple entry bookkeeping system in existence.” It began in 2009 by an anonymous character named Satoshi Nakamoto, who released a bitcoin whitepaper explaining the technology. It was the first decentralized cryptocurrency, has the highest awareness globally and has the highest value. A bitcoin digital token can be broken down to eight decimal places or .00000001, which is called a satoshi, bitcoin’s version of a penny. The price of a bitcoin fluctuates constantly akin to a stock exchange and market demand.

Critics, especially established banking systems whose foundations may be threatened by Bitcoin, aim to discredit cryptocurrencies as a volatile fad ready to crash due to overvaluation or a lack of propagation across the public. Also, some speculate that governing bodies will deem it illegal, or attempt regulation for taxation purposes. Most governments have a centralized authority for producing new physical currency and balance this process against inflation. The move from dollars sitting in private FDIC insured bank accounts to cryptocurrencies shifts power away from centralized governing bodies, and they are most certainly not going to allow it to continue rising in value without a fight to discredit or demolish it.

While it’s uncertain if Bitcoin will last the test of time (hint: it will), we have it to thank for developing the genius infrastructure of the blockchain. In the book Blockchain Revolution, Don and Alex Tapscott state, “The blockchain is an incorruptible digital ledger of economic transactions that can be programmed to record not just financial transactions but virtually everything of value.”  Blockchain is a decentralized technology that records permanent transactions across millions of machines, promising extraordinarily strong cybersecurity through its growing list of public records or blocks. The blockchain ledger is managed via open source technology, although your private security key is obviously hidden so nobody can oppose the validity of any given transfer and, more importantly, everyone can validate its permanent existence.  In summary, blockchain is a colossal ledger of transactions documented in a shared and decentralized database.

According to InvestingPR.com, blockchain technology faces a significant hurdle in its ability to scale effectively as user bases increase and ledgers expand. Often described as the fork problem, transactions may have to wait in a queue before they are memorialized into a block.  According to TechCruch, Ethereum is experiencing a scalability issue as seen in the game CryptoKitties, a game where users buy and sell virtual kittens. Using 15 percent of Ethereum’s network traffic, and being the largest contract on the network, the CryptoKitties application exposes the biggest limitation of scalability within the Ethereum blockchain technology. As the game became increasingly viral, transactions slowed down significantly and users sometimes had to make multiple attempts at an action.

InvestingPR also insightfully points out that ether and blockchain technology can be cloned and a new cryptocurrency spawned, stating: “The only thing conferring value on any currency is the consensus of the community using said currency.”

The same transparency and open source technology that makes cryptocurrencies brilliant may also act as a significant risk in that they can be copied.  With more than 2,000 digital currencies live in the market and seemingly no limitation on the number that can start up, the barrier to entry seems not to be the technological innovation, but instead marketing and user adoption. More problematic for Ethereum and Bitcoin are the chances that someone will not only replicate their technologies, but defeat their scalability issues.

However, there may be uses of the technology outside the commonly associated currency exchange, such as its use in the world of marketing. According to Digiday, one use of blockchain in advertising is via BitTeaser, a Danish ad network collecting ad revenue in Bitcoin. Digiday goes on to cite four areas where blockchain may be useful in the future of media:

  1. Ad-delivery verification decentralized and used to analyze delivery, prevent fraud, and blacklist in real time.
  2. Corporate social responsibility via the authentication of products and contracts, both public and digitally accountable.
  3. Marketing in new and creative ways such as personalized “one of a kind” products with a backstory and point of origin verifiable via blockchain.
  4. Manage consumer data with heightened privacy, anonymizing and decentralizing large amounts of transaction data.

Considering the hype around cryptocurrencies and probability that they will continue rising in value, with some predicting a single bitcoin could boom to one million USD, you may be excited to jump into the game. Caution: Keep a balanced and diversified investment portfolio with a maximum of 5 to 10 percent of your investments in high risk options such as cryptocurrency. Since there is no central governing body, there’s also no insurance or federally-backed guarantees on your investment, and no legal obligation for the exchanges to replace stolen funds or to provide support for hacked cases. Also, you may be tempted to invest in a new ICO, or initial coin offering similar to an IPO. Again, be cautious as new coins may have no long-term value. If there’s no wide adoption of the currency, if it has technical flaws or if it is priced solely on short-lived hype, there may be the risk of hyperinflation akin to what happened in Zimbabwe in 2008 when one US dollar was equivalent to more than two trillion Zimbabwe dollars.

Many have stated that blockchain is as revolutionary as the internet and establishes a level of trust via authentication and authorization far superior that of the traditional 20th century banking and credit systems. One of the easiest ways to buy the three most popular cryptocurrencies is through CoinBase, available on a desktop browser or smartphone app (the app I use — and, yes, that is a referral link that will give both me and you $10 USD value in-app if you sign-up and spend $100).  Coinbase acts as the buying platform, storage wallet and performance monitoring tool. It’s so easy to use and transfer USD via bank account or credit card that it has become the most popular app to use on iOS for buying Bitcoin, Litecoin, Bitcoin Cash and Ethereum. Other prominent exchanges include Bittrex, Poloniex and Binance, where you can buy and sell alternative forms of currency such as Ripple, Neo and Stellar Lumens.

Remember to keep a diversified portfolio and consider only allocating 5 to 10 percent of your holdings into volatile cryptocurrencies.  You should only invest an amount of money you are willing to lose, so create a long-term strategy that feels comfortable. For those of you thinking about taking out loans, maxing out credit cards or mortgaging your homes to buy cryptocurrency, I strongly advise you do not to proceed. The likelihood that cryptocurrencies will soar high is just as real as a significant crash in value.

The post An Introduction to Cryptocurrency appeared first on The Search Agency.

Build Better Creative with These 9 Best Practices

The Search Agents Feed - Wed, 12/20/2017 - 08:46

The scope and world of display advertising have evolved extensively, growing from simple static images to include video, rich media ads and much more. When developing creative, there are a few best practices that can be leveraged in order to improve user experience and maximize return. Take a look below for a few high-level recommendations that can help you build better display ads.

 

Size, Standardization and Specification

 

Since most banner ads are produced to be displayed on someone else’s website — specifically, a website your creative team does not have design control over — adhering to Google Display Network or other platform guidelines is required. When running a campaign that extends beyond a single ad type or format, this becomes even more important.

Most professionals in creative and display probably already know this, but the Interactive Advertising Bureau, or IAB, does maintain a specification of the standards most banner ads adhere to and this is a good foundation when planning new ads. The IAB Display Advertising Guidelines detail a mixture of common ad unit sizes, from small 300 x 250px sidekicks to larger 970 x 250px billboards, including older, “delisted” units that are starting to be replaced.

 

Use IAB Guidelines

 

The IAB’s guidelines go beyond just simply the size of the ad. The Bureau also distributes guidelines on specifications such as the limit of the Z-Index of the unit, the maximum video frame rate and the file size. These are only guidelines, but trying to adhere to them is a good practice that will guarantee your ad is as technically adequate as it will be aesthetically. We’ll look at a few of these components later in the article.

 

Develop Clear, Attention-Grabbing Copy and Images

 

When a customer first sees your display ad, chances are they weren’t coming to the landing page looking specifically for your ad’s product or services. Instead, they arrived on the page to see some other piece of content, and your ad is simply there alongside it. Since users aren’t attentively looking for what’s in your ad, grabbing their attention with a combination of images and clear copy is critical.

Your ads don’t have to Apple-like and minimalist with just one image and fewer than five words — there’s no hard science to an ad’s word count. Because you aren’t creating the ad to replace a website or a larger form of media, you need to trim your ad copy to just what’s enough to pique the interest of a user.

 

List Your Product or Service Benefits

 

This can mean detailing savings when they buy a product or announcing a discount for limited time. This should work alongside visual imagery that makes it clear what your ad is about. Offer users a value, a benefit to clicking on your ad — and keep it short and sweet. Make it simple but make people want to read more.

 

Include a Strong Call-to-Action

 

An essential element in your banner ad is its call-to-action. This is how you persuade users to interact with your ad and connect with your message. The call-to-action is the first step in a conversion funnel — the journey from the first ad impression to registering a sale or otherwise achieving a goal.

Your ad needs to tell potential customers what you want them to do; otherwise, they won’t do anything. Your ad must direct them to “find out more” or “buy now” so that they can easily navigate through the conversion process. Therefore, you need to create a call-to-action that is hierarchically meaningful to the visual design of your ad and very clearly lays out what to do.

 

Design Unity and Congruency

 

Just as a website’s design should have unity and congruency between the individual pages that compose it, an ad should be visually relevant to the site it takes you to. Clicking through to reveal a website with little to no visual relationship with the original ad seriously degrades the user’s experience and will likely throw most users out of the purchase process.

 

File Size

 

Ultimately, you want your users to interact with your ad, so you have to ensure that your ad is quick to appear. No one is will going to sit around and wait for a page to load, let alone a slow loading ad unit. If a user is initiating a feature like audio, video or animation in your ad, you’ll want that user to have the best experience possible. Speed is critical for an ad to be effective, and many ad delivery platforms have ad format requirements that must be adhered to. The file size of the initial ad load should be kept to a minimum.

If your file size is too large, you’ll likely lose out on potential impressions (particularly from users with slower internet connections) or create performance issues for the page as a whole. Neither of these outcomes is good for your display, so keep an eye on file size. Ideally, you should restrict your ads to under the maximum load sizes that the IAB sets out for each standard ad unit so they’re always delivered with high performance.

 

Motion and Animation

 

Utilizing animation and motion within your ad can considerably increase user engagement outcomes, but too much can easily end up annoying users instead. Building out subtle transitions and motion effects in your ads are a fantastic way of garnering attention and increasing interaction, though you should restrict the length to a non-looped 15-30 seconds (depending on the size of the unit). Although animations that run automatically on page loads don’t necessarily diminish user experience, more intense animations that are flashy and obtrusive can prove detrimental.

There are technical concerns with animation, too. Plugin-reliant technologies like Flash, while once a prominent format for animated ads, should be avoided when there are formats that are more universally available such as HTML5. This especially applies if the same ad is being used as part of a mobile campaign. Keep an eye on file size, as it can significantly increase with the addition of animations and multiple slides or frames of content.

 

Responsive Design/Alternate Sizes

 

Responsive design is key to future-proofing your ad designs and maximizing their impact on a mixture of screen sizes and devices. As mobile traffic rises, designing mobile ads that are focused on smaller ad formats becomes increasingly vital to success.

The post Build Better Creative with These 9 Best Practices appeared first on The Search Agency.

Blockchain Tech for SEO: How the Technology Behind Bitcoin Will Change Search

The Search Agents Feed - Wed, 12/06/2017 - 09:29

Unless you’ve been living under a rock for the past few years, you’ve probably at least heard of bitcoin by now. Traditional investors and analysts have been predicting its downfall for years, repeatedly calling it a bubble and comparing it to the tulip mania of the 1600s. In spite of this, for eight years bitcoin has constantly shattered expectations and led a revolution in how many world problems are being solved. The secret isn’t in bitcoin itself, but rather the technology it ushered in: the blockchain. Whether bitcoin itself ultimately succeeds or fails as a currency, blockchain technology is here to stay.

 

What is a blockchain?

 

Put simply, a blockchain is a decentralized digital ledger that keeps track of value exchange. Through the use of peer-to-peer technology, every transaction is verified in a way that prevents manipulation and ensures integrity and availability without requiring trust in a central authority. “Value” can be exchanged in the form of economic transactions (i.e. payments), as with bitcoin, or it can represent something more abstract, such as spare hard drive space or a vote for a political candidate. Value is often passed through the exchange of a cryptocurrency, of which bitcoin is an example.

Just as value can be abstracted to represent something other than money, cryptocurrencies can be used for purposes other than financial transactions. Take Siacoin, a currency that provides access to a blockchain that serves as a decentralized cloud service. Sia claims to provide cloud storage for 10 percent of the price of traditional cloud storage providers.

People who volunteer as hosts can provide hard drive space to the Sia network, which leads to payment in Siacoin. Siacoin has monetary value and can be exchanged for US Dollars on a cryptocurrency exchange, or it can be used to purchase storage on the Sia network. In a similar manner to how US Dollars might be seen as an abstract representation of labor, Siacoin is an abstract representation of storage space.

 

Blockchains and the future of marketing

 

So what does this all have to do with marketing and SEO? Everything. Everything we do centers around the exchange of value. Whether it is the purchase of digital advertising, the flow of link equity, actual conversions that happen on a site or the coordination of a network of machines to process complex crawl data, there are myriad problems that can be solved through the application of blockchain technology.

Take keyword research as an example. Google’s search results pages are now so complex that it is difficult to accurately assess the actual position of any given search term. It will vary by the searcher’s location, device type and all sorts of other factors. Now imagine that someone invented a cryptocurrency called SERPcoin that allows you to track average keyword positions across a wide variety of devices and locations.

Users could add their devices to the network and allow the SERPcoin client software to perform searches in the background, based on requests placed by marketers using the service. Results from tens of thousands of nodes could be returned very quickly, with more participation resulting in better accuracy. Marketers would place requests using units of SERPcoin, which would then be paid out to users, incentivizing participation.

This is a relatively small-scale example. In reality, I expect that we will see blockchains representing full-scale digital marketing analytics suites that can provide better data than any of today’s services provide. We are already seeing the beginning of this with services like Kochava, which has created a blockchain for ad insertion orders, and AirFox, a startup developing a blockchain-based ad network. See also IAB Tech Lab’s Blockchain Working Group, which was created with the mission of “…[investigating] the application of blockchain technology to address challenges in the digital advertising space.”

These examples are just the beginning of a larger revolution within the arena of digital marketing. Whether or not any of the aforementioned services capture the full potential of these new technologies, it’s almost certain that blockchain technology is going to be a game changer for service providers within the digital marketing industry. The companies that learn how to best harness this technology first are likely to become the new giants that pave the way in the years to come.

The post Blockchain Tech for SEO: How the Technology Behind Bitcoin Will Change Search appeared first on The Search Agency.

The ABCs of AdWords’s New UI

The Search Agents Feed - Tue, 11/21/2017 - 09:14

Google announced the AdWords interface facelift last year, which is currently in beta and will roll out to all advertisers by the end of this year. The new AdWords may look and feel different, but campaigns will run the same as today with no upgrades or migrations.

Here’s a rundown on the differences between the new and old interfaces.

 

Visual Aspects

 

a. The new interface has a vertical navigation tab layout compared to the old layout, which is at the top of the page.

b. An Overview tab has replaced the old home screen. The snapshot of sortable tables with customized metrics makes it easier to analyze the data quickly and even gives in-depth details when you hover over them.

 

 

c. All the graphs and charts come with multiple filtering options. So, the data can be sorted, and metrics can be picked. It helps in easy visualization. Also, some filters are very useful. Case in point: the filter for “keywords of good quality but low traffic” in the Keywords tab.
d. Ads and extensions are consolidated together now in the new interface.
e. Change History also comes with data visualization. Charts are expandable and collapsible, too.
f. Locations Targeting UI is useful as it shows targeting and exclusions in a map.

 

 

New Features

 

The new AdWords interface doesn’t quite have feature parity with the legacy AdWords interface as it’s still in beta, but here are the few exclusive features only available in AdWords’s new UI.

 

a. Demographic Target now includes Household Income.
b. Calls are now listed as interactions, so advertisers can adjust bids between -90 percent and +900 percent. Metrics provide insights like interaction coverage so you can see how your bid adjustment affects reach.
c. Promotion extensions show up as an additional feature in the new UI.
d. The Landing Page tab is a particularly useful new addition. This tab gives data on landing page performance along with mobile-friendliness of the website. Mobile-friendly click rate and valid AMP click rate show the percentage of clicks received on those types of pages.

 

Reports

 

This is definitely a section worth exploring. It allows you to create reports with tables and bar, line and scatter charts. There is emphasis on visualization for data digestion that is very intuitive. So every level of metric or detail is a drag and drop feature.

Analyzing data using these reports is is superior to the old interface. For example, the search term report has a graph with word clouds. Hover over a word and you will see a list of search queries that contain it and metrics related to each. The darker the border around a word or phrase means more impressions (or any other metric selected).

 

 

Custom dashboard creation is a great new addition, too. The feature appears to be still in the works but looks like it will allow you to create custom reports with graphs and charts with refreshable data that can be downloaded as a PDF.

 

Conclusion

 

The new interface seems to be a work in progress but is moving in the right direction in terms of making data more visual and digestible. It scores higher in terms of ease of use and speed but still has a few compatibility issues (as of mid-November). Some features like Display Planner, AdWords Lab and Google Merchant Center are still unavailable in the new interface.
Like any other major change, the new UI will take some time getting used to. Let’s hope it achieves its objective of making AdWords as relevant for the next 15 years as the first 15.

The post The ABCs of AdWords’s New UI appeared first on The Search Agency.

Treating People Well Matters, Part II

The Search Agents Feed - Mon, 11/20/2017 - 07:31

In this day and age, where tolerance for different opinions seems to be unfortunately waning, it’s important to take a step back and think about what it means to treat people well when you disagree with something they say, do or believe. We all have different opinions and different management styles, but it’s important to welcome these differences. I frequently don’t have the “right” answer and very often I don’t believe there is a “right” answer, so eliciting opinions and feedback from people is an essential part of ongoing improvement. While it’s nice to get opinions from people who might think similarly to me (a little validation for the ego), I find it very valuable to get opinions from people who have a different perspective to open my mind to varied solutions that hopefully drive better results.

Science seems to support this view of the benefits of diversity. In an op-ed in The New York Times, Sheen Levine and David Stark wrote about a study they conducted where people in diverse groups were 58 percent more accurate on a series of tests than those in less diverse groups. There are multiple studies reported in Scientific American and other periodicals that support that diversity drives better business outcomes.

To create an environment where diversity can flourish, it’s very important to provide people a safe platform to express opinions that might deviate from the norm. So, how can that be done effectively? I have found three key techniques that, when used well, can help create diversity of opinion in an organization.

 

1. Don’t attack

 

Way too often, particularly in the public discourse of the day, we encounter people with differing perspectives attacking each other on a personal level. Rather than debating the merits of different ideas or perspectives, we see an enormous number of personal attacks in social media and elsewhere designed to de-legitimize an opposing view. How often have you read or heard someone say that the other person they disagree with is fat, lying, ugly, etc.? What that has to do with the merit of their ideas is unclear and hopefully will be rejected and not become the American norm.

Inside an organization attacking people typically results in numerous negative consequences:

 

  • Lowering morale for both those attacked and those who witness personal attacks
  • Developing more silos as people try to avoid/not work with people who attack
  • A “head-down” culture where people are afraid of contributing
  • Worse economic results for the organization over time

 

2. Disagree only with the idea if you have a different opinion

 

Diversity naturally generates different views, and in an organization like The Search Agency, is essential as we constantly seek better ways to deliver results for our partners. One of our core values is that “there is always a better way,” and for us to discover these ways, we must be open to new ideas; in fact, they must be constantly encouraged.

Doing this will surface differing opinions but hopefully this diversity will drive better end results. If you don’t agree with something, I recommend the following:

 

  1. Listen to the idea or opinion and try to make sure you understand what is being said
  2. Ask questions to clarify and probe to classify how it fits with or diverges from your own perspective
  3. Ask for supporting facts or documentation to back up the opinion
  4. If you still don’t agree, solicit feedback from others
  5. Finally, be openly willing to change your mind and go with the “better way”

 

3. Don’t get emotional when dealing with conflicting ideas

 

Finally, I try hard not to let emotions take control when debating with someone on a particular idea. When I do, I regret both how it makes me feel and my personal actions. Below are several techniques I use to help keep emotions in check when encountering an opposing view:

 

  1. Step back and listen more carefully–maybe I’m missing something
  2. Don’t raise my voice–it never helps the situation
  3. Walk away if needed–no disagreement is worth ruining a friendship or damaging the work environment
  4. Change topics–move on to other areas where we have common ground
  5. Agree to disagree–we will never agree on everything all the time, and that’s okay.

 

I’m not always successful in following all these ideas, but I find when I do, I feel better about myself and usually better things happen.
 


Learn the importance of practicing active listening in the first installment of this series, “Treating People Well Matters, Part I.”

The post Treating People Well Matters, Part II appeared first on The Search Agency.

How to Be Ready for Mobile-First Indexation

The Search Agents Feed - Thu, 11/16/2017 - 10:32

For over a year, Google has continued to discuss its imminent transition to mobile-first indexation. In an endeavor to make its results more useful, it hopes to primarily use the mobile version of a website’s content to rank pages. When no mobile version is available, Google will continue to assess the desktop version of a website.

This change will finally reflect an ongoing change in user behavior: more Google searches taking place on mobile devices than on computers. However, Google’s ranking system still typically looks at the desktop version of a page’s content. Google’s decision to change to mobile-first indexation is to ensure that its index better serves this majority of searchers.

 

Mobile-First Indexation Best Practices

 

With their announcement, Google provided a number of recommendations to help webmasters prepare for the transition towards a mobile-first index. Below is a summary of the recommendations and how to test them now, prior to the complete roll out of mobile-first indexation that we can expect next year.

 

Content

 

Consistency is key, if you have a responsive site or a dynamic serving site where the primary content and markup is equivalent across mobile and desktop, you shouldn’t have to change anything. This is because, in most instances with these types of sites, content and markup is identical for mobile and desktop users.

If a website is configured so that content and markup is different for mobile and desktop users, changes will need to be made. Structured markup needs to be included on both the desktop and mobile version of a website.

Structured Data

 

Google’s structured data testing tool be used to assess what markup is included on both the desktop and mobile versions of websites. If the mobile version contains less structured data this will need to be resolved, ensuring Google has all the information needed to best populate its results pages. Additionally, webmasters should ensure that only the necessary structured data is added to each page, markup that is not relevant to a specific page should be avoided.

 

Accessibility

 

Content also needs to be accessible on the mobile version of the website, otherwise the content will not be assessed for relevancy when indexed. If there is a difference in the quantity of content on the mobile and desktop versions of a website, Google is more likely to only view the mobile content. This is why opting for a responsive or dynamic site is a better option, because the mobile content is identical to that on the desktop version.

Using Google’s fetch and render tool to request a URL with the mobile Google bot will clearly identify what content is displayed on the mobile version of a page. If content is not displayed correctly, or if the page is inaccessible, changes will need to be made. Alternatively, using Screaming Frog’s JavaScript rendering tool is a quicker way to view a mobile version of a website’s pages at a much larger scale.

If separate URLs are used for the mobile version of a website, the way in which the relationship between these pages is indicated does not need to change. Annotations in both the HTML and sitemaps can remain the same.

If you need support to ensure that your website is ready for mobile-first indexation when it is rolled out entirely, you can view our full list of Search Engine Optimization services and get in touch with us today.

The post How to Be Ready for Mobile-First Indexation appeared first on The Search Agency.

Streamline Your Audience and Device Targeting Strategies with IF Function Ad Customizers

The Search Agents Feed - Thu, 11/09/2017 - 07:06

In the past, advertisers have had several different types of formulas they could use within single ads to make them dynamically show desired messaging. This goes as far back as simple dynamic keyword insertion, feed-based ad customizers and countdown ads.

Google AdWords’s IF function formulas are relatively new. They’ve been available for almost a year, but advertisers are still testing their full capabilities. IF function formulas allow the creation of a single ad, which will serve differently depending on who is searching or what they’re searching with. IF function formulas allow us to customize ad copy by both audience and device.

IF function ads for audiences specifically allow us to customize copy to any valid user list in our account. This includes things like customer email lists, remarketing lists and the new In-market audience lists. Until recently, creating custom messaging by audience would have probably required that we create duplicate campaigns, using different targeting, and create two separate ads for each of these campaigns. Now all we need is one campaign, one ad group and one ad created using the formula seen below.

Let’s say my pretend client, the Greg Hotel, wants to use one type of messaging for people who are members of their rewards program and different messaging for new visitors to the site, both on the same search queries for their Dallas location. All that’s needed in this scenario is the one “Greg Hotel – Dallas” location campaign and the single ad group, with the one ad seen here with this long, complicated-looking formula:

 

 

If a rewards member found in the email list “LoyaltyMember” searched the keyword “dallas hotel,” then the formula would make it serve this ad which pushes several member benefits:

 

 

If a new visitor to the site, or one not found on that email list, searched the same keyword, then the formula would simply make it serve the more generic ad introducing the property:

 

 

Another interesting audience you could send varied messaging to are past purchasers on your site. Take, for example, a second fake client — the Boysan Appliance Dome — which specializes in selling kitchen appliances and typically tries to sell entire high-end kitchen suites. Customers who make a purchase receive a 20-percent-off coupon to use towards a future purchase. So, in your “Dishwasher” campaign/ad group, you could launch this one ad containing the IF formula:

 

 

If a visitor who had never been to the site searched on the term “dishwasher,” then the formula would trigger the more standard ad for the Rub-a-Dub Dishwasher model:

 

 

If the same term is searched by a visitor who falls into the remarketing list of past purchasers of a refrigerator (“FridgePurchaser”) at the Boysan Appliance Dome, then they would be served something a little more tailored to their experience:

 

 

As I mentioned above, we can also use IF function ads to customize messaging across different devices. Let’s use a third fake client, Carpet Diem, for this example. Carpet Diem makes and sells custom rugs, and they primarily want to push an in-store experience because there are physical samples as well as consultations involved in their business. In an in-store-specific campaign, on the search query “rug stores near me,” we might assume that someone doing a mobile search (within a specific GEO radius, and/or during a specific time of day) might be close to one of their stores. If we want to make walking into the store the call-to-action, we can use this as our one IF formula ad:

 

 

If user is out and near the store and using their phone, then they’d see the following ad:

 

 

Whereas, if a user is on their desktop, then they would see Carpet Diem’s regular local ads which urge them to make an appointment for their eventual store visit to Seize the Rug.

IF function ads for mobile are also a good way to have more control over how your messaging might be truncated when showing up on Google’s mobile search results page. Ads can be inadvertently shortened and may not make sense, so why not shorten them ourselves to get the exact messaging we want?

The Search Agency has launched IF function device ads in several accounts and it has shown us some substantial wins: Within one month of launching these ads for one (non-pretend) client, mobile CPCs decreased 7.5 percent, conversions improved by 30.8 percent and CPA went down 18.5 percent!

We’re moving closer and closer to customizing user experiences and developing entire campaign strategies around specific audiences. We’re also in an ever-changing industry that requires instant optimizations and streamlined processes. Google’s IF function formula can help us fulfill both of those needs almost perfectly.

The post Streamline Your Audience and Device Targeting Strategies with IF Function Ad Customizers appeared first on The Search Agency.

When Retargeting Gets Unruly: Signs You May Need to Wrangle Your Remarketing Programs

The Search Agents Feed - Thu, 11/02/2017 - 09:54

By this point in the evolution of digital marketing, we’ve all gotten a glimpse of retargeting programs “in the wild.” My first experience was when I had browsed Zappos looking for a gym bag and I stumbled upon a bright pink Nike duffle. Perhaps I spent too much time looking at the bag. Maybe I visited that page more than once, or maybe I just accidentally clicked into the product to bounce right back to the results page. Whatever the case, that pink Nike duffle bag refused to quit, following me around the web for a number of months, and unfortunately for Zappos, my interest in heading to the gym doesn’t seem to last more than a couple of days.

Although multiple years have passed since that pink bag stalking, I still see companies struggle when implementing retargeting campaigns. Frequently, I’m asked about what can be done to improve the end user’s experience or how investment can be optimized to improve return. Here are some of the common problems reported and recommended solutions:

 

“I see my ad everywhere, on practically every page load, and sometimes multiple times! It seems like we are overdoing it.”

 

This is one of the most common concerns about retargeting I’ve heard voiced by marketers and is amplified by the number of auctions that occur on a single page. Unlike paid search, where a single auction occurs, in the world of display, each page load can have several independent auctions happening simultaneously. To minimize or eliminate this problem, several steps are recommended.

 

1. Avoid running retargeting through multiple platforms at the same time.

 

Sure, it seems like a good idea to set up a bake-off between retargeting platforms, but this fundamental configuration sets you up for reduced control (and less-than-optimal efficiency), which causes the above problem.

When two separate retargeting programs are run concurrently, the frequency of how often a user is exposed to your ad is less controllable. One platform may have a frequency cap of three impressions per day and the other may have a cap of three per day, too. Since they aren’t communicating, that user can be exposed to six impressions per day. By running your remarketing through a single platform, you will take one step towards reducing undesirably high frequencies.

 

2. Use a system that allows for multiple levels of frequency caps.

 

As retargeting technology and audience strategy have evolved, campaign structure has become more complex. Oftentimes, this complexity is realized through multiple platform line items set to target sub-segments of your retargeting pool. For example, you may be targeting cart abandoners in one line and then retargeting FAQ page visitors under a separate line. If your system supports line item frequency caps, you again could have a frequency cap of three impressions per user on each line, resulting in six impressions per user -– an undesirable result.

However, if you implement a platform that supports frequency caps across the structure hierarchy, you can introduce line item, campaign-level and, in some systems, universal frequency caps, reducing or eliminating this problem completely.

 

3. Determine an audience priority logic and implement audience suppression.

 

As in the above example, a user might be a member of several different retargeting audiences. They could be in the homepage visitor, FAQ page visitor and cart abandoner audiences all at the same time. They could also be part of audiences defined by how recently the action occurred, for example, audiences based on activity in the last one, seven or 30 days.

If you’ve gone to the trouble of segmenting your audiences in this way, the likelihood that you have a campaign structure to treat these audiences differently is probably high. Minimize the chance of over-serving an individual by developing a targeting priority or waterfall, suppressing audiences accordingly.

 

4. Finally, use a system that supports frequency caps over the smallest period of time possible.

 

To minimize seeing multiple impressions on a single page load, make sure you’ve followed the above three tips. Beyond that, run your remarketing through a system that provides frequency cap functionality at the smallest time increments possible. While the ideal would truly be less-than-millisecond frequency caps to account for the load speed and multiple auctions taking place on a single page, look for minutes or seconds, if available.

 

“I already purchased that product; I don’t need that product anymore. Why is it still following me?”

 

This is another classic challenge that advertisers face. At several industry conferences, I’ve heard speakers or panels publicly call out marketing campaigns where this sub-optimal user experience runs rampant. Here are a few ways to prevent and minimize this problem:

 

1. Activate your purchase data and suppress.

 

This is certainly the most obvious action to take. If a user has made a purchase on your site or in your store, develop custom audiences from web activity and onboard offline data sets to suppress buyers from your program. Dependent on your business model, however, you may consider altering this straightforward approach.

In the case of retail and e-commerce advertisers with multiple SKUs, you may want to collect SKU-level data of buyers. This would enable you to stop showing the previously-purchased product, but continue to target users with complementary or recommended products.

In the case of lead generation programs, like in enterprise B2B sales, you may want to suppress lead form completions from lead-focused retargeting programs while intentionally setting up separate campaigns to nurture with alternative proof points, case studies and calls-to-action.

 

2. Implement more stringent retargeting durations.

 

If a user viewed a product 30 days ago, think about whether it is appropriate to serve them a message with that specific SKU. If I looked at a gym bag 30 days ago and you’ve already served me 30+ impressions showing that product, isn’t there a point where you might assume I’m no longer interested in buying? Alternatively, I may have purchased elsewhere and no longer have a need for your product or service.

Step back and think about this from the consumer perspective to qualitatively evaluate. Leverage your frequency, time lag and performance data to quantitatively evaluate. Once you’ve assessed both sides, put a plan in place to alter your targeting duration and messaging.

 

“I’ve seen multiple ads with two different offers. It’s confusing and it’s not what I intended.”

 

Regardless of whether you’ve taken the steps above, if you aren’t thinking about your program holistically, you may be leaving yourself vulnerable to this sub-optimal user experience and reduced program efficiency. How does retargeting fit into your larger digital marketing program?

 

1. Consolidate and suppress.

 

Similar to the retargeting platform consolidation discussed above, programs that are managed in isolation of each other have no way of communicating to reduce or eliminate the overlap. A user can fluidly move from being eligible for a prospecting auction to a retargeting auction, but their experience with your brand is not divided by rows on a budget spreadsheet.

By consolidating audience strategy, either by working with a sophisticated audience-focused partner and/or by investing in a data management, campaigns can be configured more effectively to reduce overlap for a single user. Suppressing retargeted audiences from prospecting campaigns will allow for greater control over the messages served to your audience and improve how funds are allocated to maximize program return.

Overall, the introduction and evolution of retargeting has equipped marketers with several dynamic and flexible options to reach previously engaged users. It is up to us as marketers to leverage the technology more effectively to maximize program return, and more importantly, eliminate historically unfavorable and intrusive practices to provide prospects with more relevant and meaningful brand experiences.

The post When Retargeting Gets Unruly: Signs You May Need to Wrangle Your Remarketing Programs appeared first on The Search Agency.