Archive for March 2013

As long as there have been search engines, there have been people trying to take advantage of them to try to get pages to rank higher in search engines. It’s not unusual to see within many SEO site audits a section on negative practices that a search engine might frown upon, and Google lists a number of those practices in their Webmaster Guidelines. Linked from the Guidelines is a Google page on Hidden Text and Links, where Google tells us to wary about doing things such as:

  • Using white text on a white background
  • Locating text behind an image
  • Using CSS to position text off-screen
  • Setting the font size to 0
  • Hiding a link by only linking one small character—for example, a hyphen in the middle of a paragraph

 

Those are some of the same examples described in a patent granted to Google today at the USPTO:

Systems and methods for detecting hidden text and hidden links
Invented by Fritz Schneider and Matt Cutts
Assigned to Google
US Patent 8,392,823
Granted March 5, 2013
Filed: August 25, 2009

Abstract

A system detects hidden elements in a document that includes a group of elements. The system may identify each of the elements in the document and create a structural representation of the document.

The structural representation may provide an interconnection of the group of elements in the document. The system may also determine whether one or more elements of the group of elements are hidden based at least in part on locations or other attributes or properties of the one or more elements in the structural representation.

Unsurprisingly, one of the co-inventors behind the patent is Google distinguished engineer Matt Cutts, who has spent a good part of his long career at Google exploring the many different ways that people might try to spam the search engine, and find some solutions.

I really enjoy seeing patents like this one, which may not tell us something new, but provide a reference resource that other people, including clients, can be pointed towards. They sometimes fill in some gaps on how a search engine might do something, and provide some history.

For example, this patent is based upon an earlier one that was first filed in 2003, and it’s not hard to imagine people at the Google of that time trying to figure out how to automate a way to identify text and links that might be hidden by being the same color as the background they appear upon, or being obfuscated by cascading style sheets, or written in lettering so small that it appears to be a line rather than actual text.

The Guidelines above mention the use of a single small character in a paragraph being used as a link, and the patent mentions that extremely small (1 pixel X 1 pixel) images have also been used as hidden links on pages.

As the patent also notes, CSS allows webmasters to mark a block of text as hidden, or to position it outside of visible areas of a page. Java script can also be used to hide text, and to modify documents to replace text.

Part of the process behind identifying hidden text or links on a page may involve analyzing the HTML structure of a page and its elements, such as divisions or section, headings, paragraphs, images, lists, and others. It looks at a Document Object Model (DOM) of pages to learn things about those different elements, their sizes, positions, layer orders, colors, visibility, and more.

The patent provides a few different examples of when hidden text might be found on a page, such as in the following:

In this example, server 120 may detect that the webmaster has overridden the value of the <h2> tag. Normally, the “h2″ tag is a heading size, in which H1 is very large, H2 is a little smaller, H3 is smaller still, etc. Here, the webmaster has used CSS to override the value of h2 to mean “for all text in the H2 section, make the text color almost completely black, and make the height of the font be about one pixel high.”

A viewer of this document would not see the text because it is so small, but a search engine may determine that the text is relatively important because of the H2 heading label. In this situation, server 120 may determine that the text in the H2 section is very small, which can indicate that the webmaster is attempting to hide the text in this section.

Conclusion

There are some times when designers use hidden text because they want to use a font on a page that isn’t a standard system font that might come with Windows or Apple or Linux computers, and the page won’t render the way they want. Google’s John Mueller has noted in the past on Google’s Webmaster Help Forum that is probably not a problem:

Hi Eric

If you are using image replacement techniques and replacing the text with an image that is equivalent (with the exact same text in approximately the same visibility) then that is generally fine. This provides a nice user experience and still lets those who cannot access the images (eg crawlers or vision-impaired users) use your website normally.

Hope it helps!

John

As I noted above, one of the things that I really appreciate about this patent is that it provides another place to point people to when discussing things like hidden text and links other than just Google’s help pages on the topic. It also puts the problem in the framework of a business that is trying to address a challenge rather than a web institution laying out a guideline that it expects people to follow.

This week, we announced the release of our newest tool, Fresh Web Explorer. We’re so excited to give marketers incredibly recent data in a tool to keep track of their mentions and links in a scalable way.

In today’s Whiteboard Friday, Rand walks us through improving our marketing through fresh links and mentions, and he explains how you can use Fresh Web Explorer to achieve the best results.

Excited about Fresh Web Explorer? Have questions you’d like answered? Leave your thoughts in the comments below!

“Howdy SEOmoz fans, and welcome to another edition of Whiteboard Friday. This week, as you may know, we’ve been very excited to release Fresh Web Explorer. It’s one of our latest tools. We’ve been working on it for a long time. A lot of work and effort goes into that project. Huge congrats and thank you to Dan Lecocq and Tamara Hubble and to the entire team who has been working on that project. Kelsey and Carin and everyone.

So I wanted to take some time and talk through the value that marketers can get from Fresh Web Explorer and not just from Fresh Web Explorer, because I realize it’s one in a set of tools, but also from things like doing regular Google 24 hour searches to look for brand mentions and links, using other tools like Radian6 or an uberVU, which is inside empowering, Raven Tools fresh links and fresh mentions section. You can do a lot of these things with any of those tools.

I’m going to focus on Fresh Web Explorer for this part, but you can extrapolate out some ways to use this stuff in other tools too.

So number one, one of the most obvious ones is trying to find opportunities for your brand, for your site to get coverage and press, and that often will lead to links that can help with SEO, lead to co-occurrence citations of your brand name next to industry terms, which can help with SEO, could help with local for those of you who are doing local and have local businesses mentioned. It certainly can help with branding and brand growth, and a lot of times helps with direct traffic too.

So, when I perform a search inside Fresh Web Explorer, I’m getting a list of the URLs and the domains that they’re on, along with a feed authority score, and I can see then that I can get all sorts of information. I can plug in my competitors and see links, who’s pointing to my competitor’s sites. Perhaps those are opportunities for me to get a press mention or a link. I can see links to industry sites. So, for example, it may not be a competitor, but anyone who’s doing coverage in my space is probably interesting for me to potentially reach out to build a relationship with.

Mentions of industry terms. If I find, you know whatever it is, print magazines that are on the web, or blogs, or forums, or news sites, feeds that are coming from places that are indicative of, wow, they’re talking about a lot of things that are relevant to my industry, relevant to my brand and to what our company’s doing, that’s probably an opportunity for a potential press mention.

Mentions of competitors brands. If a press outlet is covering, or a blog or whoever, is covering one of your competitors, chances are good that you have an opportunity to get coverage from that source as well, particularly if they try to be editorially balanced.

Mentions of industry brands. It could be that you’re in an industry that, and you’re not necessarily competitive with someone, but you want to find those people who are relevant to your brand. So for example, for us this could include things like a brand like Gnip or a brand like HubSpot. We’re not competitive with these brands, SEOmoz is not. But they are industry brands and places who cover Gnip and HubSpot may indeed cover Moz as well.

Number two, I can find some content opportunities, opportunities to create content based on what I’m discovering from Fresh Web Explorer. So I plugged in “HTC One,” the new phone from HTC, and I’m looking at maybe I can curate and aggregate some of the best of the content that’s been produced around the HTC One. I can aggregate reviews, get really interesting information about what’s coming out about the phone. I might even be able to discover information to share with my audience.

So, for example, we focus on SEO topics and on local topics. If we expect the HTC One to be big and we want to cover several different phones and how that’s affecting the mobile search space, we can look at their default search providers, what sorts of things they do in terms of voice search versus web search, whether they have special contracts and deals with any providers to be tracking that data and who that might be going to, all those kinds of things, and we can relate it back to what we’re doing in our industry.

You can also Fresh Web Explorer to find the best time to share this type of information. So, for example, the HTC One comes out and maybe you’re working for a mobile review site and you’re like, “Oh, you know what? This has already been covered to death. Let’s do something else this week, or let’s cover some other stuff. Maybe we’ll hit up the HTC One.” Or, “Boy, you know what? This is just starting to get hot. Now is a great time to share. We can get on Techmeme and get the link from there. We can be mentioned in some of the other press coverages. We still have a chance, a shot to cover this new technology, new trend early on in its life cycle.”

Number three, we can track fresh brand and link growth versus our competitors. So a lot of the time one of the things that marketers are asking themselves, especially in the inbound field is, “How am I doing against my competition?” So I might be Fitbit, which is a Foundry cousin of ours. They’re also funded by Foundry Group. They compete with the Nike FuelBand, and they might be curious about who’s getting more press this week. We released a new version of the Fitbit, or we’re about to, or whatever it is, and let’s see how we’re doing against the Nike FuelBand. Then when we have our press release, our launch, let’s see how that compares to the coverage we’re getting. Where are they getting covered that we are not getting covered? Where are we getting coverage where they are not?

We can then use things like the CSV Export feature, which is in the top right-hand corner of the Fresh Web Explorer, and we can look at CSV Export to do things like, “Oh, I want to filter out these types of sites. Or I only want a report on the high feed authority sites versus the low feed authority one. So I want to see only the places where my coverage is high.”

A note on feed authority though. Be very careful here because remember that a great page on a great site might be discovered through a low quality feed. It could be that a relatively junky feed is linking to some high quality stuff. We’ll discover it and report on the feed authority of the source where we discovered it. So you may want to try using metrics like page authority and domain authority to figure out where are you being mentioned and is that a high quality site, not just feed authority.

All right. Number four. Find fresh sources that link to or mention two or more of your competitors, but don’t mention you. Now, this has been a classic tool. We’ve had a tool in our library at Moz, which is similar to SEO Book’s HubFinder. Ours is called the Link Intersect tool, and what you can do here is you can plug in something like some ice cream brands and see how it writes. So “Full Tilt” and “Molly Moons” ice cream, and I actually want to put quotes around those brand names so that I can get mentions every time someone mentions the Moon and the name Molly that would pop in there, that wouldn’t be ideal, minus D’Ambrosio, which is the best Seattle ice cream shop obviously. It’s a gelateria. It’s fantastic. Side note, it’s possible that maybe owned by my cousin-in-law, but shh, let’s not tell anybody.

Okay, and then if I’m Marco over at D’Ambrosio Gelato, I can see where are Full Tilt and Molly Moons getting mentioned that aren’t mentioning me. If it’s, “Hey, there was an article in The Stranger about ice cream and they didn’t cover us.” And, “Hey the Capitol Hill blog didn’t cover us.” Maybe they don’t know that we also have a Capitol Hill location. We should get in there and talk to those folks. We should mention, maybe leave a comment, maybe just tweet at the author of the post, whatever it is and tell them, “Hey, next time you cover ice cream, you should also write about us.”

Number five. Compare sources coverage. So this is actually a bit of a teaser, and I apologize for that. So the operator site colon will not be available at lunch. So when you’re watching this video, you probably can’t use the site colon operator to see different sources and to run a search like the CRO site colon SEOmoz. However, it will be coming soon.

When it is, you’ll be able to compare, hey is SEOmoz or is HubSpot more active in covering the CRO topic? Are there different sources out there that maybe don’t have coverage of a topic and I could go and pitch them for a guest post? I could find those content opportunities. I could know if a topic is saturated or if it hasn’t been covered enough. Maybe I find sites or blogs that might be interested in covering a topic that I would like them to write about. I can see who’s covered and who hasn’t using this site colon operator to figure out the source and the level of coverage that they might have or not.

The last one, number six, is really about reporting. Fresh Web Explorer is going to show you these great sort of trends about how is a particular term or phrase or link doing, links to a site, mentions of a brand name, mentions of a phrase or an industry term, whatever it is. So I can plug in things like my brand, SD, which is our link operator for just seeing links to anything on the sub-domain. I can plug in my sub-domain, and then I can see, here’s how that’s gone over the past 7 days or 30 days. I can screen shot that and put it in a report. I can download using the export functionality. I can download the CSV and then filter or scrub.

A lot of times, for example, PR companies, companies that help you with your press will do this type of work. They’ll assemble this kind of reporting. In fact, at Moz we use a firm called Barokas here in Seattle. Every week they send us a report of here are all the places that you were mentioned, and here are places that mentioned industry terms and that kind of stuff, which is really nice, but you’re oftentimes paying a lot of money to get that reporting. You can actually do that yourself if you don’t have a PR company that you’re already using for this type of work. Of course, if you are a PR company, this might be an option for you to do that type of reporting.

These six, they are only scratching the surface of what you can do with Fresh Web Explorer, and I don’t doubt that I haven’t thought of hundreds of uses yet for the data that’s inside Fresh Web Explorer. I really look forward to seeing some cool creative uses from you guys out there, and I hope that you are enjoying the product. If you would like, please give us feedback. I know the team would love to hear from you on this, and they’re constantly working and iterating and updating and adding in things like the site colon operator. So very cool.

Thank you very much, and we will join you again next week for another edition of Whiteboard Friday. Take care.”

Matt Cutts at SXSWHad a bad experience purchasing from an online merchant? Google says it wants to protect searchers from that, and it may crackdown later this year with changes intended to prevent bad merchants from ranking well.

The news came during the “How to Rank Better in Google & Bing” session that I moderated yesterday at the SXSW conference in Austin. Google’s chief Web spam fighter Matt Cutts responded to concerns one merchant had about bad competitors outranking him.

Cutts said:

“We have a potential launch later this year, maybe a little bit sooner, looking at the quality of merchants and whether we can do a better job on that, because we don’t want low quality experience merchants to be ranking in the search results.”

Google’s Previous Crackdown

This isn’t the first time Google’s done a crackdown. In November 2010, the New York Times ran a big feature about a sunglasses merchant called Decor My Eyes, and how the owner Vitaly Borker was convinced that people complaining about him online helped him rank better. The exposure of his bad business practices later lead to Borker getting a four-year jail sentence.

Whether those bad reviews really did help Decor My Eyes do better is debatable, but for whatever reason, the site was doing well in Google at the time, despite having such bad reviews.

Google reacted with unprecedented speed, making a change within days that it said would penalize bad merchants. It never explained what factors were used to issue penalties, not even confirming if poor quality reviews had an impact.

Looking At Signals Beyond Bad Reviews?

Of course, if Google already has a system in place to penalize bad merchants, why are they apparently still ranking well, in some cases?

I’ll try to follow up further with Google about this, but one factor might be the continued growth in fake reviews. You can’t rely solely on reviews for assurance a business is really good.

That, of course, would mean that reviews were being used as part of the previous crackdown. Cutts seemed to confirm this when I asked, “if Google is already using review data, then what other signals would it turn to as part of a renewed effort.”

Cutts replied:

“We are trying to ask ourselves, are there other signals that we can use to spot whether someone is not a great merchant, and if we can find those, and we think that they are not all that spammable, then we’re more than happy to use those.”

Related Articles

twitter-app-iconTwitter announced search upgrades to the Twitter mobile app. The upgrades includes an improved relevancy engine for top tweets, improved auto-completion for search and an easier web browser for results.

Top Tweets In Search Results

Some searches within the mobile Twitter interface may separate out top Tweets from the rest of the results. To see additional Tweets from this time period, tap on “View more from this time.” Here is an example of how this looks:

mobile-tweets

Search Auto-Complete Update

Now the autocomplete feature for searching or even tweeting is improved. You will now see more hashtag, topic and username suggestions as you type. In addition, the suggestions offered within autocomplete will be updated more frequently and feel real-time. Here is a screen shot:

autocomplete-twitter

Web Browsing Within Twitter

Now when you open a link from a Tweet that Tweet wil be displayed at the bottom of the app’s built-in web browser. So instead of using the default browser on the device, it will use the built-in browser. Twitter says “this provides additional context to the page you’re viewing, and makes it easy for you to retweet, favorite or reply to the Tweet as you’re reading an article or watching a video.”

At SES London 2013, Marcus Tober, the founder and CTO of Searchmetrics, and Will Critchlow, the Founder and Chief Strategist of Distilled, tackled some thorny topics during the session on “Meaningful SEO Metrics.”

Tober kicked off with some new technical data.

He said “SEO visibility” was one of his most important metrics. He acknowledged that ranking on single keywords are worth less because of personalization, localization, and search history.

However, he said that the cumulative number of all relevant keyword rankings for your market or industry will show you important trends – especially after updates, re-launches, and technical re-brushes. These issues are independent from seasonal effects or traffic spikes that are based on temporary events.

homeaway-rebranded-after-penguin

As an example, he showed what happened to the SEO visibility of Holiday-Rentals.co.uk last year after Google’s Penguin update on April 24, 2012. He also showed that rebranding the site HomeAway.co.uk hadn’t fixed site’s link profile, so its SEO visibility hadn’t recovered.

Tober also shared the results of several experiments conducted last year in the Searchmetrics Labs on social signals. The results showed:

  • Google Chrome “sees” traffic, but does NOT index postings.
  • However, Google+ triggers instant indexation.
  • Many Facebook shares also trigger instant indexation.
  • Pinterest and Twitter don’t have any impact on indexation.

This led Tober to dispute a statement by Matt Cutts, Google’s Distinguished Engineer. In a Google Hangout, Cutts was asked, “Do Google +1’s affect a website ranking?” He answered, “Not really a direct effect, but … we have an authorship proposal.”

Tober said, “+1 influences search!” Based on an analysis with different unique postings, “URLs with a +1 are being indexed instantly and rank for the title as well as some longtail queries.” They also influence Google News search results if you’re logged in.

Tober then took a look at Author Rank. Do you get higher rankings with an author profile? He quoted Eric Schmidt, Google’s Executive Chairman, who writes in his book that, “Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results. The true cost of remaining anonymous, then, might be irrelevance.”

Critchlow followed with an analysis of some of the meaningless SEO metrics being demanded by many C-level executives.

As knowledge of SEO practices moves from the cubicles of search engine optimizers to the boardroom, the standard metrics used by a group of very talented techies called webmasters are straining under the weight of the all-powerful bottom line. The days when upper management was impressed by subtle changes in PageRank have been replaced by questions about Lifetime Value (LTV), Return on Investment (ROI), and Cost per Acquisition (CPA).

But Critchlow said, “They are all broken or hard to measure for SEO.”

As for LTV, it’s easy to double-count because “analytics packages don’t track people, they track cookies.”

As for ROI, it assumes costs scale with success. But, he asked, “Would you rather spend $1 and make $1,000 or spend $100 and make $2,000?”

As for CPA, it’s hard to get fully-loaded costs. For SEO, should development costs be included? And it also assumes that costs scale with success.

Critchlow concluded, “There is no one metric.”

Instead, he urged SEOs to “understand how the business makes money. Build a simple model. And remember, the best metrics guide behavior.” He added that key performance indicators (KPIs) should also guide executive behavior: “This is doing fine – go think about something else.”

Critchlow recommended that SEOs “measure activity and outcomes.” He added, “Campaigns have two failure modes. We don’t do what we wanted to: blog posts shipped, contacts made, pages updated, development tickets completed. Or what we do doesn’t work well: not enough people read our posts, too many people ignore our emails, and bug fixes don’t move the needle.”

He concluded, “Only worry about costs sometimes. Always consider the margin on an incremental sale.”

focus-on-cohorts

Critchlow than provided an example from the launch of DistilledU, which provides online SEO training. He focused on cohorts, a group of students working together through the same academic curriculum. When Distilled announced the inclusion of all its video content in DistelledU, the percentage of signups who upgraded to paid and engaged users almost doubled.

These are the kind of meaningful SEO metrics that you can take to the bank.


 

integrateAs Google’s Enhanced Campaigns announcement news has reminded us, it’s critical for SEM practitioners to make sure the tools they use grow with them. When Google makes a change like this, the impact is felt far downstream.

Most major SEM platforms will adapt to these changes. But that doesn’t necessarily take into account how something like this affects other channels or how you need to look at your data.

For marketers looking to bust out of individual channel silos and reap the advantages of integrated marketing, this makes it even more critical to plan ahead for growth and change.

The first step is being able to understand the benefits of simplifying the chaos that is multichannel online marketing.

Single Data Source

This leads to consistency throughout efforts. The ability to see paid search, display and social marketing data, for example, in a single dashboard and with detailed reporting creates huge workflow efficiencies, as marketers only need to log into, and understand in detail how to use, a single platform.

Whether each channel is managed by a different user, a separate agency or by the same team, this continuity in data and workflow ensures smoother and more reliable work. It also means not having to switch between browser tabs or applications to see data for different channels.

Fewer Errors

A single platform is less confusing, more efficient, and also cuts down on the opportunity for errors introduced by forgetting which system someone is working in or how to accomplish the same task between different systems.

Even little things, like whether you have to hit “submit” after making a change, can differ between platforms and can result in changes a marketer thought they had made never going through.

True Attribution

A single platform also means that the data that lives within a given channel is already de-duped and attributed without going through a maze of technology, such as going through an API to a separate system for attribution against another channel’s data, which is coming in from yet a third platform, to be processed and then sent back to the original system.

Having this data together opens the door to innovative ways to understand users, their actions and the effect of media on them – such as optimizing based on engagement levels.

A single system also cuts down on potential points of failure or lag between disparate solutions as well as potential for delays or errors in one system creating problems downstream that then have to be fixed by multiple partners.

Removal of Silos

Properly-aligned technology results in a single – and consistent – source for training, technical support, and best practices. It also cuts down on the wild goose chase that results from different vendors pointing to one another for answers or fixes.

Deeper Insights, Greater Returns

With these insights, budgets can be more intelligently, and even automatically, managed and returns on investment can reach new heights.

For marketers working toward true integrated online marketing, the biggest obstacle in the way of reaching this goal is often their technology and tools. The current arena of marketing management technologies is oversaturated with single point solutions with multiple platforms existing for every aspect of digital marketing.

Navigating this complex landscape is a daunting task and presents numerous challenges to managing integrated marketing. What is most frustrating for many marketers is that the technology they sought to make their lives easier is actually the piece of the puzzle that will hold them back from integrating their marketing in an effective way.

Integrating your online marketing tools can be difficult, but the benefits are sizeable. Done correctly, not only does it allow for a more efficient use of time and resources, but also offers the potential for deeper insights and greater returns on marketing investments.


 

want to tell you a true story about a discount store from the 1970s called D.B. Sales.

Now, before you start yelling…

“Join me in the 21st century, Grandpa! We have the Internet, Snuggie blankets and millions of cat videos to watch.”

…give me a chance to explain. I promise to make it worth your while.

D.B. Sales was run by Morris and Tessie Benatar — friendly, hard-working folks who were trying help their small business succeed. The problem is, in the mid-70s, their business wasn’t doing too well. Sales were down, money was tight, and tensions between Morris and Tessie were rising.

Morris and Tessie Benatar

Sure, they look nice, but you wouldn’t want to get Tessie angry. She had a mean right hook.​

Like any good businessperson, Morris doggedly tried everything he could think of to increase sales. He changed the window displays, ran promotions, offered free delivery, and placed ads in local newspapers. But, nothing worked.

Then, one day, everything changed.

Morris finally had a promotion that worked. In fact, the promotion worked so well that he ran it year after year for the next 10 years:

Liquidation Sale Sign

You don’t actually have to go out of business to have one of these sales, do you?

Now, why did I tell you this story? Because I think it contains a valuable lesson about how to increase the conversion rate of your website.

Morris spent a lot of his time testing out different ideas until he finally (and luckily) came across something that worked. As online marketers, we do the exact same thing.

We test different button colors, call to actions, headlines, images, and everything else we can think of. Occasionally, on our good days, we come across something that works and we feel good about ourselves.

However, we should learn from Morris. He could’ve saved himself a lot of money, stress, and dirty looks from Tessie, if he would’ve talked to his customers. They could’ve helped him answer one of the most important questions:

Why aren’t people buying from me?

This was an easy question for Morris to ask because customers would walk right into his store. But, as people who manage websites, how do we find out why people aren’t buying from us?

Tron Image

In my mind, this is what a website visitor looks like. It makes life more exciting.

That’s why I want to share with you my patent-pending approach* to finding out what your website visitors are thinking.

*Okay, you got me, it’s not patent pending. Does that make it “patent pretending”? <Insert Drumroll>

Five ways to find out why your customers aren’t buying from you

1) Chat transcripts

If you have a chat feature on your website then you can get really helpful feedback RIGHT NOW by simply reading through your chat logs. Whenever we’re going to revise a page at UserTesting.com we always start by searching for all of the chats that happened on that URL.

This is an easy way to learn about your customers’ main questions, concerns and objections.

If you don’t have chat on your site, but are considering adding it, then check out SnapEngage. They’re who we use and we’ve been very happy with them.

Chat Window

Chat logs make it easy to find out what questions your visitors ask on specific pages.

2) Surveys

If you have a question for your visitors, or want some feedback, then often times the best thing to do is ask. Use tools like Qualaroo, SurveyMonkey or 4QSurvey and ask open-ended survey questions like: “If you didn’t sign-up, can you tell us why not?”

Survey Example

Sometimes the easiest thing to do is ask.

3) Talk to your sales and customer support people

Your sales and customer support people spend all day communicating with your site’s visitors. This means that 1) they’re amazing people and 2) they understand the objections of your web visitors better than anyone.

So go talk with your sales and support people and ask them how they overcome the common objections. You can then take this learning and apply it to your site.

4) Eat your own dog food

Spend time pretending to be your customer and use your website and product. At UserTesting.com we have one of our team members pretend to be a customer each and every month, write up their suggestions for improvement, and then email them directly to our CEO.

This isn’t quite as good as unbiased feedback from someone in your target market, but you’ll be surprised at the amount of good ideas your team will come up with.

5) “Think aloud” testing

Look, I’m biased, but this is definitely my favorite way to find out why customers aren’t buying. With “think aloud” testing you can watch people in your target market speak their thoughts out loud as they try to accomplish common tasks on your website or mobile device.

When you run this kind of test you can see with your own eyes where your users get stuck or have problems.

Lady putting whiteout on screen

You just think you know your users.

Remember, the people visiting your website are actual human beings – they’re not “uniques” or “pageviews”. To understand how to make your website better, you need to learn from Morris Benatar: either pretend to always be going out of business, or talk to your customers.

When you perform a search at Google, and you have a set of search results in front of you, how do you decide what to click upon? How do you judge the page titles, the snippets, and the URLs that you see. How does Google decide what to show you? A little more than a year ago, Google Webmaster Trends Analyst Pierre Far wrote on the Google Webmaster Central Blog a post titled Better page titles in search results. There he told us that Google might sometimes rewrite the titles for web pages when showing them in search results. The post told us that Google might do some changing of titles when those had generic titles such as “home”, or no title at all, or:

We use many signals to decide which title to show to users, primarily the <title> tag if the webmaster specified one. But for some pages, a single title might not be the best one to show for all queries, and so we have algorithms that generate alternative titles to make it easier for our users to recognize relevant pages.

Before we consider how Google might decide when and how to change page titles (in a follow up post to this one), there’s another question about search results that needs some exploration.

 

How does Google decide upon snippets for search results when it chooses snippets from the content of pages?

Sometimes Google will use the meta description created for a page as a snippet. Sometime Google will pull a sentence or some information from the content of a page instead to display to a searcher. Chances are, if a page has a meta description that is well written, includes the keyword terms or phrases the page is optimized for, and is roughly around 150 characters or so, Google will choose the meta description to display as a snippet. But not always.

Sometimes a page ranks well enough to show in search results for words other than the terms or phrases that a page is optimized for, and those words aren’t all contained within the meta description for the page. Sometimes a page’s meta description isn’t well written and doesn’t include keywords the page is optimized for either. A meta description for a page may be extremely short and not very descriptive, which would make it a poor choice as a snippet. Sometimes a meta description might be identical to every other meta description on a site. Some pages don’t even have meta descriptions. Google could even choose to use content from a page even if the words from a query appear in a meta description.

Last March, Google was granted a patent that provides some hints about when Google might choose content to display from a page, and where it might choose that text from.

If the query terms or phrases that someone searches with are word that tend to appear on pages that have abstracts or lengthy introductions, Google might decide to pull content from the start of a page if the query terms are present.

If query terms or phrases being searched for tend to appear in ranking pages that often have conclusions at the end of a page, Google might choose to pull content to display from near the end of a page. That’s what the patent tells us:

If there is one section of web marketing that seems to be ignored over and over again, it’s analytics. Even for people who pay attention to their site’s statistics, it’s very common for those users to leave everything on the default settings and just pay attention to what is in front of them without even thinking of how to delve into things that might suit them better, like Custom Reporting.

Five recent changes to Google Analytics can lead to a more effective and accurate use of the toolset and your metrics. They can save you time and headaches and make your data more meaningful.

1. Google Tag Manager

Are you in the position of having to ask your IT department or web developer to make changes to your site? Do you deal with Google Analytics, AdWords conversions, remarketing tags and more? Are you sick of having to wait for weeks for your requested changes to take effect only to find out that your developer didn’t do something correctly?

If your answer to any of these questions is yes, then Google has been working on creating a tool to help you out.

Enter Google Tag Manager! This tool from Google allows you to organize all of your various Google tracking tags into one piece of JavaScript code.

This means you only have to have your developer (or IT department) put one piece of code onto your site and then you are free to create new tags and manage existing tags in one easy to use interface.

Only have Google Analytics now but thinking about doing AdWords or retargeting in the future? This is for you.

Resources:

2. Google Analytics Custom Dashboards

Google Analytics can be difficult. With so many reports and options, many people freeze up, unsure of what to do with all the data and what to look at.

With a few simple changes in the interface you can make even your executive dashboard glances much more meaningful. Although the Dashboards have long been customizable, there have been several changes recently that make that process even easier and the layouts more robust.

I would almost guarantee that if you were to spend 15-30 minutes dealing with these new functions, you would be able to come up with reports that meant more to people in your organization with only minimal knowledge.

You can now add a Custom Dashboard Layout to configure how reports will display on the screen:

google-analytics-dashboard-layout

And customize this further with easy to use analytics widgets:

google-analytics-dashboard-widgets

If you’re feeling more daring and want to dig deeper, Google has tapped into top Analytics users and is now providing downloadable Dashboards, Custom Reports, and Advanced Segments that can be plug and play according to your needs.

Resources:

3. In-Page Analytics – Enhanced Link Attribution

If you’ve spent any time in Google Analytics at all, there’s one complaint that comes up time and time again while trying understand traffic patterns. That complaint revolves around the fact that you might have two links on your page that lead to the same place but your not sure which gets the most clicks.

Before now, we’ve had to just rely on In-Page Analytics to show us a fancy overlay that only works based off of percentages to guide us. It looks something like this:

analytics-overlay

Enter Enhanced Link Attribution! By enabling Enhanced Link Attribution in your Analytics account and slightly altering the Google Analytics code that resides on your site you are now privvy to a wealth of more detailed statistics that can give you precise numbers around individual links on pages and how they performed.

Was the link more valuable in your navigation or in your content? This can give you the information you need to make your pages better.

To set this up, just go to Admin –> Property Settings. Then check the box next to Enhanced Link Attribution.

After you make a slight alteration to your Google Analytics Code, you’re good to go!

enhanced-link-attribution

Resources:

4. Google Analytics Change History

For years the Change History menu in Google AdWords has been the first place that people should go to understand what has been done in their account. Match up actual changes with dates and those responsible for making those changes and you can really start to get a grasp on how your PPC dollars were spent and why.

The same can’t be said for Google Analytics. Changes are made:

  • Based off of data and conversations.
  • To get more accurate results.
  • As new options are rolled out.
  • As goals and websites change.
  • By various people.

Six months later you look back and your data appears differently than it did last year. But you can’t remember who did what or why. Move that even further back and two years from now are you going to accurately remember why certain changes were made to your analytics account?

For most people, I doubt it.

So Google is in the process of rolling out Change History for their Analytics accounts. Fewer than 30 percent of sites I’ve encountered have this currently but it is coming to all accounts very soon! When it hits your account you should be able to view your complete account history by visitng your Settings and looking at the following tab:

google-analytics-change-history-settings

Resources:

5. Tie Google Webmaster Tools Data to Your Google Analytics

While this feature has been around for quite a while, it’s somewhat shocking to log into account after account and not see these tied together. This happens all the time.

Tying these two accounts together gives you detailed statistical information that Google only gives to webmasters and make it actionable by combining this with your Analytics reporting. If your Google Webmaster Tools Account is already set up and verified then tying it to your Analytics account should take less than 30 seconds.

This gets set up in Traffic Sources –> Search Engine Optimization

You click the button and Google walks you through the process.. easy as can be.

set-up-webmaster-tools-data-sharing