The Web Design Usability Series is supported by, an easy way to instantly share your screen with anyone. lets you collaborate on-the-fly, put your heads together super-fast and even just show off.

A site’s ease of use, or its usability, is an integral part of its success, especially with websites becoming more and more interactive, complex and packed with features. User-centered design is all about building websites that fulfill the goals and desires of its users, and at the heart of this concept is that a user must be able interact with your website effectively.

Testing usability is an art and a science. There are many times when usability testers rely on qualitative measurements, intuition, opinions and feedback from users and experience. However, there are also factors you can test quantitatively to ensure that a site is usable.

In this post, we’ll discuss six crucial factors that affect usability. For each, you’ll be provided with some tips, tools and ideas on how you can measure these usability factors.

We’ll focus on practical usability testing, so the emphasis is on pragmatic and inexpensive strategies that most site owners can do. These things apply regardless of what type of website (blog, e-store, corporate site, web app, mobile device, etc.) you’re evaluating.

What other tools have you used to test website usability? Let us know in the comments below.

1. User Task Analysis

The most important and obvious thing to test for is whether users are able to accomplish their tasks and goals when they come to your site. Not only that, you have to ensure they’re able to do so in the best and most efficient way possible.

The first thing that must be done is determine what the core user tasks are. For example, in a blog, some critical user tasks are reading blog posts, being able to find older posts and leaving comments.

Perform a task analysis for each task. Evaluate task performance under these considerations:

  • Learnability: How easy is it for new users to learn to perform the task? For more complicated tasks, are there sufficient help features such as tutorials, in-line tips and hints, tool tips, etc.?
  • Intuitiveness: How obvious and easy is the task to accomplish?
  • Efficiency: Are users performing tasks optimally? Are there ways to streamline and reduce the time it takes to complete the task?
  • Preciseness: How prone to errors is the task? What are the reasons for any errors? How can we improve the interface to lower errors and unneeded repetition?
  • Fault Tolerance: If a user makes a mistake while performing the task, how fast can he recover?
  • Memorability: How easy is the task to repeat?
  • Affordance: Are interactive elements (such as buttons, links and input text boxes) related to the accomplishment of a task obviously interactive and within convenient reach? Is it evident what the results of a user action will be when the user decides to interact with it by clicking, mouse hovering, etc.?

Evaluating user tasks is a little tricky because many things associated with this are subjective, can vary greatly between different users and require you to create your own criteria for what can be considered a success.

That said, one of the best and easiest ways to perform task analysis is remote user testing. You can test participants regardless of their location, and you save the money related to the logistics of conducting your own user testing studies (booking a location, equipment, searching for participants, etc.).

Check out these remote user testing web apps:

Intuition HQ

IntuitionHQ allows you to see how users interact with your website and records the duration in which a task is completed.


This remote usability testing tool gives you loads of features without the requirement of knowing how to code. It has a nice interface for setting up your tests and offers you excellent reporting features for task analysis.


This free tool allows you to input your website’s URL and then subsequently provides you readability scores based on some popular readability evaluation algorithms.


To evaluate legibility, you can test the contrast of your foreground (text) against its background. With Check My Colours, you simply plug in the URL of the webpage you want to check, and it will test page elements against optimal W3C color contrast algorithms. The higher the contrast, the more likely your text is legible and pleasant to read.

3. Site Navigability

For most sites, it’s imperative that the user be able to move through multiple webpages as easily as possible. Navigability consists of numerous user interface components, such as navigation menus, search boxes, links within the copy of a webpage, sidebar widgets that display recent or top content and so on.

Here are the major considerations for when you’re testing your site’s navigability:

  • Information Architecture (IA): How well are webpages categorized and organized? How well are navigational features constructed?
  • Findability: Are there sufficient site features such as search boxes, archive pages, links and navigation features that aid in finding relevant webpages?
  • Efficiency of Navigation: How fast and in how many actions (number of clicks, how much text, etc.) does it take to get to page of interest?

There are numerous tools available to help you evaluate the usability of your site’s navigation and information architecture. Most evaluations of this nature should be undertaken before the site launches. For example, testing the intuitiveness and accuracy of content categories is a good idea before the website grows bigger because it may be more difficult to change when the site generates more content.

There are numerous methods for testing navigability. Card sorting is an activity where you place content categories on cards and ask participants to place them in groups. This gives you an insight on how to develop your content hierarchies and content relationships, as well as test any existing organizational systems. Tree testing involves generating a list of topics and subcategories and then tests how well and how easy it is to find a category based on the tree.

OptimalSort is a robust web app for conducting card sorting activities.


Creating a site map — a list of webpages that a website has or will have — can greatly aid navigability analysis. WriteMaps is a tool that you can use to generate, manage and share your site maps.


Navflow can give you information on how your users move through webpages. It gives you plenty of information, such as path analysis, which allows you to follow how a user gets to certain webpages on the site.

4. Accessibility

A website should be accessible to everyone, including those of us with disabilities that affect how we experience the web.

When evaluating a website’s accessibility, it’s important to look at it from a universal design point of view. People often mistake web accessibility as being only for those with barriers like blindness or mobility issues. However, we should broaden our view to include anything that might hinder a user accessing your site from a number of browsing situations. This is especially critical with the rapid adoption of mobile devices, tablets, netbooks and web-enabled TVs and gaming consoles. Internet users also have a much wider array of web browsers than ever before: IE, Chrome, Firefox, Safari, Opera and so forth.

All of these options render our work in different ways and present interaction challenges. For example, selecting a link on a touchscreen tablet is completely different from clicking it on a desktop computer.

The general goal of evaluating a site’s web accessibility is how well it deals with these varying circumstances.

Here are considerations to take into account when performing web accessibility analysis:

  • Cross-Browser/Cross-Platform Compatibility: Does the site work in as many browsing situations as possible? Is the site responsive, flexibly changing the layout depending on how the user views it?
  • Semantic HTML Markup: Especially for those who use assistive technologies like a screen reader, the quality and accuracy of the webpage’s structure is important. Are HTML tags being used correctly?
  • Color Choice: Are the colors used high contrast? Do the colors create a hindrance to people will colorblindness or poor vision?
  • Use of HTML Accessibility Features: There are HTML features and techniques that aid users with visual impairments. Are these features and techniques being used?

Here are a few tools you can use to quickly identify and resolve web accessibility issues.

Juicy Studio: Local Tools

This is a suite of tools for evaluating website accessibility. Aside from the Readability Test tool mentioned earlier, there’s a CSS checker for identifying accessibility issues related to the visual layer of a website, an image analyzer for checking your image elements and more.


This tool allows you to provide a URL to check for HTML markup issues. It can give you some assurance that your site is coded properly for web accessibility.


Browsershots shows you how your website looks in different browsers. This is helpful in seeing whether your site renders correctly in each of them.

5. Website Speed

One factor of usability that’s not completely evident is the need for a website to be speedy and responsive. In fact, web users deeply care about how fast they’re able to get the information they need. The better performing a website is, the more efficient a user will be when completing his desired tasks.

Here are considerations for evaluating the speed of a website:

  • Webpage Response Time: How fast (in units of time, such as milliseconds) does it take to load an entire webpage?
  • Webpage Size: How big is the webpage, in terms of file size?
  • Code Quality: Does the website use web development best practices for website performance?

Here are some free tools you can use to quickly learn about your website’s performance.

Pingdom Tools

This free, web-based tool is dead-simple to use. All you need to do is plug in your website’s URL, and it will give you a report that displays your website’s response time and webpage size.

Page Speed Online

Aimed at website owners and web developers, this tool from Google allows you to type in your website’s URL. It will evaluate your site based on its best practices for web performance.

6. User Experience

User experience (UX), at its core, tries to study and evaluate how pleasant a website is to use. This factor is largely subjective because it deals with user perception, which can be vastly different from one user to the next.

The way UX can be evaluated is through user feedback. By asking questions of users, you can gain a better understanding of how they feel about the site.

Some considerations when evaluating UX:

  • Fulfillment: Do users feel satisfied after interacting with the website?
  • Usefulness: Does the user feel like he’s obtained value from using the website?
  • Enjoyment: Is the experience of being on the website fun and not burdensome?
  • Positive Emotions: Do users feel happy, excited, pleased, etc. when they interact with the site?

When evaluating user experience, a qualitative approach is often the only option. We can’t accurately quantify such subjective things as feelings and emotions.

Through the use of web design feedback tools and surveying tools, we can gain some insights into how users feel.

Feedback Army

Feedback Army gives you the ability to pose open-ended questions to website reviewers. You get 10 responses for $15.


The Web Design Usability Series is supported by, an easy way to instantly share your screen with anyone. lets you collaborate on-the-fly, put your heads together super-fast and even just show off. The possibilities are endless. How will you use Try it today.

More About: features, mashable, web design, Web Design Usability Series

For more Dev & Design coverage:

Google Chrome Statcounter

According to a recent report by the StatCounter web analytics firm, Google Chrome is on track to pass Mozilla Firefox’s number of users by December of this year. Unless Mozilla Firefox, which is the world’s second-most used browser in the world, can put an end to Chrome’s rapid growth rate, the Firefox browser era may be coming to a close.

The report indicates that since January of 2011, Chrome has captured an additional 8% of the global internet browser market share. This increase, which represents an astounding 50% growth rate, has increased Chrome’s total market share from 15.6% to 23.6%. The simultaneous growth in Chrome’s market share and decrease of both Internet Explorer’s and Mozilla Firefox’s market share indicates that Chrome’s growth is at the expense of its competition.

During the same time period that Chrome was rapidly growing, both Mozilla Firefox and Internet Explorer were shrinking. Since the beginning of this year, Mozilla Firefox has fallen to 26.8% of the global market share, a drop of four percentage points. Although Internet Explorer fell four percentage points during the same time period, is remains the most-used browser in the world with a market share of 41.7%.

In an effort to compete with the rapidly growing Chrome browser, Firefox has accelerated its development process and now releases a new version every six weeks.  However, it does not appear that the “rapid development” strategy is preventing Firefox users from defecting to Chrome.

If Firefox wants to prevent fulfillment of StatCounter’s prediction that Chrome will become the world’s second most used internet browser by the end of the year, Firefox must take measures to prevent additional users from making the switch.

[Sources Include: Computer World & StatCounter]

Follow SEJ on Twitter @sejournal

Google Analytics Premium

Yesterday afternoon, Enrique Munoz Torres, the Product Manager of the Google Analytics team, announced a new Premium version of Google Analytics. Initially, the product will only be available in the US, Canada, and the UK where Google indicated that it has already signed on Papa John’s, Travelocity, Gucci, and Transunion for the premium service.

The price tag, which is a hefty $150,000 per year, will discourage the majority of Google Analytics users from changing over to the Google Analytics Premium service. Google Analytics Premium, which is best-suited for enterprise level clients with sophisticated analytics needs, offers the following advantages over the free service:

  • 4-hour data freshness
  • Dedicated processing power
  • Service Level Agreement (SLA)
  • Up to 50 custom variables
  • Attribution modeling
  • Higher level of product support that includes training

The new premium version addresses the major concerns of large corporations: data ownership, data retention, and a formal SLA agreement. As a result of addressing concerns and developing a premium product, Google has been able to acquire some of the world’s top brands as pilot customers.

Even though Google Analytics Premium offers a robust feature set and many new options, the interface is almost identical to the free Google Analytics and implementation is still as easy as placing a code snippet on the site.

Although rumors began circulating that Google Analytics would discontinue its free service, Torres issued the following statement to reassure free users that these rumors were false:

“We’re more committed than ever to providing our customers, large and small, with options to measure and improve their marketing efforts. Google Analytics will continue to offer a powerful, free product as it always has and you’ll see plenty of new features and enhancements in the future.”

For businesses that have an extra $150,000 in their analytics budget, Google Analytics Premium is available through authorized resellers or directly from Google.

[Sources Include: Google Analytics Blog & Google Analytics Premium]

Follow SEJ on Twitter @sejournal

When I started my company, I had a few solid years of SEO experience to draw on, with good references at each place of employment. I entered the industry in an entry-level SEO position with a large e-commerce company that specialized in home furnishings. Soon thereafter, the VP of technology at the company was hired by to work as a senior product manager, and he hired me to work on Ask’s Search Quality team as a search relevancy evaluator.

That position was a year-long contract that consisted of receiving large data sets of queries and websites and having to manually rate each search result based on an extremely detailed set of evaluation criteria. A notable highlight of my employment at Ask was when I received an “adult” query and proceeded to go right ahead and evaluate all of the search results … while in class. After my time at Ask, I spent a year working in-house at a large skin care company who would later go on to become a client of my company. All in all, I had plenty of work experience to draw upon, especially for a recent grad.

This was fortunate, because if you are to have a reasonable chance at gaining clients in today’s SEO market, you’re going to need more than a fancy website and some business cards.  Potential clients want to know that they’re hiring a qualified consultant, and it’s going to really help your chances of success if you have experience and a reputation to draw on.  Do you have a proven track record of building organic search traffic and creating sales on your own websites?  Have you worked in-house at a company doing SEO or at an SEO agency?  You’re going to need to polish up that resume and bring your A game to the sales pitch, because it is tough to get work when you don’t have an extensive portfolio of satisfied clients and referrals coming in left and right.

The more experience you have, the easier it is to acquire clients.  But that doesn’t mean that you need 10 years of C-level experience.  Many people with as little as 2 to 3 years of solid SEO experience have seen success at starting their own SEO companies.  The sweet spot is probably something like 5+ years, but if you’re a good salesman and you’re a legitimately skilled SEO, nothing is impossible.  The key is being able to market yourself as an expert by drawing on experience from your resume.  Showing analytics and numbers is great.  Getting a previous employer to vouch for you on the phone is even better.  LinkedIn recommendations are an added bonus.

This post is part of a series that analyzes those barriers to entry for starting an SEO consulting firm.  Check out the other posts in this series:

  • Opportunity
  • Strategy
  • Finances
  • Personality

Follow SEJ on Twitter @sejournal

I have an associate that runs a very large and heretofore successful content site. He’s an experienced and competent entrepreneur and CEO. For several years, everything came together, according to plan, as his website and traffic grew exponentially. He was adding staff and negotiating a move to new offices. The future seemed bright, sunglasses and all, until an unexpected and unwelcome visitor stopped by to visit last April: Panda.

Like many content based websites, his was hit HARD. At the time, he told me that he wasn’t sure if Panda had dealt his company a body blow or a career ending knockout. It was time to put away the shades and get to work. It was time to rescind those job offers. It was time to contact the new landlord and tell him that he wasn’t coming.

Of course, his story isn’t unique. Thousands of companies, both large and small found themselves in the same position. So… like the rest of us, he read up on all things Panda. He ran every piece of content through grammar tools, re-wrote the work that didn’t make the grade and fired writers that didn’t make the cut. He took into consideration all of the quality signals cited right here in SEJ: “Why the Pandobsession Has to Stop” among countless other articles written about Panda. He even followed the sub-domain advice given by Matt Cutts to HubPages.

One would think that after all of that work and presumably doing everything in accordance with Google’s demands guidelines that he would be rewarded for his efforts – right? Well of course, you know better. Instead he sits in the modern day version of the sandbox – only now it’s the Panda Box aka penalty box.

This hasn’t prevented him from putting out new content every month – around 2,000 pages. The “funny” thing is that scrapers find this content within minutes and consistently outrank my colleague’s website with materials that they STOLE from him – OUCH. If there is silver lining to this, he did add a new employee. The new hire spends his entire day filing DMCAs (Digital Millennium Copyright Act) against the websites that are ripping him off. No kidding.

As with any penalty, the worst part is “not knowing”. I wrote a blog post in 2009 that offered up some of my own suggested guidelines for Google in handling reconsideration requests. I’m still convinced this could work:

  • The penalized Webmaster fully and truthfully completes a reconsideration request Wizard, which walks you through the Google webmaster guidelines & identifies  compliance issues.
  • Google immediately acknowledges receipt of the request and has 10 business days to respond.
  • The response, at minimum, informs the webmaster what areas of his site do not conform to the guidelines and what penalty has been assessed.
  • Google informs the webmaster that his penalty will be removed within X days/weeks/months after coming into compliance.

Is that really too much to ask? I don’t think so.

Black Hats have co-opted Panda. They are filling the search results with scrapers. They build ‘em big and build ‘em fast. They use the list of websites that have been Pandalized & scrape the hell out of them. Unfortunately, the current Google also rewards these scrapers.

Here’s a novel idea: Perhaps the answer to the “Scraper Problem” is to release from Panda’s grip all of the websites that have worked so hard to come into compliance? What do you think?

Follow SEJ on Twitter @sejournal

As a social marketing tool, StumbleUpon Paid Discovery (a.k.a. StumbleUpon ads) is underrated. Paid Discovery helps people who are likely to be interested in your infographic – or your clever article or how-to guide, etc. – find your infographic at a fraction of the cost of traditional promotional methods.

When we promote an infographic we use the following tactics:

  • Outreach to industry and topic-related bloggers through email and twitter
  • Search engine optimized press release, which is sent through MarketWire
  • Sponsored tweets campaign targeting industry bloggers, journalists, etc.
  • Facebook ad campaign that promotes our infographic to people with a stated interest in a relevant topic or industry
  • Old-fashioned begging of colleagues, friends, and family members to post our infographic on Facebook, Twitter, blogs, etc.
  • StumbleUpon Paid Discovery

Of all the methods we use, blogger outreach is by far the most effective. While promoting our recent American family debt facts infographic, we contacted dozens of financial and economic bloggers to great effect. The downside: Blogger outreach is incredibly time consuming.

StumbleUpon Paid Discovery, which is perhaps the least time-consuming and lowest-cost tactic, isn’t necessarily the second most effective tactic on our list. Often times, begging colleagues, friends, and family can give a new infographic a nice boost, and we’ve gotten lucky a couple of times with press releases, Facebook ads, and sponsored tweets. Yet when measured over the long term (six months or longer), Paid Discovery generated at least as many likes, tweets, and links an any other method listed.

The reason that Paid Discovery is so effective is that it’s a great way to generate long-term traffic. Take a look at the StumbleUpon traffic report for our Colorado Beer Facts infographic we created and initially promoted via Paid Discovery in September 2010:

The graphic shows that StumbleUpon generated more than 20,000 page views for the Colorado beer infograhpic over the last year or so. These 20k+ views all stemmed from an initial investment in only 1,000 paid views in September 2010. When StumbleUpon users who see our content during our paid advertising give us a thumbs up, StumbleUpon rewards us with unpaid impressions. Over time, as more and more people thumbs up our content we gain more unpaid impressions. As you can see, our unpaid impressions exploded in October 2010 and again in January 2011.

During the last year, the number of Facebook Likes for our Colorado beer infographic has about doubled. We’ve seen a handful of tweets show up in our alerts, and despite no active promotion, the infographic continues to be embedded by other website owners (albeit at a slow pace). While it’s hard to attribute all of these likes, tweets, and embeds directly to our initial investment in Paid Discovery, StumbleUpon is the largest source of traffic for this particular page. Other content we’ve promoted via Paid Discovery shows similar results, so it seems to be a great ongoing link building and social marketing tool.

Tips For Using Paid Discovery To Promote Your Infographic

First, Paid Discovery has three different pricing levels, but all levels allow you to show your content to users by their stated interest. Investing in the “standard” pricing level allows you to learn things about how each targeted audience interacted with your content, which can help you figure out the interest area that best matches your infographic. Typically, we’ll identify three to five interest areas, and then purchase 300 or so impressions for each interest area at the “standard” price level. A few hours later, we have data that shows interactivity for each interest area, which we can use to buy more paid impressions at the lower price level. Basically, we run a test at the higher price, and then spend more money on a lower-priced bracket when we know which interest areas are likely to generate traffic long-term.

Next, we’ve found that it’s critical to target one interest area per campaign. This allows you to see which interest area generates the best response, which allows you to target additional spend accordingly. For the American family debt infographic mentioned earlier, we targeted StumbleUpon users with an interest in banking, economics, and financial planning. Of the three, users with an interest in economics were the most engaged and the most likely to share our graphic on Facebook or Twitter.

Also, remember that you can use the demographic data gathered from your Paid Discovery campaign to inform your Facebook ad targeting (and vice versa). If males over 40 were more likely to interact with your graphic than males under 40, you can adjust your Facebook ad bids accordingly. Obviously, it’s only wise to do this sort of thing if you have enough data.

There are also some general tips that can make or break your infographic in terms of Paid Discovery:

  1. Get the content as high up on the page as possible. Infographics almost always stretch below the fold, but the more of the graphic that you can get above the fold the better.
  2. Social proof: Showing the number of Facebook Likes at the top of the page can also get Stumble users to stick around and view your graphic.
  3. Make sure you do everything you can to make the content load as quickly as possible. If the graphic doesn’t load fast, Stumble users will just leave.

Finally, the more you use Paid Discovery to promote content the more you can use it to figure out when you have content that isn’t working. When an investment in paid StumbleUpon impressions doesn’t generate any free stumbles, it’s a signal that the campaign is targeting the wrong interest area or the content needs improvement.

Follow SEJ on Twitter @sejournal

blekko funded by yandex

On Wednesday, search engine startup Blekko received $30 million in funding from various investors including Yandex, Russia’s top search engine. Trailing only behind Google, Bing, and Yahoo, Blekko is the fourth largest search engine in the US. With its motto “slash the web,” Blekko strives to provide search results that lack the spam and content farm pages that are often found in the dominant search engines.

The cash infusion combined with the resources offered by Yandex should help blekko move closer to its goal of obtaining one percent of all US search traffic. Blekko’s CEO Rich Skrenta commented:

“Yandex is a partner and investor that shares our mission of making search the best experience it can be. Having access to one of the world’s top pools of search talent and the fantastic products they have built will help us grow Blekko in the U.S.”

Blekko is hopeful that the funding and access to Yandex’s extensive search index and technology infrastructure will help to improve long tail and obscure searches and expand its market share.

Yandex Arkady Volozh, Principal Founder and CEO of Yandex, will sit on Blekko’s board. Volozh recently stated, “We love blekko and think it’s a great product — a quality search engine that organically combines search algorithms with expert opinions.” Russian search engine Yandex is the largest in Europe and has over 20 years experience building its core search technology.

Since launching in 2007, Blekko has secured a total of $54 million in funding. Although many search startups have tried to take on Google and failed miserably, Blekko remains a growing force in the US search market. While its search volume pales in comparison to competitor’s, the new round of funding and alliance with Yandex sends a message to the other major players that Blekko may slash away a little more of their search volume.

Sources include: [SFGate, Mashable, and MarketWatch]

Follow SEJ on Twitter @sejournal

Today we’re very excited to bring real time data to Google Analytics with the launch of Google Analytics Real-Time: a set of new reports that show what’s happening on your site as it happens.

Posted by Aaron Wheeler

It’s often pretty difficult to make a short title for a webpage that offers a lot of varied or super-specific information. At SEOmoz, we say that the best practice for title tag length is to keep titles under 70 characters. That’s pretty pithy considering that the title also includes your site or brand name, spaces, and other nondescript characters. So, does it matter if you go over 70 characters? How important is it to strictly adhere to this best practice? Cyrus Shepard does SEO for us here at SEOmoz, and he’ll answer that very question in this week’s Whiteboard Friday. Think title tags could or should be longer? Shorter? Let us know in the comments below!


Video Transcription

Howdy SEOmoz! Welcome to another edition of Whiteboard Friday. My name is Cyrus. I do SEO here at SEOmoz. Today we’re talking about title tag length. How long is your title tag?

Bad title tag joke. For years, we’ve been telling people, the length of your title tag should be 70 characters or less. That this is best practices. But what does this really mean? Is it absolutely true? What happens if your title tags are longer than 70 characters? For example, the title of today’s post within the meta description is 77 characters. Not this title, but the actual HTML title tag, if you look at the source code, you’ll find that the title tag of today’s Whiteboard Friday is 77 characters. We’re actually over the 70 character title tag limit. Is that bad? Are we going to go to SEO hell for that? What does that mean?

Well, recently people have been doing some experiments to see just how many characters Google will index within a title tag. For years, we thought it was 70s. It’s fluctuated. But recent experiments have shown that Google will index anywhere between 150, one person even showed that they will index over 1,000 characters, and I will link to these experiments in the post. But does this mean that you should use all of those characters to your advantage? Can you use them to your advantage? Well, I got really curious about this. So I decided to perform some experiments here on the SEOmoz blog with super long title tags. We’r
e talking extreme title tags, like 200 characters long, 250 characters long, just blew them out of the water just to see what would happen.


On the first experiment, I took 10 posts that did not get a lot of traffic, but they were pretty consistent traffic from week to week. I kept the old title tags and I just extended them with relevant keywords up to about 250 characters long. The results blew me away. In that first experiment, my traffic, over about a 12-week period, rose 136%. You can see, I’ll try to include a screen shot in the comments below of the Google Analytics. It exploded. I got really excited. So, I tried a second experiment. (Correction, the experiment took place over a 6 week period, not 12 like I stated in the video.)


The second experiment I tried with existing successful pages, pages that were already getting a fairly high volume of traffic, that were getting a consistent level of traffic every week. On that experiment, over about the same 12-week period, traffic rose 8%. Cool, but overall site traffic rose 9%. So it was actually 1% below the site average.

For a third experiment, I tried again on a completely different site, a personal site. I changed a few pages, title tags. Traffic actually went down over a 12-week period 11%. On that site overall site traffic went down 15%.

So, in one of these experiments, the long title tag seemed to work really well. In the other two, it just seemed to be a wash. Why did this happen, but not here? I am going to get to that in a minute.

Title Tags less than 70 Characters

Now, what are the arguments for short title tags? The best practices that you always hear about, keep it less than 70 characters. There are reasons why this is best practices and why we recommend it time and time again.

The first reason is that Google will only display the first 70 characters, in general, in their SERPs. After that, they’re truncated. Users aren’t going to see them. So, if you are writing title tags longer than 70 characters, you’re basically writing it for the search engines, and time and time again we’ve found that if you’re doing something specifically for search engines and not for users, there is probably not a lot of search engine value in it. There might be some, but probably not much.

The second reason is our Correlated Ranking Factors, a survey that we perform every couple of years. Our highest on page correlation value for keyword specific usage was if it is found, if the keyword is found in the first word of the title tag, that was a 0.09 positive correlation. It is not a huge correlation, but it was our largest on page keyword factor. Year after year after year when we perform these correlation studies, we see a direct correlation between the position of the keyword in the title tag and how important it is in the query. So, the closer the keyword is to the beginning of the title tag, the more likely it is to be important in the query. You’re going to see this time and time again. It’s very consistent. Hundreds of webmasters know this from personal experience. You want your keywords at the beginning of the title tag to rank for those keywords. The further out you do it, at 220 characters, those keywords aren’t going to count for very much.

Title Tag Best Practices

Now the third reason is kind of new in today’s world, and that is the rise of social media. Twitter limits characters to 140 characters. So, if you have a 220 character title tag and you’re trying to share it on Twitter through automatic tweets or Facebook or whatever, they look spammy, they’re not shareable, people don’t want to share them. Shorter title tags, snappy, work really well.

For all these reasons, and for most of the time we found that longer title tags don’t help you, we say that less than 70 is best practices. Now, people get confused by when we say best practices what that means. Does it mean an absolute rule? No. It just means best practices works most of the time. It’s going to be your best bet. All other things being equal, it’s going to be what you want to implement, what you want to teach people to do, and generally how you want to practice.

So, what happened here? Why did this experiment rise 136%? Well, if you remember, these were low volume pages, pages that weren’t getting a lot of traffic anyway. The reason it rose, we suspect, is because those title tags were poorly optimized in the first place. They didn’t match the content. When we added a few keywords to the end, Google interpreted that as, hey, these match a little bit better to the content, and that’s why it rose. It was a fluke. If we would have wrote the title tags better in the first place, we could have seen this traffic all along.

So, with this in mind, I have some suggestions for your future title tag use, and best practices is going to continue to be less than 70 characters.

Best Practices are Guidelines, Not Rules

The first rule is always experiment. Like I said, if we would have tried something else, if we would have written different title tags in the first place, it could have helped us. What did it cost us to change those title tags? Zero. If your pages aren’t performing well, you can always try something different and you should try something different. I still see sites all the time, large eCommerce sites, that on thousands of pages they have their brand name, the first 20 characters of the title tag in places where they shouldn’t necessarily do that. SEOmoz did that for a number of years up until a few months ago. So, always experiment, not too much, but always try different things to see what title tags are going to work best for you.

Second is write for users. Here at SEOmoz our title tag is the same as the title of our post on our blog because we think it is important to meet users’ expectations. When they see a title tag in the SERP and they click through to your page, you want them to feel like they’ve arrived where they thought they were going to arrive. So, it doesn’t always have to match the title of your post, but something similar, something to make them comfortable, and something to talk to the users.

Third, remember to keep your important keywords first. Putting your important keywords out here isn’t going to help you much unless your titles are so poorly optimized in the first place that you really should rewrite them. So, put your important keywords, they don’t always have to be in the very first position, but as close to that first position as you can.

Lastly, what happens if your title tag is over 70 characters, such as the title tag of today’s Whiteboard Friday post at 77? Don’t sweat it. In our web app, in our Pro Web App, if you go over 77 characters, we issue a warning. It is not an error. It’s a warning. We just want you to know that maybe if your title tag is over that limit that it might not be the best written title tag. You might want to have a look at it, but here at SEOmoz we have thousands of title tags that go over the 70 keyword limit, and for the most part, we’re going to be fine. Best practices means that it’s best most of the time, but you can go outside of best practices if it’s warranted.

Remember, experiment, try different things out, find out what works best for you.

That’s it for today. Appreciate your comments below. Thanks everybody.

Video transcription by

Do you like this post? Yes No

Google’s Chrome is on the brink of replacing Firefox as the second-most-popular browser, according to one Web statistics firm.