Which Link Metrics Should I Use? Part 2 of 2 – Whiteboard Friday

Posted by Aaron Wheeler

 We all know that, at first, it can be really difficult to decide what the most valuable link metrics are and when to use them. Last week, Rand outlined and defined a variety of metrics that are used to assess the respective values of domains, pages, and links between them.  This week, he’s back with the stunning conclusion: how to actually use these link metrics in your research and how to choose which metrics to use for given research situations. If you were ever confused about when you should be using PageRank and when you should be using mozRank, fret no longer!

 

Video Transcription

Howdy, SEOmoz fans! Welcome to another edition of Whiteboard Friday. Today the exciting conclusion, Part 2 of 2, on which link metrics to use. So, last week we discussed in depth a ton of the link metrics that are available, what they mean, what they do, how you can interpret them. Today I want to walk through some of the specific tasks and projects that you are going to be involved in when you are doing SEO kinds of things and which metrics can help you to accomplish those specific tasks.

First up, let’s say I am doing a high level SERPs analysis, something like the keyword difficulty tool output report where it is trying to give me a sense of who is in the top 10, who is in the top 20. Why are they ranking there? Is it because they are an exact match domain? Do they have a lot of good anchor text? Do they have a ton of links into them? Is it because their domain is important or their page is important? We can look at a few key metrics. I like looking at page authority, which is that aggregate of all the mozMetrics and domain authority and then maybe the number of linking roots and C-blocks just to give me a rough idea of kind of what I am dealing with. That high level SERPs analysis is great when I am doing like a keyword difficulty report trying to determine which keywords to go after, whether it is roughly possible for me to be ranking in those sectors.

If I want to do some link quality analysis, so I am looking particularly at a link and trying to determine is this link helping me to rank? Is it potentially hurting me? If I am looking maybe at a client’s website, say I was doing consulting or I am a new SEO in an in-house position and I am trying to analyze whether some links that were built previously are questionable or not, there are some really good ways to do that. One of my favorites is looking at PageRank versus mozRank and mozTrust.

Normally, what you should see is that PageRank and mozRank are pretty close. If PageRank is a 3 and mozRank is like a 4.5, it might be okay. It’s a little on the border. If is a 3 and a 3.5, oh, that’s, you know, that’s perfectly fine. That’s normal. We should expect that. If, however, I am looking at like a 3 and a mozRank is like a 5.8, something is fishy, right? Clearly, I mean, Go
ogle probably knows about more links than SEOmoz does and mozRank, boy, for it to be that high and PageRank to be that low, something might be up. Something might be going on where this site is selling links, Google has caught them, they are doing something manipulative. This could be a problem. Then I also like comparing mozTrust, because a lot of times, you won’t see PR scores, especially for a lot of new sites and pages. Google hasn’t gotten the data there, or they have an updated PR, but that site has built a lot of links in the meantime. By the way, you do want to be careful of that too when you are comparing PR and MR. But mozRank and mozTrust, if I see like a 5.8 and a 7.2, this is probably a phenomenal link. If I see a 5.8 and a 2.2, that’s really, that’s a bad sign. That usually means that this page, this site or this page has gotten a lot of links, but from a lot of very questionable sources. Otherwise, their mozTrust should be quite a bit higher.

So, those types of analyses along with looking at not just the number of links but the number of external versus internal links, if it’s a lot of internal links, maybe that is boosting up the ranking, but it will be easier to overcome than a high number of external links and followed/no-followed. If it is a lot of no-followed links coming to the site, oh that is a different story than if all the links are followed.

Now, if I am looking at outreach and popularity, I am trying to say, how popular is this blog? How important is this news website? How big and popular on the Web do I think this forum is or Q&A site or community? Then, I want to be looking at some of those high level metrics, but I might want to dive sort of one step deeper and look at, yes, domain authority. I really care about domain metrics here, right? Not individual pages on those sites. So, I am looking at Domain mozRank and Domain mozTrust, which are the same thing as mozRank and mozTrust but on the domain wide level, and then I might care a lot about the linking roots and C-blocks, because that tells me a raw popularity count. How many people on the Web have referenced this guy compared to other people?

Now, if I am looking and trying to sort by the most helpful links to raise my ranking, say I am analyzing a set of 50 blogs and I want to decide, who am I going to guest blog for first? Who do I really think is going to be providing that value? Or I have the opportunity to sponsor or speak at a conference or contribute in some way, and I know that I can contribute the content or whatever I need to, to get those links. I really care a little bit less about the metrics and a few about these big three questions. So, I would ask you before you look at the metrics to ask yourselves these three questions, particularly if you are doing that sort of detail level analysis.

Number one, how well does that page or that site rank? If you search for a few keywords that are in the title tag of this particular page or the homepage of the site and it does not come up in the number one or number two positions, that might not be a good sign. If you search for four or five keywords that compose a phrase in the title and it is still not coming up, something is seriously wrong there. There might be some issue with that site in Google.

How relevant and useful is it? Is this site going to send actual traffic? Was the link editorially given? Is it a true citation that represents an endorsement from one site, one page to another? If that is not the case, you might be in trouble in the future. Even if Google hasn’t caught it yet, Bing hasn’t caught it yet, in the future, that might be causing problems. It is just not worth it. Go spend your time on other links that are editorial, sincere citations.

Do the sites and pages it links to rank well? This is a great way to analyze directories or link lists or those kinds of things and say, oh, this looks highly relevant. It is a pretty good site. If the pages that it is linking to don’t rank well for their keywords, that’s a bad sign. If a few of them don’t, okay maybe, you know, everybody links to a few bad apples. But if a lot of them are not ranking well, something is going on there, right?

Next, I might look at some metrics like mozRank versus PageRank as we did above, mozRank versus mozTrust, the number of links and linking root domains just to get a sense of these. But those three questions, more so than any metric, are going to really answer the question of how helpful will this particular page or site be in raising my rankings if I get a link from them. Next, second to last here, is sorting of links. So if I want to do a rough or a raw sort, I have a bunch of links that I exported from Google, that I exported from a tool that ran that analyzed a bunch of pages and figured out whether there was usefulness. Maybe I used the – in SEOmoz Labs there is that great tool to help me find all the search queries that I could use to find potential links. I think it is the, what is that called? I think it is the Link Acquisition Assistant. So, the Link Acquisition Assistant might export a bunch of raw lists of pages, and if I want to do some just raw sorting to get a general sense of importance before I start asking these questions, PA/DA are really good for that and so is number of linking roots. So inside the web app, you will see a lot of these. We tend to show at least those three metrics on most everything so you can do a rough sort.

Finally, last but not least, if I am doing a deep SERPs analysis, where I really want to know why does this particular page, why does this particular site rank where it does? Why is this 3 and this 2 and this 4? I want every metric I can get my hands on. The reason is because when you analyze these things all together in Excel, you can see weak points, strong points. You can get a sense of what Google is using or Bing is using in that particular grouping or algorithmic result to try to determine who should rank higher and lower, and that will give you a great sense of what you might need to do to accomplish those rankings.

All right everyone, I hope that this two part Whiteboard Friday extravaganza has been great for you. I look forward to the comments on the blog. Take care.

Video transcription by SpeechPad.com

Do you like this post? Yes No

SERPd.com is Changing Hands

FOR IMMEDIATE RELEASE:

SERPd.com is Changing Hands

Houston, Texas – March 31, 2011 – Just seven short months after the site’s inception in September of 2010, SERPd.com has been bought for an undisclosed dollar amount.  This marks the fastest acquisition of a niche Web 2.0 community site.

Currently, there is a media blackout in place that prevents either party from disclosing the full details of the transaction.  This includes identifying the new owners of the site.  However, the creators of SERPd are able to say that they are very pleased with the terms of this deal and extremely optimistic about the future of the site.

What does this mean for SERPd?

Users will be happy to know that not much will change in terms of the type of content and the overall goals of SERPd. Founders Chris Burns and Gerald Weber will remain on the site’s staff and continue in a community manager role.

As SERPd undergoes this new “changing of hands,” the site will be doing away with the voting model. The general consensus is that the voting model is dying and we are looking forward to seeing the site move in a more modern Web 2.0 direction.  Because this is also what SERPd’s users wanted, this new direction is being embraced with open arms.

The voting system will remain in place for approximately ten days until the new moderation system is put in place.  The site will then transition to its new editorial model.

That being said, Gerald wants to personally thank all of the devoted members of SERPd for making this possible.  The team is looking forward to this new change and a bright future for the site and all of it’s members.

Additionally, because there will be an increased need for active moderation as a result of the site’s new ownership and model, anyone interested in being involved in scouting quality SEO content should send an email to staff@serpd.com.

Thanks again folks, it’s been an awesome ride!

Please leave a comment if you have any questions or concerns about this exciting new change for the SERPd community.


UPDATE!

Happy April Fools Day Folks!

This was our April fools day prank guys and gals.

SERPd is not changing the voting model or anything else. We were just joshin ya!

We had also added a silly logo to the SERPd site but not as many people seemed to notice that part of the prank.

Check out the SEO Tools guide at Search Engine Journal.

SERPd.com is Changing Hands


Local Search O-Pack and the Art of Title Tags

We all know how important title tags are when it comes to SEO. Not only for ranking but it will also be the first piece of information a searcher will see about your site. I like to think of a title tag as a first date. You want to look better than you really are, and hope you can impress just enough to cover up future blemishes that someone might get to know.

So, the question is how do title tags translate on the local playing field? I am sure by now you have seen that when the integrated local results show up (aka the O-Pack) there are usually quite a few organic listings tied to a places pin like the followed…

Local titles

When the O-Pack came out, it was a big enough change to the locosphere that I decided that I needed to do some extensive research into what makes the pack tick. So, I put together a study of the following over the past few months…

28 Google Places listings that are ranking 1-7

28 Google Places listings that are ranking 50-56

Listings were examined from the following keywords:

  1. Chicago Personal Injury Lawyer
  2. New York Divorce Lawyer
  3. San Diego Dentist
  4. Dallas Dentist

In all I took 56 listings in 4 cities across the country in the law and dental categories. I generally find that these are well optimized categories in local search that have very little “luck” involved in rankings.  While the research covered 27 separate factors, one of the things I found most interesting has been the use of title tags.

Where Does Google Places Get the Title Tag From?

Generally speaking, the first 5 results in the O-Pack pull from the title tag of the website page that Google thinks is most relevant, and the remaining listings pull from the Business Name given in the Places account.

title tags

This was the case for all results that were examined in my research. I have seen a few exceptions to this, but only a few. So, bottom line, gone are the days where you could get by without a good website for a Google Places listing. You used to be able to rank a company who didn’t have a website, and you could work wonders with companies that had a website (Even if it was built with website tonight).  Now, not only does your title from your site populate on your Places listing, but it holds effect on rankings.

How Many Websites Had The Keyword Search Phrase In The Title Tag?

22 of the 28 High Ranking Places listings (79%) had the keywords in the website title tag.  Whereas 12 of the 28 Low Ranking Places listings (43%) had the keywords in the website title tag.

16 of the 28 High Ranking Places listings (57%) had the keywords first in the Title Tag.  8 of the 28 High Ranking Places listings (29%) had the keywords first in the Title Tag.

How many Listings have the Business Name in the Title Tag?

17 of the 28 High Ranking listings (61%) have the Business Name in the website Title Tag. 15 of the 28 Low Ranking Places listings (53%) have the Business Name in the website Title Tag.

All of this data is shown in the graph below….

Title tags

A few interesting observations…

When the O-Pack came out in late October, it didn’t take very long to realize that normal SEO ranking factors were now a very large part of local search. My studies have verified that to me. But, I also felt that there were some factors that were being overlooked.

Local Search is all about proving your local prominence through your Business Name, Address, and Phone Number. I couldn’t help but think that businesses would do well to include all of this information in a title tag.  Only 1 business had their phone number in the Title Tag. None had their address. But, as I looked deep into the listings (past the top 7 in each category) I didn’t see any listings that had this information. It simply hasn’t been done on a large scale.

So, I ran a few tests and they seemed to look a lot like this….

Local titles

The site in the first position doesn’t have a superior link profile, a higher pagerank, a crazy amount of citations, or anything else that would peg this listing in front of many others on this list. But they have the NAP information prominently displayed in a big way.  I haven’t shared this with many people as it isn’t something I can prove or disprove with the little data I have, but the results are interesting to say the least.

My Thoughts On A Title Tag For Local Search

For rankings, I think it is very apparent that having the keyword phrase listed is extremely important and probably at the first. This is new in local but simply a transfer from our normal good ol’ SEO factors.

For Recognition, I think that it is very important to include your business name on your home page title tag. Many people try to stuff a title with only keywords. But from a local search perspective, you should be advertising your business on a lot more places that just online and if they search, see your business name and recognize it from either friends, billboards, print, phone books, or anything else, then the chance of you getting the click/call will go up dramatically. Does a business name effect rankings? I don’t know, but it doesn’t hurt them and I definitely think it will help your click through rate if it is included in your Title.

For Best Results, if you are in the top 7 then you better get to the top 5. I would much rather have control over what a user sees as the title to my listing than letting Google show only the business name.

There is a lot more information that I have found interesting during my research of the O-Pack results from Link Profiles to Reviews, and Citations to Category choices. I will be publishing more in the coming weeks, and will probably release the data sheet for anyone to see what they can find as well.

Check out the SEO Tools guide at Search Engine Journal.

Local Search O-Pack and the Art of Title Tags


Is Syndicated Content Duplicate Content?

One of the challenges we all face whether as Webmasters, Bloggers, SEOs or any other related fields is the sheer volume of knowledge that we need to properly perform our work and advise our clients.

These are “full-time, plus” jobs!  SO much reading is required to really be informed and there is only so much time in a day.

Yet, many Webmasters and Bloggers we interact with aren’t full-time, having other responsibilities, which is understandable in today’s world and economy.

I believe this is where a lot of the misinformation begins. We can all be guilty from time to time, so I point the finger at no one. But I’d like to clear up a misconception about what is (and isn’t) “Duplicate Content” and what is “Syndicated Content”.

Duplicate Content is Any Content on Another Website, Right?

Wrong. Or at least it’s not that simple.

Duplicate Content is the same or very similar content on the SAME website.

Have you watched the TV news and seen someone being interviewed or offering commentary and that person is labeled as a ”Syndicated Columnist”? Or, ever seen a story reported by the A.P. (Associated Press) in your hometown newspaper or on TV news? That same story will be in many news outlets. These are examples of Syndicated Content at work.

Search Google News and find the same news stories picked up on multiple websites.

Quality content like News, Press Releases and Articles are all similar and are considered Syndicated Content. If the content is good, it’s expected to be seen in multiple places.

Syndicated Content is natural. It’s existed for decades. This is the way things have been long before Google or the Internet even existed.

NOTE: Quality Syndicated Content on the Internet has nothing to do with PLR (private label rights) articles that can be re-used by anyone buying them. Syndicated Content as I describe it is originally written by the Author (or ghost written only for that Author) and that Author maintains ownership, sharing the work subject to the Terms of Use of the blog(s) or website(s) using that content, often under a Creative Commons license.

But Using Syndicated Content Will Get My Site Banned from Google!

Really? Yeah, it won’t. Not even close.

Don’t take my word, see what Google says right here:    http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=66359

Specifically, about Syndicated Content, Google says:

“If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer.”

By contrast, what does Google says about Duplicate Content on the same page?

1. “However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic. Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results.”

2. “Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results.”

3. “In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved.”

My advice is don’t try to manipulate Google’s ranking algorithm. Always has been.

White-hat SEO your sites. Use high-quality Syndicated Content, Original Content (or both) to build website content and to build links.

But don’t manipulate Google, using Black-hat techniques and violating Google’s Terms of Use.

Why Would I Want to Use Syndicated Content Instead of Original Content?

Original Content is always better, right? Well, maybe, but maybe not.

First, what is original? Are we only talking about a brand new idea that’s never been written about before? That would eliminate probably 99% of possible content! Or, are we just talking about content that would pass Copyscape®?

Personally, I wouldn’t want content that had been syndicated to 50 or 100 sites. That’s just my take.

But what about a good article or blog post that’s been Syndicated to maybe 5 or 10 sites?

If the article or post was really good, you wouldn’t want it on your site?

Really?

Understanding Both the Blogger’s and the Author’s Needs and Points of View

Bloggers – If you use User Generated Content and want to grab the big ‘take away” from this post, here it is:    Think like your Authors.

Your Authors and Guest Bloggers have cost for their content, either in time, money or both. You have something you need from them (quality content) AND they have something they need from you (links, qualified traffic or both).

If your Authors can absorb their cost of content over a slightly larger number of blogs or other websites publishing the article or post, they can invest more in it! More research can be performed for the article, it may potentially be longer, be more precisely worded and better edited.

Crappy content is crappy content, whether it is an Article, a Blog Guest Post, whether it’s Original or Syndicated.

Authors – Think like your blog | website owners accepting content, since it’s not all about you, either. And if you’ve agreed to supply Original Content, do so. Do NOT misrepresent your content!

Both sides’ needs have to be met or otherwise, the relationship doesn’t work! Often times, we’re only thinking about our side of the deal and not the other person’s.

Wrapping it Up

Look, if you accept content from guest bloggers and you only allow Original Content, I’m not asking you to change and I’m not saying that you’re wrong. I’m just asking you to consider what your real goals are and the best ways to accomplish them.

Does “original” by itself really accomplish quality for your website or for your loyal readers?

It’s the intent of the person – in this case the Author – that makes the content crap or quality, not whether the content is Syndicated or Original.

And sharing content or guest blogging is all about building relationships, first. If everyone believed that, our work lives would be easier and a whole lot more enjoyable. 🙂

Thanks for your time reading. I’d love to hear your thoughts.

Check out the SEO Tools guide at Search Engine Journal.

Is Syndicated Content Duplicate Content?


Is SEO Going to be Costlier in Coming Years?

Search engine dramas have left us with lots of “Ifs and Buts”. Caffeine update followed by algorithm change is only to start with.  A lot more is still waiting for webmasters and eventually the website owners to discover.

It is time to think ahead of time. What is there in store for this SEO as a blooming service industry?

Quick questions that came to my mind:

I was just wondering whether SEO industry would retain its position and importance in coming days, or will people take it for granted?  With “N” number of SEO professionals (majority from Asian countries) available at cheaper cost, will the industry loose its value and craze?  It seems anyone with the slightest bit of SEO knowledge can give you decent results.

Has it become the playground of unscrupulous freelancers and spammers forcing Google to come up with updates one after another?

What does the situation indicate?

Will SEO be the cheapest service in coming days? Does investing in it become a kid’s game?

Or,

SEO is going to be costlier with days.  Where is the tomorrow’s catch line?

What is the future of SEO, as it seems today?  We need to consider it from both SEO professionals and business owner’s point of view.

1. Think from SEO professional’s point of view:

  • Will corporate ever treat SEO as an established technology field?  Is it going to be a serious affair?
  • Will online giants recognize and include SEO job as a full-fledged operation? Will they allocate a separate budget for like other advertisement expense?
  • Should cheap and low quality SEO-service providers reign in the market and neutralize the over-all good effect?

Finally, will the professionals perish for their pay packages?  Since, number of providers (do not ask me about their honesty) are rising, there is a chance, website owners will say “Yes” to cheap proposals.

The bottom line is – how safe a good and ethical SEO consultant’s standpoint is in terms of job satisfaction and value.

Lesson:

Do not get pessimistic! Google is watching everything.  No search engine will allow people to garbage web and show them to public. After all, all are here to do business and at some point, we have to be ethical to our customers.

2. Think from Business point of view:

For them, the days are yet to be tougher. If you are already tempted with unsolicited mails, cold calls and willing to go for their low cost SEO options, here are few simple questions for you:

  • Are you still expecting someone to arrange pool of in-bound links for your site to rank good?
  • Do you still believe that paid inclusions, traditional link building activities (though quick and easy options) will survive gracefully (say for another 5 years), so you can rest in peace?
  • What is the viability of SEO techniques that your business is following currently?
  • Is this going to be so affordable that any one can do away with SEO for his website and the results will simply follow thereafter?
  • Do you move on gut feelings?

Lesson:

Businesses who think, SEO has almost become a layman’s job and who better than Asians can do it in bulk, will regret very soon.  Cheap SEO services are not going to stay long.  Moreover, this will not add any value to your website ranking or traffic generation. Forget about conversions or building long-term reputation.

You may ask why? Is there any definite reasons or, the judgement is on speculation basis?

Here are the possible citations:

  • Stereo type job like – traditional link building, directory submission, article syndication and other automated link generation processes are loosing importance to search engines
  • Content farms are no more going to boost you PR value or brings you huge traffic
  • Nothing artificial and duplicity in terms of information will get value from users
  • Customer and viewer ratings will be prioritized, so your site to get on to top SERPs
  • Keyword competition, PR allocation, traffic visits are going to be stringent

Forecasting solution:

  • Real marketing knowledge need to be applied along with SEO expertise.
  • Off-line marketing concepts should be integrated with latest SEO knowledge. Managing and tracking consumer behaviour will gain importance.
  • Local SEO service will earn more credit and demands, which need comprehensive knowledge on local market situation.

Conclusion:

The final effect would be much like this:

  • Cheap SEO service consultants will perish a lot. In fact, bringing perfection to this competition level will not be possible for them. They cannot afford high-end technical and marketing solutions at lower costs.
  • SEO as an industry is going to earn a distinguished place in IT service domain. It will earn much recognition and respect.
  • Skilled and talented SEO professionals will have more power and market demand. Their job is secure, but challenging.
  • Number of service providers will reduce as search engines are getting stricter. Because, quality comes with price
  • Big time players having good infrastructure, resources, advanced technical support will survive. Cheap providers will fade away gradually.

As it seems, future of SEO was never predictable and will not be ever. However, with all these fine tunings, we can hope – SEO will upscale its grade very soon. Are you ready for it?

Check out the SEO Tools guide at Search Engine Journal.

Is SEO Going to be Costlier in Coming Years?


AllTheWeb.com To Close April 4, 2011

Yahoo! will be closing AlltheWeb on April 4, 2011, as we focus on other features to improve your search experience. Starting on April 4, 2011, www.alltheweb.com will redirect to Yahoo! Search at search.yahoo.com. Thanks for your understanding.

All The Web To Close April 4, 2011

Yahoo! will be closing AlltheWeb on April 4, 2011, as we focus on other features to improve your search experience. Starting on April 4, 2011, www.alltheweb.com will redirect to Yahoo! Search at search.yahoo.com. Thanks for your understanding.

Baidu Deletes Almost 2.8 Million Files Of Online Works Over Copyright

Chinese search engine Baidu has deleted 2.8m works from its online library, Baidu Wenku, in an attempt to settle a copyright dispute with writers.”By Tuesday afternoon we had removed almost 2.8 million files, mainly from the Literary Works section of the site, which was the primary concern of the writers and publishers,”

Microsoft Files Complaint With EU Commission Over Google's Dominance

As troubling as the situation is in United States, it is worse in Europe. That is why our filing today focuses on a pattern of actions that Google has taken to entrench its dominance in the markets for online search and search advertising to the detriment of European consumers.How does it do this? Google has built its business on indexing and displaying snippets of other organizations’ Web content. It understands as well as anyone that search engines depend upon the openness of the Web in order to function properly, and it’s quick to complain when others undermine this. Unfortunately, Google has engaged in a broadening pattern of walling off access to content and data that competitors need to provide search results to consumers and to attract advertisers.

Twitter Upgrades Embedded Tweets

, , , , , , , , ,

Twitter has updated its developer tools, making embedded tweets more interactive and functional. The new tweets allow users to reply, retweet or favorite a tweet directly from its embedded version.

Twitter introduced embeddable tweets last year — and while the end result has been quite effective, the set-up process involved in actually embedding tweets is more trouble than its worth. Fortunately, plugins like Blackbird Pie for WordPress have made the process less cumbersome.

The new functionality of embedded tweets comes courtesy of a developer tool called Web Intents. Users must first insert a script on a page that will use the intent. Those that already use the Tweet button on their websites will be able to start using Web Intents right away.

The integration process is still surprisingly cumbersome — especially for users that just want to easily and quickly embed a tweet. But the code itself looks a lot more clean. Already, WordPress.com users can take advantage of Web Intents powered embedded tweets. We imagine that the WordPress.org version of that plugin will be updated soon.

There are some cool things about Web Intents. Not only can content creators embed a tweet on their website, they can also embed a pre-filled Twitter message window. Web Intents are mobile-friendly and work with both iOS and Android, which is a nice touch.

The fact that users can send a tweet directly from a webpage or retweet messages without having to use a third-party program or extension could make for some interesting possibilities — at least for web developers and app makers that want to add more seamless social ability to their sites.

Developers, what do you think of the new Twitter Web Intents? Let us know in the comments.

[via ReadWriteWeb]

More About: blackbird pie, twitter, web intents, WordPress.com

For more Dev & Design coverage: