Products You Might be Interested

Friday, March 16, 2012

Matt Cutts: Over Optimized Websites Will Get Penalized In A Month Or Two

Danny Sullivan from Search Engine Land, Matt Cutts from Google and Duane Forrester from Bing held a very popular panel at SXSW. According to Matt Cutts, over Optimized Websites Will get penalized in a month or two. Listen to the SXSW named Dear Google & Bing: Help Me Rank Better audio below.

Friday, November 4, 2011

Google's New Algorithm Update Impacts 35% Of Searches

Google announced today its latest search algorithm update or change that the company says will impact 35% of Web searches. The change builds on top of its previous “Caffeine” update in order to provide its users more up-to-date and relevant search results, specifically those in areas where freshness matters. This includes things like recent events, hot topics, current reviews and breaking news items.

According to Google, the new algorithm knows that different types of searches have different freshness needs, and weighs them accordingly. For example, a search for a favorite recipe posted a few years ago may still be popular enough to rank highly, but searches for an unfolding news story or the latest review of the iPhone 4S should bring the newer, fresher content first, followed by older results.

For searches about recent events and news, Google may now show search results towards the top of the page that are only minutes old, the company says. For regularly occurring events, like the Presidential election, the Oscars, a football, company earnings, etc., Google knows that you’re likely most interested in the most recent event, even if you don’t specify keywords indicating that.

That means a search for “Apple earnings” won’t (in theory) require you to also type in “Q4 2011″ in order to see the latest information. It will be implied that you meant this latest quarter, without the need for the extra text.

For items that see regular updates, like consumer electronics reviews, reviews of a particular kind of car, etc., Google will also feature the most current and up-to-date information above the rest.

This “freshness update,” is an extension of what Google begin last year with Caffeine, an under-the-hood improvement that, among other things, helps Google index content quicker, so results were more realtime. This year, Google also brought out its Panda update, which was meant to decrease the rankings of so-called “content farms” – SEO-optimized entities that critics said filled Google search results with low-quality results.

Now, it’s clear that Google understands that the most relevant search result is more often the one that’s relevant now – the one that’s bringing you new information. The update’s impact on Google Search is fairly substantial, with Google claiming that roughly 35% of search results will be affected by the changes.

Google used to have a search vertical specifically for the most recent updates at www.google.com/realtime, where it was indexing Twitter updates. However, when the contract with Twitter expired, Google shuttered the site (it now redirects to the Google homepage). Google said at the time that it planned to re-open the site with Google+ search results alongside other realtime sources of information. But with the new Google search update, a specific vertical for realtime information feels less necessary.

Friday, July 1, 2011

Matt Cutts: Try something new for 30 days

Watch Matt Cutts as he talks about trying something new in 30 days. The video was originally posted at ted.com

Matt Cutts: Try something new for 30 days

Monday, May 30, 2011

Twitter for Better SEO?

Do you know that links tweeted on Twitter matter for SEO? When determining rankings in their search results, Google and Bing include social signals - namely, links that get tweeted on Twitter. Danny Sullivan confirmed this in his December 1, 2010 post on Search Engine Land, and it was likely a factor for a while before that.

To put this new aspect in perspective, Google uses hundreds of signals to determine how it should rank a website. These signals include inbound links to the site, the title tag of a web page, as well as the site speed.
Getting people to link to your site is really all about having great content that people want to share, whether on their blogs or websites, or on Twitter. As Google and other search engines increasingly take note of social activity and the links shared on sites like Twitter, having a good social media presence will become increasingly important for ranking well in search results.

Many companies have been employing social media as a part of their marketing strategy, and for good reason. Now that social activity has so much impact on search engine optimization (SEO), companies that take SEO seriously know they must use social media as part of their strategy for getting onto the first page of search results.

Note for those who are not familiar with Twitter: "tweets" are the 140-character (or less) messages that people post on Twitter. Twitter offers help for new users on its site.

So, How Can I Use Twitter To Help My SEO?

The ways of search engines are puzzling, and people are always trying to figure out which specific tactics will help more than others. But just as we know that other ranking factors are considered in light of giving searchers the best information for their queries, you can bet that search engines will elevate the best content on the social networks - especially the content that's shared by real people who have influence.
Based on case studies, the more quantity and quality of tweets that link to your website, the more of a lift you can expect to see in your search engine rankings for the linked-to page or pages.

  • Mind the Text - When you tweet a link, it's likely that search engines use the text you enter to determine what your link is about. It's very similar to the way that search engines regard anchor text on web pages - the text on which a link is built tells the engines what the linked page is about. This in turn can help the linked page rank better for the keywords contained in the anchor text.
  • Who Says? - Who links to you on Twitter matters. You probably know already that it's more beneficial if influential tweeple - "people" in Twitter-speak - tweet about you, or retweet your tweets, because they will reach a wider audience. The same is true for the SEO value of Twitter. Google and Bing both say they look at the author's authority or quality when evaluating links that appear in tweets.
The search engines are mum on how they determine author quality, but here are some indicators of authority that SEO experts think search engines consider:

  •  Presence of an avatar or portrait. Spam accounts often don't have one.
  • Has the account been verified? Did the person confirm their email address? (People can't see this, but Twitter has this information, and the search engines may be able to get it.)
  • More followers.
  • Quality followers. (This means people who follow someone for a good reason - NOT purchased followers!)
  • Ratio of following to followers.
  • It may be better if the URL in someone's profile doesn't match domain they're tweeting about, because then it's certain the person isn't engaging in self-promotíon.
  • Twitter handles that don't have numbers. (Many spam accounts on Twitter have user names like Name8765.)
  • A bio with complete information. 
  • Engagement. (Accounts that don't ever reply to other people certainly seem spam-y to me.)
  • Included in lists created by quality tweeple.
  • The PageRank of a Twitter profile.
Think of it this way, who would you rather have link to your website?

The idea of author quality is much like PageRank for web pages. If a web page has 100 links, each from a different page with a PageRank of 0, they probably provide the same SEO value as a single link from a web page with a high PageRank. A link tweeted by a respected and well-followed person on Twitter will be worth more - both for your reputation and your SEO - than 100 tweets from spam-y bot accounts.

Something to keep in mind is that using bots or cheap labor to create a ton of Twitter accounts and tweet links to your site would be nothing but a spam-y waste of time and money. You won't get any SEO value, and you could be identified as a cause of Twitter spam.

If you notice a spam-y Twitter account, click "report [username] for spam".

What Can I Do To Encourage Tweets and Links?

  1. This should be pretty obvious - I hope. Create great content that people will want to share.
  2. Make it easy for people to tweet and share your content. Consider including a Twitter button, a call to action, or some simple way for people to share a link to your website.
  3. Engage with your followers and attract new, quality ones. See our Twitter Marketing 101 article for guidance.
  4. Keep tabs on who has mentioned or linked to you and thank them. You can also ask them to link to the newest thing you've created.

Friday, May 6, 2011

Google Algorithm Update – Is Bounce Rate a Ranking Signal?

Is bounce rate now considered as part of a website ranking criteria? After the release of the data about the Panda winners and losers in the UK, SearchMetrics said:

“It seems that all the loser sites are sites with a high bounce rate and a less time on site ratio. Price comparison sites are nothing more than a search engine for products. If you click on a product you ‘bounce’ to the merchant. So if you come from Google to ciao.co.uk listing page, than you click on an interesting product with a good price and you leave the page. On Voucher sites it is the same. And on content farms like ehow you read the article and mostly bounce back to Google or you click Adsense.”


“And on the winners are more trusted sources where users browse and look for more information,” the firm added. “Where the time on site is high and the page impressions per visit are also high. Google’s ambition is to give the user the best search experience. That’s why they prefer pages with high trust, good content and sites that showed in the past that users liked them.”

WebmasterWorld Founder Brett Tabke wrote in a recent forum post, discussing what he calls the “Panda metric“, that:

“Highly successful, high referral, low bounce, quality, and historical pages have seen a solid boost with panda.”
Google’s Matt Cutts recent video on ranking in 2011 talks about increasing site speed, and how this can keep users on your site longer (IE: not bouncing), you can increase your ROI. Speed is a ranking signal. We know that. Speed can reduce bounce rate. Even if Google doesn’t use bounce rate directly, there is a strong relationship here.



Matt McGee at SearchEngineLand said:

“It’s important to note how Google defines Bounce Rate,” he adds. This is below:

“Bounce rate is the percentage of single-page visits or visits in which the person left your site from the entrance (landing) page. Use this metric to measure visit quality – a high bounce rate generally indicates that site entrance pages aren’t relevant to your visitors. The more compelling your landing pages, the more visitors will stay on your site and convert. You can minimize bounce rates by tailoring landing pages to each keyword and ad that you run. Landing pages should provide the information and services that were promised in the ad copy.”
He also points to how it is defined in Google Analytics:

“The percentage of single page visits resulting from this set of pages or page.”

“Personally, I don’t think that a single page visit is a bad thing. To me, it tells me the visitor found what they were looking for. Isn’t that what Google would want? If I were Google, I’d want a searcher to find the answer to their search on the exact page they clicked on in a search result…not 1 or 2 clicks in. If I were Google, I’d look more at ‘Who Bounces off that page, and returns to the same Google search, and clicks on someone else, and then never returns to your site,’ but I’m not Google, and that’s just my ‘if I were Google’ thoughts”.
“I think most agree that there’s a ‘Page Score’ or a ‘set of pages score,’ and when that has a bad score, it affects those pages, and somehow ripples up the site,” Boykin adds. “It could quite well be that if you have a page that links out to 100 internal pages, and if 80 of those pages are ‘low quality’ than it just might affect that page as well. A lot of this is hard to prove, but there are some smoking guns that can point in this direction.”

“Bounce rate is important, and yes, many sites that got hit did have a high bounce rate, but comparing this to sites/pages that weren’t hit doesn’t exactly show any ‘ah ha’ moments of ‘hey, if your bounce rate is over 75%, then you got Panda pooped on,’ because the bounce rate Google shows the public is missing many key metrics that they know, but don’t share with us.”

I think the best advice you can follow in relation to all of this is to simply find ways on how to further improve every page of your website and keep people from leaving your site, before they complete the task you want them to complete. That means providing content they want.

What do you think?