Friday, August 10, 2007

Google Pagerank Table

Google Pagerank is losting its power more and more, but there are many people wonder that how many external links we need to improve our Google Pagerank from 3 to 4 or even from 4 to 5. THIs table helps you to understand.

Saturday, August 04, 2007

Google's Supplemental Results go to Hell

There is a good news that Google announced that they have done away with the "supplemental results" label from the search results within Google.com.

I remember what I shouted about on DP 45 days ago:
Supplement result is useless for me,
when search something, I don't care whether if is supplemented, just want what is helpful.
why Google let us know that's supplemented? I can't figure it out.
Now, Google do a right thing to drive us out of the damn Supplemental Results, we are just a normal searcher, never care about what we search are supplemented. Cheers!

Sunday, January 14, 2007

Google Update on 2007-1

During the last week, we noticed the PR on some sites got changed, but I think it's not a real PR update, just a little adjustment on algorithm. Some site's PR dropped to zero, then rolled back after several days. It's said just some changes on some DCs, maybe just reboot. :-) I can't see any obvious update on PR. Most of sites' PR remained, only one PR of index updated to 3. Wish a big dance will be happened in the next two months. So don't believe the PR update on 2007/01, it's just a joke.

Here's Matt Cutts's explain:
A few people were seeing PageRank 0 for their site. There was a small auxiliary push that needed to happen to complement the PageRank push, and that push happened a few hours ago (i.e. Jan 11, 2007). If you were getting stressed, you might want to re-check now. If you never even noticed, well, good for you.

Thursday, December 21, 2006

2006 Search Blogs Nominations

Here is a list for the nominated blogs in each category. Blogs were nominated over the past week by Search Engine Journal readers. Great Massive Search Engine Resources. Sorry for no links, you can check their url by search these title, like: http://www.google.com/search?hl=en&q=SEO+by+the+SEA

SEO Blog of the Year

SEO by the SEA
SEOpedia
SearchRank Blog
SERoundtable
Search Engine Guide
Graywolf’s SEO Blog
SEO Disco
SEO Buzz Box
SEO News Blog
Unofficial SEO Blog
SEO Scoop
StuntDubl
Bruce Clay Blog
SEO Book
Matt Cutts
Sugarrae
SEO Egghead
Traffick
SEOmoz

Search News Blog

Search Engine Watch
SERoundtable
ResourceShelf
Search Engine Guide
Search Engine Lowdown
Search Views
Threadwatch
WebProNews
Pandia
Traffick
ValleyWag
John Battelle’s Search Blog
Daily Search Cast
Googling Google
Google Blogoscoped
Google Operating System
V7N SEM News (Peter Da Vanzo)

Search Marketing / Contextual Ad Blog

JenSense
Make Easy Money with Google & AdSense
ShoeMoney
SERoundtable
Small Business SEM
SEM In House
Search Marketing Gurus
Shimon Shandler
Marketing Pilgrim
ProBlogger.net
eWhisper.net

Best Link Building Blog

StuntDubl
Jim Boykin
Link Building Blog
Text Link Brokers Blog
The Link Spiel

Best Search Agency Resource Blog

SEOmoz
MarketingPilgrim

Best Search Engine Blogger of 2006

Take the questionairre to find out :)

Best Social Media Optimization Blog

TopRank
Pronet Advertising
Seth Godin’s Blog
MicroPersuasion
Social-Media-Optimization.com

Best SEO Black Hat Blog(gers)

Webguerrilla
David Naylor
G-Man
IrishWonder
Oilman
SEO Black Hat
Biggnuts (Dax)

Best Local Search Blog

Greg Sterling’s Screenwerk
Small Business SEM
Understanding Google Maps & Yahoo Local
Mike the Internet Guy
The Local Onliner

Best Affiliate Marketing Blog

5 Star Affiliate Programs
PepperJam
ReveNews
Super Affiliate
SuperAff Blog
ShoeMoney

Best Web 2.0 Blog

CenterNetworks
Mashable
ReadWrite Web
TechCrunch
Somewhat Frank

*BONUS* Best Search Engine Marketing Community / Forum

Webmaster World
v7n Forums
Digital Point Forums
Search Engine Watch Forums
Web Pro World
Cre8asite Forums
iHelpYou Forums
High Rankings
SitePoint
SEO Chat

Google Tell: How to deftly deal with duplicate content

Google became more care about the duplicate content on its search result, for give user better search experience, they gave some advices to webmasters. These suggestions is good for help webmaster to keep SERP on Google.com.

How can Webmasters proactively address duplicate content issues?

* Block appropriately: Rather than letting our algorithms determine the "best" version of a document, you may wish to help guide us to your preferred version. For instance, if you don't want us to index the printer versions of your site's articles, disallow those directories or make use of regular expressions in your robots.txt file. (I don't think so, like print page is very helpful,will not delete it.)
* Use 301s: If you have restructured your site, use 301 redirects ("RedirectPermanent") in your .htaccess file to smartly redirect users, the Googlebot, and other spiders. (no need to do it. update your sitemap should be OK)
* Be consistent: Endeavor to keep your internal linking consistent; don't link to /page/ and /page and /page/index.htm. (It's hard to do, especially for huge site.)
* Use TLDs: To help us serve the most appropriate version of a document, use top level domains whenever possible to handle country-specific content. We're more likely to know that .de indicates Germany-focused content, for instance, than /de or de.example.com. (good idea)
* Syndicate carefully: If you syndicate your content on other sites, make sure they include a link back to the original article on each syndicated article. Even with that, note that we'll always show the (unblocked) version we think is most appropriate for users in each given search, which may or may not be the version you'd prefer. (if some guy steal your web page and don't link back to you, that would be harmful to you? )
* Use the preferred domain feature of webmaster tools: If other sites link to yours using both the www and non-www version of your URLs, you can let us know which way you prefer your site to be indexed. (how about to access with IP addess?)
* Minimize boilerplate repetition: For instance, instead of including lengthy copyright text on the bottom of every page, include a very brief summary and then link to a page with more details. (How big it is? what's the suggested limitation? under 1024bytes?)
* Avoid publishing stubs: Users don't like seeing "empty" pages, so avoid placeholders where possible. This means not publishing (or at least blocking) pages with zero reviews, no real estate listings, etc., so users (and bots) aren't subjected to a zillion instances of "Below you'll find a superb list of all the great rental opportunities in [insert cityname]..." with no actual listings. (good idea)
* Understand your CMS: Make sure you're familiar with how content is displayed on your Web site, particularly if it includes a blog, a forum, or related system that often shows the same content in multiple formats. (sometimes it's rational to show the same content in different ways, cause different people have different habits)
* Don't worry be happy: Don't fret too much about sites that scrape (misappropriate and republish) your content. Though annoying, it's highly unlikely that such sites can negatively impact your site's presence in Google. If you do spot a case that's particularly frustrating, you are welcome to file a DMCA request to claim ownership of the content and have us deal with the rogue site. (good idea)

Labels:

Sunday, December 17, 2006

The Top of Live Search for 2006

Microsoft released the top searches in 2006 on MSN Search and Live.com. These lists are the result of studying our anonymous logs for the most searched queries in all the countries and regions our users search from.

AOL, that uses Google for web search, also released the top searches like this:

1. weather
2. dictionary
3. dogs
4. American Idol
5. maps
6. cars
7. gamers
8. tattoo
9. horoscopes
10. lyrics

Saturday, December 02, 2006

My Top Google Properties


According to an extended list of Google properties and how they rank amongst themselves in market share of visits. We can see what's the most useful in Google. That also means a great search engine not just a search engine, but a bundle of search engines, that means a giant.

Sunday, November 05, 2006

Yahoo Duplicate Content Explains

Search engines now become more and more intelligent, but some puzzles are still unresolved, such as duplicate contents, yep, anybody use search engines knows we can find many duplicate contents in the search engine results, even some copies get higher SERP than original content. This is really harm for the benefit of the author who created that helpful and interesting posts. Anybody dislikes this happens on himself. So, let's take a look at how search engine try to fix this big bug. William Slawski post Microsoft Explains Duplicate Content Results Filtering , maybe this can help us to understand Yahoo how or try to avoid duplicate contents on it's searching results. SO, if you want search engines know your posts are original, you'd better follow these guidelines:
1. Do not use dynamic URL to publish your posts.
2. Submit your posts to search engines as soon as you published it.
3. declare the copy right privacy on your posts.
4. simplify your post url, better with keywords related your posts.
5. tell your friends you post a original and great article.