Saturday, April 24, 2010

Google's 20th Anniversary of Hubble Space Telescope Launch by NASA

Google showed its trademark again, that Google is not just a Search Engine. Google has always shown a variety of efforts, which basically, in order to spread the Substance of an Information to the world. Just like when on 22 April 2010, that Google changed the homepage with Earth Day-themed logo.

And today, 24 April 2010, is the 20th anniversary of the Hubble Telescope, which was launched by NASA. So what role does Google do?, as usual, Google is changing the homepage logo, with a logo theme: 20th Anniversary of Hubble Space Telescope Launch by NASA.

And also, Google is offering to users, for Take a tour of the universe with the Hubble imagery in Google Earth.

Then what it means?, especially for Google users. When users or visitors went into Google homepage, very probably, they do not even know about what's Hubble Telescope. Furthermore, Google is giving out to them by way of, setting the Google logo with an image of a telescope that is floating in space, obviously, it will be very interesting for Google users.

And, it will encourage them to click. Where, the content behind the logo, is a query of "Hubble Telescope". When they clicked, thus, there was a transformation of information about everything related to Hubble Telescope. Simply put, "No idea" turned into "Know". And really, it is the Substance of an Information, which is being tried to be said by Google.

The second is a link that offers to Take a Tour of the universe with the Hubble imagery in Google Earth. This feature is more directed for Google users who already know about NASA's Hubble Space Telescope.

And it was not just above mentioned features offered by Google, users will also be invited to take a tour through Google Sky on Google's 20th Anniversary of Hubble Space Telescope Homepage.

3 great efforts from Google.

Saturday, April 17, 2010

Explore Google Search Can Be A Disadvantage For Website Owner

There are two important variables in Google search. Those two variables are Information and Website. On search results page, those two variables are divided into four features, namely: Title, Snippet, URL, and Cached link.

From four of those features, the most useful for users is Snippet, a description of or an excerpt from the webpage. This snippet that will encourage users, whether to conduct a Clickthrough or not?. And this snippet is a vital feature of information as a variable. Where, snippet can be made by the website owner by creating site's title and description in search results with Meta Tags.

To search for more complete information, then users click a link on the Title feature (also known as Clickthrough). Then there was a search activity by users on Google, which is in accordance to the expectations of website owners.

However, for searches on Google with special features, such as search for a Unit Conversion, Dictionary, and others, then the opposite will happen and most probably is not expected by the website owner. Whereby, users already get the Information they are looking for from snippets that appear on search results page. So users do not need to do a Clickthrough to a website.

On the one hand, this is an achievement of Google, for giving information proportioned, quick, and appropriate for users. But on the other hand, it is a disadvantage for the website owner.

See the description on images below:

Dictionary Search on Google


Unit Conversion Search on Google

Friday, April 16, 2010

Why Are Restricted By Robots.Txt Always Appears In Blogger's Blogspot?

Please note that Googlebot (Google robot for crawling, also known as Spider) must crawl billions pages on the web. With the advancement of technology and resources they have, it becomes very likely be completed by Google in order to realize the mission of Google, which in this case performed by the Googlebot.

But not necessarily, billions of pages on the web, can be indexed by Google within a short and quick time. Googlebot needs Efficiency and Effectiveness for crawling. Googlebot will not crawl on the same page with different URL address, For example,

http://yourblogname.blogspot.com/search/label/firstlabel might point to the same content as http://yourblogname.blogspot.com/search/label/secondlabel.

This explanation, actually is the answer to the question often asked by Blogspot's bloggers, that is: Why are Restricted by robots.txt always appears in Blogger's Blogspot? (on Webmaster tools - Diagnostics - Crawl errors report).

Blogger is a blogging platform that has the advantage, automatically indexed in Google. But unlike WordPress which is self-hosted, Blogger or Blogspot is hosted by the second hand or not by You. Next, Blogger has a way to classify various topics by adding a feature that is, Labels.

By using Labels, a Blogspot's blogger can categorize or classify each post with a separate topic. However, if a post did not have a specific topic, then its post would be very likely to have more than one topic or label. What happens next is, there will be two or more different directory or URL address that leads to a page of the same post.

And to handle the matter or issue, Blogger by default or automatically add the Robots.txt file for each post pages that have more than one label or topic.

Furthermore, this issue will appear on Crawl errors report on Google webmaster tools. Where in fact, it is not a problem that needs to worry about by Blogspot's blogger. Therefore, the post pages will still be indexed by Google, it's just that, Googlebot will requires more time to crawl the post pages with more than one label, where, Googlebot will first identify, where post pages which are Restricted by Robots.txt file.

Default Restrictions or Robots.txt for Blogger's Blogspot

For example:

You have a blog with a post titled: How to Play Football, it is very likely that this post will have two topics or label, namely: Playing Football (firstlabel) and Football (secondlabel). And the result is two URL address just like below:
  1. http://yourblogname.blogspot.com/search/label/firstlabel
  2. http://yourblogname.blogspot.com/search/label/secondlabel
So it is clear that, the two URL address above lead to the same post page. And, Blogger will make restrictions on one of its URL addresses.

Benefit And Detriment Of Blog Submission To Blog Social Networks

Blogging is an activity done in a dimension that totally limitless. Bloggers brought to a dimension that seems like a jungle, so that bloggers often do not know exactly on What will be, being, and they have done in the world wide web. Simply put, bloggers finally trapped in the world wide web. Though on the other hand, blogging is an exciting activity, which will provide friends and new experiences.

An activity (a way) that is often done by bloggers is to register their blogs to the website which is called as a Blog Directory or Blog Social Network (e.g:Blog Catalog, MyBlogLog and more). Despite that it is an activity that is Right for them, but actually, it can slightly weaken the existence of blogger's blog.

There are two things, which, one thing is a Benefit, one other thing is a Loss or Detriment for bloggers.

Benefit for bloggers: By registering their blogs to the Blog Social Network, their blogs will be more known by other bloggers from around the world. Their blogs will also be easier to search or crawl (through the Link) by Search Engine Robots, but with a Note, this Blog Social Network is a DoFollow site (just check, whether they DoFollow?). Usually, bloggers are asked to enter or put their Blog Feeds URL (RSS URL) from their blog. So the update will happen automatically. This Blog Social Network next to an Up-Stream site for the blogger's blog.

Therefore, the Blog Social Network is usually a big site that has been highly recognized by Search Engine Robots. Of course, they must be indexed more quickly than the blogger's blog (see on attached image, click to enlarge).

The true Blog URL even didn't appears on SERP

Detriment for bloggers: Because the Blog Social Network indexed more quickly, they are very likely to appear with Higher Rank than the blogger's blog on SERP-search engine results page. And this will encourage the User to do Clickthrough to the Blogger Social Network. And this is called as Weakening the Existence of a blogger's blog on the world wide web.

The most fitting way for bloggers (especially beginners or newbies) to avoid that, among other things:
  1. Focus in making Unique and Original Content as much as possible,
  2. Use Google Webmaster Tools regularly,
  3. And always check on blog progress with Google Analytics.
For newbie bloggers (especially again for blogspot's bloggers), think twice to register the blog to the Blog Social Networks, just focus on doing three things mentioned above, at most about three months, then the blog will get a steady place on the world wide web. Pay attention to things that are long-term, and avoid the short-term.