Monday, July 29, 2013

Infographics and their use in SEO


Infographics is a graphical representation of information, data or knowledge. Infographic is a quick and easy way to learn about a subject without a ton of heavy reading. Infographics are used these days to characterize the complex as signs, maps, technical texts and many more with ease and efficiency.

Why should use infographics for SEO?

  • Large amounts of data can be easily represented in an attractive way.
  • Make a good effect on the mind of the user, so that it is possible for the user to re-examine and recommend the site.
  • Infographics are more likely than simple text.
  • Publisher that use infographics grow in traffic an average of 12% more than those who do not use infographics.

Benefits of Using infographics for SEO

Infographic plays an important role in SEO as it gives readers a richer experience and also contributes indirectly to obtain a better ranking of web pages - Indirectly, because the search engines cannot read graphics and videos. 

  • Using Infographic is a good strategy, since it is not affected by the search algorithm updates.
  • The dazzling images attract the attention of social media users and stimulate the intention to share more content.
  • Get advantages of image directories to host the infographic with a link to your website is another great example of the benefits of using computer graphics.
But too many graphics that consume large bandwidth will negatively impact search engine ranking of your website. Do not forget also that when it comes to graphics, the search engines are just blind and dumb to it.

Friday, July 19, 2013

What is sitemap, Robots.txt and RSS?


Sitemap: 

A Sitemap is a file that lists URLs for a site. The Sitemaps protocol allows a webmaster to inform search engines about URLs on a website that are available for crawling. It allows webmasters to include more information about URL like when it was last updated and how important it is in relation to other URLs in the site. Site Files are limited to 50,000 URLs and 10 megabytes per map. Google first introduced Sitemaps 0.84 in June 2005. In November 2006 Google, MSN and Yahoo announced joint support for the Sitemaps protocol

RSS: 

RSS (Rich Site Summary) is a format that provides information about regularly changing web content such as latest news headlines, blog entries, audio and video. An RSS document includes summarized text, Meta data such as publishing dates and authorship. It allows you to easily stay informed by retrieving the latest content from the sites. RSS can be read using software called an RSS reader, feed reader or aggregator. News aggregator or RSS reader software allows you to capture RSS feeds from various sites and display them for you to read and use.

Robots.txt File:

Robots.txt is a convention to prevent cooperating web crawlers and other web robots from accessing all or part of a website which is otherwise publicly viewable. Web site owners use the robots.txt file to give instructions about their site to web robots.
It works like this: If a robot wants to visits a Web site URL, say http://www.site.com. Then it firsts checks for http://www.site.com/robots.txt and finds: 

User-agent: *
Disallow: /

The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot to not visit the site pages.
There are two important considerations when using robots.txt:
Robots can ignore your robots.txt. Especially malware robots that scan the web for security vulnerabilities, and email address harvesters used by spammers will pay no attention.
The robots.txt file is a file in the public domain. Anyone can see what sections of your server do not want robots to use.