Search Engine Optimization

This guide can assist you with your SEO problems to a degree. I stumbled upon a lot of new stuff now and thought it would be important to share it with the external world of site programmers.

Out of the Numerous tools I stumbled upon today were:

  • · SeoSiteCheckup
  • · AddThis
  • · compressnow

All are damn good for almost any SEO analyst or a professional. Basically I worked on over 2 dozen SEO techniques today and all of them was crucial. Here is an overview of the methods I dealt with:

1) Keywords – The meta keywords tag lets you present extra text to search engines to index together with the rest of what you’ve written on your page. Meta keywords can emphasize a specific word or phrase from the primary body of your text.

2) Most Common Keywords Test – Check that the most frequent key terms & their use (number of times used) in your web page. HOW TO FIX In order to pass this test, you have to optimize the density of your major keywords displayed above. If the density of a particular keyword is under 2% you must increase it and if the density is over 4 percent you must decrease it.

3) Keyword Usage – This describes in case your most frequent key words are used in your title, meta-description and meta tags tags. Second, you must adjust these tags content in order to incorporate some of the primary keywords shown above.

4) Headings Status – This indicates if some H1 headings are used in your page. H1 headings are HTML tags than can help highlight important topics and key words within a page. HOW TO FIX In order to pass this test you need to identify the most important topics from the page and insert these themes between tags. Example: Important topic goes Another topic Headings Status This implies if some H2 headings are employed on your page. H2 headings can be helpful for describing the sub-topics of a page.

5) Robots.txt Test – Search engines discover tiny programs called spiders or bots to search your site and bring back information to ensure your pages may be indexed in the search results and found by internet users. If there are files and directories that you do not want indexed by search engines, then you can use the “robots.txt” file to specify where the bots shouldn’t go. These documents are extremely simple text files that are placed on the root folder of your site: There are two major factors when using “robots.txt”: – the “robots.txt” file is a publicly available file, so anybody can see what segments of your server that you don’t want robots to use; – bots can dismiss your “robots.txt”, especially malware robots that scan the web for security vulnerabilities.

Sitemaps are an easy way for novices to inform search engines about pages on their websites that are available for crawling. In its simplest form, a sitemap is an XML file that lists URLs for a website along with additional metadata about every URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the website.

Favicons are small icons that appear in your browser’s URL navigation bar. They are also saved alongside a URL’s title when bookmarking that page. They can help brand your website and make it effortless for users to navigate to your site among a list of bookmarks. HOW TO FIX To add a favicon to your website, you need to have your logo created at a 16×16 PNG, GIF or ICO picture and uploaded to your web server. Then it’s merely a matter of adding the following code to the header of your HTML code to your web pages: In the example above the “url_to_my_favicon” refers to the actual positioning of your favicon file.

8) Code To Text Ratio – Check your page source code so as to assess the size of text content compared to this arrangement (HTML code). This percent isn’t a direct ranking element for search engines but there are different factors that depend on it such as site loading speed and consumer experience. HOW TO FIX In order to pass this test you must increse your text to HTML code ratio. Here are some tehniques: move all inline styling principles into a external CSS file move your JavaScript code into a external JS file use CSS layout Rather than HTML tables

9) URL SEO Friendly Test – Check if your website URL and all links from inside are SEO friendly

10) Broken Links Test – Check your site for broken links

11) Google Analytics Test – Check if your site is connected with Google analytics HOW TO FIX In order to pass this test you need to create an account on Google Analytics site and insert in your page a small javascript tracking code. Example: Note that you have to modify the ‘UA-XXXX-Y’ with the proper id that you’ll discover to your analytics account. Know more click on this link mootools.net

12) Underscores in Links Test – Check your URL and in-page URLs for highlight characters.

A Google PageRank (PR) is a step from 0 – 10 and is set by a proprietary mathematical formula which counts every URL to a site for a vote. Essentially, your website is set up against every other website with similar content and key words in a popularity contest. Therefore, it is necessary to allow your site to obtain backlinks from other sites which are essentially giving a vote of confidence to your site. With your website already cleaned up and shining with outstanding quality content and internal SEO techniques (key words, meta tags, etc), you need to have a strategy to get your site noticed by others. Important pages receive a higher PageRank and therefore are more likely to look on peak of the search results.

14) Alexa Page Rank Test – Check Alexa Rank to your site. Alexa Rank mesure the visitors speed of your domain name and is set by the web information company Alexa. This provider ranks sites based on the amount of traffic (over a period of 3 months) recorded from users who have the Alexa Toolbar installed. The lower your score on Alexa the better. Meaning in the event that you’ve got a standing under 100,000 then your site should be generating some fantastic traffic. The traffic rank depends on the popularity of your site (how many consumers who visit your website and the number of pages from your site viewed by those users). HOW TO FIX Some best practices to boost your Alexa Page Rank are recorded below:

  • – The most important thing is that the content: compose qualitative and useful content
  • – Regularly publish new and one of a kind content Increase the visitors on your own website
  • – Generate quality backlinks to your site
  • – Connect to social media sites
  • – Install Alexa Toolbar in your browser and Alexa Rank Widget to your webpage

15) Image Alt Test – Check all images from the page for all features. If an image can’t be displayed (wrong src, slow link, etc), the alt attribute provides alternative information. Using keywords and human-readable captions in the alt attributes is a good SEO practice because search engines cannot really see the images. For pictures with a decorative function (bullets, round corners, etc) you have to use an empty alt or a CSS background picture. A picture with an alternate text specified is inserted using the following HTML line: Remember that the point of alt text is to provide the same functional information a visual user could see.