Tuesday, November 30, 2010

code.google.com: Controlling Crawling and Indexing

Controlling Crawling and Indexing



This methods automated website crawlers are powerful tools to help crawl and index content on the web. As a webmaster, you may wish to guide them towards your helpful content and absent from irrelevant content. Described in these documents are the de-facto web-wide standards to control crawling and indexing of web-based content. They consist of the robots.txt file to control crawling, as well as the robots Meta tag and X-Robots-Tag HTTP header element to control indexing. The robots.txt standard predates Google and is the accepted method of controlling crawling of a website.

This document represents the current usage of the robots.txt web-crawler control information as well as indexing directives as they are used at Google. This information is generally supported by all major web-crawlers and search engines.more...

Sunday, November 28, 2010

Google Voice Local Search

Finally Google lab lunching a services a google voice local search

Google Voice Local Search lets you search for local businesses from any phone and for free. If you're in the US, call 1-800-GOOG-411 and say what you want to find. Here are some of the features:
In the US, 411 is the phone number for local directory assistance, but it's pretty expensive (more than $1). 1-800-FREE-411 is a free service that uses speech recognition to process your request and ads to monetize it.

  1. Voice Search - search Google by speaking instead of typing.
  2. Voice Input - fill in any text field on Android by speaking instead of typing.
  3. Voice Actions - control your Android phone with voice commands. For example, you can call any business quickly and easily just by saying its name.

Benefit seo

Benefit seo
Search engine optimization (SEO) used for promoting your business on the internet; one must knows the ultimate benefits of seo promotion.


standpoint (Global / Regional)
Selecting your keywords or phrases to target your viewers, search engine optimization ensures that you and your company are establish worldwide or regionally by those who require exactly what you offer. SEO has many benefits for any organization which wants to reach all potential customers locally or globally. You can reach the targeted customers of your own choice.



Targeted Traffic
Search engine optimization promotion can enhance the number of visitors for your website for the targeted keywords or phrase. Converting those visitors into potential customers is one of the arts of search engine optimization. Search engine optimization is the only campaign which can derive targeted traffic through your website. Essentially more targeted traffic equal more sales.



Enhance Visibility
once a website has been optimized, it will enhance the visibility of your website in search engines. More people will visit your website and it will give international recognition to your products/services.



High ROI (Return on Investment)
An effective SEO campaign can bring a higher return on your investment than any other type of marketing for your company. This will therefore increase your volume of sales and profit overall.



Extended term positioning
once a website obtains position through a SEO promotion; it should stay there for long term as opposed to PPC (Pay per Click). SEO is a cheaper and long term solution than any other search engine marketing strategy.



Cost-effective
One of the great benefits of search engine optimization is that it is cost effective and requires the minimum amount of capital for the maximum exposure of your website.



Flexibility
It is possible to reach an audience of your own choice through a seo campaign. You can get traffic according to the organizational strategy to meet the needs and requirements of your choice.



calculable results
It is a unique quality of seo campaigns that you can quantify the results of seo by positioning reports of search engines, visitor conversion and the other factors of this nature.


Saturday, November 27, 2010

SEO Tips 2011

SEO Tips for 2011 


Naming Conventions – Use proper naming conventions (subject or keyword first, then plural variation, modifier, then tag line).
  • Here is a more useful post about how to use Meta titles, descriptions and naming conventions, but the gist is simple. Create a hierarchy based on a relevant platform of topically reinforcing semantics using keyword clusters and related synonyms to toggle relevance from what search engines deem as the co-occurrence matrix.
Canonical Issues - verify canonical issues to make definite there is a preferred default page or domain preference (http:// or www).
  • No reason to have 3 variations of a home page .html, .php and default.htm, depending on your programming platform, server settings (Unix or Windows) as well as if you are using static pages or a content management system you will need to consolidate your website to either http:// or http://www to avoid splitting your site into less potent slivers.
Indexation and Crawl Frequency – See how many pages are crawled in a site to determine crawl frequency.
  • If you have a home page that is crawled regularly and your internal pages are ignored, then this is often a result of lack of internal or external links.
  • You can implement sitemaps on a folder by folder basis, then link from the footer in a site template to a maser sitemap page (where all the mini sitemaps are linked to) to increase indexation. This way the link from the footer consolidates the ranking factor to one page and THAT page feeds the various site maps equivocally (through a tiered drip-down site architecture effect).
Orphaned Pages and Dead Ends – verify for orphaned pages to determine if pages are linked suitably to ensure crawling.
  • If you have a page or sub folder in a website that is only linked to from a few pages, then you cannot possibly expect that page to rank well in search engines. If you yourself will not “endorse” a page by linking to it properly (contextually from keywords in the body copy) or from the primary or secondary navigation, then you cannot possibly expect search engines to pay that page with any more credence than yourself.
  • Also be weary of PDF files (which can rank on their own and sponge link flow from your website). Make sure PDF files have absolute links (use the complete URL) back to your site, so they do not pool ranking factor and trap it where the rest of your site cannot benefit.
Dynamic URL’s – If pages are dynamically created, try to remove or rewrite as many parameters in the URL as possible or use URL / Mod rewrite.
  • Any time you have session data or query string parameters in a URL, you are decreasing the possibility of indexation. Particularly if there are conventions such as? PID=23D-55.aspx trailing along, when a SEO friendly naming convention could have taken its place with a bit of programming. You can rewrite entire segments, sub folders, categories, etc. without losing functionality or compromising SEO value.
Manage Outbound Links - Try to cap outbound links per page to fewer than 50 links for larger pages (10 for top level pages that need more ranking factor).
  • The more links that leave a page, the less ranking factor the elements on that page have as equity. The only instance when this is not a concern is if the page itself is augmented from other strong internal pages or have strong inbound links from other sites to offset the hemorrhaging effects of excessive links leaving a page.
Footer / Site wide Links - Use footer links sparingly by tactfully to tie site segments together.
  • Footer links still work (using 5-10 keyword-rich text links at the bottom of a page), but that can also diffuse the intent of pages that do not have enough content to distinguish themselves from other pages. If a page in a site does not have more than 300 unique words on that page, it can lose relevance as the navigation and other code structures collapse and all interject their shingles to offset or diffuse the pages unique purpose and optimal continuity.
  • For example if you have a page that is only a paragraph or two and you expect that page to rank for specific keywords, your navigation alone may trump the relevance for that page. Check your cached pages in text view to see how search engines view your code and your content without style sheets or java script.
  • Footer links can help brings balance to pages with less content, but uses them on pages with enough content to weather their contribution.
Broken Links – Check for broken links which could be hemorrhaging link flow and weakening a site from within.
  • Broken links irritate search engine spiders, and when they cannot connect the dots, your sites rankings suffer. If you are using WordPress our plug-in SEO Ultimate features a 404 monitor that sweeps the site for broken links which you can find and eliminate.
Alt Attributes in Images – Use alt attributes on images to preserve content integrity while providing internal links for ranking factor.
  • Using the alt attribute in images allows you to reinforce topical relevance with the on page text based content to improve a pages relevance score.
Anchor Text Optimization – Use pertinent anchor text and do not waste link equity from excessively linking to non reciprocating pages within a site.
  • Employing anchor text optimization means using relevant keywords to link to relevant pages within a site. Do this enough and before you know it you are virtual theming (which means creating a secondary navigation contextually through keyword co-occurrence).
  • This alone can distinguish your site from competitors as each granular layer consolidates ranking factor for a website. This alone is one reason why Wikipedia dominates search results, due to virtual theming.
Flattening Site Architecture – Keep site architecture as flat as possible or use breadcrumbs to aid in information architecture and crawling.
  • Avoid using sub folders excessively within a website domain.com/categories/products/color/page.html vs. flattening the url and site by using more descriptive naming conventions for a page domain.com/electronics-black-sports-watch.html
  • The closer the more competitive keyword landing pages are to the root folder, the easier it will be for them to gain additional ranking factor, page rank and page strength to express the content on that page.
Content Volume - Ensure you have enough content to topple a competitive keyword.
  • Trying to rank for a keyword with 5 million competing pages with a handful of content is an exercise in futility. You will need topical relevance which means articles, posts or pages all internally linked and consolidated to create the proper on page signals for that keyword.
  • For every keyword, there is a relevance threshold and tipping point, you will need to offset competitors by having more on page affluence as well as off page peer review (links from other authorities). However, in either case, content is a requirement.
Contextual Links – Link contextually within related document to select preferred landing pages through virtual theming.
  • The premise is simple, if you are on a page about engines, and have a keyword pistons appear, and then link the keyword pistons to the piston page. Do this for every keyword (only once per page if it appears more than that) and you have just added a virtual theme to your keywords. This means that each page can now work together collectively to support the parent theme (which is the main/root keyword itself).
Meta Tags - For larger sites, exclude Meta descriptions but for smaller sites, use the Meta data as an extra title or place for alternative keywords.
  • Always use a succinct and relevant title, but if you have multiple pages on a topic, then let search engines decide which keywords are more prominent and relevant by excluding the Meta description / snippet from the page.
  • Also make sure that if you are using a content management system that your pages do not all share a common, generic Meta title or description as a default. This is the fastest way to shoot down rankings in a site (lack of character).
Deep Links – Get at least 5-10 inbound links to each page via deep links from other sites in order to create buoyancy.
  • A page without links either from the site itself, or other sites is a page that has little value to readers or search engines. Popularity matters and for the millions of site owners who may or may not be aware of this simple fact, you MUST have deep links to a page if you want that page to exceed standard normalization.
  • A website replete with deep links (links to other pages other than the homepage) will start to have those individual pages rank and appear for multiple keywords. Not only does this create a more robust user experience, but the dependency for your rankings is not tied to an off topic or generic page like the homepage.
  • The take away here is, get at least 5-10 inbound links to each page minimum (if that page is expected to gain traction) otherwise, link to another page that is the preferred landing page and get deep links to it.
Keyword Stemming - Link to a page with multiple anchors (to create keyword stemming) and with” exact match” keywords to elevate just that term.
  • You can control how each page in your website ranks by being mindful about internal and external lining habits. This post called SEO Rankings and How to Create Them provides a masterful breakdown of this process.
RSS Feed Syndication - Set up multiple RSS feeds within a site to syndicate your content to attract natural back links from other sites.
  • A proper RSS campaign alone can build sufficient links for your website. Combined with a content development strategy and time-released topical content, this alone can drive traffic and increase domain authority to produce rankings and relevance in even the most competitive vertical markets. This post SEO, RSS and the Power of Syndication provides SEO techniques and tactics for RSS feeds and RSS aggregation.
Sub Domains – Despite abuse in the past, sub domains still work.
  • If your website is sagging under its own weight, then segment a new section of the site with a sub domain to emphasize topical content or to topple a competitive keyword vertical.
  • Search engines pay particular attention to keywords in the URL and while you cannot always make the best of a bad situation, sometimes you can create islands of relevance using a keyword rich sub domain to augment your existing website to create a new beacon of relevance.
  • Here is a post on which is better for SEO, subdomains or subfolders. The choice ultimately is up to you, or even using a combination of both is entirely relevant. Site architecture must work in tandem with content, links and conversion. All are mere pieces of the puzzle until consolidated
Trust Rank – Linking from aged pages can pass along trust to new landing pages or sub folders or sub domains. Don’t look past your own site for ranking factor.
  • Passing along trust rank can save you months of waiting for search results to mature from fresh content. Here is a post that shows you how to identify and link from older more relevant pages to new pages to augment rankings and more importantly, trust. This method is designed to augment on page SEO and consolidate ranking factor from all pages to the new preferred landing page.
Sitemaps – Use sitemaps to not only tie the site together, but also as a way to nourish pages like an irrigation system through linking to them.
  • Here are a few other useful SEO tips you can use in addition to using sitemaps to improve rankings.

Friday, November 26, 2010

Google URL Shorter

Google URL Shorter

Shorter URL introducing by Google website http://goo.gl/ has added another service to its swiftly increasing range. With its new goo.gl site, Google joins the likes of bit.ly and tinyurl.com in enabling internet users to shorten long web addresses into more manageable, bite-size links.

The demand for link shortening services has increased very much in recent years, as users of social networks such as Twitter and Facebook look to realize ways of sharing links as quickly and in brief as possible.

Google’s URL shortening tool transforms long links into the much shorter http://goo.gl/, followed by a unique mixture made up of four letters as well as numbers.



Even as the popularity of goo.gl remains to be seen, the service could have a constructive effect on search engine optimisation. Social networking and SEO are becoming more and more intertwined, so any service that enables quicker and easier link sharing via social platforms is likely to benefit SEO campaigns.


Google website Goo.gl currently consists of just a text box and a click button marked ‘Shorten’.Google software engineer Muthu Muthusrinivasan stated that the beauty of the service lies in its very simplicity.He also added that Google didnt plan to overload goo.gl with features, but that they wanted  it to be the stablest, most protected, and quickest URL shortner on the web.

click here to try http://goo.gl/

Wednesday, November 24, 2010

Firefox add-ons Top 20 for SEO and Webmasters

Firefox add-ons Top 20 for SEO and Webmasters
Mozilla Firefox is attractive common used browser among other top browser like Google chrome, internet explorer, and safari. Still no can beat Firefox due to easy to customize, speedy and security features. Today I m going to demonstrate about customizing Firefox using add-ons especially for web master.
  1. SEO for Firefox – SEO for Firefox is a Pretty common add-ons used by webmaster which adds many search engine optimization data points into Google’s SERPs and Yahoo’s SERPs like age of site, Alexa rank, yahoo .edu and .gov links, dmoz listing and who Is data.
  2. Rank Checker – It’s another pretty good add-ons for checking keyword rankings. You can check rankings, schedule a ranking report and lot more things you can do with this tool. You can also check rankings for Google (Almost country covers), Yahoo! And LIVE.
  3. Google Toolbar – Google Toolbar is widely used by webmasters. We all know functionality of Google toolbar.
  4. Echofone – This plug-in Echofon adds a tiny status bar icon that notifies you when your friends post tweets. You can also manage couple of twitter account simultaneously. You can easily do retweet, post a tweet and send a direct message at your connivance.
  5. All in One Side Bar – This plug-in is handy and fully customizable toolbar at the left side of the browser. You can easily navigate open Bookmarks, History, Downloads, Add-ons, Page Info, Page Source, and Error Console in the sidebar
  6. SpeedyFox – SpeedyFox is helping in speed-up browsing history, boosting Firefox speed in just single click!
  7. Inform Enter - This plug in is very helpful while filling up details in web forms. For webmasters this plug-in is also helpful to fill up a submission details.
  8. Customized Google - CustomizeGoogle helps to enhance Google search experience by adding extra information like add position counter and add favicon. With this plug-in you can also eliminate unnecessary information like ads.
  9. GreaseMonkey – This plug-in allows you to customize a webpage display just by adding little code of JavaScript. You can use your own scripts or available JavaScript on the web for customization.
  10. Firebug – With the help of Firebug you can edit, debug, and monitor CSS, HTML, and JavaScript live in any web page while you browse.
  11. Web Developer - This plug-in adds a menu and a toolbar with various web developer tools.
  12. WebMail Notifier - WebMail Notifier helps you to manage multiple email accounts at the same time. It notifies number of unread emails. It supports Gmail, yahoo, hotmail and many more.
  13. FireFtp -  FireFTP is a secure, cross-platform FTP client for Mozilla Firefox which provides easy and intuitive access to FTP servers. You can get quickly access for uploading or downloading data within your browser.
  14. Shareaholic - Shareaholic helps to make sharing online faster & easier and it will enhance your productivity. This plug-in are working with 60+ services including Twitter, Facebook, Google Gmail, Delicious, Linked In, etc.
  15. KeywordSpy - This plug-in allows you to do competitive research for SEO/ PPC campaigns within browser. This plug-in has 3 main functional bar SEO Status Bar, Attribute Bar & PPC Keyword Research.
  16. ShowIp - ShowIp displays IP address and hostname information like whois, netcraft etc with single click only.
  17. Alexa Sparky- This plug-in shows Alexa Internet’s official add-on in  your status bar.
  18. Dictionary lookup – This plug-in shows the meaning of the highlighted word and displays the definition when you right click on it and select “Define”.
  19. Live Http Headers - This plug-in helps you to view HTTP headers of a page and while browsing.
  20. Google Global – Google Global permits you to view what the Google search results from different geographical locations.

CONTACT US







*

*

*

*
*
contact form faq verification image
Contact forms generated by 123ContactForm
DMOZ LISTING 


DMOZ listing is a Open Directory Project is the biggest, most inclusive human-edited directory of the Web. It is constructed and maintained through a vast, global community of volunteer editors.

If you are looking for ahead popularity in all the major search engines, then you must put sufficient efforts to get listed in DMOZ. It is the top most directory on the web and it is most respected directory as well. All the search engines give great credit to websites listed in DMOZ. Even search engines such as Google, MSN and Yahoo value DMOZ listing. If you want to enhance your websites ranking and good traffic to your website, then submit your site to DMOZ.

Although DMOZ is a free directory, getting listed in DMOZ is not an easy task. Not all sites that are submitted to DMOZ are approved. Only websites that comply with the submission criteria 100% will be listed. We have an experienced team in DMOZ inclusion services. We will be able to help you with your DMOZ submissions.

Benefit


  1. DMOZ prioritizes quality content therefore if your website is getting the link of DMOZ then online reputation would be high
  2. it will help you identify the right categories in DMOZ for submission, which is one of the essential criteria.
  3. it will provide you with appropriate titles for your website for submission. The titles will comply with DMOZ guidelines.
  4. it will also provide you with appropriate description for submission.

Monday, November 22, 2010

Google update

1 ) Google search engine show a four paid listing over natural result :-


2 ) Google show more then eight link in result :-


3 ) Google show every webpage preview it's Google preview :-


4 )  Google show Left Hand Side Tool Bar that provide better option for searching :-


5 ) Google providing  Checkout Option in Ad-words :-


6 ) Google show Product list in Search Results :-


7 ) Google Caffeine search index box :-











Friday, November 19, 2010

sitemap

HTML Site Map

XML sitemap

http://seofreehelp.blogspot.com/
2010-11-20T06:14:54+00:00
1.00


http://seofreehelp.blogspot.com/feeds/posts/default
2010-11-20T06:14:54+00:00
0.80


http://seofreehelp.blogspot.com/2010/09/search-engine-optimization.html
2010-11-20T06:14:54+00:00
0.80


http://seofreehelp.blogspot.com/2010/10/on-page-seo.html
2010-11-20T06:14:54+00:00
0.80


http://seofreehelp.blogspot.com/2010/10/off-page-seo.html
2010-11-20T06:14:54+00:00
0.80


http://seofreehelp.blogspot.com/2010/10/seo-questions.html
2010-11-20T06:14:54+00:00
0.80


http://seofreehelp.blogspot.com/2010/10/seo-optimization-tips.html
2010-11-20T06:14:54+00:00
0.80


http://seofreehelp.blogspot.com/2010/11/required-field.html
2010-11-20T06:14:54+00:00
0.80


http://seofreehelp.blogspot.com/2010/11/caffeine-goole-new-search-index.html
2010-11-20T06:14:54+00:00
0.80


http://seofreehelp.blogspot.com/2010/11/google-algorithm.html
2010-11-20T06:14:54+00:00
0.80


http://seofreehelp.blogspot.com/2010/11/different-between-web-10-web-20-and-web.html
2010-11-20T06:14:54+00:00
0.80


http://seofreehelp.blogspot.com/2010/10/seo-optimization-tips.html?showComment=1288369265275
2010-11-20T06:14:54+00:00
0.64


http://seofreehelp.blogspot.com/2010/10/seo-optimization-tips.html?showComment=1289990526375
2010-11-20T06:14:54+00:00
0.64

Caffeine: Goole new search index

Google announce the achievement of a new web indexing system called Caffeine. Caffeine provides 50 percent fresher results for web searches than our last index, and it's the biggest collection of web content we have offered. Whether it's a news story, a blog or a forum post, you can now find links to relevant content much sooner after it is published than was possible ever before.

So why did we build a new search indexing system? Content on the web is blossoming. It's growing not just in size and numbers but with the advent of video, images, news and real-time updates, the average webpage are richer and more complex. In adding up, people's hope for search is higher than they used to be. Searchers want to find the latest relevant content and publishers expect to be found the instant they publish.

A few backgrounds for those of you who don't build search engines for a living like us: when you search Google, you're not searching the live web. Instead you're searching Google's index of the web which, like the list in the back of a book, helps you pinpoint exactly the information you need

To maintain up with the progress of the web and to meet rising user expectations, we've built Caffeine. The image below illustrate how our old indexing system worked compared to Caffeine:

Our old index had several layers, several of which were refreshed at a faster rate than others; the main layer would update every couple of weeks. To refresh a layer of the old index, we would analyze the total web, which meant there was a significant delay between when we found a page and made it available to you.

Caffeine lets us index web pages on a huge scale. In fact, every second Caffeine processes hundreds of thousands of pages in similar. If this were a pile of paper it would grow three miles taller every second. Caffeine takes up nearly 100 million gigabytes of storage in one database and adds new information at a rate of hundreds of thousands of gigabytes per day. You would need 625,000 of the largest iPods to store that much information; if these were stacked end-to-end they would go for more than 40 miles.

By Caffeine, we analyze the web in small portions and update our search index on a constant basis, worldwide. As we find new pages, or new information on existing pages, we can add these straight to the index. That means you can find fresher information than ever before — no matter when or where it was published.

more info

Google Algorithm

Google Algorithm

Google algorithm gave me the extreme idea to really attempt to write out a irregular outline of what the formula might look like. I've taken soft plus' suggestion and given it a go using a particularly basic weight/factor expression:

GoogScore = (KW Usage Score * 0.3) + (Domain Strength * 0.25) + (Inbound Link Score * 0.25) + (User Data * 0.1) + (Content Quality Score * 0.1) + (Manual Boosts) - (Automated & Manual Penalties)


Keyword Usage Factors:
  • KW in title tag
  • KW in header tags
  • KW in document text
  • KW in internal links pointing to the page
  • KW in domain and/or URL
Domain Strength
  • Registration history
  • Domain age
  • Strength of links pointing to the domain
  • Topical neighborhood of domain based on inlinks & outlinks
  • Historical use & links pattern to domain
Inbound (Back-link) Link Score
  • Age of links
  • Quality of domains sending links
  • Quality of pages sending links
  • Anchor text of links
  • Link quantity/weight metric (Pagerank or a variation)
  • Subject matter of linking pages/sites
User Data
  • Historical CTR to page in SERPs
  • Time users spend on page
  • Search requests for URL/domain
  • Historical visits/use of URL/domain by users GG can monitor (toolbar, wifi, analytics, etc.)
Content Quality Score
  • Potentially given by hand for popular queries/pages
  • Provided by Google raters (remember Henk?)
  • Machine-algos for rating text quality/readability/etc

Thursday, November 4, 2010

Different Between Web 1.0, Web 2.0 and Web 3.0

Web 1.0 Application

Web 1.0 is Shopping cart applications, which most ecommerce website owners employ in some shape or form, basically fall under the category of Web 1.0. The overall goal is to present products to potential customers, much as a catalog or a brochure does — only, with a website, you can also provide a method for anyone in the world to purchase products. The web provided a vector for exposure, and removed the geographical restrictions associated with a brick-and-mortar business.

Web 2.0 Application

Web 2.0 application .it’s " For example, the perception exists that just because a website is built using a certain technology (like Ruby on Rails), or because it employs Ajax in its interface, it is a Web 2.0 application. From the general, bird's-eye view we are taking, this is not the case; our definition simply requires that users be able to interact with one another or contribute content. Developers, for example, have a much more rigid definition of Web 2.0 than average web users, and this can lead to confusion.

Web 3.0 Application

web 3.0 application service is a software system designed to support computer-to-computer interaction over the Internet. Web services are not new and usually take the form of an Application Programming Interface (API). The popular photography-sharing website Flickr provides a web service whereby developers can programmatically interface with Flickr to search for images. Currently, thousands of web services are available. However, in the context of Web 3.0, they take center stage. By combining a semantic markup and web services, the Web 3.0 promises the potential for applications that can speak to each other directly, and for broader searches for information through simpler interfaces.

Most highly, you don't need to upgrade anything, get new software or anything like that. These are abstract ideas used to contemplate the challenges developers face on the web in addition to theories about how to address them