Professional SEO & SMO Consultant, based in Ahmadabad. We are providing SEO & SMO Expert Service packages at affordable price. Contact us Freelancing SEO Work
Thursday, March 7, 2013
Tuesday, December 18, 2012
10 most useful htaccess tips to improve your website
Everyone will be familiar with tip number four, which is the classic 301 redirect that SEOs have come to know and love. However, the other tips in this list are less common, but are quite useful to know when you need them. After you've read this post, bookmark it, and hopefully it will save you some time in the future.
1) Make URLs SEO-friendly and future-proof
Back when I was more of a developer than an SEO, I built an e-commerce site selling vacations, with a product URL structure:
/vacations.php?country=italy
A nicer URL would probably be:
/vacations/italy/
The second version will allow me to move away from PHP later, it is probably better for SEO, and allows me to even put further sub-folders later if I want. However, it isn't realistic to create a new folder for every product or category. Besides, it all lives in a database normally.
Apache identifies files and how to handle them by their extensions, which we can override on a file by file basis:
<Files magic>
ForceType application/x-httpd-php5
</Files>
This will allow the 'magic' file, which is a PHP file without an extension, to then look like a folder and handle the 'inner' folders as parameters. You can test it out here (try changing the folder names inside the magic 'folder'):
2) Apply rel="canonical" to PDFs and images
The SEO community has adopted rel="canonical" quickly, and it is usually kicked around in discussions about IA and canonicalization issues, where before we only had redirects and blocking to solve a problem. It is a handy little tag that goes in the head section of an HTML page.
However, many people still don't know that you can apply rel="canonical" in an alternative way, using HTTP, for cases where there is no HTML to insert a tag into. An often cited example that can be used for applying rel="canonical" to PDFs is to point to an HTML version or to the download page for a PDF document.
An alternative use would be for applying rel="canonical" to image files. This suggestion came from a client of mine recently, and is something a couple of us had kicked about once before in the Distilled office. My first reaction to the client was that this practice sounded a little bit 'dodgy,' but the more I think about it, the more it seems reasonable.
They had a product range that attracts people to link to their images, but that isn't very helpful to them in terms of SEO (any traffic coming from image search is unlikely to convert), but rel="canonical" those links to images to the product page, and suddenly they are helpful links, and the rel="canonical" seems pretty reasonable.
Here is an example of applying HTTP rel="canonical" to a PDF and a JPG file:
<Files download.pdf>
Header add Link '<http://www.tomanthony.co.uk/httest/pdf-download.html>; rel="canonical"'
</Files>
<Files product.jpg>
Header add Link '<http://www.tomanthony.co.uk/httest/product-page.html>; rel="canonical"'
</Files>
We could also use some variables magic (you didn't know .htaccess could do variables!?) to apply this to all PDFs in a folder, linking back the HTML page with the same name (be careful with this if you are unsure):
RewriteRule ([^/]+)\.pdf$ - [E=FILENAME:$1]
<FilesMatch "\.pdf$">
Header add Link '<http://www.tomanthony.co.uk/httest/%{FILENAME}e.html>; rel="canonical"'
</FilesMatch>
You can read more about it here:
3) Robots directives
You can't instruct all search engines not to index a page, unless you allow them to access the page. If you block a page with robots.txt, then Google might still index it if it has a lot of links pointing to it. You need to put the noindex Meta Robots tag on every page you want to issue that instruction on. If you aren't using a CMS or are using one that is limited in its ease, this could be a lot of work. .htaccess to the rescue!
You can apply directives to all files in a directory by creating an .htaccess file in that directory and adding this command:
Header set X-Robots-Tag "noindex, noarchive, nosnippet"
If you want to read a bit more about this, I suggest this excellent post from Yoast:
4) Various types of redirect
The common SEO redirect is ensuring that a canonical domain is used, normally www vs. non-www. There are also a couple of other redirects you might find useful. I have kept them simple here, but often times you will want to combine these to ensure you avoid chaining redirects:
# Ensure www on all URLs.
RewriteCond %{HTTP_HOST} ^example.com [NC]
RewriteRule ^(.*)$ http://www.example.com/$1 [L,R=301]
# Ensure we are using HTTPS version of the site.
RewriteCond %{HTTPS} !on
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
# Ensure all URLs have a trailing slash.
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_URI} !(.*)/$
RewriteRule ^(.*)$ http://www.example.com/$1/ [L,R=301]
5) Custom 404 error page
None of your visitors should be seeing a white error page with black techno-babble when they end up on at a broken URL. You should always be serving a nice 404 page which also gives the visitor links to get back on track.
You can also end up getting lots of links and traffic if you but your time and effort into a cool 404 page, like Distilled's:
This is very easy to setup with .htaccess:
ErrorDocument 404 /cool404.html
# Can also do the same for other errors...
ErrorDocument 500 /cool500.html
6) Send the Vary header to help crawl mobile content
If you are serving a mobile site on the same URLs as your main site, but rather than using responsive design you are altering the HTML, then you should be using the 'Vary' header to let Google know that the HTML changes for mobile users. This helps them to crawl and index your pages more appropriately:
Again, this is pretty simple to achieve with your .htaccess file, independent of your CMS or however your are implementing the HTML variations:
Header append Vary User-Agent
7) Improve caching for better site speed
There is an increasing focus on site speed, both from SEOs (because Google cares) and also from developers who know that more and more visitors are coming to sites over mobile connections.
You should be careful with this tip to ensure there aren't already caching systems in place, and that you choose appropriate caching length. However, if you want a quick and easy solution to set the number of seconds, you can use the below. Here I set static files to cache for 24 hours:
<FilesMatch ".(flv|gif|jpg|jpeg|png|ico|swf|js|css|pdf)$">
Header set Cache-Control "max-age=28800"
</FilesMatch>
8) An Apple-style 'Back Soon' maintenance page

With this bit of .htaccess goodness, you can redirect people based on their IP address, so you can redirect everyone but your IP address and 127.0.0.1 (this is a special 'home' IP address):
RewriteCond %{REMOTE_ADDR} !your_ip_address
RewriteCond %{REMOTE_ADDR} !127.0.0.1
RewriteRule !offline.php$ http://www.example.com/back_soon.html [L,R=307]
9) Smarten up your URLs even when your CMS says "No!"
One of the biggest complaints I hear amongst SEOs is about how much this or that CMS "sucks." It can be intensely frustrating for an SEO when they are hampered by the restraints of a certain CMS, and one of those constraints is often that you are stuck with appaling URLs.
You can overcome this, turning product.php?id=3123 into /ray-guns/ in no time at all:
# Rewrite a specific product...
RewriteRule ray-guns/ product.php?id=3123
# ... or groups of them
RewriteRule product/([0-9]+)/ product.php?id=$1
This won't prevent people from visiting the crappy versions of the URLs, but combined with other redirects (based on IP) or with judicious use of rel="canonical," you improve the situation tremendously. Don't forget to update your internal links to the new ones. :)
10) Recruit via your HTTP headers
Ever looked closely at SEOmoz's HTTP headers? You might have missed the opportunity to get a job...
If you would like to add a custom header to your site, you can make up whatever headers and values you'd like:
Header set Hiring-Now "Looking for a job? Email us!"
It can be fun to leave messages for people poking around - I'll leave it to your imaginations! :)
Download the rules
You can grab all of these rules in quick-form from a compilation I made.
Viewing headers
If you are unsure about how to look at HTTP response headers, here's a great tool to get you started.
If you would rather do it in your browser, follow these steps:
- Chrome on Windows: Ctrl-Shift-I and click 'Network' (then reload the page)
- Chrome on Mac: Command-Option-I and click 'Network' (then reload the page)
- Firefox: Install Live HTTP Headers
Thanks for reading, and don't forget to test anything you change! :)
Wednesday, December 12, 2012
SEO Reppers Song Video with lyrics
SEO Reppers Song
SEO Reppers lyrics
Your site design, the first thing people see.
It should be reflective of you and the industry.
Easy to look at, with a nice navigation
When they can't find what they wan it causes frustration
A click costs an action. To increase the temptation
Use appealing graphics that create motivation
You have animation please use in moderation
‘Cos search engines can't index the information
Display the logo of all associations
Highlight your content; therefore that’s an obligation.
Create clean design; you can use some decoration
But try to prevent any client hesitation
Every page that they click should provide an explanation
Should be easy to understand like having a conversation
Create a site style you can use your imagination
But make sure you use correct colour combinations
Do some investigation, looks at other organisations
But don’t duplicate or you might face a litigation
You done? Congratulations start construction
Move into production, please follow these instructions:
Your photoshop functions, slice that design
Do you layout with divs make sure there is a line
Please don’t use tables even though they work fine
When it come to indexing they give searchers a hard time
Make it easy for spiders to crawl what you provide
Removed font type, font colour and font size
No background colours, keep your coding real neat
And tag your look n feel on a separate style sheet
Better results with XMl and CSS,
Now you making progress, a ‘lil closer to success
Describe you doc type so the browser can relate
Make sure you do it great or it won’t validate
Check in all browsers, I do it directly
Gotta make sure that it renders correctly
Some use IE some others use flock
Some use AOL, I use Firefox
Title everything including links and images
Don’t use italics, use emphasis
Don’t use bold please use strong
Cos if u use bold that’s old and wrong
You use CSS your page should load quicker
Your client's satisfied like they easting on a Snicker
They stuck on ur page like u made it with a stickers
And then they convert now that the real kicker
Make u a lil richer, your site a lil slicker
Design and code right man I hope u get the picture
What I'm telling you is true man it should be a scripture
If it's built right you’ll be the pick of the litter
Everyone will wanna follow you like twitter
Competition will get bitter
You will shine like glitter
If u tryna grow; your company will get bigger
Design and code right man can you get with it?
Friday, September 7, 2012
Top Australian Free Classifieds Sites List in 2014
Australian Free Classifieds Ads Sites List
http://www.olx.com.au/
http://www.locanto.com.au
http://post.noflies.com.au/
http://www.gumtree.com.au
http://adverspace.com.au/
http://www.muamat.com/
http://www.global-free-classified-ads.com
http://www.paiir.com.au/
https://www.allclassifieds.com.au/
http://www.adpost.com
http://www.ozads.com.au/
http://www.goodlist.com.au/index.php
http://www.dinkos.com.au/
http://australia-free-classifieds.com/
http://www.expatads.com
http://adoodau.com/
http://all2go.com.au/
http://aufreeads.com/
http://aussielocal.com.au
http://chaosads-australia.com/
http://dewalist.com
http://freeaustraliaclassifieds.com/
http://freelists.com.au/
http://inetgiant.com.au/
http://listonline.com.au/
http://ockalist.com.au/
http://overheremate.com.au/
http://postadeal.com.au/
http://postclassifiedlisting.com/
http://streetmarkets.com.au/
http://zappclassifieds.com/au/
http://localtrader.com.au
http://www.134sale.com.au/
http://www.dinkos.com.au/
http://tradingspace.com.au/
http://www.tradexchange.com.au/
http://ikoalaads.com.au/
http://www.accessoriesfreeads.com/
http://www.truebuy.com.au/ |
http://australia.anunico.com |
http://www.globeslist.com.au/ |
Friday, August 17, 2012
Saturday, August 4, 2012
Tips for On Page Site Auditing
A SEO site audit is a critical first step that SEOs must undertake to understand the state of SEO.
There are two basic options:- A quick hit site audit where you go through the top level diagnosis with the site in a few hours.
- An exhaustive site audit that can take days.
- On-page SEO audit
- Technical site audit
- Link audit
Let's start by looking at few important elements of on-page SEO, why they're important, and tips on what you should look out for during your audit.
Title Tags
The purpose of a title tag is to make each page unique in terms of the page’s content. Since each page on your site should be unique, you need to have unique title tags for each page.Title tags also carry huge weight in terms of how you rank, so try to spend as much time as possible on writing great title tags. A title tag should include keywords relevant to the page content.
For example, a website that sells shoes online has a page about men’s athletic shoes. The title tag might be written in the following format:
Mens Athletic Shoes: Brand name Athletic Shoes for Men at [company name]
The above example uses variations of the same keyword to account for keyword searches.
Also, the company name appears at the end of the title. This is critical because many sites use their prime title tag real estate (the beginning) with their company name.
However well-known your company and its brand may be, always use a popular keyword instead of your company name at the beginning of your title tag. Ideally, you may want to limit your title tag to be 40-69 characters at most.
Meta Tags
The meta description tag should be written like how you would for PPC ad copy. Just as good ad copy on PPC campaigns will increase your CTRs, a good meta description will increase your click-through rate (CTR) on organic search.Make sure to describe what your page content is about, benefit or value to readers, and then a good call to action. Ensure that you spend some time writing your meta description. As a rule of thumb keep your meta description tags under 200 characters.
Missing Title Tags
In addition to writing unique title tags, keep a list in your Excel sheet on pages that are missing title tags or title tags with just your company name on it.There are a lot of SEO software options available, but a good free option is Google’s Webmaster Tools. Go to the Optimization menu and then click on HTML improvements. There you will see many suggestions and among those are suggestions related with title tags.
Duplicate Title Tags
In Google Webmaster Tools under the same HTML improvements menu, there is a quick way to look at all your duplicate title tags on your site. Again, auditing duplicate title tags and writing unique title for each page can be a good win. These are on-page SEO factors that are low hanging fruits in nature but provide you with maximum bang for your effort.Missing Alt tags
Besides having benefits like making your site disabilities compliant, alt tags are an important on-page SEO audit element. They play a big part in optimizing your images as these tags are picked up by engines when including your images as part of image search.Audit your site well and see if alt tags are missing from your images or they have been incorrectly described.
Missing Image Names
Naming your images correctly plays a big part in image optimization. Make sure that images are named correctly when doing your auditing.One common mistake companies make is when naming their company logo. Designers tend to name the logo as logo.jpg. Instead add your company name as part of the image name like [company name]-logo.jpg.
If you have products on your site then ensure that you are using accurate keyword to describe that product.
Image optimization can be one of those low-hanging fruits when conducting your on-page SEO audit but can drive good amount of traffic through image searches.
H Tags
Header tags tell the search engines that this is the headline of your page. While auditing your site look out for H tags – if you’re missing any, or you have too many. Ideally, your main headline should be in H1 and then maybe sub-headings in H2.Keyword Mapping and Cannibalization
One thing that an on-page SEO audit should uncover are multiple uses of same keyword on different pages on the site. This is not optimal because you're:- Confusing the search engines as to which page should rank for the keyword in question and as such letting the engines decide.
- Missing an opportunity to increase the number of keywords you're optimizing your site against.
On-Page Copy
There is no rule that says you must have a minimum number of words on your page. But always keep in mind that you should be describing your content well and your visitors should easily understand what your page is about and what message you are trying to convey.Think of it as writing good marketing copy. If you can convey your message in 50 or 100 or 200 words then that should be your on-page copy. Perform some guerilla usability and test your copy based on the results.
Friday, July 27, 2012
Helpful Link Building Tools
1. Majestic Site Explorer (paid)
Majestic has gradually become a go-to link building tool. Its results are so fresh that you'll often able to find new links you've earned through broken link building that went up on pages you didn't ask about. Anyhow, here's how it can be used:- Bulk Backlink Checker: When you've found a batch of dead pages or sites, submit them all to the bulk link checker and set it to return results based on linking domains. Presto! You've got a list of dead sites or pages ordered by the number of inbound linking domains.
- Find Linkers to Dead Pages: I haven't taken the time to master their reports, so I'm stuck with the top 5,000 backlinks. And this will include all the backlinks from a single domain. Typically in batches of 5,000 there are about 1,000-2,500 unique domains. Of those, usually about 500-1,000 are still actually live with links on them.
- Find Links Pages for Dead Site Discovery: Start with a .gov site in your conceptual neighborhood (fda.gov for a health site) and then download linkers just to its home page. Then look in the URL strings for "links" or "resources." This will help you find hundreds of topical links pages that will undoubtedly contain some dead sites or pages.
- New Link Reporting: In broken link building in particular it isn't always easy to find the results of your handy work – primarily because the page you request a link on isn't where you always end up. Since Majestic is updated so frequently, you'll often find new links on pages you didn't target, but from domains you did. For projects that are XX-links-per-month this is vital. Note: ahrefs.com (paid) can be used for this purpose as well – their index gets updated daily.
2. Link Prospector Tool
I designed this one. This tool facilitates web research, and in particular, link prospect discovery for a particular link building tactic.- Guest post opportunity discovery.
- Links pages as BLB seeds: When starting a broken link building campaign you need a set of links pages to scrape so you have resource-oriented URLs to check. Helping fix a broken link is a much better foot in the door than "you link to our competitors."
- Target site discovery: "Target site" sounds vague, but if sites exist at large scale (i.e., high school websites) there will be a footprint. Find it and you can find large numbers of these types of websites.
3. Keyword Combiner
I still use the heck out of a keyword combiner tool, for combining prospecting phrases rather than SEO keywords. There's one built into the link prospector but here's the one in my Chrome toolbar: Keyword Combination Tool.Be warned though, some don't allow for quotation marks or other advanced operators. And the one linked to here is defaulted to doing a double combo.
4. Ubersuggest
This wonderful tool scrapes Google's suggestions, giving link prospectors instant access to the "problem space" around a given phrase. This is useful particularly at the early stages of a campaign.Test it out with something like: "how do I" to get an idea of what it does. Then narrow in on your space. For example, try: "link building for" which leads Ubersuggest into helping you discover what people are actively searching for.
5. Outbound Link Scraper
Majestic and Google aren't the only tools for opportunity research out there... There are also lists, glorious lists ripe and succulent for the scraping. Whether it's a list of blogs for outreach or just a list of links from a page that we're checking for URL validity a link scraper is an awesome little tool to have handy.8 Prospect Processing and Qualification Tools
Prospecting is only half the fun. Well, probably less than half the fun if you're counting time. Once you have those raw opportunities you have to have tools for deciding which ones make the cut to move on to the next stage.1. The URL Status Checker
This one's dead-dumb simple. Input a list of URLs you want to check and then hit submit. You will need to check your dead URLs a couple times and still end up weeding through a few false positives.2. Regex Filtering Tool
Up in the Majestic section above I describe pulling backlinks to authority sites and sifting through them for links and resource pages. This nifty little filter tool provides regular expressions for this.3. Pick Shortest URL Per Domain
Sometimes you only need one URL from a domain, but you have hundreds (all mixed in with URLs from other domains). This free tool (login required) picks the shortest URL per domain and dumps the rest. Good for cleaning up lists of possible outreach targets.4. List Comparison Tool
Sometimes you need to dedupe two lists without combining them. For this you need to use the list comparison tool. This thing can handle tens of thousands of URLs at a time. I've never pushed it farther than that – hopefully it doesn't get shut down after sharing it here!5. The Super Deduper
Actually it's called just the dedupe lists tool but I prefer super deduper. This tool removes duplicates from lists of URLs, but be sure to remove the www. from your list first or you'll still have multiple instances of the same domain.6. The Contact Finder
This tool (paid) finds between 20-80 percent of contacts from a list of URLs. Hand off the output of this tool to your human contact finding team to handle the email address selection and to go site by site to find the rest.7. Archive.org
While anchor text can provide ample clues about the content of a page or site, nothing beats Archive.org for piecing together the purpose of a page. Some BLB outfits "repurpose" content from Archive.org.8. The URL Opener
Numerous tools can open multiple tabs in your browser when hand-qualifying sites. I try to hand qualify as little as possible these days, but when I do I use the URL opener (also made by the Super Deduper guy).2 Focus Enabling Tools
Do you spend time "researching trends" on Reddit? With a handy work timer and site blocking tools, you can more effectively regulate your intellectual indulgences.1. Pomodoro Technique and App (work timer)
I'm not sure why this simple app and technique works, but it does.2. I-AM-STUDYING BLOCKER
For those of us with chronic "I'll just check Facebook for a second-itis" this app shuts down all access to problem sites simply. It's made for studying but the interface is simple. If you don't like to think of yourself as studying you could try Stay Focused or Nanny.4 Tools On My Radar
These are tools that I simply haven't made the time to either learn fully or implement yet:- The Link Building Toolbar (review here)
- BuzzStream (Gosh, wouldn't it be great to record who not to outreach to in the future?)
- Social Crawlytics (via BuzzStream post)
- Trello - simple, free project management tool.
Subscribe to:
Posts (Atom)