Friday, July 27, 2012

Helpful Link Building Tools


1. Majestic Site Explorer (paid)

Majestic has gradually become a go-to link building tool. Its results are so fresh that you'll often able to find new links you've earned through broken link building that went up on pages you didn't ask about. Anyhow, here's how it can be used:
  • Bulk Backlink Checker: When you've found a batch of dead pages or sites, submit them all to the bulk link checker and set it to return results based on linking domains. Presto! You've got a list of dead sites or pages ordered by the number of inbound linking domains.
  • Find Linkers to Dead Pages: I haven't taken the time to master their reports, so I'm stuck with the top 5,000 backlinks. And this will include all the backlinks from a single domain. Typically in batches of 5,000 there are about 1,000-2,500 unique domains. Of those, usually about 500-1,000 are still actually live with links on them.
  • Find Links Pages for Dead Site Discovery: Start with a .gov site in your conceptual neighborhood (fda.gov for a health site) and then download linkers just to its home page. Then look in the URL strings for "links" or "resources." This will help you find hundreds of topical links pages that will undoubtedly contain some dead sites or pages.
  • New Link Reporting: In broken link building in particular it isn't always easy to find the results of your handy work – primarily because the page you request a link on isn't where you always end up. Since Majestic is updated so frequently, you'll often find new links on pages you didn't target, but from domains you did. For projects that are XX-links-per-month this is vital. Note: ahrefs.com (paid) can be used for this purpose as well – their index gets updated daily.

2. Link Prospector Tool

I designed this one. This tool facilitates web research, and in particular, link prospect discovery for a particular link building tactic.
  • Guest post opportunity discovery.
  • Links pages as BLB seeds: When starting a broken link building campaign you need a set of links pages to scrape so you have resource-oriented URLs to check. Helping fix a broken link is a much better foot in the door than "you link to our competitors."
  • Target site discovery: "Target site" sounds vague, but if sites exist at large scale (i.e., high school websites) there will be a footprint. Find it and you can find large numbers of these types of websites.

3. Keyword Combiner

I still use the heck out of a keyword combiner tool, for combining prospecting phrases rather than SEO keywords. There's one built into the link prospector but here's the one in my Chrome toolbar: Keyword Combination Tool.
Be warned though, some don't allow for quotation marks or other advanced operators. And the one linked to here is defaulted to doing a double combo.

4. Ubersuggest

This wonderful tool scrapes Google's suggestions, giving link prospectors instant access to the "problem space" around a given phrase. This is useful particularly at the early stages of a campaign.
Test it out with something like: "how do I" to get an idea of what it does. Then narrow in on your space. For example, try: "link building for" which leads Ubersuggest into helping you discover what people are actively searching for.

5. Outbound Link Scraper

Majestic and Google aren't the only tools for opportunity research out there... There are also lists, glorious lists ripe and succulent for the scraping. Whether it's a list of blogs for outreach or just a list of links from a page that we're checking for URL validity a link scraper is an awesome little tool to have handy.

8 Prospect Processing and Qualification Tools

Prospecting is only half the fun. Well, probably less than half the fun if you're counting time. Once you have those raw opportunities you have to have tools for deciding which ones make the cut to move on to the next stage.

1. The URL Status Checker

This one's dead-dumb simple. Input a list of URLs you want to check and then hit submit. You will need to check your dead URLs a couple times and still end up weeding through a few false positives.

2. Regex Filtering Tool

Up in the Majestic section above I describe pulling backlinks to authority sites and sifting through them for links and resource pages. This nifty little filter tool provides regular expressions for this.

3. Pick Shortest URL Per Domain

Sometimes you only need one URL from a domain, but you have hundreds (all mixed in with URLs from other domains). This free tool (login required) picks the shortest URL per domain and dumps the rest. Good for cleaning up lists of possible outreach targets.

4. List Comparison Tool

Sometimes you need to dedupe two lists without combining them. For this you need to use the list comparison tool. This thing can handle tens of thousands of URLs at a time. I've never pushed it farther than that – hopefully it doesn't get shut down after sharing it here!

5. The Super Deduper

Actually it's called just the dedupe lists tool but I prefer super deduper. This tool removes duplicates from lists of URLs, but be sure to remove the www. from your list first or you'll still have multiple instances of the same domain.

6. The Contact Finder

This tool (paid) finds between 20-80 percent of contacts from a list of URLs. Hand off the output of this tool to your human contact finding team to handle the email address selection and to go site by site to find the rest.

7. Archive.org

While anchor text can provide ample clues about the content of a page or site, nothing beats Archive.org for piecing together the purpose of a page. Some BLB outfits "repurpose" content from Archive.org.

8. The URL Opener

Numerous tools can open multiple tabs in your browser when hand-qualifying sites. I try to hand qualify as little as possible these days, but when I do I use the URL opener (also made by the Super Deduper guy).

2 Focus Enabling Tools

Do you spend time "researching trends" on Reddit? With a handy work timer and site blocking tools, you can more effectively regulate your intellectual indulgences.

1. Pomodoro Technique and App (work timer)

I'm not sure why this simple app and technique works, but it does.

2. I-AM-STUDYING BLOCKER

For those of us with chronic "I'll just check Facebook for a second-itis" this app shuts down all access to problem sites simply. It's made for studying but the interface is simple. If you don't like to think of yourself as studying you could try Stay Focused or Nanny.

4 Tools On My Radar

These are tools that I simply haven't made the time to either learn fully or implement yet:
Folks now it's your turn. Please add the tools you use, and why/when below.

Saturday, July 21, 2012

Hottest SEO Methods to Rank Higher & Avoid Over Optimization

SEO Methods for 2012

Penguin and Panda updates have disregarded most of the SEO techniques that worked wonders once. This is the high time when webmasters and website owners need to consider what their end user would like and ultimately what search engines would find natural and rankable. After you build or tweak your website targeting the end user, link building and SEO techniques need to follow the search engines’ quality guidelines. Primary elements of quality SEO and link building that can please Google pets like Penguin and Panda are.

1) Great content that appeals your audience
2) Proper Anchor text variation
3) Finding and approach link prospects that relates with your industry and find your website worth linking to
4) The linking prospects are not connected with a bad neighborhood
5) Diversifying your link building and SEO techniques
6) Getting socially engaged with your audience]

SEO Methods

Monday, July 16, 2012

Know More about Link Building Strategies and Why It’s Fail


Some of these strategies I created. Others were handed to me. Over the years, I've discovered that the same reasons for failure kept cropping up. Thus, I thought it'd be helpful to outline the mistakes I've made and seen. Hopefully this helps you avoid the stress and headaches I've experienced along the way.
Here are the ways in which I've seen link building strategies fail and how you can make sure you don't get duped by these common pitfalls.

Link Building Strategy Content Strategy which Link Bait

Let me break it down...
The content strategy is the road map designed to create epic pieces of content. The link building strategy is how you plan on getting links. Sometimes you use content from your content strategy to get links, but sometimes you don't. Sometimes the content from your content strategy helps generate links all by itself, such as when people find it easily in search engines and start sharing it on their own sites. This process is not part of the link building strategy.

Producing a piece of link bait isn't a content strategy, either. Link bait is a piece of content that is created with the hopes of attracting a load of links. Link bait creators hope their content will go viral, and let me tell you, it is really difficult to churn out link bait after you produce a piece that goes viral. You're lucky if one goes viral, let alone multiple. Thus, link bait isn't a content strategy as it's not really sustainable.

In addition, link bait can create a bad user experience, say in the case that you are churning out infographic after infographic all hosted on your company blog. Frequent visitors to that blog might become alienated, especially in "boring" or small niches where you often need to think of tangential topics and audiences to make a successful piece of link bait.
Why should you know (and care about) the difference? Because getting link bait and content strategy confused can set your team up for failure. It often results in expectations not being met, as they will be unrealistic from the start. Plus, each of these strategies answers different questions as they are being created — such as the target market, goals, and metrics — all of which are vital to success. Miss one part and you could be missing an important piece of the big picture.

How to Win:

Create a content strategy that includes link bait (isn't all link bait) and figure out where you can leverage the content for link building. Simultaneously, create a link building strategy that drives additional links above and beyond what your content can do. For even more win, consider how you can coordinate with your social strategy to really leverage both your content and link building efforts.
The key here is collaboration and integration, which will ensure you don't miss opportunities for a win. Think you only have the resources to pursue one strategy (which I will call BS on)? I love this controversial article on content marketing being better than link building. I'm not saying I agree or disagree, I just love when authors take an aggressive stance :)

The "Strategy" Part is Left Out

Trusty old Wikipedia defines a strategy as, "A plan of action designed to achieve a vision." Unfortunately, a lot of times the planning and designing parts are left out, leaving a grandiose vision but no real road map for getting there. I see this a lot with companies who pride themselves in being agile. Sometimes they are just too agile, too shotgun and reckless, saying "let's go, go, go" and hoping to see some wins later on.

How to Win:

No brainer here: create a strategy. Know that executing one campaign after another with no connecting thread is not a strategy. Here are the top-level pieces both a winning content and link building strategy should have.
  • What - What is the purpose? What are the goals?
  • Who - This is two part, 1) Who will be accountable for the project? 2) Who is the target audience?
  • How - Again two part, 1) How will you reach this goal(s)? 2) How will you track it?
  • Why - Why is this piece of the strategy being pursued? This should be answered from both the business and user perspective.
  • When - When will this be executed? Create timelines and deadlines to execute each piece of the strategy.
  • Where - Where does this fit in? Knowing how it integrates with other efforts and where it fits into the grand scheme of things is essential.
Rather than re-invent the wheel, here are some useful resources for how to create link building and content strategies:

No Defined Goals

Part of a strategy is having a vision, but to achieve that vision you need to set goals. Typically, the best strategies have a number of goals, broken into both short term and long term. Some will argue that "ranking #1 for [insert highly competitive short tail keyword here]" is a goal, but since it is pretty lofty you need to ask yourself what short term goals can your team action in order to be number one.

How to Win:

Set up both long-term and short-term goals. Generally, the short-term goals will be stepping stones for reaching long-term goals, creating a tree of goals with actionable milestones. I found an acronym from a UK government agency that I thought was wonderful. It says that all goals are SMART:
characteristics of SMART goals

No Urgency or Tracking in Place

In my experience, in-house link building teams tend to fall victim to this more often than agencies. Just as a link building strategy should have goals, there should be some urgency in reaching them. This is where tracking comes into play. I see a lack of urgency is often correlated with not having a system in place for inspiring that urgency. To be clear, what I mean when I say "urgency" is a timeline for reaching goals - essentially deadlines.
Every link builder is going to find a way to track how many links he/she is getting. It's human nature - we're out trying to get something and we want to see if we won or not. While I still find that the market struggles with developing a completely automated and instant way of tracking incoming links, there are manual processes that can aid in tracking how many links are received. However, this isn't the only tracking that's important.
Knowing the number of links is great, but what are those links doing for rankings? How have those incoming links fluctuated over the duration of the project? What does the anchor text spread look like? These are bigger picture questions that some teams fail to answer in tracking. It makes it very difficult to measure the effectiveness and ROI of your link building team if you aren't looking at tracking data over time. And what is more important is making sure you share these results with everyone on the team. The first set of people who should know the results should be the ones doing the work. Too often the link builders are left out of the loop and given indirect feedback like "we need to get more links."

How to Win:

Here is a top-level road map for implementing tracking procedures:
  • After defining the goals, determine KPIs for measuring if those goals have been reached.
  • Create a system for monitoring these KPIs, such as through Analytics or third-party tracking tools.
  • Regularly analyze the data. Aggregate it into digestible visualizations to help make sense of it all.
  • Draw conclusions.
  • Share these conclusions with the whole team.
  • As more data is collected, compare new data with historical results. Be sure to share these with the team as well.
  • Reassess strategy and determine where changes need to be made if applicable.

Expectations Aren't Managed

managing expectations comic Dilbert
Everyone is going to have his/her own opinion, from the bosses to the link builders themselves. Those who own the project need to set a realistic bar and constantly communicate what everyone can expect. Keep in mind that some people won't be as forthcoming with their expectations, so the key is to ask, and preferably have their expectations documented somewhere for reference to hold everyone accountable.

How to Win:

It's safe to set the bar low so you can exceed expectations, but the key in that is just generally setting the bar. If you want to go even further, epmphasize constant and clear communication. A good workflow includes:
  1. Learning what is expected from each party.
  2. Negotiating those expectations.
  3. Consistently reporting on where your team is at in meeting those expectations.
  4. Reporting on the final results - were those expectations met or exceeded?

Resources Spread too Thin

I see this all the time. Let's put it in perspective with a hypothetical story.
The boss decides the company should try it's hand at some link building, so he designates one person to give it a go. Let's say it's Susy the copy writer. Off the cuff, the boss decides he wants Susy to start getting 30-50 links a week, and of course to complete all the other copy writing tasks required of her. With a pat on the back he tells new link builder Susy good luck.
It's not difficult to guess that Susy the link builder comes back with less than stellar results. Not only is the boss disappointed, but he also decides link building isn't good for the company. Susy is left feeling like she failed, disappointed in herself, and aggravated at her boss.
Link building isn't a get rich quick scheme..or at least it shouldn't be viewed as one.

How to Win:

Devote ample resources to your link building project. This can be easier said than done, especially when you operate a small business. A good start is to hire one person whose sole job function is link building. Susy the copy writer shouldn't also be dabbling in link building as nobody simply dabbles in link building.

Make sure to monitor everything he/she does in order to make a case study for obtaining more people and resources for link building. Especially when just starting out, companies can see a lot of big wins so make sure to track before and after progress to prove your case.
Depending on which strategy(ies) you pursue, the ideal team consists of:
  • Link builder(s) - Remember, results are determined by time x cost = # of wins. For more wins, you need to amp up the variables.
  • Content person(s) - A lot of link building strategies require content, so a strong team of writers and editors can help scale link building.
  • Creative person(s) - Designers and developers can really help amp up the content you produce.
  • Researcher(s) - Having someone who is an expert in data can make sure your resources are credible and your content is solid.
  • PR person(s) - This person has his/her finger on the pulse of media, knowing what is trending and what journalists are interested in.
  • Project manager - Sometimes this role isn't as explicit as this title suggests, but someone in charge of tracking and keeping an eye on the big picture can give your link builders more time to execute, rather than get caught up in monitoring.
  • Social media person(s) - Not essential, but it's great to have a dedicated person to leverage the link building opportunities social media presents.

Content is Consistently Crappy

This feels like it should be a no-brainer, but plenty of companies fail to make epic content. Sometimes this caused by looking for quick wins, resulting in not enough effort or resources devoted to content creation. However, a more common reason is not due to lack of experience, but falls on the fact that people just don't know. People tend to fall in love with their own ideas (often which are too promotional) and have trouble seeing whether or not the idea will really speak to anyone.

How to Win:

First, know that the content creation process is not a sweatshop. Content should be made with a lot of blood, sweat, and tears. A case study from Salon.com illustrates this concept well. Essentially, the webiste was able to increase traffic to the site by 40% by creating 33% less content. This supports the idea that content isn't a numbers game; it's a quality game.

Second, your content strategy should define accountabilities and create a workflow that implements a checks and balances system to ensure that you are creating epic content. If your content strategy is lacking, make sure you really understand how people are using the Internet and get the link builders involved with brainstorming, even if it's just to ask them. If you have minimal content building experience, it's important to have the people in the trenches involved in the discussion.
To make content creation even easier, I've created this checklist for what I believe all epic content should have. If you can't say "yes" or provide a compelling answer for each, you might need to go back to the drawing board. Print it out and put it next to your desk right now (high res version available up request).
good content creation check list

Link Building Isn't Integrated with Other Marketing Channels

This section might be a bit misleading as link building can still be successful without being integrated into other channels. However, a lack of integration can be a serious glass roof on the ladder to link building awesomeness. There is only so much traditional link building can do. Big wins often encompass other divisions, such as PR, social media, the product team, and more. This is acknowledged well in Jon's link building strategy list - in the top filter notice the checkboxes under "Dependencies on Other Resources." The bigger your ideas, more departments will need to be involved. Without a seamless plan for integration and collaboration, working together is going to be one big mess (if it's even possible at all).

Besides being able to work harmoniously, another big reason you want integration is because you can capitalize on what everybody else is doing and make sure you capture all the wins possible from a particular campaign. A lot of the initiatives your PR team is running can be easily tweaked to fit into the SEO team's agenda as well. It's called "looking for low-hanging fruit", and it is impossible to implement this idea if you don't know what the other teams are doing.

How to Win:

Communicate, communicate, communicate. Especially at first, you are going to have to stay relevant by constantly communicating with the different marketing teams. I've found that taking the initiative and including these teams' interests in your link building initiatives shows them a clear example of how you can work together.
For example, if you are running contests with bloggers, looking out for the social media team's interests can help you make a case for why your teams should work together. It's the "I'll scratch your back if you scratch mine" philosophy, and you are initiating the "scratching". By laying out a work flow through example, you will inspire these teams to get you involved with their projects.

To do this, include them in every step from inception to execution and make it clear how you are addressing their goals. Ask questions and ask for input. Consider it internal egobait. Pretty soon, you'll have an open line of communication and your requests for regular cross-team meetings will be taken seriously. Before you know it, you will all be in the loop of what other teams are doing and collaborating on your different project calendars. Tread lightly so as not to step on any toes, and encourage hosting these calendars on a collaborative platform, such as a simple Google Doc. In the end, communication turns into collaboration, and that is where you will gain the most wins from your marketing efforts.

Link Building Team Members are Siloed

Just as marketing channels can be siloed, so can the different teams involved with link building. I mentioned above the different concentrations that should be represented in a complete link building team. Smaller teams tend to have no trouble including everyone in the strategy, as they all typically sit in the same room. But what if you have a large team? Maybe with offices in different locations?
It's easy to leave people out of the link building funnel; people figure the process will ship quicker if there are less chefs in the kitchen. However, this is the easiest way for a link building campaign to fail. Not every person on your team can be an expert in everything, so drawing from the experience and know-how that all team members have to offer will help you succeed.

How to Win:

Know each person(s) or teams core competencies. By knowing what each person/team can bring to the table, you'll be able to figure out where they fit into a particular link building campaign. The skills that each team member can bring to the table include:
  • Link builder - This person spends all day trolling the Internet and trying to promote content, dealing with rejections, and taking part in observational learning along the way. He/she can tell you if the idea is promotable, as well as insights into how to reach different target niches.
  • Content person - This person spends all day writing. He/she will know how to use the written word to represent the data on your infographic in a comprehensible way, while capturing your brand's voice and style to ensure consistency.
  • Creative person - This person knows how to represent complex data in a beautiful, visual way. He/she will be able to tell you possible roadblocks of the infographic idea from a design perspective, or suggest better ways to represent the information.
  • Researcher - This person crunches data all day. He/she will be able to collect credible sources for your piece, ensuring that the information is statistically sound and that your link builders don't get eaten alive by critics.
  • PR person - This person speaks to the media all day. He/she knows what journalists look for and will be able to tell you if your piece will be promotable to high-value publications.
  • Project manager - This person is the glue that holds the process together. He/she should be responsible for making sure the teams communicate and collaborate.
  • Social media person - This person knows what is trending in the social sphere. He/she can help promote the piece and tell you if the idea will be shared. He/she can help push your piece viral.
Without the help from all of these people, you are much more likely to create a piece that:
  • Doesn't target a specific audience.
  • Targets an audience that is difficult to infiltrate.
  • Targets an audience that is too small, making the likelihood of success small.
  • Isn't statistically sound, making it likely that it will be ripped apart during outreach.
  • Is visually limited because your design team was incapable of executing the grand vision.
  • Doesn't take off in the social space.
Ensure your teams work collaboratively and communicate to create an inherent checks and balances system for creating winning link building campaigns.

Relationships are Left Out

I'm a big proponent of link building being renamed relationship building. The lines between traditional PR and link building are being blurred, and to get a link you need to bring it back to basics: networking. People do things for people they know. Bloggers and journalists get solicited to build links all the time, and you need to make sure you are among the people they come to know if you want that link.
The "relationship" part is left out because link builders are constantly pressured by numbers. Those who aren't in the trenches don't understand how much goes into link building. I would argue that with some link building campaigns, the back and forth with a prospect takes up to 75% of the time. That number is often lost when looking at metrics like number of hours worked versus number of successful link placements.

How to Win:

First, educate your team and manage expectations. If you are the boss, understand that link building should take time. iAcquire wrote an article on the effort required to build links and I don't think it's far off.
As far as measurement, make sure that the time spent negotiating is taken into account. If you have a large team and sense a problem among your link builders of not knowing when to let go (which I think is a big beginner pitfall), sit down with them and role play. Have them fill out a time log to capture back and forth if you have to. Host a link building hack day and get in the trenches to see for yourself just how much back and forth you'll do. No matter the method, make sure that this is taken into account and encourage relationship building. It's what makes link building scalable — as you can go back to the same people and be introduced to new contacts — so don't give up. Pretty soon they'll get to a place where they can manage a relationship quicker and more efficiently.

Everyone Forgets the End Goal

Disclaimer: this is a #tellmehowyoureallyfeel moment and is my #1 gripe with link building. I think the term has been bastardized - it is so overused that people forget where it came from in the first place. It's not about the number of links. Link building was a means created to increase conversions. Conversions should be what you care about. Not the number of links. Not even how well your SERP positioning improves. If you are increasing the number of links, improving your SERP positioning, and seeing more traffic, absolutely NONE of this matters if that traffic isn't converting.
Because of this misconception, I think a lot of SEOs will say a big part of their job is educating their clients/bosses. It's quite easy to get caught up in the minutiae; counting the number of links and watching rank changes are easy to hold onto because they are the easiest to see. These are great short-term goals, but they are not the end goal.
My co-worker Carson Ward said this well:
carson ward pullquote on link building

How to Win:

Constantly remind yourself/your team of the end goal. Do this by measuring changes in conversions that are a result of the smaller wins. Especially while educating your clients/bosses, if they aren't constantly reminded of conversions, they will easily forget. This is where defining goals and managing expectations are super important.
Second, remember that link building — and SEO as a whole — is only part of the inbound marketing puzzle. The only reason people have an online business is to capture new customers on the web. SEO is only one way to do this, and link building is only one way SEO makes this happen. Know the big picture and understand how your efforts contribute to the grand vision.

Thursday, July 12, 2012

Google Panda Update 3.9 on 12th July 2012


Google Panda Update 3.9

This would be version 3.9 and happening only a little over 2 weeks after the last update, Google Panda 3.8.

There are SEOs and webmasters claiming recoveries from previous Panda damage and other webmasters claiming major drops in their rankings. Because some webmasters who were hit by Panda are claiming recoveries, this seems to be a Panda related update and not a Penguin update.

To be honest, I am not sure if this is indeed a real update. There is a nice number of posts in the forums but not typical levels of a Panda or Penguin refresh.

I will reach out to Google for a comment and update this post when I hear back.

Past Panda Updates:

Update: Google said there was no specific update.

Tuesday, July 10, 2012

Useful SEO Mozilla Firefox Add-ons


Firefox SEO Add-ons


Yes, I know there are much more great FireFox plugins but I am sharing only those I am using myself and find really useful.



Useful SEO Mozilla Firefox Add-ons


Firefox SEO Add-ons



Yes, I know there are much more great FireFox plugins but I am sharing only those I am using myself and find really useful.

Monday, July 9, 2012

Google Local Listing Optimization Tips


10 Way to Improve your Gooogle Local Listing

What?

Ranking locally is much more critical with the evolvement of local listings across the web such as Google Places, Bing BBP, Yahoo local and the various social media channels. Google Places seems to consistently drive more traffic than any other local channel, sometimes upwards of 40% share of referring traffic for many of our clients.

Why?

Google has 65% market share at the moment, making Google Places the most important local listing to focus your attention on. This should be followed by Bing & Yahoo.

In October, Google updated their SERPs to showcase local listings alongside your organic result making your local listing that much more important. It’s now absolutely crucial to have your local listing 100% optimized and in sync with your on page/off page website strategy as well.

Google Places Listing

How?

So what can you do to improve your chances of ranking well locally? Below are the top 10 things you can do to improve your chances of ranking high and converting traffic from your Google local listings.
  1. On-page optimization: Use the correct geo-location in all on-page optimization. Do not use the geo-location that you would like to rank for. For example, if you are located in Sunnyvale, you should use “Sunnyvale hotel” and not “Santa Clara hotel”. Its fine to use “Hotel near Santa Clara” but to state “Santa Clara Hotel” would not be accurate or helpful to your local strategy.
  2. Local Submissions: Submit your website/business information to local data providers such as Acxiom, InfoUsa and Localeze. These data providers will feed your business information across the web to other business information channels, helping you build credibility across the web. In the long run, local submissions help you gain more local relevance and trust with the search engines and public users.  It is critical to ensure your business information on your website is accurate and consistent across all of these sources. Incorrect business information will result in distrust from search engines and the public, and will negatively impact your local ranking and conversion.
  3. Claim Google Places listings:  The ability to optimize your listing is only possible once it is claimed and verified by you, the owner. This is a great way for you to control the information about your business.
  4. Optimize Your Local Listing Account to 100%:  Ensure each of the below buckets are optimized and completed.
    • Correct business information: Ensure name, address, and phone number  (NAP) on listing & website is correct.
    • Categories: this is how Google knows what type of business you are.
    • Hours of Operation
    • Photos
    • Videos
    • Additional Details
    • Update Place Page: Promote specials or events with specific calls to action
    • Google Reviews Management Response: Respond to Google reviews via your Google Places account.
  5. Encourage Online Reviews:  Online reviews play into the local ranking algorithm. It’s not so much about the amount of reviews your hotel receives, but the quality of your reviews.  Research which third party review sites are important to your industry/location and target those first. Don’t forget to encourage online reviews directly via your website as well. Google can now index your website reviews and pull them into your local listing. The more reviews, the better!
  6. Respond to reviews:  Doing so can help build/improve your hotel’s online reputation as it shows that management places great importance on customer service and satisfaction. Not only will this result in higher trust, but it can also lead to return visitors/guests.
  7. Build citations from credible/authoritative sources:  Online citations for your business are the equivalent to online referrals for your website.  Citations are not links necessarily, but can be simple mentions of your NAP.  The more credible online citation sources you accumulate, the more likely it is that your Google local listing will rank well.
  8. Claim Hyperlocal listings:   It’s important to claim your Facebook Places, Foursquare, and Gowalla listings. Hyperlocal listings give you local credibility, allow you to gain local relevance, and help you build a stronger local community following.
  9. Ensure you cross-optimize and apply what works on your site to your Places Local Listing:
    • Promote successful packages
    • Promote most viewed/converting pages
    • Promote Social Media channels
  10. Google Boost: Participate in Google Boost to get some early traction to your local listing.  You cannot control which keywords you appear for but it’s a great way to gain some additional exposure.

Thursday, July 5, 2012

You can get your social media profile weight still if its "no follow"

How does nofollow work with the Social Graph API (rel="nofollow me")?

If you host user profiles and allow users to link to other profiles on the web, we encourage you to mark those links with the rel="me" microformat so that they can be made available through the Social Graph API. For example:

<a href="http://blog.example.com" rel="me">My blog</a>

However, because these links are user-generated and may sometimes point to untrusted pages, we recommend that these links be marked with nofollow. For example:

<a href="http://blog.example.com" rel="me nofollow">My blog</a>

With rel="me nofollow", Google will continue to treat the rel="nofollow" as expected for search purposes, such as not transferring PageRank. However, for the Social Graph API, we will count the rel="me" link even when included with a nofollow.

If you are able to verify ownership of a link using an identity technology such as OpenID or OAuth, however, you may choose to remove the nofollow link.

To prevent crawling of a rel="me nofollow" URL, you can use robots.txt. Standard robots.txt exclusion rules are respected by both Googlebot and the Social Graph API.

However, you can be superstitious about the following best practice in using these relationship attribute:

rel="me"

Google now uses reciprocal link juice dumping factor. So, reciprocal links don't get the usual link juice for SEO. So, many people think that using rel="me" can save them, if they link from multiple sites that they own. This tag can be used for similar and related contents also.

rel="friend"

You can define a link as friend, which might have separate influence on the link when search engines plans to utilize the relationship. Without login, if you go to your public profile in Facebook, then you will see that your friends' profiles are linked with the rel="friend" tag. If your content is dependent on the contents of other links (e.g. if you are reviewing a post), then you can use the tag.

rel="external"

This tells the search engine that the link is not a "friend" nor "me". i.e. just a typical stranger. So, this might have different value in the eye of search engine. But, let me remind you that this is still a superstition.

Monday, July 2, 2012

What is Pruning Links


Pruning Links

Once you have identified if bad links are pointing at your site, you need to start working to address it.
link pruningIf you have 10,000 or more links to your site, this can seem like an insurmountable task – particularly if the people who acquired the bad links are no longer around to ask about what they did.
Here are some things you can do to simplify the cleanup process.

Categorize Your Links

Start by pulling the link data. At a minimum, pull it from Google Webmaster Tools, because that is what Google is reporting.
If you can, it is also great to get link data from Open Site Explorer and Majestic SEO. Integrate all this data into one master list.
Combining your data (and de-duping the results) gives you the largest possible list of links, as individual link data sources can only provide a sampling of the links to your site.
Once you have your master list, it's time to start simplifying the task a bit.

1. Sort Your Links by the Linking URL

Group the links by domain. If a domain links to you from 834 pages, just check 1-3 at most – chances are that if the links are bad on any of the pages, it's bad on all of them.
This by itself is a huge simplification of the task. For example, look at these example link counts:

The blog has 201,048 links from 778 domains. So instead of checking more than 200,000 pages, you only need to check 778 domains. The workload seems a lot lighter already, doesn't it?

2. Separate Links From Blogs

This is hard to do without doing a little programming, but the required programming is easy.
Look for URLs that have "blog" as part of the URL, or load the pages and see if you can find the string "Wordpress", "Moveable Type" and other such blog platforms on the page. This won't give you a complete list of blogs, but it will identify a lot of them for you.
You will need to look at these posts, but you can give the person doing the work simplified guidelines, such as telling them to look for:
  • Low quality posts.
  • In context links with rich anchor text.
  • Multiple links per post.
Also, if 1,100 blogs link to your site, and you look at 100 of them and see significant problems, you know you need to check them all (unfortunately!).
But, if you look at 200 or more and they are all clean, you might start to think you don't need to look at the rest. To be conservative, if you look at half of them, and they are all clean, chances are there was no bad blog campaign underway, and you can skip the rest and focus on looking at other problems.

3. Look for Multiple Links Per Page

This is another hint of a possible problem. Not definitive, of course.
For example, the other a guest post someone did on Forbes had six links to the site of the author, including rich anchor text links in the body of the post. It just looked "off."
link-pruning-multilink-pages
People who buy links tend to be a bit greedy. To them, one rich link anchor text link is good, but several links are great – they want to get their money's worth.
The upside of this greed is it can make it easier for you to recognize potential bad links.
Tracking down pages with multiple links takes little bit of programming, but it isn't too hard.

4. Look for Pages That Link to You With Rich Anchor Text

This is again not a definitive flag, but can focus where you look for trouble.
Consider the inverse rule too – if the only links on the page to you use your URL or business name (assuming that these aren't keyword packed), then chances are that the page in question isn't a problem.
Focus your energy on pages that smell like trouble.

Sunday, July 1, 2012

How to Create Robots.txt and Use of Wildcard and Dollar patters Match


How to Create Robots.txt


The simplest robots.txt file uses two rules:
  • User-agent: the robot the following rule applies to
  • Disallow: the URL you want to block
These two lines are considered a single entry in the file. You can include as many entries as you want. You can include multiple Disallow lines and multiple user-agents in one entry.
Each section in the robots.txt file is separate and does not build upon previous sections. For example:
User-agent: *  Disallow: /folder1/    User-Agent: Googlebot  Disallow: /folder2/  
In this example only the URLs matching /folder2/ would be disallowed for Googlebot.

User-agents and bots

A user-agent is a specific search engine robot. The Web Robots Database lists many common bots. You can set an entry to apply to a specific bot (by listing the name) or you can set it to apply to all bots (by listing an asterisk). An entry that applies to all bots looks like this:
User-agent: *  
Google uses several different bots (user-agents). The bot we use for our web search is Googlebot. Our other bots like Googlebot-Mobile and Googlebot-Image follow rules you set up for Googlebot, but you can set up specific rules for these specific bots as well.

Blocking user-agents

The Disallow line lists the pages you want to block. You can list a specific URL or a pattern. The entry should begin with a forward slash (/).
  • To block the entire site, use a forward slash.
    Disallow: /
  • To block a directory and everything in it, follow the directory name with a forward slash.
    Disallow: /junk-directory/
  • To block a page, list the page.
    Disallow: /private_file.html
  • To remove a specific image from Google Images, add the following:
    User-agent: Googlebot-Image  Disallow: /images/dogs.jpg 
  • To remove all images on your site from Google Images:
    User-agent: Googlebot-Image  Disallow: / 
  • To block files of a specific file type (for example, .gif), use the following:
    User-agent: Googlebot  Disallow: /*.gif$
  • To prevent pages on your site from being crawled, while still displaying AdSense ads on those pages, disallow all bots other than Mediapartners-Google. This keeps the pages from appearing in search results, but allows the Mediapartners-Google robot to analyze the pages to determine the ads to show. The Mediapartners-Google robot doesn't share pages with the other Google user-agents. For example:
    User-agent: *  Disallow: /    User-agent: Mediapartners-Google  Allow: /
Note that directives are case-sensitive. For instance, Disallow: /junk_file.asp would block http://www.example.com/junk_file.asp, but would allow http://www.example.com/Junk_file.asp. Googlebot will ignore white-space (in particular empty lines)and unknown directives in the robots.txt.
Googlebot supports submission of Sitemap files through the robots.txt file.

Pattern matching ( Wildcard and Dollar)

Googlebot (but not all search engines) respects some pattern matching.
  • To match a sequence of characters, use an asterisk (*). For instance, to block access to all subdirectories that begin with private:
    User-agent: Googlebot  Disallow: /private*/
  • To block access to all URLs that include a question mark (?) (more specifically, any URL that begins with your domain name, followed by any string, followed by a question mark, followed by any string):
    User-agent: Googlebot  Disallow: /*?
  • To specify matching the end of a URL, use $. For instance, to block any URLs that end with .xls:
    User-agent: Googlebot   Disallow: /*.xls$
    You can use this pattern matching in combination with the Allow directive. For instance, if a ? indicates a session ID, you may want to exclude all URLs that contain them to ensure Googlebot doesn't crawl duplicate pages. But URLs that end with a ? may be the version of the page that you do want included. For this situation, you can set your robots.txt file as follows:
    User-agent: *  Allow: /*?$  Disallow: /*?
    The Disallow: / *? directive will block any URL that includes a ? (more specifically, it will block any URL that begins with your domain name, followed by any string, followed by a question mark, followed by any string).
    The Allow: /*?$ directive will allow any URL that ends in a ? (more specifically, it will allow any URL that begins with your domain name, followed by a string, followed by a ?, with no characters after the ?).
Save your robots.txt file by downloading the file or copying the contents to a text file and saving as robots.txt. Save the file to the highest-level directory of your site. The robots.txt file must reside in the root of the domain and must be named "robots.txt". A robots.txt file located in a subdirectory isn't valid, as bots only check for this file in the root of the domain. For instance, http://www.example.com/robots.txt is a valid location, but http://www.example.com/mysite/robots.txt is not.