4 Mistakes to Avoid in Your Local SEO Campaign

1 Comment  

Although there are a lot of articles on the subject, local search marketing can be quite tricky. What I would like to share in this article are 4 things that while fairly simple, tend to be overlooked by many business owners or their search engine marketing team.


1. Local Data Aggregators

TripAdvisor, Google +, Yelp. Are you optimizing your business listings correctly?

No? You should!

Yes? You shouldn’t stop there!

With Google + having a strong influence in the search engines, I don’t think I even have to say why your listing there should be optimized (okay fine, I just did!). People use Yelp and TripAdvisor all the time, so search crawlers are hanging out there too to see what’s up with your business.

However, some businesses just stop here – and they shouldn’t! Google doesn’t take into account citations from just these 3 places! It crawls 1000s of websites to do so!

Make sure to submit your data on websites that have dedicated “local” pages. A good place to start is on these 4 influential data aggregators: NeustarLocaleze, InfoUSA, Acxiom, and Factual.

Why bother? Well, here is a good way to think of it: If your business is in it for the long-run and is “legit”, you will want it to be found in as many places as possible. This includes offline and online spots. A “dodgy” overnight “business” might run a few PPC or CPM ads, but a legitimate business will take the time to build increase its presence in other ways too. And if 100s of local aggregators mention you, chances are, your business is legit and valuable.

2. 10 Stores = 10 Pages, or Something Like That

This is advice straight from Matt Cutts. Literally, he has a post on his blog titled: “SEO Advice: Make a web page for each store location”.

It’s not a long article, but for those of you who just want the main takeaway, here it is:

Don’t hind your store location information behind a form or a POST. Create a unique, easily crawlable URL for each store. Also have an HTML sitemap of the stores’ web pages if you can. He also acknowledges that this might not be ideal for ALL businesses, so he proposes this solution as well:

“If you have a relatively small number of stores, you could have a single page that links to all your stores. If you have a lot of stores, you could have a web page for each (say) state that links to all stores in that state.”

Oh, and if you really want to maximize your chances of getting ranked higher, then by all means, optimize these pages for mobile too. It should not be a surprise that people are very likely to look for your store location on the go.

Read more

An Unconventional Way to Building Links: Using Images

Leave a Comment  

Guest posting. Article directories. Press releases. What do they all have in common? They all use words to land you links. But focusing on text-based link building can make you miss out one of the most powerful link building strategies on the planet: image link building. From infographics to icons, quality images can translate into powerful backlinks.


How to Use Images for Link Building

1. Offer Premium Graphics to Other Bloggers in your Industry

This strategy for link building is both easy and rewarding. Create, or purchase the rights to use and distribute quality images that you can barter with other bloggers in exchange of a link back to your site.

Don’t go around spamming other bloggers. Take the time to find blogs that could use those images. For example, a blog that always has nice images in its post. Or why not, a blog that has really good information on it, but lacks some visuals.

2. Reverse Image Search

Sometimes other webmasters will love an original image that you’ve put on your site. So much so that they will feature it on their own. Alas, some of them forget to link back to you. To make sure you get due credit for your hard work on your images or at least a link, head over to Google image search and click on the little camera button in the search field. Paste the URL to the original image you have on your site and click “Search by Image.” Google will show you the pages in their index with that same image.

Visit each page in the results and look for any that don’t include an attribution link.The idea isn’t to be threatening, but to thank the webmasters for posting your work and reminding them to link back to you.

3. Infographics

Infographics are a link builder’s Trojan horse. Armed with a quality infographic, you can get links from authority powerhouses like NYTimes.com and Business Insider.

Of course, for this to happen, you’ll have to do a bit of marketing first.

One of the best ways to get some traction to your infographic is to submit it to free infographic directories. Some of them are:









Other promotion methods include the classic emails, social promotion or paid advertising.

4. Memes

Memes are super-easy to create and have the potential to go viral if you can be creative.

To create a good meme:

– Using Photoshop or Quick Meme, create your meme.

– Once the meme is ready, share it on your social account

– Use your meme in your content distribution campaign (in a blog post or an ebook)

Read more

How to Make Your Website SEO-Friendly | A Developer’s Guide

Leave a Comment  

It’s hard to find technical SEO information over the web. This cheat sheet is aimed at helping web developers new to search engine optimization with improving their sites’ interaction with both search engines and users. The SEO best practices outlined below apply to sites of all sizes and types and will make your website easier to crawl and index. After all, search engine optimization often boils down to making small modifications to different parts of your website. When combined with other optimizations, these changes improve your site’s user experience and subsequently its performance in the organic search results.


1. The Importance of Unique and Accurate Page Titles

For each page on your website, indicate the page title with a title tag. A tittle tag serves to tell both users of your site and the search engines what topic the page covers. Ideally, you need a unique title for each page. The title for your homepage can be your business name and include other important bits of information such as a physical location or a few of your main business focuses.

SEO Best Practices:

– Choose a title that accurately describes the topic of the page’s content.

– Each page should have a unique title tag.

– Use brief but informative titles.

2. A Page’s Meta Description Tag

A page’s meta description tag summarizes what the page is about. It can be one-sentence long or it can be a short paragraph. Like the page’s title tag, the meta description tag also goes in the <head> tag of your HTML document. Meta descriptions tags are important for pages on a website because these are often used as snippets for the pages by Google.

SEO Best Practices:

– Use a description that accurately summarizes the content on a page.

– Use a unique description for each page.

3. Mind the Structure of Your URLs

Organizing the documents on your website under descriptive categories and file names keeps your website tidy and it helps Google better crawl the documents. It also creates simple-to-understand and user-friendly URLs. A URL that contains relevant words provides users and the search engines with information about the page unlike oddly named parameters. Also remember that the URL to a document is displayed as part of a Google search result.

SEO Best Practices:

– Use URLs with words and not session IDs and other unfriendly parameters.

– It’s advisable to use hyphens to separate words as they’re treated as spaces by the search engines, .e.g. homes-for-sale.html

– Use a simple directory structure to organize the content on your site.

– Provide only one URL to access a specific document to avoid splitting the reputation of that content.

4. Navigation is Important

The importance of the navigation of a website is two-fold: to help visitors find their way around the website and for the search engines to understand what content is more important to the webmaster. Plan the navigation of your website based on your home or “root” page, the starting place of navigation for many visitors.

SEO Best Practices:

– Develop a naturally flowing hierarchy for your content from general content to increasingly more specific documents.

– Control most of the navigation on your site through text links. Avoid navigation based entirely on javascript based drop-down menus – you’re alright if the dropdown menu is CSS based and can be navigated by the search engines.

– You need an HTML site map page and an XML Sitemap file.

– You need a custom 404 page for the occasional user who lands on a page of your site that does not exist.

Read more

3 Ways to Make Your Content Smarter For Your Audience

Leave a Comment  

In 2013, everyone was talking about the significance of content marketing in search marketing and digital media business. For a start, let’s see what content marketing entails. In the broader scheme of things, content marketing is the practice of creating content and subsequently driving consumers to it in an endeavor to influence people to either talk about the content, share it or use it in some way that will drive your business sales.


The key to successful content marketing it seems is to make the process scalable without compromising organic authenticity. Organic viability is what best distinguishes good content from marketing noise.  If truth were told, content has always been one of the most important aspects of marketing a business online but in this era of mobile devices and multimedia channels of communication including the social sites, the way people consume content has changed. They now consume content through their eyes, ears and hearts.

So What Distinguishes Branded Content from Noise?

1. Knowing your Consumers

know-your-audienceThis is the “who” of a content marketing strategy. If you own a business that you are marketing online to increase the reach and impact of the business you want to know who your consumers are so you can create content targeted to a specific audience. In general, content is created with two types of audiences in mind. First, there are people who already know you and consume your content. Then there are those people you want to influence. Effective content marketing depends on audience research, planning and measurement.

– Audience research means gathering information about your target audience to build a kind of database. The aim is to create buyer personas based on the data you collect. These are the people you want to influence with your content.

– With your buyer personas in place, the next thing is comparing your actual audience with your buyer personas. To that end, there are a number of tools you can use to collect demographic and social audience data. These include Google Analytics, comScore and BlueKai.

– The last step is the analysis and optimization of the content marketing strategy based on audience data.

2. Engagement Customer-Engagement

Equipped with a firm understanding of what your target audience expects from you, you can now use your knowledge to create the kind of content that reaches and impacts that audience. The best content marketers are captivating story tellers. People love stories and when used as part of a content marketing strategy, stories can help marketers build a certain level of trust with their subjects to deliver information that sticks.

At the same time, you need to understand that storytelling cannot be rushed or faked. It is not possible to automate such type of content. It is a kind of art that generates real-time engagement and feedback.

To help you create smarter content, use the data you collected and web analytics to understand what’s popular with your consumers? Do they prefer infographics to written content? Do they like product reviews? Do your product reviews get better engagement when they are in a video form?

Read more

A Simple Guide to robots.txt

Leave a Comment  

Crawlers, robots, user agents; it really sounds like it’s a bunch of geeks who picked those terms in an attempt to confuse the less tech-savvy users.


However, once you understand a few basic terms, a few commands, and how to access your root directory, it all becomes pretty clear.

This is the purpose of this guide: To make the famous Robots.txt an easy thing to comprehend.

What is robots.txt?

To begin with, it’s a simple text file. A text file is simply a file that just contains plain text (no formatting: no bolding, no italicizing, no underlining, no highlighting etc.).

You can create a robots.txt file using a free text editor like Notepad (Windows) or TextWrangler (Mac).

What Does it Do?

Basically, it serves as a set of instructions for search engine crawlers and robots, telling them what directories or files they are not allowed to crawl, index, and display in the search engine results page.

You may be wondering why you would prevent the search engines from accessing some parts of your website. Here is an example: Let’s say you are on a tight budget and decide to go with a cheap host that only gives you limited resources (like bandwidth). You host all your pictures in a folder named “images”. Stopping the crawlers from accessing the images folder is one-way to decrease bandwidth usage.

The Anatomy of a robots.txt File

robots.txt-anatomyThere are 2 main pieces of instructions that you need to know about.

1. User-agent:

2. Disallow:


The User-agent defines the bot that you are targeting. If you are targeting Googlebot, then the line would look like this:

User-agent: Googlebot

If you want to target ALL the bots, then just use the wildcard *, like below:

User-agent: *


This part defines the parts of your site that you don’t want the robots to crawl. For example, if you want them to stay away from your “images” folder, the line would look like this:

Disallow: /images/

Now that you have an understanding of how it works, let’s see how the file in itself should look if let’s say…you want to block every bot from crawling your “pvt” folder:

User-agent: *

Disallow: /pvt/


Want to block multiple folders? No problem! Let’s say you want to block crawling to pvt, images, downloads, media, audio. Here is how it’s going to be:

User-agent: *

Disallow: /pvt/

Disallow: /images/

Disallow: /downloads/

Disallow: /media/

Disallow: /audio/

Google’s crawler is called Googlebot. It understands a few more commands than most other bots. Here is one that might come in handy: Allow. This commands allows you to allow Googlebot to crawl a folder or page within a folder that you asked it not to crawl. Maybe the example below will help you understand better:

User-agent: Googlebot

Disallow: /pvt/

Allow: /pvt/coolpage.html

The only thing it’s going to crawl in the “pvt” folder is the “coolpage.html” file; it will ignore all the rest.

Read more