Tuesday, September 16, 2008

Black Hole SEO, Don’t Get Sucked In

Free Viral Advertising!





Recently a number of well known SEO blogs have been talking about Black Hole SEO. In this post I want to take a look at the points raised and explain why I don’t think it’s a viable strategy for most websites.

A black hole site is created when an tier 1 authority site ceases to link out to other sites. If a reference is needed, the information is rewritten and a reference page is created within the black hole. All (or virtually all) external links on the site are made nofollow.

Sites such as TechCrunch use the technique very intelligently - they link both internally and externally. This means they keep bloggers and startups happy by giving a live link and they get good rankings for sites in the network such as CrunchBase.

If you are an authority site then linking to sites in your network rather than external sites is sometimes a good strategy but the problem arises when less experienced webmasters read about Black Hole SEO and think it is a viable strategy.

Unless your site is already a massive authority site then stopping linking is a bad idea. As a webmaster nothing annoys me more than people who reference me or my clients without linking. It’s just bad manners. I remember people who don’t link and make a point of not linking to them in the future. I’m sure some people go even further and bury their stories on social sites.

Most astronomical black holes form when a star collapses. If the star is more than 3 times the mass of the Sun it forms a black hole, otherwise it becomes a neutron star or white dwarf.

The point is that people don’t like Black Hole sites and they won’t want to link to them. So cutting off your outgoing links before you reach critical mass will inhibit your chances of actually achieving Black Hole status.

Google Yahoo Deal Suspended


Google has suspended their advertising partnership with Yahoo after European regulators started investigating the deal.

Google and Yahoo! today said they would temporarily suspend the partnership until regulators from both the US and EU have time to scrutinize the deal.

The planned partnership will see the two search engine giants combine their search advertising systems, which would allow Google to sell advertising on Yahoo! in return for a share of the profits.

Sunday, September 14, 2008

Google Join Hands With NBC To Expand On TV Ad Sales


New York -- Google Inc. and NBC Universal, a unit of General Electric Co, on Monday said they are grouping up to form a strategic multi-year advertising, research and technology partnership, under which the search-advertising giant will get access to sell through its Google TV Ads service, the companies announced in a joint statement.

The move that could be seen as a major victory in Google’s quest to sell ad time in more targeted fashion that will expand Internet powerhouse efforts to become a force in television advertising, and in a way that would have a TV network give up some of its control over the ad-sales process.

“With the addition of NBC Universal inventory, advertisers using the Google TV Ads platform can reach NBCU Cable’s national audience and gain access to viewer-ship data at an unprecedented scale,” NBC Universal and Google said in a statement.

“This latest move would likely give the Mountain View, Calif.-based company a strong foothold in the business of television advertising.”

In a joint statement late on Monday, the two companies announced the multi-year deal between the GE media unit and Google. The deal calls for Internet search giant to employ its TV Ads platform to sell advertisements on some of NBCU’s cable networks, including Sci Fi Channel, Oxygen, MSNBC, CNBC, Sleuth and Chiller channels, in the coming months.

Mike Pilot, president of NBC Universal sales and marketing, and Tim Armstrong, Google’s president of advertising and commerce for North America, said that the partnership would make TV ads more accountable.

Through an existing deal with DISH Network, the Google TV Ads service can report second-by-second TV usage data allowing advertisers to measure viewer-ship of their ads more precisely. The NBC-Google partnership computes on the data supplied by Dish set-top boxes in millions of U.S. homes.

“We are extremely pleased to join forces with Google on this effort, which will help us develop better accountability and [return-on-investment] metrics for our advertisers and attract an entirely new group of clients to television advertising,” Pilot said in a statement. “This is another step in our commitment to trying innovative advertising approaches and testing new technologies that can help benefit our clients.”

“The Google TV Ads platform is making television advertising more accountable and measurable and we are pleased with our progress to date,” Armstrong said in a statement. “Our partnership with NBCU will help us bring the power of television to a broader set of advertisers as well as give our current advertisers increased reach through our system.”

NBC Universal and Google have also plan to work together to adapt the Google TV Ad service for use in local TV markets. They are also collaborating on custom marketing and research projects using Google TV Ads to survey audience trends.

NBCU said the deal could expand to other NBC Universal properties, including top-rated cable network USA Network and NBC, the fourth largest broadcaster, in the future.

“NBC Universal is a big win for us in terms of distribution growth,” said Michael Steib, director of Google TV Ads.

Google will have access to a small slice of the advertising inventory that brings in almost $6 billion a year for NBC Universal. Advertisers will be able to buy time on Sci Fi, which reaches 1.4 million viewers in prime time, as well as lower-rated cable channels.

The amount of inventory to be made available to Google is “fairly small by NBC Universal standards,” said Ed Swindler, executive vice president and chief operating officer for advertising sales at NBC Universal.

Getting “better metrics that are clearer, richer, deeper” is an “imperative” for NBC Universal, said Swindler. “And it does not stop with set-top-box data.”

Swindler said the amount of inventory may be adjusted “to make sure this test is successful.” He said the deal was significant regardless of scope.

“Any deal that allows us to change the model in favor of the advertiser to drive return on investment is a great deal,” he said.

US Advertiser Groups Fight Google-Yahoo Alliance


Los Angeles -- The Association of National Advertisers, a trade group representing some of the country’s biggest marketers is rallying to oppose an advertising deal between Google Inc. and Yahoo Inc., as the Justice Department considers whether to go to court to block the agreement.

The association last week sent a letter to Assistant Attorney General Thomas Barnett, stating that “a Google-Yahoo partnership will control 90 percent of search advertising inventory,” the ANA, which represents major U.S. advertisers, said in a statement.

The letter further stated that the partnership “will probably diminish competition, increase concentration of market power, limit choices currently available and potentially raise prices to advertisers for high quality, affordable search advertising,” the statement said.

The group announced the letter on its Web site on Sunday. The agreement, announced in June, gives Web-search giant Google the right to sell search and other text ads on Yahoo sites, sharing the revenue with Yahoo.

Barnett could not be reached for comment on Sunday.

Staying independent and trying to boost its search revenues by outsourcing part of the advertising to Google would yield more for shareholders than an outright acquisition at the price Microsoft was suggesting, Yahoo’s board decided.

Although the alliance does not need official antitrust clearance, the two companies said they would delay implementing it for 100 days to allow the Department of Justice to study it. The voluntary delay was designed to reduce the risk that regulators would decide later on to challenge the relationship, which links the two biggest search advertising companies, as anti-competitive.

“Whether the letter will influence federal antitrust regulators remains unclear, but it is considered a blow to Yahoo and Google because of the trade group’s high profile. Until now, big marketers have been reluctant to come out against the deal publicly because of Google’s growing power in the ad business.”

Google spokesman Adam Kovacevich said “numerous advertisers have recognized that this agreement will help them better match their ads to users’ interests, and that ad prices will continue to be set by competitive auction.”

“While some have raised questions about the agreements’ potential impact on ad prices, advertisers care far more about getting a good return on their advertising dollar than they do about buying cheap ads that don't bring in customers, and this agreement will clearly help advertisers reach Yahoo users more efficiently,” Kovacevich said.

Yahoo said last night it was “disappointed with the ANA board’s position.” It said prices would be determined by advertiser demand-driven auctions, and the deal would help drive a “more robust” marketplace for Yahoo’s advertisers.

Google could not immediately be reached for comment.

The Justice Department has been reviewing the deal for months, questioning some ad executives and advertisers about what it would mean for the advertising business.

As they weigh comments from outsiders, regulators often discount the views of competitors who complain about a deal, as Microsoft has done. They are likely, however, to listen closely to customers, in this case major advertisers, so the association’s letter could be a significant hurdle.

Microsoft and Michael Kassan, a longtime advertising and media executive who is now consulting for the company, have been lobbying Madison Avenue’s advertising and media-buying executives, as well as marketers, to oppose the Yahoo-Google alliance, according to ad executives. In testimony during House and Senate hearings about the deal, Microsoft general counsel Brad Smith argued that it would lead to fewer choices and higher prices for advertisers.

A spokesman for Microsoft declined to comment.

While some individual advertisers have hinted publicly at their own concerns, the trade association’s letter represents the first attack on the deal from a highly influential group of consumer companies.

The ANA’s board includes representatives from large advertisers like General Motors, Wal-Mart and Anheuser-Busch. Bob Liodice, ANA’s president, said the submission to antitrust regulators had been made after an analysis that included “input from the board’s members,” as well as discussion with Google and Yahoo.

Google and Yahoo combined sell more than 80% of U.S. search ads, which account for the largest part of the online-advertising business. Google alone has more than 70% of that business.

The factual and legal merits and the anti-competitive effects of any practice will determine what the states will do.

The idea for the partnership first arose during Microsoft’s unsolicited buyout bid for Yahoo, which brought pressure on the Internet search pioneer to show it could be just as valuable as a stand alone company. And although Microsoft has long since withdrawn its $33 a share buyout bid for Yahoo, the two Internet search companies are continuing with its efforts to move the partnership forward.

Friday, September 5, 2008

Beginners Guide to Linkbait

Linkbait is the act of adding content to a website with the aim of attracting links from other sites. The content can take a variety of different forms from a unique tool or a breaking news story to a well written article or controversial image.

Sometimes linkbait is intentional but quite often the best linkbait is conceived quite by accident.

My favourite piece of link bait is the Adobe Acrobat Reader software. The download page has over 15 million links and is probably the most linked internal page on the web today.

Creating your linkbait

The first step to thinking of a really cool tool, unique news story or article is to find out what people want to read about. It sounds simple but following the breaking stories on the various social networking sites as well as Technorati and the BlogStorm Tracker will give you a unique insight into the weeks hot topics.

Another good tip is to search the social bookmarking sites for things related to your site that have been popular in the past. You might find a nice tool that was on Digg 2 years ago and be able to use it to inspire something more modern for your site.

A quick look at some popular tools in different areas is often a good source of inspiration. For example you might see a tool on a car/auto website for valuing used cars and be able to apply it to your real estate site.

Top 10 lists


Although top 10 lists are very popular on sites like Digg they are actually the least successful kind of link bait. Most of the reputable sites won’t bother linking to a top 10 list unless it is really amazing or offers new information. You could try using a top 10 list to build a short burst of traffic but don’t expect a link from Engadget in return.

Articles


Well written articles or tutorials are an art form and are one of the hardest link bait methods to pull off unless you are well known or an exceptional writer. The article would need to be very well written and perfectly targeted towards your audience to succeed. If you want to try link baiting some quality articles you should concentrate on writing content that will help people, content that will offer such great benefits that other bloggers will want to link to your post to help their readers.

News

Most new linkbaiters should really start off with trying to break the latest news in their niche before anybody else. Done right this is an easy way to build links from authority sites and social bookmarking sites. You should also benefit from lots of new RSS subscribers who want to get more breaking news from you in the future. Gain a reputation for being the first to break the news and you will dominate your niche. If you break a big enough story you will find all the bloggers in your niche desperate to post about it and links will flow into the hundreds or thousands.

Apart from being the first to write about it the key aspect of writing a breaking news article is to make sure the article is of exceptional quality and includes the sort of images that other bloggers will want to use on their sites. You need to make sure that your article is the authority that others will want to link to. If somebody else explains the story better or has more details then they will attract the links.

Finding news to write about can be very hard. Building relationships with larger companies is impossible for most bloggers so you will be reliant on subscribing to press releases and news feeds in most cases.

If you can be the second site to write about something and promote it in the right way you can often overtake the first site and become the “source” yourself. The best way to find breaking stories that have not yet become mainstream is to subscribe to a load of RSS aggregator feeds. Below are some of my favourites:

Tools


For anybody with a bit of imagination and a talented programmer, creating a tool for your site is the easiest way to link bait. Make the tool useful and well presented and its very easy to link to. Unique and useful tools can often turn a commercial site that struggles to attract any natural links into a useful resource that even competitors will link to and use on a regular basis.

Common ideas include methods to help people find information about products, test their websites, improve their skills, generate images or content for blogs and social networking profiles. Some of the best tools build millions of links without the users even knowing they are part of some clever link bait scheme.

As with any type of link bait the presentation of the tool is very important. Although it sounds like a cliché making good use of AJAX is a great way to improve the linkability of your tool.

Get sued or sue somebody


This technique is only recommended for those with deep pockets or a postal address in an obscure country where you are unlikely to ever face court.

If you decide to take on an industry giant like Google then the links will flow nice and fast. Make sure you have a good case to maximise your links.

Presenting your link bait


Although successful link bait will attract thousands of visitors you should not expect the visitors to click on any Adsense adverts or buy any products from your site. The best way to make your site stand out and attract links is to remove all the adverts. Yes, you read that part right. Remove your adverts. You can put them back on in a months time if you like but make sure the page is clean, well laid out, easy to read and ad free.

The next part is very important, you need to have a selection of bookmark buttons at the bottom of your page. Use the icons from each site to form the buttons so they are familiar to your visitors. The goal is for readers arriving from sites like Digg and Stumble Upon to like your site and bookmark it at Del.icio.us and Reddit while they are visiting.

Design is another essential skill in a link baiters arsenal. The page needs to look great and not look like just another Wordpress blog. Invest the time and money required to make your site look great. People who go the extra mile and design custom graphic for a particular article or blog post will get more links because of it. If you are really creative then adding an image that’s cool enough for other bloggers to use it on their blogs while they link to you will make you more memorable and maximise the potential of your link bait.

Thursday, September 4, 2008

Three Good Reasons To Target Long-Tail Keywords!

In professional terms, what we are talking about here is the concept of targeting so-called long tail keywords.

Long Tail keywords are those 3 and 4-keyword phrases which are very, very specific to whatever you are selling. You see, whenever a customer uses a highly specific search phrase, they tend to be looking for exactly what they are actually going to buy. In virtually every case, such very specific searches are far more likely to convert to sales than general generic searches that tend to be geared more toward the type of research that consumers typically do prior to making a buying decision.

To help illustrate this phenomenon, let's take a look at the typical step-by-step buying path that a customer travels on the way to a making a purchase.

1. Consumer becomes aware of a product.
2. Consumer seeks information about that product in preparation for possible purchase.
3. Consumer evaluates alternatives to product (features, pricing, etc...).
4. Consumer makes their purchase decision.
5. Consumer pulls out their credit card and completes the transaction.
6. Consumer then evaluates the product after buying it and decides if they want to keep or return it.

Using the above six step path to a purchase as our model, you can probably already see that you want to target the consumer who is somewhere around step 4...

Consumer makes their purchase decision.

...because once they have made their decision to buy something, that's when they start using very specific search phrases to seek out their target purchase.

Now for the GOOD news...

Highly specific multi-word phrases tend to be far easier to rank well for than the more generic single keyword or double keyword phrases.

Here's a specific example. Let's say your site sells guided mountain climbing tours in California. At first, you might consider targeting a generic phrase like travel. After all, an adventure tour is generally the type of excursion people like to participate in while traveling on vacation.

However, if you tried to go after that phrase, you'd be facing direct competition from big sites like Yahoo.com, CNN.com and Travelocity.com. It's unlikely you'd be able to knock any of those sites out of the top 10 unless you're willing to invest a pile of money and a mountain of time.

But, even more important, travel isn't the best phrase for you to target anyway. That's because many people who search using that phrase are looking for items such as plane tickets, ocean cruises or just doing very general research on where they might like to go. They're probably not saying to themselves...

"I'm looking for someone who sells guided tours for beginners to climb Mount Shasta so I can take my family on a fun trip this summer."

If they were, they'd be entering something different than travel.

Even if you were to target a more specific phrase like mountain climbing you'd still be up against heavy hitters like About.com, Wikipedia.org, and the USDA forest service. And, unless you sell everything related to mountain climbing for every mountain around the world, the traffic you'd get for that keyword isn't likely to convert to many sales.

So let's look at some of the keywords that are specific to what you're selling—keywords that you can start ranking for and generating traffic and sales right away.

Here are a few highly specific keyphrases that relate to customers who are much later in the buying cycle—at least at step 3, probably at step 4 and possibly step 5:

* california mountain climbing tours
* beginner mountain climbing in california
* guided mountain climbing tours
* mount shasta family climbing tours

Of course, these are just a few examples. I'm sure you could think of many more. However, the point is twofold;

1. The long tail keywords are much easier to rank for.
2. People who search by using long tail keywords are far more likely to become buyers!

More Good News...

Of course this suggests that you should be creating pages that zero-in on snagging searchers who use long tail keywords. And, since there are potentially so many different long tail combinations that searchers may use to buy what you offer, that means you'll likely be creating more pages.

Well the goods news is that Google likes sites that have more pages. It makes the site look more substantial, more natural, and even more real in the eyes of the world's most popular search engine. Bear in mind that your "unique" pages need only be variants of your main offering(s) but focused on a specific long tail niche.

Therefore, each and every page will have a unique title, description meta tag, h1 header tag, and body content that emphasizes your offering by using the long tail keyword that you choose for each specific page. It isn't rocket science, but it sure does work well to snag consumers at the optimum stage of the buying process!

So, instead of focusing on just two or three highly competitive general keywords, target the dozens or even hundreds of easy-to-rank-for long tail keywords.

Also bear in mind, however, that the downside of focusing too much effort on the long tail is, if you target phrases which are too specific, you might not get enough traffic to sustain your business. That's why it's best to have:

* a few pages sending you large amounts of less-targeted traffic, and
* a large number of pages with each sending you small amounts of highly targeted traffic.

But overall, it's best to think of it this way; would you rather rank for one keyword which sent you 1000 visitors a day or 200 keyphrases, half of which sent you 1 buyer a day?

After you do the math you'll see that 100 buyers are much better than 1000 site visitors who are only doing research. And there is no question that the use of ultra-specific keywords demonstrate a greater intent to buy on the part of the customer. This simply leads to more sales which is, of course, what you are really after.
There's just no substitute for Research

In the end there is no substitute for doing your keyword research and determining which keywords have enough traffic to make them worth going after. And this effort must be dovetailed with doing your competitive intelligence research to determine which keywords you'll be able to rank for based on the sites you'd have to compete against.

Of course, Wordtracker is the hands-down best place to find a huge list of related keywords as well as learn how much traffic each is likely to provide your site. For many of the sites we manage, at least half of our customer traffic comes from these longer, more specific phrases—and such traffic tends to convert at a much higher level than generic 1 or 2-word keyword phrases.

So, now you have the tools it takes to get...

1. easier rankings
2. higher sales conversions
3. and many more pages indexed in Google

...all of which will certainly lead to a much more profitable bottom line!

Tuesday, September 2, 2008

How to SEO Your Site in Less Than 60 Minutes

I’ve been a bad blogger. I’m swamped at work and have been distracted outside of work, and I’ve been trying to get by here on SBS with list links and even some of my best Flickr photos. I can’t remember the last time I posted something helpful / educational. My bad….

Let me take a stab at making things better.

I often get asked to review a web site and give quick feedback on the site’s SEO. The issue: Is the site doing well, or in desperate need of SEO help? To answer those questions, I’ve developed a speedy system to go through a site and take a quick SEO snapshot. I’m going to give that system away here. On a smaller site, this should take about 20 minutes. Even on the biggest sites, it’s never taken me more than an hour.
SEO Your Site in Less Than an Hour

A. Visit the home page, www.domain.com.

1. Does it redirect to some other URL? If so, that’s bad.
2. Review the Page Title. Does it use relevant, primary keywords? Is it formatted correctly?
3. Review site navigation:
* Format — text or image? image map? javascript? drop-downs? Text is best.
* Page URLs — look at URL structure, path names, file names. How long are URLs? How far away from the root are they? Are they separated by dashes or underscores?
* Are keywords used appropriately in text links or image alt tags?
4. Review home page content:
* Adequate and appropriate amount of text?
* Appropriate keyword usage?
* Is there a sitemap?
* Do a “command-A” to find any hidden text.
* Check PageRank via SearchStatus plugin for Firefox
5. View source code:
* Check meta description (length, keyword usage, relevance).
* Check meta keywords (relevance, stuffing).
* Look for anything unusual/spammy (keywords in noscript, H1s in javascript, etc.).
* If javascript or drop-down navigation, make sure it’s crawlable.
* Sometimes cut-and-paste code into Dreamweaver to get better look at code-to-page relationship.

B. Analyze robots.txt file. See what’s being blocked and what’s not. Make sure it’s written correctly.

C. Check for www and non-www domains — i.e., canonicalization issues. Only one should resolve; the other should redirect.

D. Look at the sitemap (if one exists).

1. Check keyword usage in anchor text.
2. How many links?
3. Are all important (category, sub-category, etc.) pages listed?

E. Visit two category/1st-level pages.

Repeat A1, A2, A3, A4, and A5 - this will be quicker since many objects (header, footer, menus) will be the same. In particular, look for unique page text, unique meta tags, correct use of H1s, H2s to structure content.

Check for appropriate PageRank flow. Also look at how they link back to home page. Is index.html or default.php appended on link? Shouldn’t be.

F. Visit two product/2nd-level pages.

Same steps as E.

Also, if the site sells common products, find 2-3 other sites selling same exact items and compare product pages. Are all sites using the same product descriptions? Unique content is best.

G. Do a site:domain.com search in all 3 main engines.

Compare pages indexed between the three. Is pages indexed unusually high or low based on what you saw in the site map and site navigation? This may help identify crawlability issues. Is one engine showing substantially more or less pages than the others? Double-check robots.txt file if needed.

I. Use Aaron’s SEO for Firefox extension to look at link counts in Yahoo and MSN. If not in a rush, do the actual link count searches manually on Yahoo Site Explorer and MSN to confirm.

…..END…..

That’s what I do when making a quick SEO site analysis. Important: This is for identifying problems, not fixing them. And it doesn’t replace a real and complete SEO analysis. (There are several shortcomings, for example. Here’s one: Steps E and F assume that all category pages across the site will be the same, and that all product pages will be the same. This is not always the case, so you may miss problems/issues that a real, deeper analysis would reveal.)

Questions for you:

- What other flaws do you find with this?
- What other steps do you take when doing a quick SEO analysis?

WORLD CLOCK