Monday, May 7, 2007

5 Article Marketing Secrets

If you are new to Article Marketing to boost traffic to your website or blog, there are just 5 things you should do immediately. Best of all, these five things will not cost you a penny!

Article marketing is most probably the most effective link building strategy in existence. Article directories allow you to include a resource box for every article you submit. Inside this resource box, you include a link to your website. If you submit just one article to 100 article directories and it is approved for publication, you will immediately receive 100 back links to your website. There are thousands of article directories in the World Wide Web!

Now, if you submit, say, 5 articles to a hundred directories, you may get 500 back links easily! How? Again, by including the link to your website or blog in the Author Box at the end of your articles like I have done below. The best part about this strategy is that quite a number of article directories have high page ranks (PR). We’re talking about PR4, PR5, and on occasion, PR7 websites! Having a link to your own website in a website which has a high page rank would give a significant boost to your search engine standing!

There are many questions about this tactic of submitting articles to directories. Here are my answers to the the top 5 questions.

1. If I submit the same article to many article directories, will it violate a search engine’s policy against duplicate content?

The simple answer is - No - but see #2 (below) for a loophole. Since we’re dealing with high ranking websites, the search engines will assume that the submitted content is your original work. This is the reason why most article directory editors are pretty confident about the articles they display. The problem that they try to avoid, really, is with regards to the credibility of their services. They don’t want to be known as a repository of junk content, that’s why they have outlawed the posting of PLR (private label rights) articles as well as articles loaded with Affiliate links.

PLR articles are those you buy that were written by someone other than you. These are not a good way to go because the authors are selling the same articles to hundreds of other people. Using PLR articles will create a "duplicate content" problem for you with both the search engines and with the Directory editors.

2. Should I submit articles which I have already posted in my own web pages?

If the article was posted on your website or blog first, the answer is "Yes, absolutely!". If you submit your articles to directories first, the search engines may assume that where the articles first appeared, in the Article Directory, is the originator of the content. In that case, your articles could be distinguished as "duplicate content". This could lead to the de-indexing of your articles. It will not, however, get your website de-indexed. The key is, you want your articles to appear on your website or blog first and be indexed first as originating with you. Wait a few days and then submit them to directories.

3. Can I submit as many articles as I want to as many article directories as possible?

Absolutely, that's what I do and I do it often! This is very much advised as a fast way to build up your link popularity and secure a great page rank for your website or blog. I have only found one directory so far that only allows one submission per day. I have not found any others that have any limits whatsoever.

Bear in mind that search engines are quite wary of any website that suddenly experiences a great increase in the number of its back links. It is advised that when you submit articles to the article directories, wait a few days between each submission. There is one exception to this. When I began submitting articles I had 10 that I submitted on the same day to get caught up, so to speak. After that I limited my submissions to one a day to each directory.

4. My page views are are still quite low even after submitting my articles. Does article marketing really work?

Yes, article marketing does work! If your page views remain at low levels after submitting many articles, ask yourself, or someone you trust, the following:

a) Is the subject of your article interesting enough? Does it cater to a wide audience?

b) Is your title enticing enough to merit a reading of your entire work?

c) Is your article informative enough to merit recommendation to others?

d) Is your article readable enough to give your audience an easy time in digesting the information you are sharing?

I almost always ask my wife or my best friend for their advice on my writing. I have no ego to bruise. I want my articles to be readable and helpful. So ASK! If you get - "No" - to any of these, you have found the factors that contribute to your low page views. Doing some editing to correct the problems you have identified can dramatically increase the number of times your article is viewed.

I have some articles that were viewed more than 30 times in just a few days through just one Article Directory! Multiply that by about 300 and you can see the impact. Many of the readers clicked my Author Box link and visited my blogs for more Free Resources on a particular topic. I pride myself on providing lots of Free stuff on my blogs and you should too.

Remember, some article directories attract only a few visitors so viewing will also be low. If you are writing on a highly specialized topic, low page views would not be unusual.

The key is, your articles are published and your back link in your Author Box is pointing to your website or blog and that helps your page rank even on a low viewed directory.

5. Can I be sure that my articles will be kept intact by people who would use them in their own websites or eZines?

You can never be sure. This is why you have to run your own check from time to time. Search for unique phrases in your article and click on the results. Determine if the webmaster published your Author Box links. If he did not, you could ask him to include them. If you get no response, contact the article directory editors so that they themselves can act on the matter. Vigilance is the key. It is rare but it does happen.

A positive side of directories is the editors will recognize "duplicate content" when the thief tries to submit your article under their name. The article will be rejected. This leaves them with placing it on their own blog or wesbite and you can find that out by doing the search as described.

Do these 5 things and watch your traffic increase.

Lastly, finding Article Directories to submit to is simple - Google "Article Directories" and you will get a huge list. Or, do what I do. I use software that submits the articles with a few button clicks (see my blog). This has saved me hours upon hours of tedious hand-submission because the software has over 350 of the best directories built in so you do not have to search for them.

Keep in mind, submitting articles takes work, tons of work when submitting to one directory at a time. Check out various software like I did. The best software does cost a few bucks so do not expect to get off cheap. There is free or low cost software but "you get what you pay for".

Articles, written by you, are still the fastest and cheapest (Free!) way to get more traffic and higher search engine rankings.

Source:
Jim DeSantis

Thursday, May 3, 2007

Difference between buying backlinks

I'm sure that some of you invest into buying one-way backlines to your website or blog. Along the way you will always notice prices with such a big range. You can get a link on some PR5 for $10 per month and on another PR5 site, it would cost $40 per month. Most people think that this second site is just an overpriced site and doesn't know the market too well. Well those who think that are totally wrong, that price fits exactly the package that is included when you get your link onto that webmasters site.

Difference

Many of you should know that the amount of traffic that you recieve to your site has to do nothing with your PageRank. PageRank is 100% decided from the quality of your backlinks, the more and better quality your backlinks are, the higher your PageRank would appear. When you pick that offer of $10, all your getting is a PR5 backlink, nothing else. This person will always most likely have very little traffic to his site, so your only paying the 10 bucks for a backlink.
Now, the webmaster that is offering to put a backlink up to your site for $40 per month will likely have alot better traffic then the one with the $10 price. So your not only getting a backlink, your also getting additional traffic from his site.

So either way you can look at it, I would always choose the sites that have traffic and a high PageRank.

PPI/PPC/Affiliate - Optimization

When I'm browsing through websites, I commonly notice how a large number of webmasters chose to use their website as an extra income. That is great for them, but most of them share a common problem. The three most common programs that are used to generate revenue with their sites are :

PPI - Get paid for 1k impressions
PPC - Get paid for each click
Affiliate - Get paid for each sale made through your site

These three programs center around one thing. and that is the percentage of targeted visitors to your site. Target visitors consist of visitors that have interest in your niche. Non-Targeted traffic is traffic that consists of visitors that have no interest in your niche.

Problem

The problem is that most webmasters use the less efficient program depending on their targeted traffic percentage. Let me tell you a real life story, I know a person who receives a heavy 9k unique targeted visitors per day organically. It's all from the search engine, but he uses PPC program. PCC wouldn't be that bad, but if he would use an Affiliate program, his earning per day would increase by three or four times the amount. Since this traffic is so well targeted, his conversion rate would be extremely high.
Below is a scale that you should pay attention to when deciding which program to use:

Targeted /Visitors Program
0%-25% /PPI
26%-50% /CPC
51%-100% /Affiliate

By following that scale, you can optimize your earning for your website. Some webmasters even use all three of these programs, though this isn't always recommended since your site quality will decrease and it may cause some of your visitors to become irritated.

Wednesday, May 2, 2007

Changed!

As some of my return visitors can notice, I have now changed my blog template completely due to the fact that the last one was dull and not that user friendly. This week I will be trying to put my effort into adding different links and information onto the template to make it more professional.

Thanks,
Martin




Tuesday, May 1, 2007

Using MyBlogLog could get you banned from Adsense

I wrote yesterday about the sorry tale of the now Yahoo! owned MyBlogLog banning Jeremy Schoemaker following Jeremy’s exposure of serious security issues with the service, but from from Jeremy today: MyBlogLog Tracks Your Visitors Ad Clicks.

Shortform: MyBlogLog is not only tracking visits from other MyBlogLog members to your site (if you’re using the service) they are also tracking clicks on your ads, INCLUDING Adsense ads.

I can’t speak for other users, but certainly when I signed up for the service I don’t remember anything being clearly mentioned about MyBlogLog spying on the ad clicks on my site.

Certainly, other 3rd party trackers do track this sort of behavior. I use Adlogger on a number of sites to track my clicks and to protect against click fraud, but notably I choose to do so and I run my own copy locally so that only I have control over the data it gathers. Services such as Alexa, Google Toolbar (both of which I use via the searchstatus plugin) and Compete send data on my surfing habits back to their respective companies, but they send MY data back, not the data of visitors to my sites, and I OPT IN TO those services, knowing to well the implications of doing so. MyBlogLog on the other hand is spying on my visitors (which I accept) but also their activity on each site (ad clicks), which I don’t accept. Notably I’ve also read recently (don’t have the link at hand) that prior to Yahoo’s acquisition of the company, MyBlogLog were trying to shop these stats to the highest bidder, so privacy of the data is questionable as well.

Are you using Adsense and MyBlogLog together? Remember the rule about not being able to disclose the CTR rates to third parties? MyBlogLog is gathering this exact data on your site if you’re using the service. Using Adsense with MyBlogLog could get you banned from Adsense!

As of 5 minutes ago I’ve removed MyBlogLog from my own personal blog. My Adsense account is too valuable to risk, and I’m outraged that MyBlogLog was stealing this data from my site. If you haven’t done so yet, I’d encourage everyone using the service to take it down immediately. For all it’s benefits (it’s still a fun idea) the risks far outweigh the benefits.







By Duncan Riley on 02.23.07 | Syndicate 901am

Monday, April 30, 2007

Origins/Optimized of High Paying Keywords

Many webmasters create a high paying niche but most of them don't understand why its a top paying keyword, and the other half aren't optimizing their high paying niche sites.







Reasons for high paying keywords

For webmasters that are in niches that are related to drawings, arcade games, music, etc;than your most probably your wondering why your CPC is so low. It's irritating when you read about those webmasters earning $2-$10 per click awhile your starring at your 2 cent average CPC. Anyone who you ask, gives that broad answer " It's because of the competition in that your niche". Well guess what, that's not the full answer, that answer cannot help you in anyway.

Let me show the example which reveals your problem:

Low CPC Keyword : " Injury"
High CPC Keyword : "Injury Lawyer"

Where does the difference lie? Let's say your typing the keyword "Injury", what are you mostly looking for, most probably pictures, stories, how to cure an injury; practically anything for your entertainment. Now lets head over the other keyword; "Injury Lawyer", why would you search for that keyword. Your most likely trying to find a lawyer to contact, there is no fun in searching "Injury Lawyer" unless you want to make a case. What does that mean, it means that the conversion rate for "Injury Lawyer" is high. That's the whole answer! The higher the rate of conversion on a specific keyword, the higher the CPC would be for that keyword.

Optimizing Your High Paying Keywords

This is exactly where the traffic tip of updated content on your site applies. The more pages of unique information you have which contains your keyword in the text, the more higher count of high CPC ads you would have on your site, since ( Increase in Pages = Increase in Ads). You want to make sure that you use terms at the end of your keywords inside your text so that it will generate a high conversion if your keyword plus that term appear as an ad. You also want to have a wide range of different high CPC ads on your site covering the same niche, only way by doing that is to keep posting on your site or blog, so new advertiser ads can appear on your site.






Sunday, April 29, 2007

Social Bookmarking Networks are killing your CPC

Social bookmarking is the new "thing" right now. Over 90% of webmasters use it to generate traffic to their site. Some even receive 80% of their total traffic from Social Bookmarking Networks. But none of them realize that they are harming their CPC ( Cost per Click) if their using Google Adsense on their website.


The Truth

If your one of those people with a low CPR(Click Through Rate)and low CPC, then take another look at your tracker to see from where your traffic is coming from. If your receiving heavy traffic from Social Bookmarking Networks and other Traffic Exchanges, then you've found your problem. Everyone knows that Social Bookmarking Networks provide a low CPR to your site, but what everyone doesn't know is that because of your low CPR, your CPC is getting reduced!

Google wants to give its high paying advertisers good service, they want to put the advertisers ads on publisher sites that have a high CPR so their advertiser can receive quality traffic quickly.If your CPR is under 1%, then I'm sure that your average CPC is 3 cents or less.

This is a classic example of the phrase " The rich get richer and the poor get poorer". Google loves to send MFA ads, Click Bank ads, smart priced ads onto your low CPR sites. Google doesn't care at all for your service, you bring them such a low CPR that they have no use from you, they just bomb you with all the smart priced ads. Google wants to provide the high paying ads to publisher sites that have a high CPR, knowing that this publisher site can move ads very quickly. They want to please these high CPR sites with higher CPC ads, so the webmaster of the publisher site stays happy and loyal to Google Adsense.

How much traffic should you receive from Social Bookmarking Networks

It's always good to know how much traffic your allowed to have from Social Bookmarking Networks before it effects your CPC negatively.The percentage that you should be aiming for is 25%. The only time you should ever have a percentage of 50 is if your other 50% traffic is all Organic. Meaning that the other half has to be all from search engines. Traffic from search engines is always the best, the webmasters with high search engine ranking always have the best CPR and CPC.

How to trick the system

There is always a way around anything, well same thing applies here. Unfortunately you won't be able to apply this on your current website, you can make a new website to apply this. As some of you know, there are high paying keywords out there, most of them have a CPC of $1 as their lowest. Now what you can do is create a website that's in a high paying niche. So, even if you recieve a low CPR which results in the reduction of CPC, you will still be averaging a nice buck or so. You don't have to worry about using your high paying websites with Social Bookmarking Networks if your goals for it is low.




Saturday, April 28, 2007

Optimizing your Pagerank!

0.15 + (0.85 x ( a share of the PR of every webpage that it links to it)) = Your webpage PR



The "share of PR of every webpage that links to it" is calculated by taking the current Page Rank of the webpage and dividing it by the total number of links on the page...


So, by removing ALL unnecessary links on a page, you will be able to pass more Page Rank to the other pages that you link to... Why is this important? Remember, the higher the Page Rank a page has, the better it will rank.

So, what are unnecessary links? Unnecessary links could be when you link to another website (link partner) and they don't link back to you. By linking out to a "link partner", you're giving them a percentage of your Page Rank that you could be passing to other pages within YOUR website. If this "link partner" isn't linking back to your website, there is no need to ever link to them. You're just hurting your ranking by doing so!

I have seen websites move up hundreds of positions by simply looking at their links pages and making sure that their link partners are all linking back. If they're not, then you should ALWAYS remove them from your links page. In doing so, you are guaranteed to benefit from this.

Friday, April 27, 2007

Quality Marketing Blog




Networkb4umarket

These days, it's been very difficult finding people that are able to actually write unique content that stats interesting and is informative. The webmaster of the blog (Dee), is really able to stay down to earth with her posts. Not only that, she covers so many topics involving marketing that there is always a good read on there. On her blog, you won't see any duplicated content, or dead phrases without elaboration, she really breaks down everything and can teach anyone something new. So what are you waiting for, click her preview image site above to check out her blog, you will be surprised with what it has in store for you.

Example Post:

When is just as important as What

"We all agree that email marketing is one of the most powerful methods of Internet marketing. And we all know that what's in your email is very important in order to get the results you desire, but did you know that when you send your email is just as important as what's in your email?

Tests have shown that there's one day of the week that gets high sales and another that gets the most clicks. Some tests have indicated that Wednesday was the best day to send out emails and get sales, but more tests have shown that it is the worst day that outperforms all other days by at least 23%.

The summarized results of the tests are:
- Emails sent on Friday results in at least 23% more sales.
- Emails sent out on Tuesday get the worst clicks or views.
- Emails sent on Wednesday get the worst sales.
- Thursday is the best day to send out emails to get raw clicks.
[Source of info is Marketing Manuscript by Mark Joyner (aff)]

I haven't personally tested this, but I have noticed that my blog gets the highest traffic on Tuesday (as opposed to emails which get the worst clicks)… maybe one thing has to do with the other? People would be too busy browsing to check their emails?! I say this with a funny tone but it might be right!

If you would like to test those results yourself, you can use a free ad tracking service like AdTracker.biz to track the clicks of your links on real time. Tracking ads is very important, it allows you to test which email is doing better than the other and it ends the thought of "is someone actually clicking on my links?" to end. "



Page Rank Formula!

Forumla

0.15 + 0.85 x ( a share of the PR of every webpage that links to it) = Your Webpage PR




0.15 = The lowest PR a webpage could ever possibly have.

0.85 = A dampening factor... This is just the number that Google uses for their PR formula.

"share" = the PR of the page linking to you divided by the total number of links on that page.

Example:

Say we've created our website and it has 3 pages in total. For sake of making the math easy,
we'll start each webpage with a page rank of 1.
  • Page A
  • Page B
  • Page C

None of the pages link to each other. They're just pure content webpages with no links to any
other pages within our website. Let's quickly calculate the PR of our pages by using the formula
above.

0.15 + (0.85 x 0 ) = .15

Because no webpages link to our pages, there is no "share of PR"... hence it is 0.
So... If we do the math, we come up with a PR of .15 for each of our pages.

Now, let's see what happens if we link page A to page B.


Using our formula we get:

  • Page A : 0.15 + (.85 x 0) = .15
  • Page B : 0.15 + (.85 x 1/1) = 1
  • Page C : 0.15 + (.85 x 0) = .15
You can see that by just linking page A to page B, we've changed the PR of page B from .15
all the way up to 1. Big difference!
With our newly calculated Page Ranks for all of our pages, let's do the calculation again. By
doing so, we'll slowly get to the "true page rank" of all of our pages... The more iterations we
do, the more gradual the change will be between each iteration.

  • Page A : 0.15 + (.85 x 0) = .15
  • Page B : 0.15 + (.85 x .15/1) = .2775
  • Page C : 0.15 + (.85 x 0) = .15
You can see that after another iteration of our formula, the PR of page B dropped
significantly... This number is closer to the actual Page Rank that Google would give page B.

Now, let's try something different and link all pages to and from one another...



  • Page A : 0.15 + (.85 x 1/2) = 5.75
  • Page B : 0.15 + (.85 x 1/2) = 5.75
  • Page C : 0.15 + (.85 x 1/2) = 5.75
After 1 iteration the Page Rank of all pages is .575.
So, by linking ALL pages together we have maximized the page rank from within our website!
You can play around with your linking throughout your website and by doing so, you can send
more PR to pages you want to rank higher for, and lowering the PR of those pages you don't
care much about their ranking :-)
This can be extremely powerful.

Free Article Submitter! ( Fully Free )

As some of you might have read from my last post concerning the free article submitter, and how I messed up and didn't know that it was a trial version. Well this time I have now is not a trial, it is the full version, but as it being free, it's not as quick as the free trial one. With this one, you must create your accounts on the article directories manually, and then when they open up the browser of the article directly, they fill in the information for the log in, and you must press on the log in then. Then it goes to the next page, and on the site you must then click on submit an article, and again this program will fill in all the article content you already added into it. So there is a lot of manual stuff you must do, but still it will speed you 2x faster. What I recommend is add the article directories that you know onto the program, like one's with a PR greater then 4, and stick with around 20-25, and just use those 20-25. The program already has around 100 article directories installed but the article directorie PR of those aren't that good. But anyway, just give it a try, Article-Submitter-Software

Thursday, April 26, 2007

Google's Giant Sandbox

What is the Sandbox?

Before we get too far into an explanation as to what Google's sandbox is, it must be noted that not everyone even agrees that the sandbox exists. The sandbox is actually nothing more than a theory developed to explain what many different SEO experts have witnessed with their listings. Whether or not the sandbox really exists is actually irrelevant when we know that the effects of the sandbox exist.

Google's sandbox is a relatively new filter that appeared to be put in place back in March of 2004. This happened after the widely publicized updates of Austin and Florida, and the implementation of what is known as the Austin update. If you are not sure what those are, there is no need to worry as those updates are now for the most part in the past. The sandbox filter seems to affect nearly all new websites placing them on an initial "probation" status. The effect of this is that new websites may get into Google's SERP's (search engine results pages) relatively quickly and may even perform well for a couple of weeks. When the filter is applied to the new website it is referred to as being put in the "sandbox". The new website will still show in the result pages, but it will not rank well regardless of how much original, well optimized content and regardless of how many quality inbound links the site may have. The filter restrains new websites from having immediate success in the search engine result pages.

The sandbox filter seems to affect almost all new websites, with very few exceptions. It is important to note that the filter is not a punishment for anything the webmaster did with their new website. The filter is merely an initiation period for new websites.

The sandbox filter also affects more competitive keyword driven sites more than sites that key in on less competitive keywords. If your website focuses on very competitive keywords, you are likely to remain in the sandbox for a longer period of time than if you focus on keywords that are relatively non-competitive keywords.

Why Does the Sandbox Exist?

There is a lot of debate as to whether the sandbox filter is a good thing for Google to implement or not. Obviously webmasters who are trying to get their sites well positioned in Google do not like the sandbox filter as it prevents them from receiving the huge levels of traffic that a top listing in Google can bring. The filter was not implemented at random, however, and there is some good reasoning for the filter existing.

As the SEO community figured out the basic elements of Google's ranking algorithm, inbound links, original content rich with keywords, and the proper use of anchor text, search engine spammers began to take advantage of these elements. Search engine spammers would setup websites that were in clear violation of Google's policies with the knowledge that eventually their website would be banned from the listings. This, however, did not matter. If a search engine spammer could get their website to rank well in Google for even one month, the profits they could make from that one month would justify the cost of building the site in the first place. All they needed to do in the future was to rebuild their spam websites with different domains and slightly different content. The idea for spammers was a simple one. Capitalize off of Google's traffic for as long as they can (before they get banned), then do it all over again with a new website. The method was extremely effective and easy to implement.

What made this all the more easy to accomplish was Google's extremely fast indexing. While other search engines would take several months to index a new website, Google could index a website in as little as one month (they are now indexing sites within a few days). Search engine spammers were living large off of Google's generosity.

To solve this problem, Google determined that it would compromise. They would still index websites quickly, attempting to get as much new, fresh content out to the general public as possible, but they would not trust new websites implicitly as they had in the past. All new websites that were launched would be put on probation. As time passed, and as the sites continued to pass any spam filters they ran, the website will not be held back from performing well in the rankings. Eventually, after quite a bit of time had passed, a site would be allowed to "leave" the sandbox and join the rest of the established websites.

How Does This Affect My Website?

If you have a new website, there is a good chance that you will be placed in the sandbox. This should be expected, but it should not change the way you build your website or market it. You should use the sandbox filter to your advantage.

Google still ranks websites in much the same way that they had in the past. Websites are judged on the quality of their inbound links and the quality of their content. Google will continue to change how they evaluate inbound links and content, but the basic elements of their rankings will remain the same.

While your website is in the sandbox, you should use this time to build your traffic using regular traffic building methods such as writing articles, building a strong community of visitors, and partnering with websites that offer some synergy to your visitors. During your time on probation, you have an excellent opportunity to build all the elements that cause websites to perform well in the search engines. When you finally do leave the sandbox, your website should be very well positioned within Google.

Is My Website in the Sandbox?

When webmasters learn about the sandbox filter, their first question is always whether or not their website has been placed in it. Determining whether or not you are in the sandbox is a relatively easy task to do. First, being placed in the sandbox is different than having your website banned.

If you do a search for your domain in Google and they return zero results for your website (and you had been previously listed in Google), there is a chance that you have been banned. One of the best ways to determine if you have been banned is to look at your log files to see if Google is visiting your website. Banned websites typically do not see Google visit their websites, regardless of who is linking to them.

If you have not been banned, but do not rank well with Google, you should look at the quality of your content and the quality of your inbound links. You should also see if you rank well for non- competitive keywords. Remember how the filter affects competitive keywords more than less competitive keywords? Well, you can use this to determine if you have been sandboxed. Finally, if you rank well in all the other major search engines, but do not show up at all in Google's rankings, you have probably been sandboxed.

Is There A Way to Get Out of the Sandbox?

The quick answer to this is yes, there is a way out of the sandbox, but you will not like the answer. The answer is to simply wait. The sandbox filter is not a permanent filter and is only intended to reduce search engine spam. It is not intended to hold people back from succeeding. So eventually, if you continue to build your site as it should be built, you will leave the sandbox and join the other established websites.

Again, if your website has been placed in the sandbox you should use this time to your advantage. It is a great opportunity to build your traffic sources outside of the search engines. If you have a website that does well in the search engines, you may be tempted to ignore other proven methods of traffic building such as building a community, or building strong inbound links through partnerships. However, if you establish traffic sources outside of search engines, when you finally leave the sandbox, you will see a welcome increase in your traffic levels.

Conclusion

Google has been going to great lengths to cut out on search engine spam. Some have faulted them on the lengths that they are going to claiming that it is effecting legitimate sites as well as the spam websites. While this is probably the case, as an owner of a website you need to place yourself in the position of Google and ask yourself what they are really looking for in a website. Google is looking for websites that offer quality content. Google still relies on the natural voting system that was first used to establish pagerank. They may change the way that they qualify content or inbound links, but the basic elements of a quality website will always remain the same.

No website owner in their right mind will "like" Google's sandbox. However, a smart website owner will use the sandbox as an opportunity to build a website that Google simply cannot refuse.


About the Author:
Mark Doust is the owner of Site-Reference.com, articles that focus on Internet Marketing, Website Development, and Search Engines
.

Secret Exposed- How Google Ranks Website!

Now you will be exposed to 7 types of variables that are used in ranking your website.

Links

Links have always been a huge role in determining citation value. The incoming links help judge the value of the website. The more citation and links, the more important and valuable it has to be. But Google adds some criteria to those citations.

In the past, the number of incoming were scored high, but the judgment on quality of the incoming link was added into this mix, causing the webmasters to also pay attention to the quality of the incoming links.If the linking page and site had a high page rank value itself, then clearly, it knew a good thing when it linked to it. Still, this link quality aspect became harder to define as so many sites were joining the web and the quality became diluted.

Historical factors now play an important roll in addition to the number and quality of the incoming links. It seems Google’s method includes counting the moment a new site is discovered and applying an “aging process” to the site. Google monitors the link as it changes over time, the speed at which the site adds incoming links, and the life span of the link. It isn’t about having thousands of sites that link to yours, but about building those thousands of links over time.

The “aging process” that monitors the history of the links and site helps to combat spam sites. Spam sites tend to come and go very quickly, building links fast through their spamming techniques, then closing down and moving on. Thus, the older the site and the links coming into it, the more “points” the site may get. The shorter the life of the site domain, no matter how many millions of links are coming in, the less Google is interested.

Google monitors the historical value and the slow building of value of incoming links, and they also monitor the changes in the link anchor text over time and throughout the site.

Consistent link anchor text scores low. This is considered “Anchor Spam”. What this means is that if you use the same text in a link, such as perfume sales consistently through your site, then it won’t score very high. If you vary the link anchor text, especially over time not just within the samepage , your odds will increase as Google monitors the changes in links over time. Generally, it is recommended to change the keywords in your anchor text around the top 5-10 keywords, to maintain consistency with keyword rankings and link rankings.

So the perfume sales might include link text such as:

  • cologne market
  • perfume market
  • cologne sales
  • fragrance sales
  • sales of fragrance

This changes the whole landscape. The idea of link exchanges and link spam as a method to attract Google’s search engine bots just doesn’t work. Age before linkage.

WordPress Blogroll Tip


The WordPress Links Manager allows you to set your blogroll links to change randomly with each page view. If you have a huge list of links in your blogroll, consider setting this to random so the links will change, appearing less like link exchange spam. Check out the other options in Links Manager for only showing updated links and other features. Also, getting links from documents that have no content, just links, also won’t work. Links without content won’t score high.

Now, does this mean that if you link to a page and they link back to you, your scores will go down and it might be considered link spam? No. The other criteria goes into effect to help offset this normal linking techniques. But it does impact the concept of the Blogrolls, which are sometimes considered link exchange lists. So choose your link exchanges wisely and avoid hundreds of links from your site to others, or being on a list of hundreds of links.

Domain Age

One of the other criteria is the age of the domain. Again, driven by spam sites which pop up and die off quickly, the age of the domain is usually a clue they tend to be in for the long haul.

This causes some problems. If you change your domain name, then are you back at the bottom of the barrel? Well, maybe not. If the rest of the criteria stays the same and your content maintains consistency, as does your traffic and incoming links, then this might just be a temporary drop and the rise will happen again soon.

Many hosts offer special rates for long term hosting and domain registration. Consider registering your domain for at least two years, five is better. This means you need to make sure that the domain name you choose is one you can live with for two to five years. You can change hosts, but the domain registration needs to stay the same, and stay in your name over the long haul to score points with Google.

Click Through Rates

The click through rate (CTR) of your site may play an important roll in adding up good points on your Google Search Engine score card. The CTR is the rate that people click “through” to your site. Referrer statistics are the numbers and methods visitors use to visit your site. This information tells the site administrator, and Google, from where did you arrive from to land on this site. Did you click through from a search engine (which one), directly, from another site (which site), or, as revealed in the patent, from the cache, temporary files, bookmarks, or favorites of your Internet browser.

The click through rate is also based on the CTR of the advertising on your site. The more ads which are clicked, the higher your score.

The CTR is also monitored for fresh or stale content - in other words, are they visiting new content on your site or old posts or articles? Trends and seasons are also taken into account as certain subject matter gains precedence with the time of the year and the current fad.

Trends, Fads, and Seasons

Built into the Google page ranking technique is the ability to track current and historical trends, fads and seasons. If your site deals with beach wear, the odds are that it will have more traffic during the beach wear season of summer than it will into the fall and winter. This seasonal traffic is taken into account and you may not loserank when the traffic dies down seasonally.

It also tracks whatever is hot in trends and fads. Right now, everything to do with Hurricane Katrina is hot, hot, hot, but a couple years ago, everything and anything to do with protecting you and your home from biological terrorism was top of the list. Paris Hilton was top of the charts for a long time, doing battle with Britney Spears, but now, both of them are old news.

This is an interesting aspect of page ranking. If your site continues to push keywords long past the fad’s life span, then this could be seen as keyword spamming. Yet, using trends and fads keywords as they come and go could attract attention. Luckily, the rest of the criteria in thepage ranking evaluation can help to clear out abusers of keywords related to the current fad or current event.

Posting Frequency

How often you update your pages and add content is monitored over time. It isn’t just how much but when. If you update or add hundreds of articles within a very short time, this is suspicious, but if you rarely update your site or add content over time, then your ranking will probably drop. Finding a happy medium is still a hit and miss angle, but the information seems to point to consistency not just random spurts of energy.

If you consistently add content once a week, and it stays steady, then it is seen as stable. If you add content consistently every day, and then it drops to nothing, then this change indicates an instability. If you do hit and miss content updating and additions over time, and then suddenly post a ton of activity, this can also be seen as instability and suspicious. Steady and consistent, no matter how frequently, adds weight to the score.

Many researchers say that frequent new or updated content carries more weight than infrequent changes to the site. I could find nothing in the patent that lent proof to that theory, but showing consistent activity does work.

A “stale” page is one that is old and rarely attracts interest. A “fresh” page is one that is new, and will be watched to see what kind of interest it may attract. By updating a stale page on your site, you may attract new interest by rewriting or structuring the information and keywords to attract more attention, breathing life back into thepage. Google monitors this “refreshing” of pages to show activity and an increase in interest, scoring high.

Not all “old” pages on your site need updating. If it is still attracting decent traffic, then leave it alone. It is working for you.

The patent also reveals that stable pages that are working which suddenly attract a “spike” in the number of incoming links or click throughs may be an indication of a change of site ownership or spamming. Google evaluates not only the content but the historical changes in the content of thepage and the site and if the changes are dramatic and sudden, then the site will rank lower. Stability over time scores higher.

Keywords Still Play a Roll

Keywords and keyword density still play an important roll in evaluating the content and content history. Putting keywords in titles, links, headings, tags, and throughout thepage is still critical to the success of your site’s page ranking and keyword ranking results.

Changes to keywords, by arrangement, closeness, and inside of links, titles and headings are also monitored, much like link anchor text. Consider reviewing and updating your keywords and checking their density and use throughout your site on a regular basis, if search enginepage ranking is important to you.

In upcoming posts, we’ll discuss how to maximize your keyword density in your blog posts.

Rank by Traffic, User Behavior, and You

Like other comparative search engines, Google’s patent also tells of how page rankings are compared across the board and monitored over time. The traffic is recorded and monitored. How much traffic each page gets as well as the overall site.

User behavior is checked. Google keeps track of how long visitors stay on your site and from what pages they exit your site. You also get points for bookmarking or adding to favorites.

Keyword search results are constantly monitored. What keywords brought the visitor to your site and what keywords they used to search once on your site.

But “you” also play a roll in determining the page ranking with Google. The domain registration information is checked and compared to the information on the site to make sure the two match. The address of the domain owner may help localize search results to that specific geographic area.

How you have your site hosted also is among the other administrative items checked off. Shared IP host addresses run a risk since they are shared. If someone else is using that server for spamming or other evils, you could also be punished. Dedicated hosting is very expensive, so make sure you choose a reputable host who is publicly and actively stopping spamming sites if you choose shared hosting.

The validity of the site’s code and structure plays a small part, but is still part of the criteria. Make sure your site’s code is validated, checked for errors, and friendly to search engines. Any errors in yourpage structure or code can easily thwart a search engine’s process through your site. Table designed sites rank low while CSS based designs are much more search engine friendly.

Spelling is still important. Not that Google’s patented page ranking process includes a spell checker - words that are not recognized get dumped. If misspelled keywords are among your missed spellings, then your site will be hurt in the rankings.

By my-last-shot

The Importance of valid HTML code!

There are many webmasters out there that overlook a very important aspect of web site promotion; it’s the validity of the HTML code.


What is valid HTML code?


Most of the web pages are written in HTML. As for any other language, HTML has its own grammar, vocabulary and syntax, and ever document written in HTML us supposed to follow these rules. HTML is always constantly changing. As it has become a rather complex language, it’s easy to make a mistake. HTML code that does not follow the official rules is called invalid HTML code.


Why is valid HTML code important?



Search engines have to parse the HTML code of your website to find the relevant content. If you have errors in your HTML code, then search engines might not be able to find everything on the page. Search engine crawler programs obey the HTML standard. They will only be able to index your website if it complies correctly with the HTML standard. If there is a mistake in your web page code, then they might stop crawling your website, and they can also lose what they have collected already because of their encounter with this error. Even though most major search engines can deal with minor errors in the HTML code, a single bracket in your code can be the reason why you cannot find your website in the search engine. If you do not close some of your tags properly, of if there are some important tags missing, search engines might ignore the complete content of that page.


How can you check the validity of your HTML code?

Validator.w3 is a great website where you can verify if your HTML has any errors.

Wednesday, April 25, 2007

Adsense Alternatives

Adsense from Google

  1. Google Adsense

Adsense Alternatives

  1. 24/7 RealMedia
  2. AdBrite
  3. Advertising.com
  4. Burst Media
  5. Kanoodle
  6. Link Share
  7. ValueClick
  8. Yahoo Publisher Network
  9. Accelerator Media
  10. Ad Agency 1
  11. Ad Dynamix
  12. AdEngage
  13. Adgenta
  14. Adhearus
  15. AdKnowledge
  16. AdPepper
  17. ADServing Network
  18. Adsmart
  19. Adtegrity
  20. AdZuba
  21. Adversal
  22. Affiliate Future
  23. Affiliate Sensor
  24. Affiliate Fuel
  25. AllFeeds
  26. Auction Ads
  27. AV Nads
  28. Azoogle Ads
  29. Banner Boxes
  30. Banner Connect
  31. Bardzo Media
  32. BClick
  33. BidClix
  34. BidVertiser
  35. BlinkAds
  36. BlueFN
  37. BlogadNetwork
  38. BlogAds
  39. BlueLithium
  40. Buy.at
  41. Casale Media
  42. Chitika
  43. ClickAdsDirect
  44. Click Booth
  45. Click Share
  46. Clicksor
  47. Click Xchange
  48. CrispAds
  49. ContexWeb
  50. Cyber Bounty
  51. Cover Clicks
  52. CPX Interactive
  53. Direct Networks
  54. Enhance Interactive
  55. Esource Media
  56. Etype-Europe
  57. EtypeUSA
  58. Etargetnet
  59. ExpoActive
  60. ExoClick
  61. Fairadsnetwork
  62. FastClick/ ValueClick
  63. FluxAds
  64. HurricaneDigitalMedia
  65. Hyperbidder
  66. Hydramedia
  67. Incenta Click
  68. Industry Brains
  69. Interclick
  70. JoeTec
  71. LookSmart Adcenter
  72. Kontera
  73. LinkBLiss
  74. Mamma Media Solutions
  75. MSN adCenter
  76. MaxBounty
  77. Mirago
  78. MIVA AdRevenue Xpress
  79. MoreNiche
  80. Nixxie
  81. Oridian
  82. Oxado
  83. Paypopup
  84. PayperPost
  85. PeakClick
  86. Popup Traffic
  87. Quigo
  88. RealCastMedia
  89. RealTech Network
  90. Revenue Pilot
  91. ReviewMe
  92. RightMedia
  93. Searchfeed
  94. ShareAShale
  95. Sponsered Reviews
  96. TargetPoint
  97. Text Link Ads
  98. TMP Express
  99. Tremor Network
  100. Tribal Fusion
  101. Veoda
  102. Vibrant Media IntelliTXT

General optimization tips!

1. Don't use frames

Many search engines have troubles with frames, making it very difficult to get a high search engine ranking for a website that is using frames.Google says about frames: "Reasons your site may not be included: Your page uses frames.Google supports frames to the extent that it can. Frames tend to cause problems with search engines, bookmarks, emailing links and so on, because frames don't fit the conceptual model of the web. If a user's query matches the site as a whole, Google returns the frame set. If a user's query matches an individual page on the site, Google returns that page. That individual page is not displayed in a frame -- because there may be no frame set corresponding to that page." Yahoo has a similar statement: "Yahoo! Slurp follows HREF links. It does not follow SRC links. This means that Yahoo! Slurp does not retrieve or index individual frames referred to by SRC links."Don't use frames on your web site if you want to have high search engine rankings!

2. Avoid Flash and other multimedia elements

Most search engines cannot index Flash pages. The normal text content on your web pages matters most to search engines. If you must use Flash on your web site, make sure that you also offer normal text for the search engines.

3. Don't use welcome pages

Some web pages use a "Welcome to our web site" image with a link to the actual site as the index page for the web site. Just don't do this. Some search engines might not follow the link on the welcome page. In addition, most web surfers don't like these welcome pages causing a loss of visitors.

4. Avoid dynamically created web pages

Databases and dynamically generated pages are great tools to manage the contents of big web
sites. Imagine having to manage the Web site contents of the New York Times without databases. Unfortunately, dynamically generated Web pages can be a horror for search engine spiders because the pages don't actually exist until they are requested. A search engine spider is not going to be able to select all necessary variables on the submit page.
Some search engines can index dynamically pages to a point, but even Google states that they have problems with dynamically created pages: "Reasons your site may not be included in Google: Your pages are dynamically generated. We are able to index dynamically generated pages. However, because our web crawler can easily overwhelm and crash sites serving dynamic content, we limit the amount of dynamic pages we index."

4. Make sure that you allow search engine robots to index your site

Imagine you're an Internet marketing service company and you keep trying very hard to get top rankings in the search engines for your customer. Even after several weeks, the customer's web site hasn't been listed in any search engine. Then you start to realize that the search engine spiders and robot programs cannot access the web site because your customer blocks them because the robots.txt file is not properly configured. Details about the robots.txt file can be found here.

5. Make sure that search engine spiders can access your web site

Search engine spiders don't have the functionality of full-fledged Web browsers such as Microsoft Internet Explorer, Firefox or Netscape Navigator. In fact, search engine robot programs look at your Web pages like a text browser does. They like text, text, and more text. They ignore information contained in graphic images but they can read text descriptions.
This means that search engine spider programs are not able to use Web browser technology to
access your site. If your Web pages require Flash, DHTML, cookies, JavaScript, Java or passwords to access the page, then search engine spiders might not be able to index your Web site.

6. Avoid special characters in your URL

Most search engines have problems indexing web pages when their URLs contain special
characters. The following special characters are known to be "search-engine-spider-stoppers":
* ampersand (&)
* dollar sign ($)
* equals sign (=)
* percent sign (%)
* question mark (?)
These characters are often found in dynamically generated Web pages. They signal the search
engine crawler program that there could be an infinite loop of possibilities for that page. That's why some search engines ignore web page URLs with the above characters.

7. Choose a reliable and fast hosting service

Your web page should be hosted by a reliable hosting service. Otherwise, it could happen that your web server is down when a search engine spider tries to index it. If your web site fails to respond when the search engine's index software program visits your site, your site will not be indexed. Even worse, if your web site is already indexed and the search engine spider finds that your site is down, you'll possibly be removed from the search engine database. It's essential to host your web site on servers that are very seldom down.

Search engine crawler programs that index Web pages don't have much time. There are
approximately 4-6 billion Web pages all over the world and search engines want to index all of them. So if the host server of your Web site has a slow connection to the Internet, you may experience that your Web site will not be indexed by the major search engines at all.

You may also want to limit the size of your homepage to less than 60K. It'd also benefit the still
numerous users that connect to the Internet with a slow modem. For even the casual Internet user, the performance of a Web site can make the difference between pleasure and frustration.

Tuesday, April 24, 2007

Return Visitors VS 1st-Time Visitors

Many of you can view on your counters how many Return visitors and how many 1st-Time Visitors you have, well this plays a huge role if your earning an income from adsense ads or any other adsense alternate. Lots of webmasters don't understand why their earning are so low and also their CTR (Click Through Rate). Well, I'll be going over that.

Who's Infected

I'm sure that the one's who owns a forum has a hard time trying to understand why their CTR is so low, well I'll explain. On a forum, the plan is to get people to register and keep coming back to your forum to keep posting. That's exactly whats harming you. There aren't that many variation of ads out there for every niche, if you have so many Return Visitors, then after some time, then each person would have seen every single ad, so they have no interest in seeing your ads, and that person doesn't help your situation. The more return visitors you have in your forum, they lower the CTR, no matter how well you place your ads. Now I'm not saying that only forums are affected, there are many sites out there that have the same problem, they keep bringing back the same people day after day, and it ends up that 65% of their traffic is return visitors.

Solution

This may not be helping your situation totally, but you should be seeing some results. As many of you should know, there are millions of advertisers out there on the web, and they all are not attracted to the same adsense alternate you use. What you have to do is sign up with 2 more adsense alternate sites. You should have three at least, now spread these three or more different types of ads around your forum or site, so now there should be a larger variation in ads. This will increase your CTR if done correctly.

Sunday, April 22, 2007

Don't get banned by Google!

It's a simple question... Are you asking Google to penalize your website?

I'm sure you're probably thinking; Well, of course not! Yet daily I see new people complaining in search engine optimization forums that their websites have been banned by Google and they "have no idea why".

These people claim they've done nothing wrong and are absolutely clueless as to why their site is no longer in Google. The purpose of this lesson is to teach you one VERY important thing.

What NOT to do when optimizing your website, to make sure you don't get banned.


How do you know if you've been banned?

First let me show you how to see if you're clearly banned by Google. Often times people think they've been banned, when in reality they've just dropped in ranking and can't find their website.

There are a couple of things you can do.

1. Check Google's search results.
2. View the Google toolbar.

Check Google's search results?


Go to Google and enter your entire URL into Google's search box. In this example we'll use a made up domain name. (www.jkhljkhkjh.com). On a side note, I tried www.somerandomdomain.com and www.fakedomainname.com and both were already taken.

Anyway, we'll go to Google and enter our entire URL and click "search"
http://www.jkhljkhkjh.com

Notice that Google says there is no information available for this URL? This means that the URL is no longer in Google's database (a.k.a. index)

If you enter a brand new website into Google, you'll always get this message until the website has been indexed. But, in this case, the website we entered has been banned by Google for some reason is either banned or is brand new. If we know our website was once in Google and do the search I just showed you above, and Google says "Sorry, no information is available for URL [whatever your URL is]", then chances are... you're banned.

Another way you can quickly see if your website has been banned is by:


Viewing the Google Tool Bar

Download the Google Toolbar here: http://toolbar.google.com

Once it's installed, simply visit your website. If the Google toolbar is completely gray, this may meen that you have been banned by Google.

Banned:

* Note: Most SEOs call this "gray barred". (So if you hear that term in an SEO forum, that's what they're referring to)

Not Banned:

Ok, so now we know how to tell if our website has been banned.


What can cause your website to get banned?


There are many onpage ranking factors AND offpage ranking factors that can cause Google to ban your website. Today, we'll focus on only the onpage things that can cause your site to get banned.

Before I begin, I want you to know that many websites still get away with doing some of these things. They DO NOT help your rankings and are simply a waste of time, so don't try them. Sooner or later Google will catch up to these websites and will remove them.

It's just not worth the risk when doing them no longer helps your ranking to begin with.


Hidden Text

Hidden Text is simply text that users can't see when they visit your webpage. Some webmasters will do this so that they can add keywords throughout their webpage without it interferring with what the visitors actually see. Yet, the search engines can still see hidden text.

For example, let's say you have a white background on your website. If you wanted to hide text, you would simply make the color of your text white (#FFFFFF) and users couldn't see it.

I did a quick search in Google and quickly found an example of a website using hidden
text. Have a look below:

At first glance, you're probably wondering where the hidden text is...

Let me show you. I went to the website and clicked "ctrl + a" on my keyboard. This will highlight the entire webpage as shown below:

Now we can clearly see the hidden text at the very top left side of their website that says "fat loss body fat abs weight loss diets bodybuilding dieting tips abdominals"

These are keywords that they want to rank well for and want the search engines to see when they first visit their website. Yet, they don't want their visitors to see this text. So, they've made the text white, to blend in with the background.

This website used to be ranked #1 for "fat loss", but I just did another check to see where they're ranked, and they're no longer anywhere to be found... AND it looks like they also removed they're hidden text, but it's probably too late.


Alt image tag spamming

This is another way that people will try to cram keywords into their website, allowing search engines to see their keywords, but not allowing visitors to notice any difference in their website.

The following is a website that wanted to rank well for "cabbage soup diet". What they've done is inserted a graphic of a cabbage. They've then added an alt image tag to the graphic. When a visitors visits the website, and hovers their mouse over the cabbage soup graphic, a little popup will appear.

Notice how many times they've repeated the word "cabbage soup" and "cabbage"? Way too many! It serves no purpose other than to cram as many keywords as possible into their webpage.

The real purpose of an alt image tag is if a user visits your website and the graphic will not load, or is disabled by their web browser, text will appear instead of the graphic. This is often used for blind people.

Alt image spamming is something you want to stay clear of. Using alt image tags are good, but you can overdo it, as you can see above. A good alt image tag in this case would simply be: cabbage soup diet graphic

Meta Tag Surfing

What I'm referring to here is when people throw in thousands of the same exact keyword into their meta tags.

For example, the following website is trying to rank well for the keyword "tents".

< name="KEYWORDS" content="tents, TENTS, Tents, tents tents tent supplies, tents, tents tent, tent, Tent, TENTS, tents, Tents,tents, TENTS, Tents, tents tents tent supplies, tents, tents tent, tent, Tent, TENTS, tents, Tents,tents, TENTS, Tents, tents tents tent supplies, tents, tents tent, tent, Tent, TENTS, tents, Tents,tents, TENTS, Tents, tents tents tent supplies, tents, tents tent, tent, Tent, TENTS, tents, Tents tents, TENTS, Tents, tents tents tent supplies, tents, tents tent, tent, Tent, TENTS, tents, Tents">

This is obviously ridiculous. Google, and other search engines no longer use Meta Tags to rank websites.

Google WILL penalize it, and it WILL NOT help you... so, why would anyone do something like this?

Stay away from it.

Title Tag Surfing

The title is what appears in the top left hand corner of your webpage. Below is an example of Title Tag Stuffing.

Don't do it... You only need to include your keyword(s) one time in your title tag.

Anymore than 1 time will only dilute the effect, and if you overdo it as shown above, you may get severely penalized and drop in the rankings.


Those are just a few of the things that people are continuing to do online. These things WILL eventually get your website banned or penalized and WILL NOT help you rank well. It's just a waste of time and effort, plus it's just plain ignorant to waste your time on something that doesn't work and will get your website banned from the search engines anyway.



Resource: seoelite

been banned?