Obsessed with the powerful and ever changing SEO market, Mark demonstrates a true passion in advanced SEO education and innovative internet marketing strategies for long-term, sustainable business growth.
This Blog is dedicated to all SEO's & business information seekers. Mark Wilson is a writer and an internet marketing professional with extensive experience in search engine optimization.

Facebook’s Version of the Retweet Has Arrived!

facebook We’ve long speculated as to when Facebook might get its own version of Twitter’s retweet, and it appears that the time is now. This evening, the site rolled out a “via” feature that lets you repost another user’s shared items, with a “via” link attached...

Apple iPhone is set to debut their iPhone credit card reader

apple_iphone Mophie, a popular retailer of Apple iPhone and iPod accessories, is set to debut their iPhone credit card reader — said to be named “Credit Card reader” — and complimentary processing application. We’re just a week away from the annual gadget-lover’s dream event, otherwise known as CES. One company that everyone will have their eyes on this year is Mophie.

Google loses Groovle domain name claim

groovle In the complaint, Google asked for the judges to rule that 207 Media transfer the domain name over to it. Google said the domain name used by the small business, 207 Media, was too similar to its own, but mediators the National Arbitration Forum disagreed.But three judges appointed by the forum refused the request.They said the name was not similar enough to confuse people and the word 'groovle' was more closely linked to "groovy" or "groove" rather than Google.

Monday, February 23, 2009

On Page optimization Factors

The most important On Page optimization techniques include content optimization, the title-tag, several META-Tags, headlines & text decoration, alt- and title-attributes and continuous actualization.

  • Keyword in URL: The process of identification and giving preference goes in First word is best, second is second best, etc.
  • Keyword in Domain Name: This is similar to the name of pages that have hyphens, for example: “automobile-truck-one”.
  • Keyword in Title Tag: Make sure you put your Keyword in Title tag that is close to beginning. This title tag should be between 10 - 60 characters with no special characters.
  • Keyword in Description Meta Tag: This shows the theme of your site and should be less than 200 characters. Though Google no longer “relies” upon this tag, it may sometimes use it.
  • Keyword in Keyword Meta Tag: This should be less than 10 words. Every word in this tag MUST appear somewhere in the body text. If not, it can be penalized for irrelevance. No single word should appear more than twice. If not, it may be considered spam. Google purportedly no longer uses this tag, but others do.
  • Keyword density in Body text: The keywords shouldn’t be stuffed into body text. The amount of keywords in the body text should be about 5 - 20% for all types of keywords (total) and 1-6% for one keyword. Too much stuffing would penalize your site under spamming.
  • Keywords in H1,H2 & H3: Many people ignore this particular aspect. Make sure Hx font is effectively used.
  • Keyword Font Size: Italics is same as emphasis while strong is same as bold.
  • Keyword proximity (for 2+ keywords): Placing them directly adjacent to each other is the best.
  • Keyword prominence: This can be important if its on the top (at the most) and in big fonts.
  • Domain Name Extension/ Top Level Domain - TL: .gov sites seem to be the highest status, .edu sites seem to be given a high status, .org sites seem to be given a high status, .com sites excel in encompassing all the spam/ crud sites, resulting in the need for the highest scrutiny/ action by Google.
  • Fresh Pages: search engines love fresh content, fresh pages.
  • Content: Make sure you put unique and fresh content since search engines feed on it. They simply love it.
  • Updates periodicity: Search engine give high status to sites that update regularly.
  • Site Size: Search engines, particularly Google, loves big sites. But it should be unique and fresh site. Any duplicated pages would be discarded by Google.
  • Domain age: Old is gold, this is the mantra of Google and other search engines.

Monday, February 16, 2009

What Is Blogging and Pinging?


There is a new partnership in cyber-town called Blogging and Pinging. It is not a new comedy team or even a singing group, but a new way to build links and attract visitors to your website and to make more money. Blogging and Pinging is a marketing tool that can help anyone build their online business.

Let's start with a definition of a blog.

Blog is short for weblog. A weblog is a journal that is frequently updated and intended for general public consumption. Blogs generally represent the personality of the author of the Web site.


what about pinging?

Originally, a ping was a program that bounced a request off another computer or server over a network or the Internet to see if the remote computer was responding. That same program is now used as a method of informing others that your blog exists and it also let's them know when a new post has been made.

When you put blogging together with pinging you get a technique that is extremely effective at getting any web site, no matter how big or small, indexed by the major search engines.

Why Is Blogging and Pinging The Newest Marketing Tool?

Most of you are now saying, 'Yeah, so what? I can get my site indexed now. What's the big deal?' The big deal is that blogging and pinging gets your site indexed almost immediately and it is free!

If you have any experience trying to get a site indexed and listed in the major search engines, then you know that it is frustrating and very difficult to get done in a reasonable amount of time. After working hard to get your site together, collecting and writing content, worrying about keywords and keyword densities, you are not even close to being finished. Now comes the even harder part! Now you have to manually submit your site to all the major search engines. And finally, you get to wait.

That's right, you have to wait for the spiders to come to index your site. This could take weeks or it could take months. There are some methods that will help bring along the spiders a bit faster, but these take lots of time and effort and money. And still, you can't be sure that the spiders will come fast more quickly.

Blogging and pinging solves this problem. You can guarantee that spiders will come to your site and that you will get listed in all the major search engines with minimal effort and no money in about 48 hours.

How does it work?

The largest search engine, Google, owns Blogger.com. This is the site that many people use to create their blogs. Google frequently sends their search engine spiders through Blogger.com to find new content.

So, if you have a blogger.com account you can add content from other websites to the content of your blog and when you blog the content of your site the URL of the page you are blogging is automatically attached. That way, when Google's spiders index your blogger.com pages, they see your website URL within the content. If that URL isn't listed in the Google database, the spiders are almost certain to follow the link to index your site!

So, rather than wait for weeks, months, or even up to a year, you can get your site listed virtually immediately! Google themselves will tell you that they would rather discover new URL's by finding links than getting them via submission. So, why not blog and ping and give them what they want!

One critical success factor in boosting your link popularity is to use the blog and ping technique to notify lots of other sites anout new content on your web site. This is achieved by using your blogger.com publishing function to ping the major blog directories such as Technorati and Blo.gs.

Another good piece of news is that once Google has indexed your site, other search engines that use Google's feed will list your site too. And it all happens at lightening speed!

Blogging and pinging is a really effective method to get your site spidered by the major search engines and, more importantly, getting those all important links.

Monday, February 9, 2009

search engine Advice on Duplicate Content

Google has come up with helpful points on how to tackle internal and 3rd party duplicate content issues. Since a webmaster has no influence on third parties that scrape and redistribute content without the webmaster’s consent, Google has strong technology in place to trace back the source of the original content. The correct identification of the original content source saves webmaster’s the trouble of having negative effects for their site.

Google sees the content source in two ways as below:

  • Internal - Content pages that has same or almost indentical content that appears within the same website.
  • External - Content that appears in the 3rd party websites either with permission or without.

Whenever you engage in content syndication services such as articles, press releases, and so on., ensure that the syndication partners link back to the original website.

There has been incidents where the scraped content ranks higher than the original content. In those case, Google recommends following advice:

  • Check if your content is still accessible to our crawlers. You might unintentionally have blocked access to parts of your content in your robots.txt View definition in a new window file.
  • You can look in your Sitemap file to see if you made changes for the particular content which has been scraped.
  • Include the preferred version of your URLs in your Sitemap file.
  • Check if your site is in line with our webmaster guidelines.
For more detailed informations Google Webmaster Central