Create an RSS Feed

RSS feeds are created in XML. Feeds can be created using tags that are enclosed in brackets <> very similar to HTML.

Software to Create Feeds
If you are not as confident and would like software to create the feed we strongly recommend FeedForAll an extremely easy to use feed creation tool that allows webmasters to create, edit and publish rss feeds.

If you want to create an RSS feed using a text editor a step by step walk-through at Make RSS Feeds will help.

Online Feed Creation Tools
If you just wish to create a single feed and do not need to edit or update the feed you can use an online feed creation tool.

GoArticles - Syndicate articles appearing on GoArticles a large article repository.

BlogStreet - online feed creation tool, only works for blogs hosted on BlogStreet

Online RSS Feeds - Online feed creation tool

Blog Harbor - online java script blog generator

Create RSS - resource for creating RSS feeds

We recommend FeedForAll for RSS feed creation

What is RSS?

RSS is an acronym for Really Simple Syndication and Rich Site Summary. RSS is an XML-based format for content distribution. Webmasters create an RSS file containing headlines and descriptions of specific information. While the majority of RSS feeds currently contain news headlines or breaking information the long term uses of RSS are broad.

RSS is a defined standard based on XML with the specific purpose of delivering updates to web-based content. Using this standard, webmasters provide headlines and fresh content in a succinct manner. Meanwhile, consumers use RSS readers and news aggregators to collect and monitor their favorite feeds in one centralized program or location. Content viewed in the RSS reader or news aggregator is place known as an RSS feed.

RSS is becoming increasing popular. The reason is fairly simple. RSS is a free and easy way to promote a site and its content without the need to advertise or create complicated content sharing partnerships.

Definitions of RSS

RSS (n) RSS is a Web content syndication format. Its name is an acronym for Really Simple Syndication. RSS is a dialect of XML. (source Harvard)

RSS (n) RSS is a format for syndicating news and the content of news-like sites, including major news sites like Wired, news-oriented community sites like Slashdot, and personal weblogs. (source XML.com)

RSS (n) Really Simple Syndication (RSS) is a lightweight XML format designed for sharing headlines and other Web content. (source WebReference)

RSS (n) Really Simple Syndication (RSS) is an XML-based format for content distribution (source CNET)

RSS (n) RSS is an XML-based format for syndicated content. (source IBM)

RSS (n) RSS is an acronym for Rich Site Summary, an XML format for distributing news headlines on the Web, also known as syndication. First started by Netscape as part of the My Netscape site, it expanded through Dave Winer and Userland. RSS started off in an RDF format. (source newsmonster)


Worst SEO Mistakes

Worst SEO Mistakes

By: Ivan Strouchliak

In this article we cover some of the worst SEO mistakes. Some of these will cause the search engines to ignore you; others will cause potential visitors to ignore you; still others may make your site disappear from the search engine results pages entirely. We'll go over each practice and explain why you shouldn't do it. We'll even tell you what you should be doing.

If you're making a new website and thinking of using Flash - stop. Though Adobe made Flash crawlable and shared technology with Google and Yahoo (leaving out Microsoft), Flash is still a bad choice.

If you don't show much content to the search engines, you will have to invest more in links. Not only does a Flash site cost more to make, but you'll need more money to optimize it. If there is no real reason to use Flash, don't. If you want cool movies and features, consider embedding Flash videos and using cool CSS styles.

No Use of Title Tags

"The most powerful HTML tag you have at your disposal" ­- Ross Jones, Search Engine Ranking Factors. Put keywords you target the page for into the tag. This is the part that shows up in search results as a big blue link, so make it count.

Another mistake webmasters make is to put something like "Welcome To...." in the title tag. Aaron Wall calls this "Welcome to low rankings."

Title Tag Duplication

Many designers create a site design without much regard for SEO, using one title tag for an entire website. Make sure to vary the title tags on your pages.

Messed up Robots.txt

It's a good idea to have a file to keep your error log clean of robot.txt requests, but if you can mess up a lot if you use it incorrectly. Here's how it looks, assuming you don't want to block ANY pages:

Not Using Keywords in Internal Links and Navigation

Your internal onsite links have as much SEO weight as some outside links. If your home page has link authority, then links from your home page to internal pages pass pagerank in the same manner as outside sites pass pagerank to your home page.

By linking from your most powerful pages to internal pages with keyword-rich links, you're doing exactly the same thing as getting keyword-rich inbound links from high PR pages.

To make the most use of link power, make sure that your navigation consists of keyword-rich, targeted descriptions. Do the same with keywords scattered within content. Don't worry about your footer; Google doesn't put much weight on those.

Aggressive Search Engine Optimization Firms

Aggressive may mean effective. All effective SEO firms buy links, meaning that most good SEOs are official black hats. Are you okay with the risks? Many effective firms actively use spam and other black hat methods to promote their clients. Greedy ones get burned. Those who are smart watch carefully and know when to pull back -- because there is no safety net.

You should know if your SEO firm does black hat up front. Rewards are great, but so are the risks.

Automated Content Generation

Automated content generation involves a bot scraping the net, mixing articles and producing low value pages camouflaged as a web site. It's often part of the link farm an AdSense site.

Duplicate content detection has improved to the point that search engines will know in no time if you're spamming.


This is not spam, but a bad mistake, like building a Flash-based web site. Frames are hard to crawl for spiders; they have no unique URLs to link to and are horrible from a usability perspective. Sites that use frames lose on all fronts.


Cloaking still works if done correctly. Some good SEOs still use cloaking, but if you're new, stay away from it. Cloaking presents a normal page to visitors but a super-optimized page to search engine spiders.

Hidden Text

Hidden text involves stuffing keywords onto a page and making keywords the same color as the background. This is an absolutely pointless technique since search engines discount pages if they see keyword stuffing.

If you put legitimate text or an optimized article on your site as hidden text, just to make page look shorter, it still doesn't make sense. There are CSS "hide/show" tricks that can conceal text while being search engine friendly.

Keyword Stuffing

Keyword stuffing involves putting a bunch of keywords on a page simply to get higher rankings. It's pointless. Search engines can tell when you're using too many keywords for natural content, and your visitors will find keyword-stuffed content to be difficult to read.

Doorway Pages

Doorway pages are one-page websites optimized for a specific keyword and made with one purpose - to rank in the search results. All links on the "doorway" page lead to the real website, essentially acting as an entrance, hence the name "doorway page." Search engine do not like doorway pages, but they still work.

Alt Tag Stuffing

ALT tags are designed for blind people. Alt tags should describe images, buttons, navigation and other site elements to make browsing easier for those who can't see. Some people stuff their keywords into ALT tags, in hopes of ranking better in search results. This doesn't work. If spotted, you'll be punished.

On top of being ineffective, it's a real hassle for blind people who have to listen to the BS webmasters stuff into these tags to manipulate search engines. Use alt tags for image descriptions, and don't worry about the keywords.

Comment Spam

There are several types of comment spam.

One type uses robots to put in generic BS comments and links back to their owner's site in hopes of getting more valuable inbound links. Google discovered this and doesn't count comment links anymore. Bloggers also use the nofollow tag, so it doesn't matter if there are links in comments.

Another way to spam comments is to leave generic comments like "I did an article about this too, come check it out" in hopes of driving traffic and subscriptions.

Blog comments are designed for discussion, so saying something useful without intent to get something back plays a lot better in the long term. You get respect and eventually the blogger becomes more interested in you. Then you get your link.

Forum Spam

Forum spam involves leaving forum messages disguised as editorials, but are nothing more than ads for a website.

This type of spam looks really nasty and irritates all forum users. Don't do it. If you want to drive visitors from forums, participate in discussions, offer advice and help everyone. You will gain respect and people will visit your site from the signature. This time however, instead of saying "who the hell is that guy?!" they will land on the home page wit the thought of "that guy is good, let's see what he's built here."

Reliance on Meta Tags

Meta tag's days are over. Keyword meta tags do not work. I wouldn't bother spending more than few minutes on keyword meta tags. The description meta tag, on the other hand, is very useful. It is the description that shows in search results, so make sure there are clear calls to action.

Submission to a Bunch of Social Media Sites

You've probably seen those one-vote articles sitting in no-man's-land with headlines like "New marketing methods" and "Super interactive website www.spamsite.com" Those are attempts by marketers to get diggs, reddits and other types of votes. Social media marketing is a craft of its own, and a link to a generic site doesn't do it. You can learn a great deal about social media marketing over at Sphinn.

Long and Dynamic URLs

Some content management systems produce long URLs that search engine spiders do not like. Keep URLs short and sweet. There are plenty of content management systems that make search friendly URL stings. They include Joomla, WordPress, Drupal and more.

Some content management systems also produce dynamic URLs, which have different strings, but lead to the same place. Those are not friendly it terms of page rank, so make use of SEO-friendly CMSes (above).

JavaScript Links

Search engines have a hard time following JavaScript. Make your links classic (a href= ....) Though Google crawls JavaScript, it's better to be on the safe side.

Free for All Reciprocals

Link exchanges in general are not effective. Exchanging with a quality site is never bad, but getting 100 random reciprocals can get sites punished.

Free links exchanges are very easy for search engines to spot. Once detected, all participants can get banned. If you trade a lot of low quality links, your link profile may raise red flags.

Randomly Changing File Names

If you change a filename, you lose all the power from previous links. There's no point to doing this. If you feel like there's a need to make some changes, then leave the ranking pages as they are. It's hard to get links, and it's even harder to get quality links from good sites that have aged.

No Site map

A site map is a page that has links to all other pages of your site. It can guide search engines to pages which spiders would otherwise not discover. Make sure that your site features a site map; there are a number of guides available on the Internet that explain how to build one.

Creating Link Networks or Link Farms

Depending on how you do it, this can still work. Google places more value on authority, so creating link networks is not as effective as it used to be. If you do create a link farm, make sure that:

Sites are on different servers, preferably in different countries.

Sites have different WHOIS data and ownership.

Sites have many outside links apart from those within the network.

Include several older domains, since a network of new domains looks very spam-like.

Given how much work is involved to hide the nature of the link network or link farm, you have to seriously question whether it's worth it, even if it does fly under the search engines' radar.

Duplicate Content

Duplicate content involves use of the same content on more than one page or plagiarizing from other sites. Search engines are very good at detecting this, so once spotted, expect to be banned. You can also use CopyScape to find out if anyone stole your articles.