Webmasters are sometimes told to submit “bridge” pages or “doorway” pages to search engines to improve their traffic. Doorway pages are created to do well for particular phrases. They are also known as portal pages, jump pages, gateway pages, entry pages and by other names.
Doorway pages are easy to identify in that they have been designed primarily for search engines, not for human beings. This page explains how these pages are delivered technically, and some of the problems they pose.
Low Tech Delivery
There are various ways to deliver doorway pages. The low-tech way is to create and submit a page that is targeted toward a particular phrase. Some people take this a step further and create a page for each phrase and for each search engine.
One problem with this is that these pages tend to be very generic. It’s easy for people to copy them, make minor changes, and submit the revised page from their own site in hopes of mimicking any success. Also, the pages may be so similar to each other that they are considered duplicates and automatically excluded by the search engine from its listings.
Another problem is that users don’t arrive at the goal page. Say they did a search for “golf clubs,” and the doorway page appears. They click through, but that page probably lacks detail about the clubs you sell. To get them to that content, webmasters usually propel visitors forward with a prominent “Click Here” link or with a fast meta refresh command.
By the way, this gap between the entry and the goal page is where the names “bridge pages” and “jump pages” come from. These pages either “bridge” or “jump” visitors across the gap.
Some search engines no longer accept pages using fast meta refresh, to curb abuses of doorway pages. To get around that, some webmasters submit a page, then swap it on the server with the “real” page once a position has been achieved.
This is “code-swapping,” which is also sometimes done to keep others from learning exactly how the page ranked well. It’s also called “bait-and-switch.” The downside is that a search engine may revisit at any time, and if it indexes the “real” page, the position may drop.
Another note here: simply taking meta tags from a page (“meta jacking,” as Infoseek calls it), does not guarantee a page will do well. In fact, sometimes resubmitting the exact page from another location does not gain the same position as the original page.
There are various reason why this occurs which go beyond this article, but the key thing to understand is that you aren’t necessarily finding any “secrets” by viewing source code, nor are you necessarily giving any away.
The next step up is to deliver a doorway page that only the search engine sees. Each search engine reports an “agent” name just as each browser reports a name. The SpiderSpotting chart within Search Engine Watch lists these, mainly as a means to tell you if you have been visited.
The advantage to agent name delivery is that you can send the search engine to a tailored page yet direct users to the actual content you want them to see. This eliminates the entire “bridge” problem altogether. It also has the added benefit of “cloaking” your code from prying eyes.
Well, not quite. Someone can telnet to your web server and report their agent name as being from a particular search engine. Then they see exactly what you are delivering. Additionally, some search engines may not always report the exact same agent name, specifically to help keep people honest.
IP Delivery / Page Cloaking
Time for one more step up. Instead of delivering by agent name, you can also deliver pages to the search engines by IP address, assuming you’ve compiled a list of them and maintain it.
Everyone and everything that accesses a site reports an IP address, which is often resolved into a host name. For example, I might come into a site while connected to AOL, which in turn reports an IP of 220.127.116.11 (FYI, that’s not real, just an example). The web server may resolve the IP address into an address: ww-tb03.proxy.aol.com, for example.
If you deliver via IP address, you guarantee that only something coming from that exact address sees your page. Another term for this is page cloaking, with the idea that you have cloaked your page from being seen by anyone but the search engine spiders.
This information provided by
Search Engine Watch