Rebuilding Web Sites After The Fall
By Damon Marturion
New Business News Staff Writer
After the dot-com crash of 2000, survivors of the crash began to ask, "Where'd we go wrong?" and began to rebuild the Web sites that they were so proud of before the fall. What criteria will they follow when re-programming and re-designing? This time they will build the Web sites with users in mind.
Entering 2001, successful Web sites will review and re-build based on four basic principles. The number one criteria for professional webmasters, is to have the highest level of usability. The second is to have built-in Web-integration, so that the site becomes "one with the 'net," not just another Web site. The third is to generate quality traffic and the fourth to support your business model.
Usability
The new generation of Web sites for dot-com survivors will pay more attention to accessibility issues in an effort to be more accessible to everyone. Oversights were made in the rush to build Web sites as quickly as possible.
Dave Masters of Web Masters Ink, recalls, "Sure, we all bent the rules for the corp. sites, because they had the money and they called the shots. So, we built the sites according to their specifications, all the while interjecting our best advice, which was voted against because it was conservative." Masters explains, "The sites were built for the executives, with high-powered systems, large monitors and T-1 access, with little regard to actual functionality."
According to WebYoda's David Lowe, "They were very happy with what they had built. They could hold it in their hand and they were satisfied. But the people who were the potential clients weren't able to utilize the material because everything was focused around the company and not focused around the visitor coming in."
Masters says, "It was hard enough to navigate the sites if you were new to the Internet and impossible if you had limited resources or disabled." Lowe expounds, "They weren't built accessible because everyone was trying to do flash interface and things that are bells-and-whistles which people with different challenges have no ability to get into those sites."
According to Lowe, the United States government is requiring webmasters to "have to abide by these different accessibility standards if they plan on building sites for the government."
Web Integration
Web sites should be built in such a manner as to be integrated with the Internet and World Wide Web. Content should be accessible to hardware and software that runs the Internet so that the Web site becomes self-perpetuating - using the power of the Internet to draw visitors and/or users.
Masters says, "If you have these giant advertising budgets, then you can generate traffic to your Web site that way - but if you become advertising dependent - then what happens when you run out of advertising dollars?" Masters answers, "If your site is advertising dependent, then no advertising means no traffic.
"So, properly-programmed Web sites have Web-integration built into them so that they use the tools that drive the Internet to self-propagate themselves. If your site has great content, what good does it do, if it's invisible to search engines? Delivering dynamic content is easy, making it accessible to the World requires real programming skills."
Quality Traffic
Ever wonder why you enter a search query into a search engine - looking for specific results - and the results that you receive aren't even close? Your first reaction is to blame the search engine for being inadequate; in reality it is likely the result of unscrupulous Web designing.
Masters says, "Webmasters armed with a basic knowledge of how search engines operate can skew results so that their sites show up on irrelevant searches. The problem is that they've promised a certain amount of traffic to their clients - or worse yet - they believe that pure visitor count is the qualifying standard of a good site."
Lowe expounds, "Just simply showing up in the search engines under some anonymous name is not only spamming the search engines, but it's also degrading the search engine and sending people to your site that really have no interest in whatever it is that you're suggesting."
"Plus," says Masters, "We have tools available to us, now, that take all the B.S. out of statistics. Tell any statistician that you want statistics that report X, and he'll generate the stats to support your cause, whatever it may be. So, I see clients coming in from other sources saying that their site is getting '40,000 hits a day' - according to their stats - but can't figure out why they're not making any sales. One look at real data - in comparison to like sites - will likely reveal some level of unscrupulous webmaster activities."
Most browsers support viewing the source code of a Web page. If you think you've been led astray, you can check the source code and see for yourself. Don't be surprised when you see "Britney Spears," or "Pokemon," as keywords because those are popular search terms.
CyberSpacers in cooperation with the ITAA has come up with a Webmastering Code of Ethics whereby participating webmasters warrant that they will "not attempt to falsify, modify or trick search engines or individuals using the Internet by misleading Meta Tags, search engine gimmicks or doorway pages," amongst other things.
Most credible Web Design courses teach ethical programming principals and agree with Lowe, "We only condone putting keywords that are specifically related to the products and services that you have."
Lowe explains the ongoing battle between webmasters and search engines, "The people who are trying to beat the search engines are continually finding new ways to do it and the people who are trying to make the search engines better are finding ways to work around it. And it's a pretty vicious cycle. But if you just go in there and you just put a balanced site together, with real good keywords and META tags, and you submit things appropriately, they should come up well in the search engines. And, as time goes on, if the site is what it says it is, the popularity of the site will push it to the top."
Masters says, "I'd rather build a site that has a ratio of 50-visitors-to-every-sale, rather than 40,000-hits-to-every-sale. It's all about target marketing. Bringing people who want your product to your page in the quickest way possible. Then making it convenient for them to make the purchase."
Supporting Your Business Model
A number of the first-round of Web sites cashed-in of the success of the e-commerce capabilities of the World Wide Web, with little regard to their existing business model, effectively shooting themselves in the foot.
Don Witt of Cylogistics, contends that corporations threw their backing into Web sites that generated direct sales, bypassing their existing business model of vendor-supported sales, leaving retailers and re-sellers high and dry. Witt says, "if you're going to have a direct sales force out there, or even having resellers/distributors out there, having a web site that sells directly to the end-user undermines the channel and basically, erodes the effectiveness of your sales organization."
Organizations who offer product through distribution channels must re-vamp their sites to support their distributors and retailers, rather than compete with them, resulting in a conflict of interest. Witt advises, "They can have a Web site that identifies all of the products, useful information, and have resellers' locations on it. Or take the leads and pass them over to your sales people. It's much more constructive in supporting how you choose your sales strategy."
Witt has been in the computer programming since 1967, has seen it all since his first data transmission in 1981 via the Internet and says, "...we've come a long way in 20 years."
According to Softbank's Gary Rieschel, it takes between $15 million and $25 million to build a top-of-the-line Web site. Yet it costs at least $150 million to build a warehouse and distribution system for a consumer Web operation. "The Internet only solved 10% of the process, the front-end purchase process," says Rieschel. "What we really needed to do was fund the back end."
On a positive note, we have a great deal to learn from the dot-com crash of 2000. By examining mistakes made by crash-and-burn Web sites (commonly referred to as "dot-carnage") we can see mistakes and avoid making the same mistakes as we charge forward to a better Internet. And we can also learn from the survivors, who are seeking to upgrade to more effective Web sites. The result: a stronger, more powerful Internet, where the winners are the users.
. . . watch for more stories coming soon
|