On Tue, 05 Sep 2000 11:31:56 +0200, "Barathy, RamaSubramaniam"
Would it not be nice to have some sort of quality control task force that
assigns a quality level for the web sites through out the world.
We have enough problems just *finding* all the web sites in the world.
I believe I saw a statistic that NONE of the major search engines included
more than about 20% of the actual pages out there. And you might want
to check out the sort of hardware the big engines have to use to actually
DO the indexing - this sort of thing usually doesn't happen for free, which
means that somebody will have to FUND this....
Also, notice that on *many* sites, all the *interesting* content is
active content - meaning that it's quite hard to index. Two of the
pages I visit a *lot* are the IBM and SGI bug report databases, which
are only searchable via keyword.
This would make the site developers to bring in the higher quality to the
Why would it motivate them? There's several possibilities here:
1) They are technically unable to do better. A quality control won't
2) They are happy with the page, and the page's users are happy. WHy do they
care what quality control says?
3) They're happy, and don't care about the users. A QC rating won't
4) They're not happy with their work, but are too time-pressed to do
better. A QC rating won't make more hours in the day.
With more and more web sites, we r getting lost in finding quality
Yes, but quality control isn't the answer, unless you are advocating
shutting down web pages that don't meet your standards....
Operating Systems Analyst
Description: PGP signature