A Way To Review The Web?
In traditional print journalism, people write reviews of books, films, music etc. However,there are fewer reviews of websites. Even online magazines don't review other websites extensively; they simply give links to the one that they like.
The web is probably too dynamic to be covered by traditional written reviews. I think the modern version of the review must be something like a ranking of websites. The concept of ranking is already used in printed journalism. For example, critics offer lists such as "The Top 100 Adventure Films Of All Time" or "The 100 Worst Movies Of The Past Decade".
One way to review the web would be to have a search engine that could use user-defined filters and rankings to influence how the engine sorted its results. ( A way to kludge a version of this engine would be to write a program that obtained search results from Google and then re-sorted them by taking into account the rankings given by the user-defined rankings of websites.
Suppose the format of the user-defined rankings is an open standard. Then websites could offer their own lists of rankings in that format. People who trusted the judgment of the website could download these rankings and use them in the search engine.
This idea is not a trivial problem of computer science. There is the problem of developing algorithms to employ the rankings. For example, a typical ranking would be for a special type of website (e.g. "The Top 100 Woodworking Sites" or "The 100 Worst Engineering Forums"). How would a search engine infer which rankings were relevant when a user made a search? How would it compromise if there were two sets of rankings that disagreed?
The web is probably too dynamic to be covered by traditional written reviews. I think the modern version of the review must be something like a ranking of websites. The concept of ranking is already used in printed journalism. For example, critics offer lists such as "The Top 100 Adventure Films Of All Time" or "The 100 Worst Movies Of The Past Decade".
One way to review the web would be to have a search engine that could use user-defined filters and rankings to influence how the engine sorted its results. ( A way to kludge a version of this engine would be to write a program that obtained search results from Google and then re-sorted them by taking into account the rankings given by the user-defined rankings of websites.
Suppose the format of the user-defined rankings is an open standard. Then websites could offer their own lists of rankings in that format. People who trusted the judgment of the website could download these rankings and use them in the search engine.
This idea is not a trivial problem of computer science. There is the problem of developing algorithms to employ the rankings. For example, a typical ranking would be for a special type of website (e.g. "The Top 100 Woodworking Sites" or "The 100 Worst Engineering Forums"). How would a search engine infer which rankings were relevant when a user made a search? How would it compromise if there were two sets of rankings that disagreed?
0