Google Updates Algorithm To Prevent Spammy Sites
This would be a typical news item but I'd like to discuss this from a CS engineering perspective. The biggest challenge in front of Google is that they should filter the low-value websites and rank quality websites on the top. As mentioned on their official blog, Google's new algorithm update affects 11.8% of their total search queries. Read more: <a href="https://googleblog.blogspot.com/2011/02/finding-more-high-quality-sites-in.html" target="_blank" rel="noopener noreferrer">Official Google Blog: Finding more high-quality sites in search</a>
Now, let's try to figure out what's are the alternate ways Google can tackle this problem. Imagine a bunch of about 1000 websites, interlinked with each other. For any particular query, the search engine must return the high-quality and relevant website on the top.
How would you build logic and a simple flowchart for this task?
Now, let's try to figure out what's are the alternate ways Google can tackle this problem. Imagine a bunch of about 1000 websites, interlinked with each other. For any particular query, the search engine must return the high-quality and relevant website on the top.
How would you build logic and a simple flowchart for this task?
0