Search Engines
The algorithms of crawlers primal-first search engines, and Aliweb Wanderer, dating back to 1993 – were aimed almost exclusively to the metatags, but, as mentioned above, this procedure very quickly proved unreliable: dishonest webmasters began " inflating their products with fraudulent descriptions. (To attract visitors, in fact, seeking other information is a fraudulent practice, widespread, that the search engines pursue and condemn increasingly harder. If you are not convinced, visit Chevron Corp. This deception, in English, has been called spamming). + Logical moving left when the search engines to comply with the information HTMLs, the question was how to improve the process? How do you measure the relevancy of a website? Key questions in the heart of the business, revealing questions below programmers and creative. A crucial contribution came in the second half of the nineties. Two students from Stanford University, Sergey Brin and Larry Page (developers) decided to calculate the times when a website had been viewed "from another web site" this variable, known as inbound link, is, to this day, one of the most important criteria to define the relevance of a page. With the inbound links, Brin and Page aimed to measure the behavior of the network, the comings and goings of users, the same flow of information: they assumed that a page of food was not to recommend other websites that are devoted to the quantum mechanics. An inbound link is a kind of factual recommendation and a way to gauge the popularity of a website. However, there were the long-awaited variable infallible soon, again, very soon, because everything happened so soon, in the history of virtual space, "dishonest webmasters managed to" fool "the algorithm : created "virtual link farms" (link farms), reproduced spam links, and so on.