Pagerank

Analyzing Google

Let’s analise Google, the “center” of the web.

Next month, precisely on 15th September 2017, will be the 20th anniversary of Google’s dominion. It’s been twenty years since two students of the University of Stanford, Larry Page and Sergey Brin, have completed the first step in the implementation of Google. Twenty years that brought all of us online. Years in which we have learned to entrust our curiosities and our search needs to a specific tool, based on a single field and specific algorithms. social-google

Analysing the course done by Google, we identify,  in these two decades, three distinct stages:

Stage 1: The idea 

In a few years our nephews will read on the history books about Page and Brin and their idea of building “something” in order to instantly collect the network’s informative flow, something like a “shopping list”.

The legend narrates about their wandering from one company to another and their meeting with smart-alecky and short-sighted executives who did nothing but mocking ideas and perspectives.

What remains today of this first stage is their visionary idea, definitely original.

light-bulbs-1125016_640

Google, the search engine, is a tool that feeds itself from the (online) product of human kind. It connects information and people. It is the automaton, the scribe who observes the history and takes notes while everything happens.

All of it inside a model of fruition based on a white page, without invasive advertising and without useless waiting time.

Google doesn’t invade our personal or visual space, on the contrary it suggests us what to do.

 Stage 2: The expansion 

In the early years of its existence, Google on par with a child, keeps growing and learning very fast, gaining billion of web pages.  It analyses them and improves its own algorithms of ranking. The impression during this stage of technological revival is that of a system able to support our knowledge. Computers placed all over the world able to store information, to rapidly acquire, index and draft lists of contents.

At the same time it provides us a service of personal electronic mail, always available, endowed with a large space for storing and without advertising intervals: Gmail.

This stage of acquisition and data supply, through a single search field, has made a crucial contribution to the evolution of Internet. It has become easy, very easy, looking for and being searched.

Stage 3: The regression. The new goals are missed 

The following third stage has started with amazing advertisings, the creation of a universal translator able to bring together people of different origin, glasses able to maximize our reality with additional information, modulated and personalised smartphones, self-driving cars.

It is the beginning of a new era thanks to Google? Absolutely not! At least for now…

To this day Google translator, Google Glasses, project ARA, Google car, attest a lack of growth stage. Probably these research projects were aiming to high with too ingenious elaborations to be carried out by the computers in use till now.

Not even the social network “Google+” has reached the pre-fixed popularity and use, compared to those of Facebook, just to be clear.

These science fiction visions, in the later decade of the 21st century, failed to materialize. There was no further improvement, in either possible direction. Today’s Internet is not so dissimilar than the one of five or ten years ago.

And what about Google’s search engine? 

Neither this did have the evolution we had hoped for. A few years ago on the network were rumors regarding a semantic web, web 3.0, able to offer correlated contents, suggestions, information and intelligent support. None of this happened up till now, at least as far as Google is concerned.

Google’s list result proposes, in the central part, a greater number of advertisements comparing to some years ago. It promotes geo-located contents and popular in certain cases at the expense of the original and cultural contents. It applies “automatic” logical assessment of the contents.

The amount of information available online grown out of all proportion, certainly doesn’t help. The search engine in response to each attempt offers millions and millions of results.

In fact, a completely useless list except the first or the second page, for ten or twenty results characterized by a short description.

The search engine has not become more intelligent or more accurate if not for some trivial aspects of less importance.  

It follows predefined schemes assigning scores based on rules that for the most part have been presumed by computers scientist and copywriters replicating or bypassing them attentively. This leads to a high competition between right information and promotional information aimed at the sale, or worse surrogate information, false and showy.

The network users didn’t become more intelligent.  

Probably they are faster, more connected, but in front of a result list they tend to rely on what is proposed by Google and think less. This paradoxically doesn’t improve the knowledge on the contrary it reduces it to popular elements, already chosen by others; pages and texts that follow syntax and semantic rules appreciated by the algorithms. Many data, a lot of opportunities but only one list of ten results from which the cyberuser more and more passive may choose.

Criticising Google

Multimedia texts and contents shared online are centrifuged and lyophilized up to their essence. Billion of ideas, words, images and videos become a short list with minimal descriptions.  All the information outcome of all our searches is contained inside a “small postage stamp” on which all of us want to leave the signature. The first page of Google’s search result list, the page on which today everyone wants and has to be.

Online information grows and evolves. The cyberusers are faster and faster getting more and more compulsive. Google is becoming the absolute judge of this virtual universe.

For this, for its central role which has assumed in our virtual society Google can and must do better. Its algorithms must become more sensitive and less automatic; more careful in promoting useful contents to the community; more explanatory on the result lists in order to provide to the most receptive users a greater number of information; more careful in analysing the users behavior not predominantly for commercial purposes. It has to be able to understand the user requests and raise his level of knowledge and self-awareness.

It is a very difficult task, almost impossible, but if there is anyone who can do it, that is Google.

WorldTwoDotZero

 

Advertisements

The website popularity, an ephemeral love between algorithms and culture

broken-heart
Imagine putting all our personal and cultural references in a large jar: photographs, books, notes, stories, poems, birthday and Christmas greeting cards, drawings made by our children when they were little and so on and so forth.
Imagine that, just after filling the jar to the top, we realize that the mouth of the jar is a bit tight, and therefore we can pull out only one object at a time. At this point we have to overcome the impasse and establish a picking rule, otherwise all our valuable objects will become unreachable.
Well, this is what happened on the Internet! Network has become our multimedia knowledge container, but not only that. So, while we were putting our information into the global container, we discovered that without the appropriate tools for research and without a network of relationship, the information was often unreachable. After all, how many times in the past the treasure map has been lost and with it the hidden treasure?
Today’ success of a search engine is a direct consequence of its ability to collect and display information from the network, which means, as I metaphorically mentioned above, to have the jar completely full. The search engine has to show this information in a logical sequence, but above all, it has to guess the user’s request basing its research on the few terms received from the user.
The increase number of information along with the increase number of people on network have placed on top the best web search engines, able to satisfy our demands less than in a few instants.  Over the top are placed the “algorithms“ as well as methods of calculation, everything designed by man and ran obsessively by computer. Computer able to judge by itself the information spread on network.
Among these algorithms probably the most famous is the “pagerank“. This algorithm is able to attach different importance to internet web pages according to their popularity, or, rather to the number of sites relative to these pages. Higher will be the number of sites that link to “our” web page, greater will be our popularity.
Today there are many other intelligent algorithms that analyze “our” web page from the point of view of contents expounded, from every single term used, from the frequency with which we update our content, from the profundity of navigation expressed by our portal / site / blog, from the interest expressed by the social network, from the cleverness of writing our page (title, subtitle, …), etc …
As happens in the real world, also happens in the virtual world: to reach a high popularity, means hard work. We must be well known by lots of network people therefore clicked, linked, posted, mentioned … And of course, same as in the real world, upkeep our popularity is extremely difficult, that requires efficiency and talent even for the most tenacious ones.
However sometimes the popularity generates further popularity without new strategies application … being on  the “front page” means being featured on the “global showcase“, selected and mentioned, and this will increase our ranking, as a result, our new location will bring us to the fore rising our ranking higher and higher.
The popularity on web, the climax of every blogger or writer, dependents on a selection process extremely rational, algorithmic, I dare say similar to that described by Darwin, a process of evolutionary selection, that assigns notoriety to those already famous, but not less interested in new, original, lasting forms of information and culture. We talk about a rational cynical process, unceasingly active, open to innovation, a “mutant”, sometimes complicated sometimes incredibly superficial.
Given these premises, who knows what kind of love will blossom between the ranking algorithms and our culture: a true one or just an ephemeral one?

Worldtwodotzero