Month: August 2014

Me, you, them, Google and the first three search results

Motori_ricercaThis article takes cue from the news released a few weeks ago which revealed the important goal achieved by Google: 100 billion searches by month.
The first result was the impressive spread of this news on web, a phenomenon already noticed by us in some other similar occasions. After all, it was only a piece of news to be released, furthermore of a huge impact, with Google as protagonist, just perfect to capture the readers’ attention.
The following days I found this news practically everywhere, proving once again that one of the web’s main tasks is that of a sounding board: well-known journalists, editors, bloggers, computer scientists, not one succeeded in resisting the temptation to report the terms “Google“, “100” and “billion.”
The paradox I would like to underline consists in the fact that the news, almost like a tape duplicated over and over again for hundreds and hundreds of times, began to lose identity copy after copy and in short time reduced itself to a mere synthetic announcement. Unfortunately, as often happens, almost all the “loudspeakers” limited themselves to announce the fact making only some fast comments on the subject, a tam-tam which I personally found banal. In brief, this led me to check some aspects, therefore, indirectly to write this article.
The oxymoron subsists in the fact that the news which states the Google’s growth has also produced thousands of clones, so much so if you search today through Google, the three terms “Google 100 billion” you’ll find 1.040.000 potential results. A short-circuit: the news which states that Google has monthly billions of contacts it has billions occurrences, traceable precisely through a Google search and, consequently, causing dozens of millions of clicks from users: billions of clicks which produce millions of clicks, internet is really a self-supporting system.
From the numerical point of view I agree with Larry Page, the goals Google can achieve, are above and beyond. With at least 1 billion of potential internet users per day, an efficient engine which is also transnational, an unlimited space at its disposal,… Google could be used by all the people of this planet multiple times per day. With a number of cyber users (smart phone and tablet) which are constantly increasing, I do believe that the goal achieved is significant, but also extremely easy to surpass.
We know, based on statistics, that the 93% of the “digital natives” (age group 16-24) uses internet at least once a week (Source EuroStat 2011). Therefore, not only all of us the grown-ups use the search engines, but most probably, the young ones and school-age children use daily these tools too.
At this point I wonder where is channeled all this potential? Are we progressively abandoning other search patterns limiting ourselves to use only one tool? Do we entrust ourselves unconditionally to these search engines?
More precisely, how we behave in front of a list of search results? Are we able to choose the most appropriate information? Do we have enough analytical capacity and patience?
I searched online some statistic data which could confirm my suppositions. I have to say that my search revealed itself long and less gratifying than I expected. Maybe once again, we find ourselves in front of an oxymoron. I wasn’t able to find, through the main search engines, some reliable and complete statistic information in order to certify the use of it. Internet is not always a transparent universe.
I have found however three examples.
The first one takes cue from the leak of information given by AOL, an “accidental” mistake, as it is said on web: some tens of millions of search query made by over 600.000 users. This “data pack” is probably the only significant data, in terms of numbers, available on internet.
I briefly summarized the gather data in the following table; here are recorded about 20 millions clicks by the data’s sample AOL, I would say a good number (click on the image to read the detail).

AOL_click_stat1

What shall I say other than I’m truly impressed! In 42.3% of cases we select the first result present on the search list, the second result has a delta of -71.82% with only 11.92% clicks, the third result has 8.44%, while the following results have minimal percentages of selection.
I also find interesting the fact that, quantitatively, positions like the 21st, the 31st and the 41th, are not so far-between them, this shows how an insignificant percentage of users is willing to search with attention what they need and therefore to scroll down the list further than the first or the second page.
The first page has 89.68% of the clicks, basically 9 out of 10 users remain on the first page. I find fanciful that position n.10 is the only one that doesn’t respect the decreasing trend, obtaining more clicks than the 9th one. It is possible that our eye (or our curiosity) have the tendency to pay much more attention to what is placed on the bottom of the list.
These data not only confirm my “feeling” whereby if, after a search, you are not on the first page, the possibilities of being selected are reduced to a minimum, but also concentrates the 62.66% of selections, therefore two-thirds of clicks, on the first three results.
The “relevance ranking” of the search engines, or rather the algorithm which determines the position of the search results, seems to guide inexorably our clicks.
My second reference source is even more “unusual”; it is about data coming from “BrandSoftech“, a supplier of software solutions for online games and casinos. The example is composed of 5.357.519 clicks of 29.327 different key sentences typed on Google extrapolated from 63 different betting sites.
Online are available only the data concerning the first page, therefore relative to the first ten results. It is not clear if the remaining pages capture less than 1% of the clicks or if the study analyses exclusively the assignment of the clicks made on the first page, and only as a consequence of the rounding of two decimal numbers, the example can’t be given of 100%. I personally choose the second hypothesis! In my opinion, we have an important sample of clicks but only relative to the first ten results.

BrandSoftech_click_stat

The data are very similar to those proposed on the first table list; in fact they strengthen the role of the first entries of the list. Now the first three entries reach 71.86% and if the percentage report is only between the first ten values it is necessary, in absolute terms, to deduct about 7%. Which is to say that in this case too, it is better to be tenth that ninth; actually in this case the gap between the ninth and the tenth place is even more evident, this certifies the fact that, most probably, we trust much more our visual perceptions, (first and last places), than what we read, or rather, we pay more attention to what is present on certain points of the video screen.
If about 2/3 of the users choose one of the three first results proposed by the search engine, I can well understand why Google, as a consequence of a search, proposes always at the top of the list, at most, three sponsored links. It refers to the three links which, statistically, are the most watched.
The positions n.1, n.2, and n.3 are those preferred by the cyber users.
If it is true that the most of the users “prefer” or “is satisfied” with the first results of the list, we have however, based on the data AOL earlier revealed, about 10% of the users who goes on the second page. So, if from a certain point of view, the phenomenon of the “easy click” on the first three results may result worrisome, it is important to underline that all is not lost yet. There is a category of people capable of selecting what ensues on the following pages and, I would like to add from my experience, more and more skilful audience is able to repeat the same search adding further terms, reducing so the number of occurrences outcome of the same search.
These two evolved behaviors cause on web a phenomenon called “Long Tail“, from the expression coined by Chris Anderson.
Such phenomenon highlight the 80% of the less popular needs (of any type or kind): basically it allows us (you) to track down that movie, that book, that information “of niche” which is of no interest for the most part of the users and, therefore, hardly traceable without the search engine.
In addition to these two sources which, as I mentioned before, are far from institutional, I recommend you to read the research carried out by Cornell University called “Eye-Tracking Analysis of User Behavior in WWW Search“.
In this case the number of clicks and the users’ sample are not so important, but a keen attention is being paid to the users’ perception of the page, through the study of “ocular index “, in other words, how long the user observes the page and in what manner. Again, the first three needs have a leading role compare to the others.
For the most curious of you, I also recommend the view of the services “Google trends” which monitor the web, proposing daily statistics relative to what is of the great interest for users. Unfortunately, the clicks on the big numbers normally show general information and they end up underlining national popular events such as gossips, football and politics… From the social point of view, I believe web is much more complex and definitely much more receptive than synthesized in these “trends” pages.
In conclusion, there is to be both optimistic and pessimistic because web gives us the opportunity to communicate, to express our opinions, to post a video, a photo, material which certifies unequivocally a fact, to look for something very difficult to find present only at the end of the “long tail”,… But, at the same time, we often tend to be satisfied with what web offers us without using completely the tools to our disposal, first of all, our brain.
Do not be satisfied with the first three results given by the search engines, you may find something more interesting on the second or the third page.

WorldTwoDotZero

The Spin cycle: Technologies, people, society in “global rotation”

forza_centrifugaAs I have already mentioned in a previous article, article in which most probably I haven’t been too original, opinion you may share only by reading the article, web has changed the world. But how did this change happen? What process did it activate? What changes will follow? Maybe if we try to analyze the dynamics which have brought the net to the attention of our society, it will be easier to realize or simply to guess what the future will bring or at least a part of it.
I would like to start from one consideration: the first web or web 1.0 whose diffusion took place in the 1990s offered static contents that hardly changed as the days went by, with graphics package similar to that of the newspapers. The pages were columns structured with a title or a subtitle and a text. The images applied were a few as well as the colors, but the difference which strikes more than all, is how the contents were rarely put through an updating. It surely looked alike with the good “old newspaper”, but being on web, it turned out to be of no use, not even fit to wrap the fish in. It is not hard to find online “old web pages” which describe an event or a service or even disused old blogs whose last, and sometimes the only post, is dated 1995 or 1996.
The biggest limits were of technical nature, the creation of the web pages was often delegated only to the computer technicians, sometimes even to improvised technicians. The action of publication had to submit to a wearing “braid” which considerably slowed down the updating of the contents, to not mention the editing of macroscopic grammatical errors.
Do not misunderstand me, every rule has its own exception, there are examples in the first web that excel in communication and navigation, however often thanks to the inspiration of one individual and not due to the “teamwork.”
I dare to compare these first pages to the first prehistoric icons carved in stone. They have many aspects in common! First of all, they are hard to cancel or even change, second, they were created by using primitive tools and last but not least, there was no awareness from the maker regarding the importance of his creation, the conception of a rough and primitive product from the communicative point of view even though of inestimable value.
Therefore the stone age of the web coincides with the web created in the twentieth century and the “stonepages” have been the direct expression of it. A good web expert has in his drawer a list of stonepages links, web pages designed at the end of the 1980s and in the early 1990s.
Continuing our cursory and general analysis, we ought to underline the incredible evolution made by technology and the way it progressively succeeded in producing more dynamic and interactive tools, creating software which permit the editor to insert and publish contents in complete autonomy. Equally decisive was the diffusion of tools which simplified and speeded up the creation of multimedia material such as digital cameras, high tech video cameras, audio tools and so on…
The electronics and the computer science have certainly given a conclusive contribution by proposing new tools, simple to use and affordable to everyone.
So, the new Web has its genesis in the innovation, but such a mass and a transversal process cannot be limited to some extra bit and chip, the step-change took actually place from the very moment when the persons, the people, the vulgo, took knowledge of these tools and begun to use them daily.
This excessive use, not only has immediately determined new forms of communication, of social exposure, but above all, it has induced new requests of use, clearer.
Being an experienced computer scientist I know that the good result of a project, of a software programming service, depends very much on the actual use/ liking of the internet user as well as on the quantity of suggestions actually applied by him to the software.
The stage two of 2.0 is characterized by the users’ ability in over stimulating the means at their disposal finding sometimes innovative and original ways of use, sometimes even suggesting new developments.
The stage three of 2.0 has been even more important because it had turned millions of individual web users into a virtual society and some of the web tools into social networks, in other words, it has been created the first de facto virtual agorà. The stage three surprised everybody, even the most optimistic employees for its intensity, but above all, for the course of the phenomenon. As I already mentioned a few pages ago, there was some billion of people who felt alone, who had that tribal necessity to exchange two words, to feel each day alive by bringing in one’s own small way his fast and precious contribution.
The society is caught by this network and it won’t succeed in going out of it.
The second and the third stage have had a massive impact on the tools proposed by web, bringing new stimuli and a lot of money to those people able to plan and succeed in innovation.
The stage four of the new web is nothing but a second turn of field: further innovation, greater use and identification from the individual as well as further borders’ extension of an already mass phenomenon.
At this point of my monologue it becomes clear that the web 2.0, which all of us are experiencing, is a circular phenomenon, rapid and in constant growth. A circular phenomenon which rotates fast inserting a centrifugal force that runs over all the participants! Just imagine a billion of paper handkerchiefs which rotate at an incredible speed inside a huge washing machine. For this reason, when we use a new tool provided by social networks we feel sometimes played-out, we are into the spin cycle
The spin cycle has another evident collateral effect, it expels, throw out from its center the heaviest objects: politicians who love the monologue, researchers who use complex tools and well-structured publication procedures, whoever wants to keep a secret, long-winded orators, and others…
If you are not cooperative, if you don’t want to take a challenge, if you think it’s better to communicate using ways and slang typical of last century, or if you are just too slow, you don’t live the real-time web, or perhaps you are not fit at all for the innovation, then my friend, you are out of web, you exist “only “as a real person, but from the virtual point of view, you are simply meaningless.

WorldTwoDotZero