tmag arte scienza contributi dalla rete annotazioni

A Grain of Salt Reloaded
How I learned to stop worrying and to love Statistics

(November 2004)

Introduction (Edited June 2005)


This feature is a 'technical advice' article. It is devoted to help consumer advisors to make informed decisions about what product and technologies can offer benefits to consumers. The issue treated is particular and needs to be integrated in order to give complete assessments on consumer products (ie. this paper debates some aspect in computer security, but a computer product is 'much more' than its security alone). This said, it needs to be remembered that we try to report informed opinions to the public, but they are only opinions, and nothing more. Use them for your benefit and remember that this feature is open (as the rest of this magazine) to peer review. You'll find our e-mail addresses reading www.thinkmagazine2.org.




A recent press release by Mi2g (http://www.mi2g.co.uk) UK security consultants again stigmatizes poor security of Linux (and to a lesser extent of Windows) giving statistics about overt attacks performed by crackers (cybercriminals) on machines connected permanently to the Internet. Since this kind of machines very usually are servers and since

"MI2G is basing part of their research job relying on Zone-H.org databases"

http://www.zone-h.org/en/winvslinux (Year 2003)

which archive defacements of web servers (very typical overt attacks) we could focus on the server panorama...
More on this later.

For now, here is the raw data (one year ending Nov.2004): on a total of 235,907 successful digital breaches

Linux accounts for 65.64 per cent
Microsoft Windows accounts for 25.19 per cent
BSD and OS X account for 4.82 per cent

of the total successful overt attacks.
If the sample studied by Mi2g is representative (and chances are it is) and investigates mainly webservers (as the argument by Zone-h seems to suggest) it should be compared with the automated webservers statistics by Netcraft (www.netcraft.com) which report for November 2004:


Top Developers
Apache 38028642 (67.77%)
Microsoft 11923566 (21.25%)
Active Sites
Apache 18255834 (69.87%)
Microsoft 5918529 (22.65%)


One can object that Apache does not mean Linux. It's true. Perhaps the most valid contestant to Linux in the Apache arena is FreeBSD. In fact a news release by Netcraft (June 2004) was saying:
"Nearly 2.5 Million Active Sites running FreeBSD"


This would place Linux around 60% of the total share of active hostnames (not parked domains). To resume briefly (please notice: VERY ROUGH ESTIMATES):

Linux market share (active sites) around 60 %
Windows market share (active sites) around 22 %
FreeBSD market share (active sites) around 9-10%


Linux overt attacks 65.64%
Windows overt attacks 25.19%
Total BSDs overt attacks 4.82%


There not seems to be a clear difference between attacks to Windows and Linux platforms, in terms of sheer probability. BSDs on the contrary seem to fare rather well. However, weighing in the millions of hosts compromised by Windows server worms like SQL Slammer and others (see as reference our previous feature), this could spell bad news for overall 'real world' security of Windows server platforms.


Please notice: here we aren't discussing 'absolute security' of the various platforms, just 'average security' in normal condition of use. This includes human error (misconfigurations, etc). So don't take it on the religious side.
I don't want to take here a position about:
- Quality of design involved in whatever OS
- How diffusion of a platform helps cracking that platform
I could tell elsewhere my opinions on these issues.


One could object that since mass hostings usually happen on Linux and BSD platforms, these should be far more attacked than they are...after all if I was a cracker I'd go only after big providers with poorly administered free homepages. This is a good argument, and if proven true could place BSDs in the Olympus and Linux quite well off in this kind of analysis, but I have no reliable data to investigate further on this issue.



In a later press release, Mi2g addressed the market share issue comparing the overall market share of the various platforms, including desktops. Since many manual overt attacks happen to webservers, I find the present analysis more accurate. Mi2g continued stating that one of the most valuable data about reliability of online platform is uptime, and pointed to a Netcraft analysis stating that the longest webserver uptimes are 'owned' by BSD platfoms. For reference see http://uptime.netcraft.com/up/today/top.avg.html


But be sure to have a look to the following uptime FAQ, also on Netcraft (emphasis mine):

"Which operating systems provide uptime information ?

Additionally HP-UX, Linux, NetApp NetCache, Solaris and recent releases of FreeBSD cycle back to zero after 497 days, exactly as if the machine had been rebooted at that precise point. Thus it is not possible to see a HP-UX, Linux or Solaris system with an uptime measurement above 497 days.

Why do some Operating Systems never show uptimes above 497 days ?

The method that Netcraft uses to determine the uptime of a server is bounded by an upper limit of 497 days for some Operating Systems (see above). It is therefore not possible to see uptimes for these systems that go beyond this upper limit. Although we could in theory attempt to compute the true uptime for OS's with this upper limit by monitoring for restarts at the expected time, we prefer not to do this as it can be inaccurate and error prone.

Why do you not report uptimes for Linux 2.6 or Linux alpha/ia64 ?


The Linux kernel switched to a higher internal timer rate at kernel version 2.5.26. Linux 2.4 used a rate of 100Hz. Linux 2.6 uses a timer at 1000Hz. (An explanation of the HZ setting in Linux.)
The above applies to Linux on 32-bit Intel-compatible systems (which is the most common case). Linux on other platforms uses different timer rates: the Alpha and Intel ia-64 ports already used 1000Hz, while the ports for sparc, m68k and other less common processors continue to use 100Hz.
The Linux TCP code only uses the low 32 bits of the timer. Due to the faster rate of the timer, the value wraps around every 49.7 days (whereas it used to wrap after 497 days). Because there are large numbers of Linux systems which have a higher uptime than this, it is no longer possible to report accurate uptimes for these systems."



Mi2g also periodically gives estimates about damages made by the various types of attacks. Since the amount of damage a compromise does depends widely on the importance of data contained in the compromised machine (ie. Government servers are usually more critical than mine), this seems not very related to the present article. Therefore I won't comment on this issue.

Finally, at page:

you'll find also my exchanges with Zone-h admin. He provided me with useful insights on the defacement panorama. But my opinion of him got even better when, some time later, I read a new article of his (emphasis mine):

"(...) So far, so good except from one detail: the only exact action after watching these data is that

Why? The reason is simple.
First of all, somebody might argue that the data should be re-evaluated and proportioned to the total amount of worldwide installations.
Second, crackers are choosing OS depending of what is "leet" at that very moment (remember the Solaris Armageddon 18 months ago?)

Availability of 0days for particular OSs is also contributing to the "mumbo jumbo" curves of the above graph.
In fact, nowadays many of the intrusions are performed at database or application level.
Regardless the OS.
Regardless the web server. (...)"

Please read this interesting feature at:



A final joke

We spoke up to now about servers. Let's speak about desktops. Common opinion is that only Windows desktops are subject to malware (virus, worms, spyware, et similia) while Linux and Mac OS machines are immune to this threat. This is mostly true, and also Mi2g agrees that malware is practically Windows-only.
That said, let's do a fascinating hypothesis: what if all 235k+ recorded manual breaches were made on Linux desktops? Stop throwing those stones, it's only a joke!
Lets make an overestimate math, however.

Let's say 600 millions desktops in the world
Let's say desktop Linux market share is 0.25%
Let's say 260000 breaches on desktop Linux each year...
It's like saying that 20 % of desktops running Linux are compromised. But remember the assumptions I made!


Now, back to reality

(From 'The Register')

"80 per cent of home PCs infected - survey

(...)They found that nearly all Windows PCs are infected with some form of malware(...)"

Also remember that many widespread Windows worms open backdoors into compromised desktop and server machines, effectively r00ting them to benefit crackers for either manual or automatic, mostly covert, exploits.




-Speaking of servers, a good advice would be to use good service providers, regardless to the used OS. Servers are very exposed to many kinds of compromises, and need competent staff to handle them. Prefer paid contracts with some degree of service guarantee, if available.

-Speaking of desktops, the simple fact of using alternative platforms seems to restrict very much the chances to be infected or however compromised.

-Best practice, though, is to use the OS you prefer and know well (provided it's actively patched and updated by the manufacturer), patching and assisting it with attention. If all the systems deployed were patched and configured in a timely manner most exploits, either manual or automatic, simply wouldn't happen.

Simone Bianchi

Added after article completion - from IT news sources

From Mi2g site
London, UK - 12 November 2004, 14:15 GMT

"(...) On the other hand, if you still prefer a rough rule-of-thumb approach with malware and manual hacker attacks conjoined like apples and pears in one basket, the safest operating system environment would still be BSD + Apple Mac OS X. Next would be Linux and then it would be MS Windows. (...)"

Please read the whole release at: