|
Alexander Hanff of Privacy International has posted an opinion piece on The Register about the Google Wifi Outrage. His byline and the article both mention Privacy International so one can only assume this misguided piece was written in his official capactiy. So why is this piece misguided? To the uninitiated it probably reads as a quite well written analysis of the situation. Unfortunately for Mr Hanff, The Register's readership are generally very tech literate. As the article's comments section highlights, the article contains a number of inaccuracies as well as a very biased report of the facts. So let's take a look at where Mr Hanff has gone wrong; Mr Hanff describes a waterfall method of software development, and asserts that this is 'proof' that Googles actions were deliberate. Every large project, Mr Hanff asserts, goes through this rigid model including thorough testing. However, if Mr Hanff had any experience of software development, he'd be aware that the industry has moved onto 'Agile' development methodologies. Unless the software is 'critical' (i.e. could affect safety), testing just does not occur to the level that Mr Hanff asserts. Most developers will test the core functionality for bugs, but will not test whether the system does anything else. This lack of testing for non-core functionality often extends beyond deployment, another point which Mr Hanff neglects to consider. Mr Hanff asserts that Google must have noticed they were collecting more data than expected. This is, again, incorrect. Google collected 1200 Gigabytes of data over a period of 3 years. That's very little data per day (see below for 'per network' calculations). Mr Hanff then moves on to discuss the culpability of those who were running open networks. This, obviously, is a hotly debated topic. The overall consensus, however, is that it shouldn't be too much to ask to read a manual to find out how to secure your wireless. Next Mr Hanff tries to establish that a 2008 patent filing is evidence of intent. The patent concerns the collection of wireless SSID's and MAC addresses - Google's true aim - but has been picked up by opponents on the basis that it doesn't explicitly state that payload data will be discarded. To me, this is obviously a very tenuous argument. For the benefit of others, lets take a quick look at why;
The article then refers to a code audit "which clearly showed that this data had been processed in a deliberate manner". This is obviosuly something of a red herring when you consider Google's claims. Google claim the code was written by one of their engineers some time ago, and that the code was then included in this latest project without realising it would record payloads. Clearly then, any audit of the code would reveal the above findings. The code was obviously deliberately written to record payloads, no-one is disputing that, the 'accident' was including it in the Streetview software. The finding of the report, therefore, clearly has absolutely no bearing whatsoever on whether or not the act was deliberate. Finally Mr Hanff asserts that the data is incredibly rich in useful data. To understand why this is wrong, you need to appreciate just how little data was actually collected; 1200 Gigabytes x 1024 x 1024 x 1024 = 1,288,490,188,800 bytes It took place over 36 months; 1,288,490,188,800 / 36 = 35,791,394,133.3 bytes a month Average of 30 days to a month 35,791,394,133.3 / 30 = 1,193,046,471.11 bytes a day Lets assume an 8 hour day, with a 1 hour lunchbreak 1,193,046,471.11 / 7 = 170,435,210.159 bytes an hour 170,435,210.159 / 60 = 2,840,586.83598 bytes a minute 2,840,586.83598 / 60 = 47,343.113933 bytes a second But, from a network owners point of view, there's more maths to do; The equipment changed channel 5 times a second 47,343.113933 / 5 = 9,468.6227866 There are 13 Wifi Channels, how many seconds to cycle through? 13 / 5 = 2.6 seconds. So Google would capture 9.4 kilobytes from your network every 2.6 seconds that they were in range. To put that into perspective, it would take 6.8 Minutes ( (((1.44MB x 1024)/9.4KB)*2.6 Seconds)/60 ) just to fill a floppy disk with data from your network. Not all of the data captured will even be useful, some of it will be standard network communications. Does this sound like a 'lot of rich data' to you? Would you consider it rich enough to risk this level of outrage? So despite the assertations of Mr Hanff, it seems very unlikely that Google did this deliberately - there's no commercial gain in retaining the payload and the risks far outweigh any purported benefits. All that Mr Hanff's article has achieved, is to portray Privacy International as the Boy who cried wolf. In my opinion he's done himself, and the organisation, more harm than good. A little bit of research is all that was needed to prevent such an embarrasing excuse for an article from being published. It actually reminds me of a letter I read some time ago, from an ex-Greenpeace member. The author criticised the current leaders for getting too caught up in the 'excitement' of a cause, to stop and actually consider the situation carefully. My gut feeling is that a similar phenomena is occurring around this particular issue. |
|
This page is copyright to me, Ben Tasker. This work is licenced under a Creative Commons Licence. |
||
All Images operate under a seperate license Please read this page for more information. The Full Image License can be read here |
|