As I am sitting here - yes, you guessed right! - on a plane, I cannot stop thinking about the book "Geekonomics"(book site) which I just finished reading (earlier impressions here and here). The way it ends, BTW, just kicks you in the balls, hard (look up what Mr Petrov did on Sept 26, 1983 and why, if you are already curious)!
Call me easily impressible, call me naive, darn, call me "out of touch with current security issues," but this book struck a major, major chord with me. It really did.
Now, I have experienced as much poor quality and insecure software as the next guy. I am never ever surprised about some feature in MS Office (or other application, really) just flat out not working or not working as expected or not working every time.
I suspect that, by now, every human on Earth who ever laid their hands on a computer knows:
software = might NOT work.
Now, we expect roads, bridges, toasters, chainsaws, bicycles, cars (until they put software in them...) to work and work they do. And if they don't - the company who manufactures them usually makes them work for us fast - or goes away, cut down by the "benevolent" axe of capitalism. Now, software is totally different (my thinking about this one).
And everybody knows it. But nobody was brave enough to take a hard look at this and analyze how that simple fact affected, affects and will affect our society. And, for my extra-paranoid readers: "... and how it might end that very society."
Until "Geekonomics!"
This book might not reveal any secrets about how software works to an IT professional (it will reveal how law works though!), but it will explain why bad software is everywhere, why we are stuck with it, why it will not improve by itself and - sorry for a hysterical note here! - how we might all fucking die because of it. It then unemotionally predicts why more people will certainly die because of bad software. It studies the complicated dynamics of today's software market such as who is more at fault for bad software - buyers who agree to buy or vendors who make it (or both). It also suggests that many of today's regulations and compliance "thingies" are a little misguided (e.g. in a battle a PCI DSS-compliant enterprise and a 0-day-wielding hacker, any sane person will bet on an 0-day). It is also very well-written; it won't bore an experienced IT or security pro and it will not overwhelm a mere IT user.
First, it explains why the software is the "foundation of our civilization" today, and how it will be more so in the future. Next, it casts a look at "innovation" and ponders how innovation-driven software development relates to the fact that users don't touch 90% of features of a typical software. In the third chapter is presents the view of the "0wned world" where "only the stupid [cybercriminals] get caught." Next chapters looks at how government oversight works in other areas (e.g. FDA), how it might work - and how it might fail (and did fail in the past). While doing it, the book dispels the "government will just make it worse" myth (basically, because some things are really bad and quickly streaming towards worse already). The amazing chapter 5 gives the clearest explanation of litigation (torts, etc) that I have ever seen (the book is worth reading just for chapter 5 alone!). Chapter 6 takes a super-pessimistic look at open-source software (no comment - just read it). Finally, several possible future - "the way forward" - is discussed.
Another thing I would like to mention about this book is that a reader should keep in mind that it is not about "insecure" software: it is about bad quality, unsafe software in general and less about "hackable" software. The author chose to not make this distinction very clear, perhaps on purpose.
So, everybody in software business, security business - in fact, just everybody who uses a computer - MUST READ THIS BOOK! Seriously, understanding the point made there might be a matter of life or death for some (all?) of us.
As a conclusion, if you want the visual image of the future to end my review, here it is: it is not "Terminator" future (where machines kill people out of evil) that we must fear and work to prevent, but "Robocop" future (where they do due to software bugs).
Go read the darn book! And support liability for software manufactures. Also, in a few days, check this out (not yet but hover over the link to get a preview...)
No comments:
Post a Comment