I caught a cold starting last week. I took a day off of work on Thursday to head the cold off, unfortunately it got worse on Friday and I had work do be done. So instead, I am spending the weekend recovering. I can only sleep so much and since I can't go out and play, I am stuck at home awake with nothing to do. So I've had a lot of time to kill with writing.
This post comes off as a continuation of Sacha's post "Hacking the iPhone was inevitable." The iPhone and any electronic device running software is probably hackable. How far people go to hack hardware is deep.
I managed to meet the coordinator of CanSecWest before through some interesting connections. I regret that I am bad at keeping up with people so I am not in as close contact anymore with these guys. But there are people out there that know how to take the packaging off of a chip and analyze the memory contents of a chip using an scanning electron microscope. There are other neat hacks where code protection on some microprocessors can be disabled by heating up the chip to cause some bits to flip and the data to come out. Who knows what creative avenues hackers have to get at software and hardware.
The risks of getting hacked is now pervasive. There are now services in the US used to disable cars if people miss their car loan payments. A disgruntled former employee managed to deactivate 100 cars after being fired after remotely logging into his company's network.
So should we feel safe with software handling data or mechanical systems?
The biggest benefit and the biggest danger of software is its power to scale with its user base. It's only marginally more expensive for a computer to handle data for 100 people vs 1,000,000 people with the right infrastructure. The power for software to scale is what provides the capacity for our modern systems to perform work for us. If 1 server handles data for 100 people and gets taken down then it isn't a huge problem, but what if a server handling 1,000,000? The negative effects are also multiplied by scaling.
What we have here is a potential "black swan" kind of event where a usually reliable system (99.999%) breaks down just once over 4~5 years causing catastrophic damage. The black swan effect is what is used to describe what happened in the the recent US stock market crash, it can also be used to describe other events such as Chernobyl or the huge power outage that happened in Eastern North America many years ago. What modern society right now is in effect doing it putting all of its eggs into an electronic basket and we are expecting it to be very reliable.
A secret weapon out there that could selectively kill electronic devices in some huge blast radius (ie. War of the Worlds style) would be crippling, like a torpedo up the exhaust pipe of the Death Star.
Deciding how far we should rely on technology is an interesting question and the only best answer I can give is "what and how much are you willing to risk?" That is... if you know that you are taking a risk in the first place.
No comments:
Post a Comment