Charlie Miller prepared his cyberattack in a bedroom office at his Midwestern suburban home.
Brilliant and boyish-looking, Miller has a PhD in math from the University of Notre Dame and spent five years at the National Security Agency, where he secretly hacked into foreign computer systems for the U.S. government. Now, he was turning his attention to the Apple iPhone.
Consideration of software flaws and hackers is often a secondary priority for software developers, who often value sales and novel applications over security, some critics say.
At just 5 ounces and 4 1/2 inches long, the iPhone is an elegant computing powerhouse. Its microscopic transistors and millions of lines of code enable owners to make calls, send e-mail, take photos, listen to music, play games and conduct business, almost simultaneously. Nearly 200 million iPhones have been sold around the world.
The idea of a former cyberwarrior using his talents to hack a wildly popular consumer device might seem like a lark. But his campaign, aimed at winning a little-known hacker contest last year, points to a paradox of our digital age. The same code that unleashed a communications revolution has also created profound vulnerabilities for societies that depend on code for national security and economic survival.
Millerâs iPhone offensive showed how anything connected to networks these days can be a target.
He began by connecting his computer to another laptop holding the same software used by the iPhone. Then he typed a command to launch a program that randomly changed data in a file being processed by the software.
The alteration might be as mundane as inserting 58 for F0 in a string of data such as â0F 00 04 F0.â His plan was to constantly launch such random changes, cause the software to crash, then figure out why the substitutions triggered a problem. A software flaw could open a door and let him inside.
âI know I can do it,â Miller, now a cybersecurity consultant, told himself. âI can hack anything.â
After weeks of searching, he found what he was looking for: a âzero day,â a vulnerability in the software that has never been made public and for which there is no known fix.
The door was open, and Miller was about to walk through.
Holes in the system
The words âzero dayâ strike fear in military, intelligence and corporate leaders. The term is used by hackers and security specialists to describe a flaw discovered for the first time by a hacker that can be exploited to break into a system.
In recent years, there has been one stunning revelation after the next about how such unknown vulnerabilities were used to break into systems that were assumed to be secure.
One came in 2009, targeting Google, Northrop Grumman, Dow Chemical and hundreds of other firms. Hackers from China took advantage of a flaw in Microsoftâs Internet Explorer browser and used it to penetrate the targeted computer systems. Over several months, the hackers siphoned off oceans of data, including the source code that runs Googleâs systems.
Another attack last year took aim at cybersecurity giant RSA, which protects most of the Fortune 500 companies. That vulnerability involved Microsoft Excel, a spreadsheet program. The outcome was the same: A zero-day exploit enabled hackers to secretly infiltrate RSAâs computers and crack the security it sold. The firm had to pay $66Â million in the following months to remediate client problems.

No comments:
Post a Comment