Investigators: Snowden Used Web Crawler to Collect NSA Files [NYT]

Snowden beat the system with a basic tool

By on February 10th, 2014 10:26 GMT

It was a rather basic method that Snowden used to gather up all the NSA files, intelligence officials investigating the breach said.

The method is quite inexpensive and automated – a simple web crawler, a piece of software that searches, indexes and backs up any material of interest, the New York Times reports.

“We do not believe this was an individual sitting at a machine and downloading this much material in sequence,” said an official that remains unnamed.

The investigation concluded that Snowden’s attack wasn’t very sophisticated, as originally thought, and should have been detected by the special monitors.

A web crawler can normally be programmed to go from website to website, visit embedded links in documents and copy anything it comes across. This is a tool often used by Internet companies, including Google, that search websites and download content for fast search results.

Officials believe that the whistleblower got access to 1.7 million files.

Whether the data coming from the officials is accurate or not remains to be seen, but Snowden is unlikely to provide information about his methods.

Comments