icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
9 Feb, 2014 05:12

​Snowden used common web crawler tool to collect NSA files

​Snowden used common web crawler tool to collect NSA files

Whistleblower Edward Snowden used “inexpensive” and “widely available” software to gain access to at least 1.7 million secret files, The New York Times reported, quoting senior intelligence officials investigating the breach.

The collection process was “quite automated,” a senior intelligence official revealed. Snowden used “web crawler” software to “search, index and back up” files. The program just kept running, as Snowden went about his daily routine.

“We do not believe this was an individual sitting at a machine and downloading this much material in sequence,” the official said.

Investigators concluded that Snowden’s attack was not highly sophisticated and should have been easily detected by special monitors. The web crawler can be programmed to go from website to website, via embedded links in each document, copying everything it comes across.

The whistleblower managed to set the right algorithm for the web crawler, indicating subjects and how far to follow the links, according to the report. At the end of the day, Snowden was able to access 1.7 million files including documents on internal NSA networks and internal “wiki" materials, used by analysts to share information across the world.

Reportedly, Snowden had full access to the NSA’s files, as part of his job as the technology contractor in Hawaii, managing computer systems in a faraway outpost that focused on China and North Korea.

Officials added that the files were accessible because the Hawaii outpost was not upgraded with the latest security measures.

The web crawler used by Snowden was similar to, but not as advanced as the Googlebot crawler, used by Google and its search engine to access billions of websites and download their contents for fast search results.

The whistleblower did raise some flags while working in Hawaii, prompting questions about his work, but he was able to ward off criticism successfully.

“In at least one instance when he was questioned, Mr. Snowden provided what were later described to investigators as legitimate-sounding explanations for his activities: As a systems administrator he was responsible for conducting routine network maintenance. That could include backing up the computer systems and moving information to local servers, investigators were told,” according to the report.

Snowden admitted in June to taking an undisclosed number of documents, which in the last half-year have been regularly relied on by the international media for a number of high-profile reports about the US National Security Agency and its British counterpart, GCHQ. He was then granted political asylum by Russia and now resides in Moscow.

The leaks have unveiled a number of previously unreported NSA operations, including those involving dragnet surveillance programs that put the digital lives of millions, if not billions, of individuals across the world into the possession of the US government.

Podcasts
0:00
23:13
0:00
25:0