Updated

Intelligence officials believe that the National Security Agency should have detected that Edward Snowden was accessing and copying sensitive data regarding the agency's phone collection programs, according to a published report.

The New York Times, citing a senior intelligence official, reported that Snowden, a former NSA contractor, used inexpensive and widely available "web crawler" software meant for use in searching and backing up websites to access approximately 1.7 million files.

Investigators tell the Times that the fact that the software Snowden used was rarely found in the NSA's system should have been a stronger tip-off to NSA officials that something was amiss, but no one was apparently looking at the Hawaii location for unauthorized activity.

"We do not believe this was an individual sitting at a machine and downloading this much material in sequence," the official told the Times, adding that the process was "quite automated."

Snowden was working as a contractor for the NSA in Hawaii when he leaked information on the agency's phone and Internet data collection program to The Washington Post and The Guardian this past June.

The Times reported that Snowden's own data collection activities raised relatively few alarms, since the agency outpost had not been upgraded with the security found at NSA headquarters at Fort Meade, Md. However, investigators have been told that Snowden was questioned on at least one occasion about that amount of data he was accessing and downloading. The Times reports that Snowden told investigators that he was doing routine network maintenance.

Among the files that Snowden accessed were the NSA's shared "wiki" databases, a compendium of information about the agency's operations that enabled him to access documents detailing the agency's phone and data collection capabilities, as well as disclosures about spying on world leaders, such as current German Chancellor Angela Merkel.

Click for more from The New York Times