Big Data

How the National Security Administration (NSA) compiles and uses metadata has been much discussed lately. Whether or not the NSA will use its gigantic (1.5 million square feet) data farm to store information on American citizens, the sheer computational power of the farm warrants discussion. According to a recent NPR article by Howard Berkes, the Utah data farm will require 65 megawatts of power in order to process an estimated 5 zettabytes of data.

Big Data - Volume, Velocity, Variety Content Image: BBVAtech/Flickr

While you may already know that private companies have begun using data mining to analyze consumer trends and create targeted advertising, perhaps you were unaware of the government’s use of so-called “big data” until the NSA’s PRISM project took front and center in the news. In fact, the federal government presently has many projects which “address the challenges of, and tap the opportunities afforded by, the big data revolution to advance agency missions and further scientific discovery and innovation.” These programs are highlighted in the President’s memo, Big Data Across the Federal Government.

Technology naturally outpaces legislation, so you can turn to the records analysts in the Government Records Section for guidance and information on best practices. As the role of government in the big data field evolves, it might be helpful to you as a North Carolina government employee to review the following documents written by the Digital Services Section:

Metadata as a Public Record in North Carolina: Best Practices Guidelines for Its Retention and Disposition

Best Practices for Cloud Computing Records Management Considerations Version 1.0