In the January edition of ITAK, we presented a broad update of the Safe Harbor principle as it applies to establishing and executing a compliant end-of-life media sanitization protocol. In that article, we noted that most government data security regulations that affected data destruction pointed to NIST Special Publication 800-88: Guidelines for Media Sanitization (henceforth: SP 800-88). If an organization could demonstrate that their IT asset disposal practices were based on recommendations contained in that document, it would be more or less off the hook in terms of compliance.
Toward the end of the article I noted two trends that would mean a changed landscape for ITAD in the near future. The developments were as follows:
- SP 800-88 was / is in the process of revision, with significant ramifications for end-of-life disposal found in expanded sections 4.7 and 4.8: Verification Methods and Documentation, respectively.
- In the real world, government regulations were becoming less important to the business than the standards being established by third-party certifying bodies such as NAID, e-Stewards, R2 and ADISA.
Now, here it is six months later and the future is now. This article is intended to pick up where the January discussion left off, beginning with an overview of how the revised NIST document treats the verification process. We will then take a look at relevant guideline updates among the third party certification agencies and finally, speculate on how all of this is supposed to be executed in the real world. As we will see, it will ultimately be up to end users to initiate practical and effective verification solutions to meet these changing requirements.
As noted, the lynchpin of new verification standards is the revised version of good ole SP 800-88, which was posted for comment on the NIST website in Sept 2012 (Revision 1). While the public commentary period closed November 1, the new document has yet to be officially released, but there is no reason to expect significant changes from the current online version.
(Note: it is not immediately apparent on the NIST website that this document exists in a revised state. Here is a link to the correct page http://csrc.nist.gov/publications/PubsDrafts.html#SP-800-88-Rev.%201, or you can find it on the DestructData, Inc. website here: www.destructdata.com/sp-800-88-rev1.pdf)
When the venerable NIST document was issued in 2006, the practice of “erasing” data on end-of-life cycle storage assets (primarily HDD) was quite different, when it was performed at all. Hard drives were still mostly being thrown away, warehoused without a plan, or destroyed physically; the concept of care, custody and control was still in its infancy for most IT departments. As a result, the benefits of sanitizing hard drives in recycling, re-purposing and re-selling scenarios were just becoming apparent.
When HDD wiping was performed for high security applications, it was generally the US Department of Defense (DoD) 5220.22-M wipe standard (otherwise known as the “DoD wipe”), which required three passes. Following the release of SP 800-88, this onerous process was gradually replaced by the SecureErase NIST recommendations for clearing. Today, there are a number of software and hardware solutions available which use a version of the “clear” technology in the original NIST document. It tells us something about the evolution of HDD sanitization standards that NIST left the “clear” definition mostly alone in revision 1, while focusing far more attention on the Verification and Documentation sections.
In addition to the Verification and Documentation sections already cited, it is also worth a look at the revised Table A1 – A20 in Appendix A. The list of storage media has been greatly expanded to include products that were either not available or not considered critical in 2006, notably SSDs, iPhones, Android OS tablets and flash drives. Of interest in many of these tables are new hyperlinked references to the revised Verification section, which clearly reflects an increase in emphasis on this component.
That brings us smartly to significantly expanded section 4.7 Verify Methods in revision 1, which now runs two full pages as opposed to half a page in the 2006 version.
In the brand new subsection 4.7.3, the authors note that “the highest level of assurance of effective sanitization (outside of a laboratory) is typically achieved by a full reading of all accessible areas to verify that the expected sanitized value is in all addressable locations. A full verification should be performed “if time and external factors permit.” No one will argue with this, but it is hard to imagine many scenarios in which “time and external factors” will permit it. We translate external factors to mean “cost” and the odds of a full verification being performed in the real world as “really low.”
SP-800-88 then arrives at a more realistic approach to validating HDD sanitization: representative sampling. The core of subsection 4.7.3 lists the details:
- Selection of pseudorandom locations on the media each time the analysis tool is applied in order to reduce false successes for partial sanitization.
- Selection of locations across addressable space, choosing a large enough number of media subsections so that the media is well covered.
- Each consecutive sample location should cover at least 5% of the subsection and not overlap other subsections.
Section 4.8 Documentation has also been expanded to include a much larger number of job parameters than previously listed. A new sample certification form reflecting these additions is included in Appendix G of the document. However, unlike the new verification guidelines, the items listed on the certificate would usually be provided by an asset tracking system.
The additions to sections 4.7 and 4.8 reflect higher stakes and a more rigorous interpretation of what is required for effective verification and documentation. However, NIST is not in the business of specifying how practitioners are supposed to go about implementing these guidelines and it has no say in their enforcement. With these new recommendations on the record, the steps for implementation are already underway. First, the certification agencies must interpret and adapt the revisions for their programs, and then the professionals at the end of the line need to figure out how to make it all work in the real world while remaining compliant.
Since the Sept 2012 draft of SP 800-88 Revision 1, several certifying bodies appear to have re-evaluated or upgraded their requirements for media sanitization. To one degree or another, all of these upgrades reflect a new focus on the verification and documentation component, if only by default.
The National Association for Information Destruction (NAID) issued standards for electronic media sanitization in its January 2013 NAID Certification document. This document references quality control and redundant verification sampling. The NAID standard states that a “specified number or percentage” of sanitized drives are to be selected for quality control verification of complete data wiping. In addition, the quality control software is required to be different than the sanitization software. This document also contains the fundamental principle that personnel performing quality control should not be the same as those who performed the sanitization. NAID’s standard also contains some, but not all, of the documentation components recommended in SP 800-88.
For its part, e-Stewards has released the second edition (2.0) of the e-Stewards® Standard for Responsible Recycling and Reuse of Electronic Equipment©, directed to the electronic recycling and refurbishing marketplace. Although e-Stewards indicates that further clarification is to follow down the road, they leave 800-88 as the prevailing guidelines for the broad spectrum of media sanitization, including verification. This clarification was issued by Basel Action Network in March of 2013:
Broadly speaking, a refurbisher must demonstrate that they have the operational framework to conform to NIST 800-88 plus e-Stewards performance requirements, and they must have an information system that confirms conformance (i.e. evaluates successful data wiping) on a device-by-device basis.
The latest R2 standard, which is effective as of July of 2013, specifically references SP-800-88 for all matters pertaining to Section 8: Data Destruction.
Finally, the Asset Disposal and Information Security Alliance (ADISA), a European certification body in the process of establishing a North American presence, released IT Asset Disposal Standard 2013 on March 20 of this year. Under 3.4.1 Processing is this line item:
- There must be a documented quality control process which will test a sample number of hard drives and all other data carrying assets after the data sanitization process has been complete.
This process is listed as “essential”, which is ADISA’s equivalent of mandatory. Quality control in this context is the same as verification.
Verification as a Quality Control Challenge
In the final analysis, the best way to approach the verification/validation stage of media sanitization is as a quality control issue, a concept which has largely been overlooked or undervalued previously. As we have just seen, the situation is evolving rapidly with the release of new guidelines that emphasize the importance of this process. Once again, as is the case with legislative regulations and NIST guidelines, the certification organizations stop well short of providing specific solutions for the rapidly emerging practice of verification. The individual organization is therefore left to analyze the key factors in its asset disposal scenario and come up with the appropriate testing protocols and tools.
Current approaches to verifying data sanitization have yielded a patchwork of approaches that are all over the map at this point. But, they are going to need to change. While working on a validation tool to meet this challenge, DestructData VP Sales and Technology Michael Cheslock identified several “points of failure” in existing processes. These bottlenecks are related to software, hardware and personnel.
A common misunderstanding is that using certified data erasure software automatically means that total data erasure is assured. We are by no means saying that certified data erasure software is in any way bad idea: far from it. In order for data erasure software to receive EAL (Evaluation Assurance Level of Common Criteria standards), CESG from the UK Government’s National Technical Authority for Information Assurance or other industry certification, the product is subjected to intense testing and scrutiny. The software must demonstrate the ability to perform complete data sanitization on a range of storage media.
Nevertheless, while starting with certified, industry-respected sanitization software is well-advised, it is not a substitute for implementing an independent quality control process for a specific media sanitization scenario. The software’s certification indicates what it can do under ideal conditions with competent operators, but the objective of a quality control process is to validate that the entire process is being executed to the intended specification. In other words, the software isn’t equipped to evaluate itself once it is out of the box.
A second potential point of failure is hardware, simply because every data erasure scenario is different. There are virtually endless combinations of hard drive interfaces, storage platforms, interconnects, chipsets, storage formats and these are the components that stand between the erasure software and the data on the drives. This high level of variation introduces an additional level of uncertainty that further indicates a need for a means of testing that is absolutely independent. The erasure software can only sanitize the storage it sees; it cannot make up for any limitations associated with the hardware on which it is hosted. It is for this reason that simply running a separate “verify” pass after erasure provides almost no added assurance that the sanitization was successful; it is not an independent process.
Ultimately, lab tests only reflect a portion of the product’s real world usage and they don’t account for any errors not directly related to the product. No matter how well-made a product is or what certifications it has received, it is impossible to test every usage scenario in the lab.
Finally, the need for an impartial operator during the testing process is an established verification criterion fundamental to any quality control scenario. For example, in many equipment-processing environments where retired storage assets will be resold, operators are expected to maximize production.
If an operator perceives a choice between security (including job security) and productivity, they are less likely to be objective as a quality control auditor. This truth creates the need for different personnel to do the testing.
A Verification Solution Wish List
One way to avoid the points of failure would be to implement third-party audits. Of course, as an exclusive quality control measure, this would be enormously cost-prohibitive if it were executed against an adequate sampling of sanitized media. At $300-$500 per drive for such a service, this simply is not a scalable option. An independent internal process is clearly more desirable.
Toward this end, our ideal verification solution would have to be look something like this:
It would deploy different software, hardware, and operator than that used in the sanitization process. It would have to be an internal solution deployable on a daily basis and it would also be nice if it were self-contained, cost effective, fast and easy to use. Ideally, we also want a simplified design that reduces the potential for systemic issues and can be rigorously tested and deployed repeatedly. It must be configurable to an organization’s specific sanitization policy so that it can catch even tiny deviations from the intended outcome, but standardized to ensure compliance with published, accepted standards. That should do the trick.
When it comes to data security, hope is not a strategy. In the context of the new emphasis on independent erasure verification, expect the perception of what constitutes a valid measurement solution to undergo a rapid reassessment.