BB's simple overwrite is not explained anywhere, why? Why no standardised erasing patterns, to ensure truer Data Sanitization:

Forums: 

Can anyone explain what happens why I enable he overwrite function with BB, how does it ensure the data cannot be recovered if I overwrite a file on a hard disk or usb stick?

I do think BB is a great solution for Linux, but how can I trust a functin that just says overwrite with no explanation of what is meant by this term, from a technical stance.

Under BB's preferences options why not enable users to choose standard secure erasing patterns like:

Gutmann method
http://en.wikipedia.org/wiki/Gutmann_method

AFSSI-5020
http://en.wikipedia.org/wiki/AFSSI-5020

DoD 5220.22-M
http://www.dtic.mil/whs/directives/corres/html/522022m.htm

How to erase data permanently from the hard drive?
http://zksti.com/how-to-erase-data-permanently-from-the-hard-drive

I find it worrying that BB does not what it means by the term overwriting, no explanation is given of the overwriting pattern used, why?

The technical reason is simple: erasing with 50 passes (and various types of passes) is not any more secure than a single pass with 0x00 data, and some of this is explained in the first link you provided (Gutmann method on Wikipedia). Gutmann's urban legend of secure erasure is so well ingrained in people's minds, it is as difficult to explain as the Earth is round rather than flat. Rather than bow to popular demand and give people something they think they need but actually just wastes their time and gives them a false sense of security, it has been my hope to write a well reasoned argument against it. I still plan to write that soon.

There are some more details here: http://bleachbit.sourceforge.net/forum/how-many-passes-free-disk-space-o...

---
Andrew, lead developer

Yes, but BleachBit developers do not explain anywhere what pattern it uses for it's overwrite function. And as to the various standards which you dismiss almost out of hand, why can't they just be enabled so the USER, can decide what erasing pattern they want to use, rather that BleachBit not providing any official comments or options on such in it's formal documentation, FAQ's

Forum comments can be made by anyone and thus have to be viewed via the scientific principal of being a respectful skeptic.

Nor giving users a choice over which erasing patterns they wish to use would be my main Caveat against BleachBit.

Effective erasing is also inter-dependent upon the type of medium data is on, so a simple overwrite may be less than secure on a solid state drive, which is common on Linux netbooks or on USB sticks drives that distribute a files content all other the place.

The url posted is concerned with erasing free space after using the overwrite function on files, therefore this secondary erasing suggests the first overwrite may not be effective without a further erasing of free space.

Put simply BleachBit needs to bite the bullet and provide official comments, faq's and documentation on the effectiveness of it's undisclosed overwrite function.

Why do these products use various patterns?

http://www.jetico.com/wiping-bcwipe/

http://www.blancco.com/en/product-services/blancco-file-shredder

If BleachBit can achieve the same outcomes as the above costly solutions, then why does it not shout it from the roof tops!

PS:

"I use BCWipe for Windows -- if you wanted to ensure no one could ever recover the file"
http://www.schneier.com/blog/archives/2009/09/file_deletion.html

Forum comments can be made by anyone and thus have to be viewed via the scientific principal of being a respectful skeptic.

I agree I need to write documentation and a FAQ. I am basically The (only/main) BleachBit Developer.

why can't they just be enabled so the USER

There is no clear case where this isn't a waste of the user's time. I am trying here to make an argument the option (Gutmann, etc) does more harm than good. In the time I am writing this, though, I could have implemented the Gutmann/DOD/etc feature.

Effective erasing is also inter-dependent upon the type of medium data is on, so a simple overwrite may be less than secure on a solid state drive, which is common on Linux netbooks or on USB sticks drives that distribute a files content all other the place.

You are losing the distinction between the device and file system and forgetting about the application. Each contributes its own complexities. DBAN and Jetico BCWipe, for example, cannot wipe the HPA (Host Protected Area) on hard disk drives, and by default DBAN doesn't wipe the remapped sectors on HDDs. As for file systems and application, say you are editing a word processor document on NTFS (Windows) or ext3 (Windows): as you re-save the document, it may move around to different places on the disk. Then you securely wipe it (with 1 or 100 passes), but you are only wiping the last known location. Application, such as Firefox or Microsoft Outlook, keep allocate space for databases (URL history, email archive, etc): the records in the database are logically deleted, but the contents remain in the file.

Back to what you mean: there is no shred of evidence that anyone can even at a great cost or even theoretically recover any amount of data from any modern medium that has been overwritten once with any pattern. However, other tools that provide claims like "US Department of Defense security" procedures simplify the complex issues which misleads users.

Why do these products use various patterns?

It's marketing hype pandering to the Gutmann's urban legend. There are well-reasoned, documented theoretical vulnerabilities in encryption algorithms, but any attack is very expensive (requires, say, millions of computer hours to finish). With some older, weak encryption algorithms, you could probably high a company to break the encryption for you. There is nothing similar with regards to multiple passes. Scientists would love to get credit for theorizing a way to recover the data, and private companies would love to charge you top dollar for recovering the data. However, a recovery company won't even bother trying to recover data overwritten with a single pass of any pattern.

If BleachBit can achieve the same outcomes as the above costly solutions, then why does it not shout it from the roof tops!

BCWipe appears to optionally wipe the slack space as part of the process of wiping unallocated disk space: this is tricky, dangerous, and has a high cost-to-benefit ratio, so I probably won't add it. BCWipe wipes swap on Linux: BleachBit wipes swap on Linux, but not (yet) on Windows. Blancco File Shredder shreds the recycle bin on Windows, but BleachBit shreds the recycle bin only on Linux.

Users should evaluate these tools relative to their needs. Are you trying to hide from family members, or are you Osama Bin Laden hiding from the whole free world---including the parts of the world that are highly resourced and motivated to get you? If you are in the latter category, don't use a computer.

So to wrap things up, the security issues is much more complex than making multiple passes. If you think multiple passes is all you need, then you have a false sense of security. What I will do "soon" is document BleachBit's exact capabilities, explain them clearly, and provide users with some general, comprehensive advice regarding data remanence. Then I will link the preferences menu in the application to the online documentation.

---
Andrew, lead developer

Just give users the option to choose the other governments approved and well researched overwrite patterns. Or just use a LiveCD so nothing ever touches your hard disk and use TrueCrypt to keep files in an easy to dispose hidden volume. Simple :-)

Different people have different threat models for sensitive information such as health records, insurance, business contracts, patent and research info, so many don't want to worry that the USB stick they left in clothes at the Dry Cleaner still has data on it they thought was erased or all the computers left in taxis, smart phones left at airports etc

So giving the option of the often legally required erasing patterns seems a no brainer, rather than taking an ideological stance against them. Even if it can be argued that commercial file shredders use patterns that could be argued to be snake-oil.

Storage wrinkle: 4,500 flash drives left at the cleaners
http://www.computerworld.com/s/article/9147100/Storage_wrinkle_4_500_fla...

PS: (edit)

Bruce Schneier must have some insight http://www.schneier.com/

Blancco – File Shredder supports the following algorithms:

HMG Infosec Standard 5, The Baseline Standard (1 pass)
-
HMG Infosec Standard 5, The Enhanced Standard (3 passes)
-
Peter Gutmann's algorithm (35 passes)
-
U.S.Department of Defense Sanitizing (DOD 5220.22-M) (3 passes)
-
Bruce Schneier's algorithm (7 passes)
-
Navy Staff Office Publication for RLL (3 passes)
-
The National Computer Security Center (4 passes)
-
Air Force System Security Instruction 5020 (4 passes)
-
US Army AR380-19 (3 passes)
-
- German Standard VSITR (7 passes)

- OPNAVINST 5239.1A (3 passes)

National Security Agency (3 passes)
-
U.S.Department of Defense Sanitizing (DoD 5220.22-M ECE) (7 passes)

Bruce Schneier must have some insight

I browsed/searched his site briefly and didn't find anything directly related, but afterward I found "Can Intelligence Agencies Read Overwritten Data?" by Daniel Feenberg which makes basically the same claim as I have been making

so many don't want to worry that the USB stick they left in clothes at the Dry Cleaner still has data on it they thought was erased or all the computers left in taxis, smart phones left at airports etc

If you "securely deleted" a single spreadsheet with any of these fancy multiple-pass algorithms, and you lost the disk in your clothes at the dry cleaners, would you actually be safe? If you edited the file multiple times on the same file system, probably not because previously pieces of the spreadsheet were in different locations, so the multiple passes overwrote, at best, only the latest copy. On the other hand, if there are so many flash drives lost at the cleaners, who has the time and money to go fishing through what is no more interesting or profitable than a TPS report?

So giving the option of the often legally required erasing patterns

The US DOD standard refers to wiping the WHOLE drive including all applications, all data, all documents, and the file system.

commercial file shredders use patterns that could be argued to be snake-oil.

Great word: snake oil. I'll remember that.

rather than taking an ideological stance against them

It's not ideology: it's pragmatism. Take the analogy of using a herbal home remedy to treat depression. There is no credible scientific evidence the herbal remedy is effective in producing the desired result, but it's cheap and easy to get over the counter. If the person didn't take it, he would probably get professional help including a proven prescription drug or counseling---and one of those would much likely solve the problem.

---
Andrew, lead developer

No amount of logic will convince the tinfoil hat crowd that they are wrong. They will just throw "science says" back in your face without understanding the basic principals of what they are saying. I have never seen a single instance where someone has even "claimed" to have actually recovered data which has been erased in the field (not in a lab setting where they know what patterns they used).

The whole secure erase industry is a scam. I work for the government and we are required to do multi-pass erases, the number of passes vary with the secrecy of the documents on the hard drive. The software we use charges us for each erase that is run, in order to store the serial number and type of erase that you used. I have asked the technical guys in our security department why we need to do this and the response has been "because my boss is an idiot and he wants to cover his ass".

It is simple enough to test if data recovery places can recover erased data. The majority (I would say any legitimate company) will give you a quote on the cost of recovery. They will list the files that they recover and you are not bound to pay them a cent until they have actually recovered the data you are looking for. So, erase a drive (an old one you don't care about) with known files. Ship it and ask them for a list of files they can recover. Tell them they can dismantle it/whatever it takes. They will not be able to recover a thing. These companies are really good at fixing mechanical defects but they cannot recovered anything that has been the target of a simple format (not quick).

Peter Gutmann's 35-pass method was based on research he did in to the various ways in which the older MFM/RLL-encoded disks wrote data. A user does not need to find out which of Gutmann's passes is appropriate for the disk drive being erased. The theory is, if you always run the 35 passes, you can be pretty sure one of them will erase your data appropriate to the type of encoding your disk drive uses.

This is no longer relevant for the newer ATA and SATA drives. Peter Gutmann himself states that companies that dogmatically apply his method to modern drives are doing no more than practising voodoo, as if his method is some mystical incantation which makes the data unrecoverable. As has been stated by various people here and elsewhere, on modern drives one single write of random data over the existing data is enough to prevent data recovery. No special algorithms are required.

I stopped using software from companies like Jettico or Blancco because their boasts about supporting Gutmann or any of the DOD methods means they either don't know what they're talking about when it comes to data erasing on modern disks or they're wilfully perpetuating out of date science in the hope of convincing customers that buying software that supports it is the only way to securely delete data. I use Mark Russinovich's SDelete v1.51 these days which satisfies my take on modern practices and theories. It still says it adheres to DOD 5220.22-M (3 passes: zero, -1, random) which I still regard as overkill but its very fast and will also wipe free disk space. I'm currently building a front-end GUI for it, for my own use.

Having collated a fair bit of information on the subject recently I've come to the following conclusions about what is required to securely delete data on modern NTFS SATA drives on Windows.
1. To securely wipe a file, you must overwrite it using a 1-pass write algorithm. To avoid what LEA might consider "suspicious deletions", it should be random data.
2. The file must then be renamed to another random name of the same length.
3. The file can then be deleted.
4. To allow for previous versions of a file still remaining on disk, free space must be wiped by writing to it until the disk is full.
5. If you're deleting sensitive registry entries, this must be followed by a compacting of the registry files, followed by a wipe of free space.
6. Before wiping free space, turn off support for Hibernation (powercfg -H off) to delete the hiberfil.sys file.
7. Either don't use a paging file at all, or encrypt it using something like BCWipe (the only use I have for Jetico software now - I'm actively looking for an open source alternative)
8. Alternatively, turn off paging while you wipe the free space.

I think that's it - I'd welcome any further suggestions that I may have omitted from my list.

I also have to say that, in my opinion, between them BleachBit (free), SDelete (free), Registry Defrag (free) and a paging file encryptor do more than enough to securely wipe sensitive file content and put it beyond data recovery.
What's slightly more difficult is tracking down all the detritus that Windows and Windows applications leave behind.

I documented how BleachBit securely wipes files and cleans free disk space, so now it is explained and justified in one place.

---
Andrew, lead developer

I have a question about how BB wipes free space...

Does it write zeros or random data? I thought I read somewhere in your well-reasoned explanation that it was better to use random data, but then I thought I saw somewhere else where it implied that BB just writes zeroes.

Could you clarify?

Thanks

There is no credible evidence that random data is better (i.e., harder to recover) than a single pass of zeroes: this sounds like another urban legend. However, random data taxes the CPU and is slower, so BleachBit writes zeroes. You may want to read the documentation: "Shred files and wipe disks" which has more details and some references.

---
Andrew, lead developer

Ok. Now that I think about it, I think you wrote somewhere that the benefit of using random values is that it is less obvious that a cleaner was used, rather than making old data harder to recover.

Thanks for your sharing your work!

Either case (random data and zeroes) could be interpreted in several ways. Zeroes could mean someone intentionally wiped the drive to hide something, the space was never used, or a program wiped the space while allocating it for some special purpose (not related to hiding anything).

Random data could mean someone intentionally wiped the drive, encryption, "garbage," or unknown data (e.g., a special file format may seem like random noise).

By itself, either case could be interpreted as hiding data (i.e., the person was doing something wrong so he wanted to hide his tracks) or not hiding (the effect is explained by normal use/non-use).

---
Andrew, lead developer

You place the same false hope in commercial anti-depressants as these guys do in commercial HD wipers. Studies have shown that commercial anti-depressants do not work as well as an exercise program. And, most of them if re-tested today work no better than a placebo (sugar pill.) Frankly, using prescription anti-depressants is one of the biggest mistake a person can make.

OK, maybe bad choice for the analogy. :) From my anecdotal experience, prescription anti-depressions have limited effectiveness, but can have serious side effects.

I'm not a physician, but how about this analogy: for management of diabetes "Sugar Control Tea" vs insulin, eating healthy, and exercise? The analogy may still be bad, but I hope I am still making the point.

---
Andrew, lead developer