Trying to recover a quick formatted external 1TB NTFS

A forum on data recovery using the professional data recovery software R-STUDIO.
max
Posts: 4
Joined: Mon Jun 17, 2019 3:32 pm

Trying to recover a quick formatted external 1TB NTFS

Post by max » Wed Jun 19, 2019 5:40 pm

Hi guys, I'm on a mission of recover as many files possible of an external 1TB NTFS HD after a quick format on Windows 10. It probably had approximately 500GB of files before the incident. After reading many posts in this forum, I still have some questions that I'd like to clarify trying to avoid any additional data loss.

1) Should I plug this HD into motherboard SATA port instead using USB? Of course, the speed is higher on SATA, but what about risks of having any unexpected and unwanted writes to this disk if it presents since the Windows boot up process?

2) I've read in this forum that Windows (and other apps such as Antivirus) may touch the disk without notice during the process of recovery. Is there any recommended steps to avoid additional data loss?

3) I have two spare drives, 1TB HD and 400GB HD, to help in this recovery process. I always really believe that making a image of source HD should be done as "best practices", but as I don't have two 1TB HDs; i) to make a 1TB image and ii) use another 1TB HD to save recovered files, is there any suggestion? I couldn't imagine other way rather than i) imaging using compress function (still don't know if this would compress good enough) or ii) ignoring the imaging processed all the way, believing that the R-Studio won't write anything on HD thus being not so critical work without a original image.

Thanks in advance

Alt
Site Moderator
Posts: 3129
Joined: Tue Nov 11, 2008 2:13 pm
Contact:

Re: Trying to recover a quick formatted external 1TB NTFS

Post by Alt » Thu Jun 20, 2019 12:29 pm

max wrote:
Wed Jun 19, 2019 5:40 pm
1) Should I plug this HD into motherboard SATA port instead using USB? Of course, the speed is higher on SATA, but what about risks of having any unexpected and unwanted writes to this disk if it presents since the Windows boot up process?

2) I've read in this forum that Windows (and other apps such as Antivirus) may touch the disk without notice during the process of recovery. Is there any recommended steps to avoid additional data loss?
You may disable disk automount in Windows. Read this article for details: Disable Automount of New Drives in Windows 10. SATA is much faster and more reliable.
max wrote:
Wed Jun 19, 2019 5:40 pm
3) I have two spare drives, 1TB HD and 400GB HD, to help in this recovery process. I always really believe that making a image of source HD should be done as "best practices", but as I don't have two 1TB HDs; i) to make a 1TB image and ii) use another 1TB HD to save recovered files, is there any suggestion? I couldn't imagine other way rather than i) imaging using compress function (still don't know if this would compress good enough) or ii) ignoring the imaging processed all the way, believing that the R-Studio won't write anything on HD thus being not so critical work without a original image.
If you're absolutely sure that the disk is healthy, you may work with the hard disk. R-Studio won't write anything on the disk unless you directly and clearly instruct it to do so.
You may find this article useful: Data Recovery from a Reformatted NTFS Disk.

max
Posts: 4
Joined: Mon Jun 17, 2019 3:32 pm

Re: Trying to recover a quick formatted external 1TB NTFS

Post by max » Thu Jun 20, 2019 5:15 pm

Thanks for reply, Alt. I read somewhere in this forum that when doing an image using R-Studio, the only difference (better saying, disadvantage) between using a byte-to-byte option or compressed (which could help me image even without full space requirements) would be that this image won't be read by other recovery apps rather than R-Studio, is that correct? Or is there anything else when using compression that should be noticed?

In fact, is the .rdr byte-to-byte just a plain image that could be read in any other app (even with dd, etc)?

Using a test 2GB SD card just to learn how to use R-Studio, I tested the image feature and tried to make a byte-to-byte (1.89GB) and a compressed image, which was supposed to be less than half (887MB). After images were made, I was surprised that both of them are same size; byte-to-byte (1.832.304 KB) and supposed-compressed (1.830.898 KB). What I missed here?

Best

Alt
Site Moderator
Posts: 3129
Joined: Tue Nov 11, 2008 2:13 pm
Contact:

Re: Trying to recover a quick formatted external 1TB NTFS

Post by Alt » Fri Jun 21, 2019 10:20 am

max wrote:
Thu Jun 20, 2019 5:15 pm
In fact, is the .rdr byte-to-byte just a plain image that could be read in any other app (even with dd, etc)?
To my best knowledge, only UFS Explorer claims to support this file format.
max wrote:
Thu Jun 20, 2019 5:15 pm
Using a test 2GB SD card just to learn how to use R-Studio, I tested the image feature and tried to make a byte-to-byte (1.89GB) and a compressed image, which was supposed to be less than half (887MB). After images were made, I was surprised that both of them are same size; byte-to-byte (1.832.304 KB) and supposed-compressed (1.830.898 KB). What I missed here?
I guess the card was full of jpg (already compressed) images. You cannot compress already compressed data.

abolibibelot
Posts: 40
Joined: Sun Jan 31, 2016 5:45 pm
Location: France

Re: Trying to recover a quick formatted external 1TB NTFS

Post by abolibibelot » Thu Jul 18, 2019 8:32 pm

1) Should I plug this HD into motherboard SATA port instead using USB? Of course, the speed is higher on SATA, but what about risks of having any unexpected and unwanted writes to this disk if it presents since the Windows boot up process?
You can plug a SATA drive after the system has booted. Hot-swap bays or cages make this easier, but it can be done by plugging the cables directly – just be very careful not to move the drive at all while it's starting up or spinning.
3) I have two spare drives, 1TB HD and 400GB HD, to help in this recovery process. I always really believe that making a image of source HD should be done as "best practices", but as I don't have two 1TB HDs; i) to make a 1TB image and ii) use another 1TB HD to save recovered files, is there any suggestion? I couldn't imagine other way rather than i) imaging using compress function (still don't know if this would compress good enough) or ii) ignoring the imaging processed all the way, believing that the R-Studio won't write anything on HD thus being not so critical work without a original image.
It's best practice indeed to have a clone or image, especially if you have no backup at all (which is bad practice for sure !), but indeed if you're certain that the source drive is perfectly healthy you can get away with doing the analysis and extraction directly without a safety net.
Another suggestion : ddrescue (Linux) allows to write a volume image in “sparse” mode, meaning that empty sectors are not allocated, which can be almost as efficient at reducing the size of the image as an actual compression if most of the files are binary files in compressed format (JPG, MP4, MP3...). The advantage is that a sparse image created by R-Studio can be accessed by any software, it appears just like a “raw” image. Problem : there seems to be a significant performance hit when writing in “sparse” mode to a NTFS partition, I don't know if it's caused by ddrescue itself or by the Linux NTFS driver. I asked about it on SuperUser but didn't get much conclusive input.

Alt
Site Moderator
Posts: 3129
Joined: Tue Nov 11, 2008 2:13 pm
Contact:

Re: Trying to recover a quick formatted external 1TB NTFS

Post by Alt » Fri Jul 19, 2019 9:46 am

abolibibelot wrote:
Thu Jul 18, 2019 8:32 pm
Another suggestion : ddrescue (Linux) allows to write a volume image in “sparse” mode, meaning that empty sectors are not allocated, which can be almost as efficient at reducing the size of the image as an actual compression if most of the files are binary files in compressed format (JPG, MP4, MP3...).
"Sparse" imaging has a very serious drawback - you won't be able to retrieve info from such empty sectors. They may contain raw files.

abolibibelot
Posts: 40
Joined: Sun Jan 31, 2016 5:45 pm
Location: France

Re: Trying to recover a quick formatted external 1TB NTFS

Post by abolibibelot » Sat Jul 20, 2019 1:02 pm

Alt wrote:
Fri Jul 19, 2019 9:46 am
abolibibelot wrote:
Thu Jul 18, 2019 8:32 pm
Another suggestion : ddrescue (Linux) allows to write a volume image in “sparse” mode, meaning that empty sectors are not allocated, which can be almost as efficient at reducing the size of the image as an actual compression if most of the files are binary files in compressed format (JPG, MP4, MP3...).
"Sparse" imaging has a very serious drawback - you won't be able to retrieve info from such empty sectors. They may contain raw files.
Are you sure about that ? As far as I know, “sparse” writing only results in sectors which are totally empty in the source being unallocated on the image, but they are still there, referenced in the MFT record(s) (I don't know exactly how yet), and kind of “emulated” by the system when any program is accessing them. Normally the checksum of the source and the image should match, so a regular image and a “sparse” image should be identical for all intents and purposes. Recently I created (with ddrescue) a ~500GB sparse image from a 4TB HDD containing about 550GB of data (a healthy drive which had been painstakingly cloned from a defective drive – details on this thread if curious), then I compared the image and the source HDD with WinHex in “synchronize and compare” mode, they were identical (I didn't go to the trouble of computing the checksums, which would have required 7-8 more hours, but I'm pretty sure that they would have matched, since I had disabled the MBR on the HDD to prevent Windows from writing on it, and the ddrescue copy went flawlessly).

There is however a (free) tool which allows to skip sectors which are unallocated in a NTFS partitioned source, even if they are not actually empty (and may therefore contain files recoverable by the “raw carving” method) : ddru_ntfsbitmap, included in ddr_utility. It parses the $Bitmap file and generates a mapfile for ddrescue, restricting the subsequent image creation (or cloning) to the areas which are actually allocated (i.e. not considered as “free space” by the file system). It also allows to recover the MFT first, which can save a lot of trouble in some situations.

Alt
Site Moderator
Posts: 3129
Joined: Tue Nov 11, 2008 2:13 pm
Contact:

Re: Trying to recover a quick formatted external 1TB NTFS

Post by Alt » Sun Jul 21, 2019 1:03 pm

abolibibelot wrote: Are you sure about that ? As far as I know, “sparse” writing only results in sectors which are totally empty in the source being unallocated on the image, but they are still there, referenced in the MFT record(s) (I don't know exactly how yet), and kind of “emulated” by the system when any program is accessing them.
What does that mean - "unallocated" sectors in your context?

abolibibelot
Posts: 40
Joined: Sun Jan 31, 2016 5:45 pm
Location: France

Re: Trying to recover a quick formatted external 1TB NTFS

Post by abolibibelot » Tue Jun 02, 2020 8:51 pm

Sorry, I'm replying only now about 10 months later...
Well, I thought that it was straightforward enough, and the definition of a sparse file should be easy to find if needed. “Unallocated” in that context means that the empty sectors are not actually “occupied” on the storage device, instead they're referenced as part of the file's metadata, but when the file is accessed those sectors are still “expanded” as if they were physically present. So if computing the checksum of a “sparse” image file and a “non-sparse” image file of the same volume, they should match, even though the actually allocated size of the sparse image file can be much smaller.

Post Reply