HW impact on recovery performance

A forum on data recovery using the professional data recovery software R-STUDIO.
HTF
Posts: 1
Joined: Tue May 04, 2021 9:43 am

HW impact on recovery performance

Post by HTF » Wed May 05, 2021 7:21 am

Hi,

I have some questions regarding the performance of R-Studio for Linux and possibilities to optimise my recovery setup.

I have to recover all files from a couple of 1,5 & 2TB HDDs to find some lost files, not found in the backups any more.

For this job I installed R-Studio for Linux (bought in November 2019) on a NUC with Intel Celeron N3450 (quad-core 1.1GHz, up to 2.2GHz), 8GB RAM, 600GB HDD, Debian Buster and 2 external 8TB HDD, each attached to a dedicated USB3 port.

My plan was to create first an image on 8TB HDD 1 and afterwards to recover all files from the image to 8TB HDD 2.
Then forwarding the recovered files to my PC for further screening and next HDD to be processed.

The image creation of the first 2TB HDD worked as expected, but the file recovery was either awful slow or the NUC crashed.
R-Studio did not respond any more. I couldn't see any progress and after 3 days with still 0 % at the screen, I ended the recovery job via the power button.
I got some MB of files in a folder structure on HDD 2, but most of them were 1 KB files with trash.

Now I have some questions, before starting a lot of testing at my own.

1. What is the expected time for a full recovery (all files) of a 2 TB HDD (with no defect sectors)?
Is it reasonable that it might take hours until the percentage increases from 0 to 1%?
Can I see somewhere that R-Studio did not crash and is still working?

2. Is it the best setup regarding performance to use 2 HDDs or should I extract the files on the same HDD where the image is stored on?

3. I could use instead of the NUC also my PC with Intel Core i5 6400 (4x 2.70GHz), 16GB RAM, 512 GB SSD.
Obviously more resources might speed-up the data recovery, but what is the benefit with above configuration in time saving?
Does R-Studio benefit much from more CPU speed, 16GB instead of 8 GB RAM, SSD instead of HDD as boot media at all?

4. What if I use a USB-Stick instead of the SSD?
Would R-Studio for Linux significantly slow down, because it swaps heavily data during recovery on its boot media?

5. Does the Linux distribution (Debian, Ubuntu, Fedora) has any impact on the recovery performance?
Is there a huge difference in performance between Server vs. Desktop image or between the desktop environments?
In theory the server image has less running processes and R-Studio might benefit from it.
What is the expected performance gain in time saving?

I know that these questions might be better addressed to R-Studio's technical support, but maybe someone has already test results or experience regarding performance, he can share here.

Thanks in advance.
HTF

DiskTuna
Posts: 15
Joined: Mon Oct 12, 2020 11:21 am

Re: HW impact on recovery performance

Post by DiskTuna » Fri May 07, 2021 4:10 pm

FWIW, I am at 2% of 2 TB in under 3 minutes (external drive USB3).
CPU is hardly used, this may depend on what it is finding and this is during initial scan. It may need more CPU at some other point. I do not know how well R-Studio is optimized for multithreading, if it is you'd generally chose cores over speed. If it's largely single threaded cores don't matter, speed does so then those Celerons aren't the best choice. Memory is an issue once it (or any file recovery tool) starts finding millions of files, in 32 bit era my software started run out of memory when we started hitting 10 million files if I recall correctly after booting Windows with /3GB switch. If you have an image file in general it's best to have one drive for the image file and another to save data to. Maybe put them in different SATA ports too, not sure if it matters. I have no idea what a NUC is.

Post Reply