Nerds2You

Why Is Data Recovery So Expensive?

Computers may seem smart, but in reality, they are not much different from any other kind of appliance that runs on electricity.

Any appliance can malfunction, because electricity is not a reliable energy source. Most of the malfunctions are so small their effects may not even be noticed, but some can create really big problems.

Even worse problems can occur due to activity from malicious attacks, or even innocent mistakes made by your own users. Ransomware is the most serious recently emerged threat, though it is easily overcome by knowledgeable computer administrators.

You may occasionally hear phrases like “catastrophic data loss”, but until it happens to you, the true meaning of these words may not be fully understood. “Catastrophic data loss” really is catastrophic.

How long could your organization continue to function if you lost access to mission critical data? Would you be able to serve your customers? Would you be able to do your accounting properly? Would you be able to pay your employees?

Data Recovery

Data recovery is always expensive, because it’s very labour intensive and requires highly skilled technicians to perform the task.

The first step is to run diagnostics to determine why the data is unavailable, unless that information is already known. Once the cause has been identified, this provides guidance to possible remedies.

If it’s simply a matter of accidental deletion, this is the easiest problem to fix most of the time, and will be far less expensive than if the problem was due to file corruption or a hardware failure of some sort.

If the problem was caused by file corruption, then the data can be successfully recovered (or partially recovered) most of the time, but it requires deconstructing the file, discarding the bad bits, and then reconstructing the file. As you can imagine, that process can take some time, and you have to multiply that time by the number of affected files.

The next level of fault is where a hard drive controller board fails. This is a PCB that sits on top of the drive and can actually be removed and replaced. If the drive is rare, it can be very expensive to replace the damaged controller board. It’s also easier to do with newer drives than with very old ones.

Any fault requiring physical access to the drive platters will be the most expensive and difficult problem to fix (typically upwards of $25,000), and the drive itself will need to be replaced because once it’s opened, it can’t be reused. Recovering data in this way is difficult. It requires special hardware, skilled technicians, and the success rate is very low.

Real-Time Incremental Offsite Backups

Data loss should be considered an eventual inevitability. Another inevitability is that data recovery will always cost much more than keeping proper backups of your data files.

The old fashioned way to back up data was to schedule it to take place at a time when nobody was likely to be using the system for work purposes. This method is no longer valid due to the much larger storage capacity of modern drives (so full system backups take too long to complete), and because many organizations may have people performing work tasks at any hour of the day or night.

A much better way to create backups exists now. This method backs up all your data files initially, and then later backs up any files you make changes to. The backup is fast and efficient, because only files that are being modified get backed up. This is how it works:

  1. Backup software detects that a user has opened a file.
  2. Backup software makes a temporary copy of the file on the local machine.
  3. When the user closes or saves the file, the backup software compares the checksum of the temporary copy to the original file.
  4. If the checksums are different, a new temporary copy of the file is made and uploaded to the cloud as a version of the file.
  5. The temporary files created in the process are deleted.

The benefits of backing up files this way include:

  • The file is only backed up if it is changed – backing up a file that is identical to an already stored version of the file would be a pointless waste of resources.
  • The backup happens instantly.
  • Very large files can be opened and worked on even if the backup hasn’t fully completed, because only a copy of the file is being uploaded.
  • Multiple files can be opened and worked on at the same time, and the process is the same for each one.
  • The software only consumes resources when it is doing something, the resource consumption while working is minimal, and the uploads can (in most cases) be throttled to avoid bandwidth congestion.
  • Multiple versions of files are available, and you can roll back to whichever version is the last known “good” version of the file.
  • The backed up files are stored offsite, so even if there was some kind of crisis in your workplace where all the computers became unusable (fire, natural disaster, theft), you would still be able to access your data once you had access to a replacement computer.

The backup software also detects when new files are downloaded or copied from external media, and makes backups of those files, too. You can instruct the software not to do this if you would prefer that it did not.

Nerds-2-You makes it easy to protect your data

Nobody wants to pay those high data recovery costs. The best way to avoid that is by making sure your data is properly backed up. The best way to get your data backed up is with help from Nerds-2-You. Call us to find out more.

Share:

Request A Quote

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

More Posts

Scroll to Top