Before posting, and to avoid disappointment, please read the following:

  • This forum is not for 2BrightSparks to provide technical support. It's primarily for users to help other users. Do not expect 2BrightSparks to answer any question posted to this forum.
  • If you find a bug in any of our software, please submit a support ticket. It does not matter if you are using our freeware, a beta version or you haven't yet purchased the software. We want to know about any and all bugs so we can fix them as soon as possible. We usually need more information and details from you to reproduce bugs and that is better done via a support ticket and not this forum.

How to avoid previous reading of many files

For technical support visit https://support.2brightsparks.com/
Post Reply
cmcruz007
Knowledgeable
Knowledgeable
Posts: 22
Joined: Fri Jun 11, 2010 8:18 am

How to avoid previous reading of many files

Post by cmcruz007 »

I want to configure a replica between source and destination that now are exact.
Source and destination are huge and contain many small files (1TB).
Daily 40-50 files will be added to source and they will have to be copied to destination.
What is the most efficient way to configure the job to avoid initial reading?

Thanks
Swapna
Expert
Expert
Posts: 1031
Joined: Mon Apr 13, 2015 6:22 am

Re: How to avoid previous reading of many files

Post by Swapna »

Hi,

Sorry, its not possible to copy files without scanning and comparing files on Source & Destination locations.

Thank you.
reynolds_john
Newbie
Newbie
Posts: 1
Joined: Tue Jun 10, 2008 6:18 pm

Re: How to avoid previous reading of many files

Post by reynolds_john »

You could consider turning on Fast Backup to reduce some overhead, but read the FAQ carefully.

If you're looking to not to comparisons over a network (for example), you may try something like DeltaCopy (basically rsync for Windows).

http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp

It's inexpensive, but you need to know what you're doing. It works on Windows versions higher than it states; I should know, we're using it to back up thousands of files from many different machines. It won't totally absolve you of doing file comparisons, but since it is a client/server approach, the speed and block transfer is very good.

A properly configured Robocopy job can blow through enormous numbers of files quickly, depending on your setup.

I've used Fast Backup with SyncBack for a number of years without any issues.
Post Reply