Page 3 of 3 FirstFirst 123
Results 21 to 29 of 29
  1. #21
    Join Date
    Mar 2015
    Posts
    20

    Default

    I'm very satisfied with the evolution of Beyond Compare, that's why I want to revive this old thread, because I now I will be listened.

    I want to add an idea to the queuing of operations topic (in particular massive copies).

    We had concurrent background copies with version 3 and pause/cancel button with version 4.
    I think that with very little effort you can achieve serialization of copies: one single stream of writes to the devices.

    My scenario:
    Some times I have to scroll through a very long list of file and copy a few of them (say 5-10%). I ctrl-click select the chosen ones while scrolling and copy them. But I don't ctrl-click all of chosen files because is indeed easy to loose the selection. So I batch a dozen or so and enqueue them for copying.
    I end up with many partial batches running in the background but this as you know represents a performance problem in some situations.
    My habit then is simply to pause all of the copy operations except the last one, and when this is done un-pause another one.
    Many words for a very simple concept.

    What I suggest is to make this automatic with a very simple option:
    [serialize queued operation] add operations paused and unpause only one operation at a time
    I expect the implementation to be simple: adding the operation pause involves just reading the options.
    Un-pausing on single operation can be approchead in at least two ways: hooking up at the "operation complete" event (the one the takes away the line frome the queue window) and un-pause a single operation.
    Second way have a small background thread that once everey 5 seconds checks how many operations are running and if they are <0 un-pause an operation. I'd personally go with the first one because it is more versatile and more lightweight: even if my proposed option is turned on, the user can interactively and without any ambiguity pause alla operations or start them all. Even the race condition when two operation end in the very same instant produces only two new operations un-pausing instead of one (no damage).

    BTW this is not an original idea, it is already implemented in GetRight (which is a very good old download manager). There the feature is called "Automatic Downloads" and addresses not only efficiency, but also servers' concurrency limits; you decide how many concurrent downloads to have maximum and the pause and resume is automatic.

    Let me know what do you think about it.
    Thank you

  2. #22
    Join Date
    Oct 2007
    Location
    Madison, WI
    Posts
    11,787

    Default

    The ability to pause/resume was a new feature we just added in BC4, and we are aware and looking into what it would take to add a serial transfer like the one you describe. Thanks for the detailed suggestion; I'll add it to our notes on the subject.
    Aaron P Scooter Software

  3. #23
    Join Date
    Mar 2015
    Posts
    20

    Default

    Thank you Aaron

  4. #24
    Join Date
    Mar 2010
    Posts
    7

    Default It's been 4 years on this thread, please give it some more weight

    My issue is I often find that I start multiple folder copies containing large files that can take hours to complete. I love BC4, as it gives me a way to "resume" copies if I stop them since it can compare and just handle the files it still needs to copy.

    BC4 runs the copies in parallel, so the issues I run into are:
    • I can stop each thread, but I could be halfway thru say 2-10 copies, which I can cancel lose the current files that were in the middle of copying.
    • I can pause each thread, and wait for all the current file copies to complete, which could take quite a bit of time.
    • When I stop or pause, I end up with multiple folders that are "half done" ... which I know I can pick back up, but until I do, they are not useable.


    If BC4 had a queuing capability, the difference would be:
    • Stopping the copy would only lose the copy data from the single current file.
    • Pausing the copy would only require waiting for the current single file to finish copying.
    • When stopping or pausing multiple folder copies, the folders that completed would be complete and useable.


    The only advantage I see to parallel copying, is if you are copying from different source/destination storage. I would like to see BC use a single copy queue for each source/destination.

  5. #25
    Join Date
    Oct 2007
    Location
    Madison, WI
    Posts
    11,787

    Default

    We appreciate the feedback. Enhancing our copy functions to allow for better queue management is still something on our wishlist. It isn't that we think parallel copying is always better, but that we have limited resources to tackle all the various enhancements and bug fixes. We have been pretty busy which you can see on our changelog, and we just released BC4.1 which includes 64bit support. I'll add these notes to our entry on the subject.
    Aaron P Scooter Software

  6. #26
    Join Date
    Oct 2007
    Location
    Pennsylvania
    Posts
    1,772

    Default

    Quote Originally Posted by Michael Bulgrien View Post
    All we really need is the ability to pass the path of the opposite side of an orphan file in an Open With definition!
    I wonder if this open with enhancement request would have faired better in its own thread. The development effort to pass a folder path as a parameter when the path in the opposite pane exists but the file does not is so trivial that it could be added in a day. The integration possibilities it would open up to your user base, on the other hand, would be limitless.

    I'm having a hard time understanding the push back on this one. We live in the era of big data. Beyond Compare doesn't move big data well. The ironic thing is that it doesn't need to when there are numerous other utilities out there that already do it well, but lack the folder compare front end. Just add the additional parameter and let your users build integrated solutions of their own where the strengths of one tool can compensate for the weaknesses of another.

  7. #27
    Join Date
    Oct 2007
    Location
    Madison, WI
    Posts
    11,787

    Default

    Hello Michael,

    No better, as I separated it into its own tracker entry and it isn't a single issue by forum thread. We've been quite busy with BC 4.1, which is a very large release. The implementation might not be tricky but we would also need to be certain it is very robust; any errors or bugs would come back to us as we work as a front end to another copy process, across the different Profile types, etc. It really is a matter of bandwidth and some items are difficult to find a schedule slot for even if they appear smaller. I bumped it up a bit, adding your notes.
    Aaron P Scooter Software

  8. #28
    Join Date
    Oct 2007
    Location
    Madison, WI
    Posts
    11,787

    Default

    Also, what types of transfers are you having trouble with when dealing with "big data"? Is this using the new 64bit client?
    Aaron P Scooter Software

  9. #29
    Join Date
    Oct 2007
    Location
    Pennsylvania
    Posts
    1,772

    Default

    By the time I began posting in this thread in April 2011, I already had more than one workable unbuffered file copy solution using alternate utilities, thoroughly tested by passing parameters to the Windows command line from a vbScript or batch file. In November 2011, I detailed how BC3 could serve as the front end to those solutions. All that was needed was for the target path to be available as an Open With parameter for orphan files.

    That being said, I'm working in the same environment that I was working with in 2011: My team moves files that range in size from hundreds of gigabytes to several terabytes on Windows Server 2008 R2 Enterprise edition (64-bit) with 96 GB memory.

    It doesn't matter what version of the BC4 client we use; normal buffered Windows copies don't cut it either. Microsoft's recommendation for large file copies in 2011 was unbuffered file I/O. 4+ years later, I still can't pass a path in a Beyond Compare Open With call to make it happen.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •