Suggestion: Queued transfers

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • beelsr
    Journeyman
    • Dec 2003
    • 11

    #16
    although it doesn't need the bump, I'm adding a vote to bump these three (queued sequential, non-buffered copy and Michael's open with) up the priority list.

    for years, probably the biggest issue has been system non-responsiveness during large copy/sync operations. It's been so endemic, my workflow has adapted to it - thinking it was unavoidable and "just the way it works". Having to move these operations to off hours and/or in sequential order led me to scripting jobs (good thing to learn, I admit, but...) to run overnight but some things do need to be done during the day. It can become a painful process to manage the copies and syncs manually when it would be SO much easier to set the jobs up and have them run on their own. I'm talking hours per week here...

    Besides, having multiple operations run in parallel just thrashes the dickens out of the drives. On a physical box, it's a killer.

    Comment

    • Justy
      New User
      • Dec 2012
      • 1

      #17
      Bump - any news on these matters?

      Comment

      • Chris
        Team Scooter
        • Oct 2007
        • 5538

        #18
        This is still on our wish list for a future version.
        Chris K Scooter Software

        Comment

        • RPM
          Visitor
          • Oct 2011
          • 6

          #19
          This request is still current - it was first lodged in 2008! Ref: http://www.scootersoftware.com/vbull...ead.php?t=4102

          Any update? This is particularly necessary when writing to media such as SD cards, that grind to a halt when performing multiple simultaneous copies.

          Comment

          • Aaron
            Team Scooter
            • Oct 2007
            • 16000

            #20
            It is not supported in the current version. If dealing with a target such as an SD card, I would suggest performing only one copy action at a time. If you perform a selection of multiple files, that single transfer would queue them and perform them one at a time.
            Aaron P Scooter Software

            Comment

            • cyberchicken
              Enthusiast
              • Mar 2015
              • 26

              #21
              I'm very satisfied with the evolution of Beyond Compare, that's why I want to revive this old thread, because I now I will be listened.

              I want to add an idea to the queuing of operations topic (in particular massive copies).

              We had concurrent background copies with version 3 and pause/cancel button with version 4.
              I think that with very little effort you can achieve serialization of copies: one single stream of writes to the devices.

              My scenario:
              Some times I have to scroll through a very long list of file and copy a few of them (say 5-10%). I ctrl-click select the chosen ones while scrolling and copy them. But I don't ctrl-click all of chosen files because is indeed easy to loose the selection. So I batch a dozen or so and enqueue them for copying.
              I end up with many partial batches running in the background but this as you know represents a performance problem in some situations.
              My habit then is simply to pause all of the copy operations except the last one, and when this is done un-pause another one.
              Many words for a very simple concept.

              What I suggest is to make this automatic with a very simple option:
              [serialize queued operation] add operations paused and unpause only one operation at a time
              I expect the implementation to be simple: adding the operation pause involves just reading the options.
              Un-pausing on single operation can be approchead in at least two ways: hooking up at the "operation complete" event (the one the takes away the line frome the queue window) and un-pause a single operation.
              Second way have a small background thread that once everey 5 seconds checks how many operations are running and if they are <0 un-pause an operation. I'd personally go with the first one because it is more versatile and more lightweight: even if my proposed option is turned on, the user can interactively and without any ambiguity pause alla operations or start them all. Even the race condition when two operation end in the very same instant produces only two new operations un-pausing instead of one (no damage).

              BTW this is not an original idea, it is already implemented in GetRight (which is a very good old download manager). There the feature is called "Automatic Downloads" and addresses not only efficiency, but also servers' concurrency limits; you decide how many concurrent downloads to have maximum and the pause and resume is automatic.

              Let me know what do you think about it.
              Thank you

              Comment

              • Aaron
                Team Scooter
                • Oct 2007
                • 16000

                #22
                The ability to pause/resume was a new feature we just added in BC4, and we are aware and looking into what it would take to add a serial transfer like the one you describe. Thanks for the detailed suggestion; I'll add it to our notes on the subject.
                Aaron P Scooter Software

                Comment

                • cyberchicken
                  Enthusiast
                  • Mar 2015
                  • 26

                  #23
                  Thank you Aaron

                  Comment

                  • moymike
                    Visitor
                    • Mar 2010
                    • 8

                    #24
                    It's been 4 years on this thread, please give it some more weight

                    My issue is I often find that I start multiple folder copies containing large files that can take hours to complete. I love BC4, as it gives me a way to "resume" copies if I stop them since it can compare and just handle the files it still needs to copy.

                    BC4 runs the copies in parallel, so the issues I run into are:
                    • I can stop each thread, but I could be halfway thru say 2-10 copies, which I can cancel lose the current files that were in the middle of copying.
                    • I can pause each thread, and wait for all the current file copies to complete, which could take quite a bit of time.
                    • When I stop or pause, I end up with multiple folders that are "half done" ... which I know I can pick back up, but until I do, they are not useable.


                    If BC4 had a queuing capability, the difference would be:
                    • Stopping the copy would only lose the copy data from the single current file.
                    • Pausing the copy would only require waiting for the current single file to finish copying.
                    • When stopping or pausing multiple folder copies, the folders that completed would be complete and useable.


                    The only advantage I see to parallel copying, is if you are copying from different source/destination storage. I would like to see BC use a single copy queue for each source/destination.

                    Comment

                    • Aaron
                      Team Scooter
                      • Oct 2007
                      • 16000

                      #25
                      We appreciate the feedback. Enhancing our copy functions to allow for better queue management is still something on our wishlist. It isn't that we think parallel copying is always better, but that we have limited resources to tackle all the various enhancements and bug fixes. We have been pretty busy which you can see on our changelog, and we just released BC4.1 which includes 64bit support. I'll add these notes to our entry on the subject.
                      Aaron P Scooter Software

                      Comment

                      • Michael Bulgrien
                        Carpal Tunnel
                        • Oct 2007
                        • 1772

                        #26
                        Originally posted by Michael Bulgrien
                        All we really need is the ability to pass the path of the opposite side of an orphan file in an Open With definition!
                        I wonder if this open with enhancement request would have faired better in its own thread. The development effort to pass a folder path as a parameter when the path in the opposite pane exists but the file does not is so trivial that it could be added in a day. The integration possibilities it would open up to your user base, on the other hand, would be limitless.

                        I'm having a hard time understanding the push back on this one. We live in the era of big data. Beyond Compare doesn't move big data well. The ironic thing is that it doesn't need to when there are numerous other utilities out there that already do it well, but lack the folder compare front end. Just add the additional parameter and let your users build integrated solutions of their own where the strengths of one tool can compensate for the weaknesses of another.
                        BC v4.0.7 build 19761
                        ¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯

                        Comment

                        • Aaron
                          Team Scooter
                          • Oct 2007
                          • 16000

                          #27
                          Hello Michael,

                          No better, as I separated it into its own tracker entry and it isn't a single issue by forum thread. We've been quite busy with BC 4.1, which is a very large release. The implementation might not be tricky but we would also need to be certain it is very robust; any errors or bugs would come back to us as we work as a front end to another copy process, across the different Profile types, etc. It really is a matter of bandwidth and some items are difficult to find a schedule slot for even if they appear smaller. I bumped it up a bit, adding your notes.
                          Aaron P Scooter Software

                          Comment

                          • Aaron
                            Team Scooter
                            • Oct 2007
                            • 16000

                            #28
                            Also, what types of transfers are you having trouble with when dealing with "big data"? Is this using the new 64bit client?
                            Aaron P Scooter Software

                            Comment

                            • Michael Bulgrien
                              Carpal Tunnel
                              • Oct 2007
                              • 1772

                              #29
                              By the time I began posting in this thread in April 2011, I already had more than one workable unbuffered file copy solution using alternate utilities, thoroughly tested by passing parameters to the Windows command line from a vbScript or batch file. In November 2011, I detailed how BC3 could serve as the front end to those solutions. All that was needed was for the target path to be available as an Open With parameter for orphan files.

                              That being said, I'm working in the same environment that I was working with in 2011: My team moves files that range in size from hundreds of gigabytes to several terabytes on Windows Server 2008 R2 Enterprise edition (64-bit) with 96 GB memory.

                              It doesn't matter what version of the BC4 client we use; normal buffered Windows copies don't cut it either. Microsoft's recommendation for large file copies in 2011 was unbuffered file I/O. 4+ years later, I still can't pass a path in a Beyond Compare Open With call to make it happen.
                              BC v4.0.7 build 19761
                              ¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯

                              Comment

                              Working...