Skip to main content

Hi!



I have a client who wants me to back up ALL the files on an old Ubuntu 14.04.6 LTS system to box.com. It’s a headless server (no gui) and they need every single file.



There’s about 850 GB across 1,948,569 files in there. The largest file is only 2.4 GB. There are a few directories with a large number of files in them. The biggest count is 457,703 files in a single directory.



They say they have unlimited box.com space and do not want to setup storage with another provider.



I’m not sure I can run the box cli on this version of linux, and from what I’ve read, it doesn’t seem like it would be appropriate for something this big.



I feel like FTPS would be the right thing (for box), but I can’t login to it from my personal account.



Any suggestions would be appreciated.



Thanks!

Hi @avi , welcome to the forum!



The Box CLI does run on linux, question is does it run on 14.04.



The minimum node version is 14, but I’m not sure ubuntu 14 runs node 14, you might need to do some testing.



FTPS is also a possibility, here is a note on Using Box with FTP or FTPS.



RCLONE might also be an option and it does have a Box plugin. You might need to fiddle with the authentication.



If you require more intelligence in your script we do have a Python SDK that requires Python 3.6 as minimum. The Next Gen Python SDK, requires Python 3.8 and above.



Let us know if this helps.


@rbarbosa - thanks for your help. Here’s what happened.



The CLI ended up being a dead end for me. It wasn’t the CLI’s fault though 🙂 Some of the node dependencies weren’t able to build on the server.



I ended up using RCLONE with the Box plugin. No issues with the installation, and I was able to copy files from the server to their shared folder without having to upgrade my own account.



Setting up auth was straight forward from reading the box plugin instructions.



This was the command I put in to a rclone_dir_to_box.sh shell script.



rclone copy /$1 box_remote:/server-upload/$1 --links --log-file=/var/log/rclone/$1.log --log-level DEBUG --exclude=$2



This let me target specific directories to transfer, instead of just copying from / . The logs were incredibly helpful. I could see that the box plugin was smart enough to understand box rate limiting and back off when it hit it.



There’s a --no-check-dest flag that will skip checking the destination to see if a file already exists. I didn’t use this flag because as far as I could tell, rclone doesn’t save state to pick up where it left off and I knew I’d have to start and stop a few times.



Time to upload 1,948,569 files totalling 850GB? So far, six days and counting. But based on the progress so far, I expect to be done today.



I didn’t try FTPS, although maybe I will if I have to do this again to see if there’s any difference in speed/time.



Thank you again for your rclone suggestion. I hope my response was helpful and can be of use to someone else.


This is awesome @avi , glad it worked for you.



I’m sure the rclone folks will also appreciate the feed back.



We had a short interaction with @ncw from rclone, you can take a look here.



Always a pleasure to see folks using open source tools.



Cheers


Reply