Speeding up CrashPlan Backups

UPDATE

See my new post here for some scripts that will edit the config files and restart the CrashPlan service for you automatically!!!

Another Update:

If my fix doesn’t work for you, try changing the server that you’re connecting to, as described HERE.  Thanks to reader Jorgen for his comment with this tip!  This will, unfortunately, mean starting your backups over from scratch.  :(

TL;DR:   CrashPlan’s dedup algorithm is a bottleneck on faster network connections and/or slower CPUs.  Changing a single setting inside an XML file pretty much disables the algorithm and makes things not slow to a crawl over time.

So, I recently got a shiny new DSL line installed that has an awesome 10Mbps upload speed — 10x faster than my cable connection, and the fastest upload speed of any residential service available to me that doesn’t have ridiculously low usage caps (thanks TekSavvy for your unlimited plans!).

I decided to take advantage of this increased network capacity to expand the amount of stuff that I backup using CrashPlan+ — due to my relatively slow upload speeds previously, I only backed up critical documents, leaving stuff like pictures from my SLR camera to my local backup system.   With 10Mbps up… no longer!   <insert BACKUP ALL THE THINGS meme image here>

Anywho… things went great.  At first.   I added a ton of files to my backup set, and off CrashPlan went, uploading at 9.6Mbps… then 9.4Mbps…. then 9Mbps… then 8Mbps… then 7Mbps…. and down and down and down… when it hit the 3Mbps mark and kept going down (despite me tweaking the compression/deduplication/network buffer/other settings), I figured an email to CrashPlan support was in order.    The reply I received, while professional and clearly NOT just a template response, was basically “Well, 3Mbps is better than most of our users get;  it’s a shared network;  you’ve already done everything I can recommend.”   That was clearly not what I was hoping for.  I should also note that one of the things I noticed and mentioned to support was that the CrashPlan process was taking up 100% CPU time on one core, indicating it might be CPU-limited.

So I set about gathering real stats, and thanks to CrashPlan’s built in logging… I had a whole bunch of data points.   Using a bit of grep, awk and sed (the 3 sweetest words I know next to Perl ;) ), and a bit of Excel charting, I came up with this:

crashplan-speed

Hmm.  Interesting.  Anyone who’s studied computer science should be screaming “O(ln(n))” right now — this type of logarithmic decay screams “algorithm performance issue”.  You simply do not, ever, see this pattern due to overloaded network capacity.   This type of pattern, coupled with 100% CPU use, told me there was something wrong with one of CrashPlan’s features, and I strongly suspected the de-duplication functions, because I already had compression turned off (I knew most of my data wasn’t compressible) and encryption algorithms don’t decay like that.

So… I emailed CrashPlan support again with this evidence (along with a whole bunch of CompSci geek reasoning), and was told, basically, “you’re already faster than most of our customers, if you don’t like it, find another provider”.   Yikes.

Talking with a few other people I know who use CrashPlan with large (multi-terabyte) data sets… this seems to be a common problem.

So… being a geek, I figured I’d check to see if I could do anything other than spend a bunch of money to upgrade the CPU in that system.

I navigated to the CrashPlan configuration directory, /usr/local/crashplan/conf/ (this is on Linux; Windows users, you’ll have to figure out there this is yourself, sorry!), and started digging.  I stumbled across this gem in the file my.service.xml :

<dataDeDupAutoMaxFileSize>1073741824</dataDeDupAutoMaxFileSize>
<dataDeDupAutoMaxFileSizeForWan>0</dataDeDupAutoMaxFileSizeForWan>

Hmm.  Interesting.  “0″ is often used as a metavalue that means “unlimited”… so maybe CrashPlan will ALWAYS dedupe EVERYTHING when going over a WAN link, but only dedupes files smaller than 1GB when going over a LAN link, presumably because they recognize that there’s a balance between CPU capacity and network capacity.  It seems that they assume everyone has ridiculously slow upload speeds that are typical of most residential Internet connections.

Being the smarty that I am, I figured I’d set it to not dedupe any files larger than 1 byte when going over the WAN:

<dataDeDupAutoMaxFileSizeForWan>1</dataDeDupAutoMaxFileSizeForWan>

(Note:  If you use backup sets, you will have more than one of these lines, one per backup set.  I suggest changing them all;  I have not done any testing to see what happens if you disable it only for some of the backup sets.)

I then restarted the CrashPlan engine (/etc/init.d/crashplan restart).

And… VOILA.    My CPU usage dropped from 100% of 1 core down to ~10% of 1 core, and my upload jumped from 2.5Mbps (and dropping) to 7.5Mbps (and holding/fluctuating between 6.9Mbps and 7.5Mbps).   I’m now seeing patterns that more accurately cover the “variable network bandwidth due to shared service” theory, without seeing logarithmic decay in performance.

I have confirmed with other people that have noticed slowdowns on large data sets that this fix works for them as well, so I can confirm it’s not just something weird on my system.

I updated my 2nd ticket with CrashPlan support with this information, suggesting that they expose the max-file-size-for-dedupe inside the client, rather than making people go dig through XML files.

 

Update: Once I reached non-peak hours, I’m starting to see upload speeds >9Mbps again as well.  Yay!

Update 2:  Check out my followup blog post with an updated traffic graph HERE 


83 Responses to Speeding up CrashPlan Backups

  1. The Windows 7 x64 path is \ProgramData\CrashPlan\conf

    Thanks VERY MUCH for posting this fix.

    • Avatar alter3d
      alter3d says:

      @tcw – You’re very welcome! And thanks for posting the Windows path. Since it’s under “Program Files” and not “Program Files (x86)”, I’d assume that the path would be the same on 32-bit as well.

      • Just a note that it’s “ProgramData”, not “Program Files”. It’s sort of an “all users” profile folder for Windows (since Vista), whereas the “Program Files” folders contain program binaries and whatnot.

  2. Avatar ValDvor
    ValDvor says:

    I’m never going back to crashplan after my latest glitch with their system that caused me half of my data to get corrupted. I’m sticking with Zoolz Unlimited.

    • Avatar alter3d
      alter3d says:

      That definitely sucks! :( I just took a quick look at Zoolz, and unfortunately they don’t have a Linux client, which is a deal-breaker for me.

      No matter what backup provider / software / process you use, it’s a good habit to verify your backups by doing test restores once in a while, just to make sure that the process works and the data isn’t corrupted. I’m used to doing this because I’m an IT geek :) but it’s definitely not something most people think of.

  3. Avatar Cleese
    Cleese says:

    Thank’s a million for this! I’ve been running crashplan on my linux-server and have had the exact same issue which I’ve been trying to resolve for months. Finally the support team directed me here :)

    Indeed, now my backup’s are running at 8+Mbit/s rather than ~2.5Mbit/s!

    Again, thanks a lot for posting this!

  4. After setting up a few new backups, I get ~ 40 Mbps speeds pretty consistently over a 65 Mbps connection, where I was lucky to get 1.2 Mbps before (it appears my 710+ is CPU limited near that rate, as CPU fluctuates between 60 and 90 percent during uploads). I have compression off and encryption, realtime monitoring and backup of open files on.

    The drawback is that analyzing files seems to take about twice as long as it did before disabling dedupe, when a share goes “Missing” and has to be re-verified when it is reconnected (this is almost certainly CPU limited). I hope CrashPlan reads some of this feedback and fixes their dedupe algorithm, so we can turn it back on someday and it actually works as designed.

  5. Avatar Remco
    Remco says:

    Thank you! This little tip has quadrupled my upload speed!

  6. Avatar douglas
    douglas says:

    This.post.rocks.

  7. Avatar Olivier
    Olivier says:

    I found the file in Windows but not the reference you mention.
    Should I just add it, and if so where exactly in the file?

    I need this a lot! Thanks !!

  8. Avatar Florent
    Florent says:

    Dear Alter3D

    Thanks for your post. I’m using Crashplan from a Fiber To The Home connection with a … 200 Mpbs upload speed (a 700 Mbps down) from Paris, France. However, I still get the same caping experience at 6 Mbps and a continuous 1 to 2 Mbps. It’s kind of frustrating ;-)

    I am a Mac user and I have tried to locate your file but I only come to a file which is “default.service.xml”. I’m not a geek but I can a make a change in a file. Here is an extract of this file :

    true

    37.5
    25
    2
    ANY

    Would you or anyone here have any knowledge on how to work this out ?

    Thanks a lot

    Florent

  9. Avatar mexicnamike
    mexicnamike says:

    Florent,

    On the mac, the my.service.xml is in /Library/Application Support/Crashplan/conf

  10. Avatar synobill
    synobill says:

    Thank you so much for this!

    Does anyone have an idea how to find/access and then edit the my.service.xml file if we are using CrashPlan as a headless client on our Synology DiskStation?

    Much obliged.

    • Avatar JackDan
      JackDan says:

      Look under /volume1/@appstore/CrashPlan/conf, this is the place i found the file.

      • Avatar Airplanedude
        Airplanedude says:

        Thanks All.
        I’m running the headless client on a Synology Diskstation 1513+.
        I modified the file per the interactions provided by the OP, at the location Dan pointed to, /volume1/@appstore/CrashPlan/conf.
        (Before I did this I had changed the file on the local machine – then realizing that it probably has to be done on the Diskstation itself. At any rate, the local machine still has the edited file.)
        I do not see an increase in effective upload speed.

        Anyone out there who has experience running Crashplan off a NAS and using the config file edits described by the OP?
        Pointers highly appreciated.

        • Avatar Bashyroger
          Bashyroger says:

          Hi Airplanedude,

          Ik just tried this on my synology 211+ and it worked perfectly! My upload changed from around 1 to 5,6 Mbps. The only think I did was changing the contents of the dataDeDupAutoMaxFileSizeForWan XML tag from 0 to 1 on the Synology and restart the service…

          • Avatar potatohead
            potatohead says:

            Hi Bashyroger,

            I’m trying to do this via SSH with the admin account on my Synology DS1511+ and when I try to edit my.service.xml with vi, I’m told it’s read only and. I’m a linux newbie, so I don’t know my way around file permissions. I’ve tried chmod to both the conf directory and the my.service.xml file, but I’m dissalowed: permission denied.

            Any ideas?

            Thanks in advance,
            pb

  11. Avatar shaun
    shaun says:

    Using a mac, backing up my iTunes library and a few aperture libraries~800 gb..edited the file and saw an increase in backup speeds…unfortunately not to the point you saw…wondering if I should unapt and start afresh..
    thx for the tip…I don’t think I’ll be going back t crash pan once my 3yrs are up..next yr..

  12. Avatar Mikel
    Mikel says:

    Dude, thank you so much!! I’ve been uploading for about a year now and am at 7.7 TBs out of 7.9 TBs. I’ve hesitated to add media to my server because the upload would never catch up at the rate I add media, but not anymore! I was getting a solid 1.2 Mbps for months, but as soon as I made the change illustrated above, it jumped up over 8 Mbps and it holding steady. Thanks a million!

  13. I’m running crashplan on a synology NAS, with a 10mbps upload speed fiber connection.

    Some services can easily go up to 10mbps, but crashplan doesn’t, even with your trick, it seems that changed nothing, stuck at 2mbps max.

    My cpu is as high as 6% and my memory usage at 10%.

    Any ideas or people in the same situation ?
    Thanks !

  14. Avatar Rodney
    Rodney says:

    I tried adding 1 to my windows 7 x64 box, but no luck. What am I missing? Please don’t laugh, I am getting 106 Kbps. Yes K not Mb. Please help. My speeds that I get are 30Mb down and 5 Mb up.

    • Avatar Jeff
      Jeff says:

      I made the change to the file as posted and crashplan started uploading my whole data set all over again – 800 gigs. Its averaging about 6mbps but going to take days to get caught up again.

    • Avatar Rodney
      Rodney says:

      I made the change, but I am not seeing any increase in upload speeds.

  15. Avatar Oskar
    Oskar says:

    Any mac users here care to elaborate where to find the right file? My /Library/Application Support/Crashplan/ contains only 1 file, ui.properties.

  16. Thanks for this tip! I went from 6 Mbps to around 30 Mbps!

  17. I made the change (windows 7 x64 box) but it made no difference. I have 8Mbps upload speed but I’m still only getting 2.5Mbps upload to CrashPlan.

    Incidentally I had to run Notepad++ as administrator in order to save the change to my.service.xml

  18. I made the change (Win7 x64 box) but it made no difference. I have 8Mbps upload speed but uploads to CrashPlan are still running at only 2.5Mbps.

    Incidentally I had to run Notepad++ as an administrator before I could save the change to my.service.xml

  19. Avatar Brent
    Brent says:

    I’m running the latest Mavericks OS, and there is no directory usr/local/crashplan/conf/ I did the ctrl-shift-G. That’s what it said. In fact, in the usr/local folder, I only have 3 items. I’ve got a blazing fast internet connection with 1tb of data to upload. Could you walk me through how to get to the file and then how to manipulate it? Do I need Terminal? Do I edit the file in TextEdit? Pardon my ignorance. Thanks.

  20. Hey, I am also wondering how to find and edit the mac version file; I don’t want to wait 35 days for this backup to be complete :/ I am getting 1.6mbps right now. I can see the .xml file, but will editing it now make the backup start over?

  21. Hey, I am also wondering how to find and edit the mac version file; I don’t want to wait 35 days for this backup to be complete :/ I am getting 1.6mbps right now. I can see the .xml file, but will editing it now make the backup start over?

    Edit: on second thought, I tried editing the file and it won’t let me, even though I am the admin of this computer. It says I don’t own the file.

  22. Avatar Paul Carpentier
    Paul Carpentier says:

    HI! I loved your article and analysis and it sounded extremely plausible to me, so my disappointment was substantial when I tried it out on my new iMac running Mavericks and found no difference after applying the mod to the XML file. Any insights by anyone would be appreciated!

  23. Avatar Patrick
    Patrick says:

    I edited this file for a Mac. (fyi, someone listed above in the comments where to find it).

    It did improve my speed slightly, but I am running at 1.6 Mbps right now. Prior to the 2 TB level, I was getting up to 10 Mbps consistently. I’ll keep an eye on it. Anything else to try?

    • Same thing here. Running on Windows 7 (i7), upload speed started as 5-7 Mbps, then the next day dropped to 1-2Mbps. I’m uploading for third day now, got to 31GB and the speed dropped to a slow and steady ~500Kbps. CPU does not look busy at all. I have a default.service.xml file but there is no dataDeDupAutoMaxFileSizeForWan tag in it…

      I wonder what’s going on. Any clues are appreciated. I’m testing the service right now (30 days free trial), want to go with full paid CrashPlan, but need to confirm that this is going to work.

      Any help?

      • Avatar Michael
        Michael says:

        I just used this on my Windows 7 x64 and it worked fine – not spectacular but I went from ~1.1Mb/s to ~2.3Mb/s so definitely an improvement. If you are seeing the default.service.xml file instead of the my.service.xml you are probably looking in the \Program Files\CrashPlan\conf directory instead of the \ProgramData\CrashPlan\conf directory. Don’t forget, you need to edit the file as an administrator and must restart CrashPlan (system tray, service, processes, etc.) before it will take effect. Good luck.

    • I was able to find and edit the file my.service.xml on my Mac (Mavericks), in folder /Library/Application\ Support/CrashPlan/conf using sudo vim.
      however, my upload speeds are still fairly slow 2.7 Mbps, I am on a gig connection. Are there any other settings to tweak?

  24. Avatar Justin
    Justin says:

    I made the change you suggested above but am still watch my speed systematically drop to 1.5Mbps. On a 50Mbps fiber connection. So frustrated! I’m on Win 7×64, are there any other settings you recommend changing? I have compression off and data-depulication set to minimal.

  25. Pingback:Network Rockstar | Speed Up CrashPlan Backups: Automagic!

  26. Avatar Ronny
    Ronny says:

    The deduplication option under advanced settings shall be AUTOMATIC in order to make this setting work? I can also choose MINIMAL or COMPLETE deduplication in the settings. Could we just change the setting to MINIMAL instead of editing this xml file?

  27. Avatar Adrian
    Adrian says:

    I have changed my dataDeDupAutoMaxFileSizeForWan but it did not speed up anything, I’m still transferring below 400 Kbps. But the CPU usage went from constantly 100 % to less than 10 %. I’m in Sweden so maybe the distance is part of the problem. I’m using OS X Mavericks.

  28. Avatar Jorgen
    Jorgen says:

    It works! :-) Thanks alter3d !
    I’m on a 36 Mbit / 36 Mbit internet connection in Denmark and my upload went from 200-300 Kbyte/sec. to around 5-9 Mbit/sec. :-) nice!
    ( backing up Synology NAS DS212j )

    • Hey Jorgen, I too have a Synology NAS with Crashplan installed. Did you edit that file on the NAS itself and if so how?

      • Avatar Jorgen
        Jorgen says:

        Hi Alex

        Yes, I enabled SSH on the NAS and used Putty as SSH client from my computer.
        Login as “root” with your admin password.
        Now go to this path: /volume1/@appstore/CrashPlan/conf/
        I edited the file my.service.xml in VI

        / Jorgen

        • Thanks Jorgen. I edited the file, restarted the service and no difference. I’ve even restarted the NAS itself and no improvement. The edit to the file is being saved though so presumably it’s reading it in correctly.
          I’m getting about 1Mbps on a line where I can often get 10Mbps upstream.

          • Avatar Jorgen
            Jorgen says:

            Hi Alex
            Okay, try to set compression to OFF
            In the GUI goto: Settings->Backup->Advanced settings->Configure
            / Jorgen

  29. I’ve now set compression to OFF and still getting the same speed after restarting the server.

  30. Avatar synobill
    synobill says:

    Thanks for the original post and thanks Jorgen — I’ve boosted my upload speed from 1.2 Mbps to 10.6 Mbps. Amazing.

  31. Hi,
    Just downloaded the trial and trying to upload 600-odd Gb. I’ve tried the Crashplan recommendations for changing the default settings when doing initial backup and it helped a lot. However, I’m having trouble following the instructions on this page. I’m on a Windows 7 64-bit laptop and I can’t for the life of me find the programdata folder

  32. Wait, I got it…

  33. Access denied – can’t save the file!! What now? I’m already logged on as administrator in Windows 7 x64. I’m just editing the file with Notepad if that helps…

  34. Holy cr@p! That’s fast! I just peaked at over 61Mbps! Keeps dropping back down to 30-ish days though. This is for a 700Gb upload.
    So for anyone else having same problems as I did on Windows, here’s what I did in a nutshell:

    1, You need to enable hidden folders (Google that – I did). That’s how you find ‘ProgramData’ folder

    2, For administrative access to making changes to the file, open the ‘my.service’ file in Notepad and make the changes. DON’T SAVE. Instead, open Notepad again from Start menu but as administrator – right click, “Run as administrator”. Copy and paste the entire content from the original ‘my.service’ file into the new Notepad file. Save the file. When the dialog box opens, save as “all file types” and click on ‘my.service’ to write over the original file.

    3, Ctrl+Alt+Del, to start task manager. Click ‘processes’ and end Crashplan. Then restart.

    4, Check out Crashplan’s own support pages for tweaking the default settings: https://support.code42.com/CrashPlan/Latest/Configuring/Speeding_Up_Your_Backup

  35. Pingback:CrashPlan Slow Part 2 - Cloud Storage Buzz

  36. Pingback:Making CrashPlan Faster – Need Help! topic | usa2

  37. Avatar Cygnus X
    Cygnus X says:

    i have been a crashplan user for the last 2 weeks, in the beginning in off hours i would get 6-7Mbps (got a 35/8 line) friday night i was at 400 out of 800gb…then saturday when i woke up i noticed the backup had started over from scratch ARGHHHH turned out that acroding to them and i quote”What happened was that two automated processes on our server coincided in such a way that caused your backup archive to become unlinked from your computer’s CrashPlan identity”

    hmmm luckily we where able to get it back to 50% complete, and get my pc linked up again (they gave me 3 free months for the trouble), BUT now my speeds vary between 800kbps and 1.2Mbps

    im on win7 pro 64bit file is located in “C:\ProgramData\CrashPlan\conf\my-service xml” as other have noted you cant save the edited file with note pad, i saved it as something else, then in explorer i deleted the original (windows will ask for admin rights…just click ok) and the file is now gone, rename the file you made to my.service.xml and thats it.

    i stopped the crashplan service and restarted it….no change still .7 to 1.2Mbps sigh…reboot no change, at this speed theres 40+ days to get the other hafl of my backup completed

    comparing uploading speed stuff to youtube i get 8.4Mbps which where it should be

  38. Avatar Rich
    Rich says:

    I have found the file on my mac, but it doesn’t mention deducing. This is the file content below. Does anyone know what i need to change?

    1391693927463 1364274000353 conf/my.log.properties 3600000 900000 CONSUMER true 0.0.0.0:4242 STILL_RUNNING,INTRO /Users/Castellari LOW false IOPOL_THROTTLE 127.0.0.1 4243 0 false eAC5py7ZWV3ZPh4FeyibcR+ScuE=:+HdDDPlYKvg= true 1364274000353 upgrade 15000 4 1 0 65536 65536 524288 1310720 1048576 2621440 20971520 true 2 true false http://www.crashplan.com false 0.0 0.0 0.0 0.0 0 0 true ANY true 4320 true 7200 true 10080 false 4320 false 7200 false 1440 false false SAFE false 20 Default 1 1391693735291 true ON 1073741824 0 AUTOMATIC true 86400000 03:00 true 900000 true 0 1256016278684 1440 15 10080 43200 /Library/Caches/CrashPlan /Library/Application Support/CrashPlan/backupArchives/ false 4096 32768 131072 524288 524288 10485760 1073741824 604800000 300000 60000 1 true 2419200000 true false true 10 30000 30000 4 2 10 80 20 true true 4242538496 0.8 0.8 30000 0.8 false 61 false -1 4200 false -1 621415107467411493 false -1 622032249959219203 false -1 622033376868040760 false -1 1 false 1227212 false

  39. Just viewed my.service.xml on NAS and have no entry for so stuck. Found it in the Mac Client package, but not sure if that does anything?

    Can someone explain in dummy terms, what to check / edit if using Synology headless install with client on OSX 10.9.1

  40. Avatar Philip
    Philip says:

    I am on a mac.

    I typed the recommended command and found the file but when I try to save I get a read only error. How do I resolve this so I can save the file?

  41. Avatar IainB
    IainB says:

    On a Mac (Mavericks), go to Library > Application Support > CrashPlan and locate the my.service.xml file. Open it in TextEdit (don’t just double click on it), and make the line change from 0 to 1
    Save, restart.
    This has tripled my upload speed – still only 1.7mbps, but hopefully this will get better later at night. I am in the UK, on VirginMedia cable with a supposed max upload of 3mbps.

  42. Thank you so much for this, it increased my reported upload speed from 3Mb to ~40Mb! bringing my 1.5 TB upload from 40 days to 2 days

    BUT I dont think these numbers are accurate, because my speedtest testing is consistant with what comcast says my limit is, at 10mb. Either way though, cpu is down to 10% so I know it is working.

    • Avatar alter3d
      alter3d says:

      Right after you restart the CrashPlan service, or after your backup sets start recycling (if you use backup sets), you’ll notice that your backup speed seems crazy high — I’ve seen 600-700Mbps before.

      What this is is the *effective* backup rate, after taking into account scanning existing files for changes, etc. Since CrashPlan has the ability to update only parts of files that had already finished in the past, instead of re-uploading the whole file, but it counts the whole file in its speed calculations (even if it’s just scanning to see if it needs to upload changes and ultimately changes nothing), you see crazy speeds when it’s going through the sync process.

      So basically… seeing numbers way above your line’s capacity is actually normal for a while after restarting CrashPlan or a few other situations… it should settle down in 30-60 minutes, depending how many files you have, how big they are, and how far along you got previously.

  43. FWIW: I updated my linux CP client recently (they have an incompatibility in the latest release where a PC-to-PC backup won’t work until the target is updated to the same version). The default in the xml-config file looks like it’s now 1000000000.

    Not sure why they did that, but that’s what I found.

  44. Avatar nick
    nick says:

    THANK YOU! I have a very large 9TB backup set and have been suffering for 3 years. Back up to max upload speed. Seriously. Thank you!

  45. I don’t use Crashplan but that’s awesome investigative work

  46. Avatar shaun
    shaun says:

    I atarted toying with my settings yesterday as I have 10 months left on my subscription and thought it’d be nice to get a backup set done before then! I’m on fios and have been on the 300kpbs fix for a WHILE when uploading to cplan and get the 40 days to complete set. I did all the changes suggested up until last december on this blog and no go…so moved to back blaze. But last night as I said- I thought I’ll try once more. I changed a few things that sounded better and voila…1.6-1.7 mbps constant(12 hrs now!)….now at 7 days till completion. I’m on OSX mavs and here’s my changes… user away or present=100% CPU (was like 20%), verify every 14 days,, data de-dup minimal, compression auto, no encryption, no watch file system in RT, no b/u open files, in the frequency tab i have all sliders to the left except for remove deleted files which is every 2 days. no limits on lan/wan sending rates…i think all the rest is at defaults…

  47. Avatar Brandon
    Brandon says:

    Great post – Thank you!! This tweak is working for me as well. I’m still migrating everything from my octopus of external drives to the new Synology, but so far I’m doing an initial backup of 3.1TB mostly raw photo data. My CPU and RAM usage hasn’t changed considerably before or after any tweaking, but wow did the speed change.

    DiskStation 1813+ (bootstrapped)
    Client app running on OS X 10.8
    Fiber 75/35mbps pipe.

    -12 hours into the backup I was sitting right where I started, around 7mbps (no noticeable slow down… yet)
    -Altered the my.service.xml file per your suggestion, turned off compression in the app, and
    immediately jumped to ~18mbps.
    -Just checked on it again after another 24 hours, mid-morning on the east coast (probably off-peak for CrashPlan’s servers) and I’m at 36-37mbps. That’s higher than my connection is even rated for, but could be accurate… I don’t think my 35mbps upstream is a hard cap. Amazing.

  48. Avatar chris
    chris says:

    Thanks for the tips everyone. Was able to get Crashplan (running headless on a Synology NAS) uploading much much faster now. Was able to go in via ssh (with no real experience doing linux command line) and manipulate the file on the NAS, using a Mac. Instead of doing like 300kbps, am now doing between 5Mbps and 10Mbps, which is pretty much what my upload speed is rated at. Its still going to take at least ten days, but thats significantly better than the 6 months or whatever it was going to take to upload 1.2gb of data.

  49. Thank you so much for the info. I have been using crashplan for over 3 months and was just about to cancel because the speed was at about 900 Kbps and when trying to upload 1.8 TB of data it was just taking to long (after 3.3 months only at 95% completed and still 14 days to go). I did your change as described and WOW 10.7 Mbps on average now (and only 13 hours remaining). This is with a 105/10 Mbps connection. Thank you again.

  50. Avatar Grant
    Grant says:

    I have 12Mbps upstream but getting max 2.5 to crashplan. Speed tests and uploading to other servers indicate bottleneck not at my end. I tried the changes above (Windows 7) but no matter what I try, it maxes out at 2.5. Guess patience will have to prevail.

  51. Avatar Aaron
    Aaron says:

    I have to say thank you.

    I was on my 3rd data set and it slowed to 2mbps no matter what. Even though my first data sets where 30mbps.

    Now its back up to 30mbps and spikes at 35mbps. Now i can finish my last 2.5 TB quickly and not 7 years. lol

    PS on linux.

  52. Avatar Jambert
    Jambert says:

    I am also on Windows 7 x64 and saw no difference. Also the speed in the crashplan GUI is wrong, if I check my router it hovers around 2 Mbps (I’m in Europe), the GUI starts at about 400 Mbps :P but it does level off to about 2.3 Mbps after 5 min.

    I do not see the 100% CPU usage either, with or without data deduplication, it is usually about 2% with the occasional peak at 7%, maybe this fix only works on old computers that cant handle the calculations required.

  53. Thank you so much!

    I have a nice FIOS connection (75/35), and when I first started using Crashplan, I was getting between 20Mbps and 35Mbps depending on network traffic. After a while, I started capping out at 9.5 Mbps. This computer also doubles as a Folding@home machine, and I noticed it was slowing down my work units considerably. Looking into it further, I found Crashplan was maxing out a full core on my overclocked i7 3770k. This is when I started looking for solutions via the Google.

    Found your post here, and made the change. Currently uploading at 34.5 Mbps!

  54. Crashplan was running extremely slow at 2 Mbps on Verizon FIOS until I made this one change . Jumped to 20-100 Mbps. Backed up 3 TB in a bit over 2 weeks so very happy. My question is what does this one simple change really do (what am I giving up if anything)? and why doesn’t Crashplan just make this the default setting given such a tremendous performance improvement?
    Thanks for any info. Curt

  55. Avatar Chris
    Chris says:

    You are the man! My uploads jumped from 4-5 to >40Mbps. Thank you so much!

  56. I ran the script and confirmed that it worked. For some reason, I am still getting slow upload speed. (around 2 Mbps) vs. a network speed test for uploading of 30 Mbps. Does anyone have any additional advice on what I can do to increase the upload speed?

    Thanks

  57. In backup settings, should Data de-duplication be set to “automatic” or “minimal”?

  58. Avatar Zalia
    Zalia says:

    @ Philip: To get around your read only error select the config folder then go to File/ Get Info, then click on the lock in the bottom right corner to be able to make changes (fill in your pw), then change the admin to Read & Write (instead of Read only). Close the lock again and then open the xml file in TextEdit and change it.

    I’ve changed the 0 to a 1 and for the first couple of minutes I saw improvement, now back at a sluggish 300kbps… Any other ideas??

    • Avatar Whammy!
      Whammy! says:

      @ Zalia: I turned off compression and that helped quite a bit. If you’re backing up media (JPG, MP4, MOV, etc.), there’s no reason to be compressing it most likely and it will free up a ton of CPU power.

      @ alter3d: You’re a true Internet Hero!

  59. Avatar Steve
    Steve says:

    Thank you. I know this is an old post and I hope you get this but wow. I wish I read this earlier as I am backing up movies (large files) and it took me months. I changed the one field and I went from under 1MB to over 15MB.

  60. Avatar garth25
    garth25 says:

    Thanks so much for this. Went from 3mbs to sustained 11.5mbs over night.

  61. Thanks, this does work! I had to use your script though because the instructions here doesn’t tell Windows users to restart their CrashPlan engine and not just the main program application after changing the XML file.