Speeding up CrashPlan Backups


Well…  I can no longer recommend CrashPlan, even using my fix below.   I recently upgraded my file server, performed a computer adoption according to CrashPlan’s instructions…. and CrashPlan lost all of my backups from that machine.  All 16TB or so or them.  It also lost my backup set definitions after the adoption.

CrashPlan support was… not particularly helpful.  They claim they see the 16TB of data attached to the right computer GUID, but I can’t see the data in my client, and when I look at my account in the CrashPlan web interface, I see nothing.  Support said it “might work” if I forced a backup to run… and it didn’t.

My CrashPlan account expires soon, and I won’t be renewing.


See my new post here for some scripts that will edit the config files and restart the CrashPlan service for you automatically!!!

Another Update:

If my fix doesn’t work for you, try changing the server that you’re connecting to, as described HERE.  Thanks to reader Jorgen for his comment with this tip!  This will, unfortunately, mean starting your backups over from scratch.  🙁

TL;DR:   CrashPlan’s dedup algorithm is a bottleneck on faster network connections and/or slower CPUs.  Changing a single setting inside an XML file pretty much disables the algorithm and makes things not slow to a crawl over time.

So, I recently got a shiny new DSL line installed that has an awesome 10Mbps upload speed — 10x faster than my cable connection, and the fastest upload speed of any residential service available to me that doesn’t have ridiculously low usage caps (thanks TekSavvy for your unlimited plans!).

I decided to take advantage of this increased network capacity to expand the amount of stuff that I backup using CrashPlan+ — due to my relatively slow upload speeds previously, I only backed up critical documents, leaving stuff like pictures from my SLR camera to my local backup system.   With 10Mbps up… no longer!   <insert BACKUP ALL THE THINGS meme image here>

Anywho… things went great.  At first.   I added a ton of files to my backup set, and off CrashPlan went, uploading at 9.6Mbps… then 9.4Mbps…. then 9Mbps… then 8Mbps… then 7Mbps…. and down and down and down… when it hit the 3Mbps mark and kept going down (despite me tweaking the compression/deduplication/network buffer/other settings), I figured an email to CrashPlan support was in order.    The reply I received, while professional and clearly NOT just a template response, was basically “Well, 3Mbps is better than most of our users get;  it’s a shared network;  you’ve already done everything I can recommend.”   That was clearly not what I was hoping for.  I should also note that one of the things I noticed and mentioned to support was that the CrashPlan process was taking up 100% CPU time on one core, indicating it might be CPU-limited.

So I set about gathering real stats, and thanks to CrashPlan’s built in logging… I had a whole bunch of data points.   Using a bit of grep, awk and sed (the 3 sweetest words I know next to Perl 😉 ), and a bit of Excel charting, I came up with this:


Hmm.  Interesting.  Anyone who’s studied computer science should be screaming “O(ln(n))” right now — this type of logarithmic decay screams “algorithm performance issue”.  You simply do not, ever, see this pattern due to overloaded network capacity.   This type of pattern, coupled with 100% CPU use, told me there was something wrong with one of CrashPlan’s features, and I strongly suspected the de-duplication functions, because I already had compression turned off (I knew most of my data wasn’t compressible) and encryption algorithms don’t decay like that.

So… I emailed CrashPlan support again with this evidence (along with a whole bunch of CompSci geek reasoning), and was told, basically, “you’re already faster than most of our customers, if you don’t like it, find another provider”.   Yikes.

Talking with a few other people I know who use CrashPlan with large (multi-terabyte) data sets… this seems to be a common problem.

So… being a geek, I figured I’d check to see if I could do anything other than spend a bunch of money to upgrade the CPU in that system.

I navigated to the CrashPlan configuration directory, /usr/local/crashplan/conf/ (this is on Linux; Windows users, you’ll have to figure out there this is yourself, sorry!), and started digging.  I stumbled across this gem in the file my.service.xml :


Hmm.  Interesting.  “0” is often used as a metavalue that means “unlimited”… so maybe CrashPlan will ALWAYS dedupe EVERYTHING when going over a WAN link, but only dedupes files smaller than 1GB when going over a LAN link, presumably because they recognize that there’s a balance between CPU capacity and network capacity.  It seems that they assume everyone has ridiculously slow upload speeds that are typical of most residential Internet connections.

Being the smarty that I am, I figured I’d set it to not dedupe any files larger than 1 byte when going over the WAN:


(Note:  If you use backup sets, you will have more than one of these lines, one per backup set.  I suggest changing them all;  I have not done any testing to see what happens if you disable it only for some of the backup sets.)

I then restarted the CrashPlan engine (/etc/init.d/crashplan restart).

And… VOILA.    My CPU usage dropped from 100% of 1 core down to ~10% of 1 core, and my upload jumped from 2.5Mbps (and dropping) to 7.5Mbps (and holding/fluctuating between 6.9Mbps and 7.5Mbps).   I’m now seeing patterns that more accurately cover the “variable network bandwidth due to shared service” theory, without seeing logarithmic decay in performance.

I have confirmed with other people that have noticed slowdowns on large data sets that this fix works for them as well, so I can confirm it’s not just something weird on my system.

I updated my 2nd ticket with CrashPlan support with this information, suggesting that they expose the max-file-size-for-dedupe inside the client, rather than making people go dig through XML files.


Update: Once I reached non-peak hours, I’m starting to see upload speeds >9Mbps again as well.  Yay!

Update 2:  Check out my followup blog post with an updated traffic graph HERE 

154 Responses to Speeding up CrashPlan Backups

  1. The Windows 7 x64 path is \ProgramData\CrashPlan\conf

    Thanks VERY MUCH for posting this fix.

    • Avatar alter3d
      alter3d says:

      @tcw – You’re very welcome! And thanks for posting the Windows path. Since it’s under “Program Files” and not “Program Files (x86)”, I’d assume that the path would be the same on 32-bit as well.

      • Just a note that it’s “ProgramData”, not “Program Files”. It’s sort of an “all users” profile folder for Windows (since Vista), whereas the “Program Files” folders contain program binaries and whatnot.

  2. Avatar ValDvor
    ValDvor says:

    I’m never going back to crashplan after my latest glitch with their system that caused me half of my data to get corrupted. I’m sticking with Zoolz Unlimited.

    • Avatar alter3d
      alter3d says:

      That definitely sucks! 🙁 I just took a quick look at Zoolz, and unfortunately they don’t have a Linux client, which is a deal-breaker for me.

      No matter what backup provider / software / process you use, it’s a good habit to verify your backups by doing test restores once in a while, just to make sure that the process works and the data isn’t corrupted. I’m used to doing this because I’m an IT geek 🙂 but it’s definitely not something most people think of.

  3. Avatar Cleese
    Cleese says:

    Thank’s a million for this! I’ve been running crashplan on my linux-server and have had the exact same issue which I’ve been trying to resolve for months. Finally the support team directed me here 🙂

    Indeed, now my backup’s are running at 8+Mbit/s rather than ~2.5Mbit/s!

    Again, thanks a lot for posting this!

  4. After setting up a few new backups, I get ~ 40 Mbps speeds pretty consistently over a 65 Mbps connection, where I was lucky to get 1.2 Mbps before (it appears my 710+ is CPU limited near that rate, as CPU fluctuates between 60 and 90 percent during uploads). I have compression off and encryption, realtime monitoring and backup of open files on.

    The drawback is that analyzing files seems to take about twice as long as it did before disabling dedupe, when a share goes “Missing” and has to be re-verified when it is reconnected (this is almost certainly CPU limited). I hope CrashPlan reads some of this feedback and fixes their dedupe algorithm, so we can turn it back on someday and it actually works as designed.

  5. Avatar Remco
    Remco says:

    Thank you! This little tip has quadrupled my upload speed!

  6. Avatar douglas
    douglas says:


  7. Avatar Olivier
    Olivier says:

    I found the file in Windows but not the reference you mention.
    Should I just add it, and if so where exactly in the file?

    I need this a lot! Thanks !!

  8. Avatar Florent
    Florent says:

    Dear Alter3D

    Thanks for your post. I’m using Crashplan from a Fiber To The Home connection with a … 200 Mpbs upload speed (a 700 Mbps down) from Paris, France. However, I still get the same caping experience at 6 Mbps and a continuous 1 to 2 Mbps. It’s kind of frustrating 😉

    I am a Mac user and I have tried to locate your file but I only come to a file which is “default.service.xml”. I’m not a geek but I can a make a change in a file. Here is an extract of this file :



    Would you or anyone here have any knowledge on how to work this out ?

    Thanks a lot


  9. Avatar mexicnamike
    mexicnamike says:


    On the mac, the my.service.xml is in /Library/Application Support/Crashplan/conf

  10. Avatar synobill
    synobill says:

    Thank you so much for this!

    Does anyone have an idea how to find/access and then edit the my.service.xml file if we are using CrashPlan as a headless client on our Synology DiskStation?

    Much obliged.

    • Avatar JackDan
      JackDan says:

      Look under /volume1/@appstore/CrashPlan/conf, this is the place i found the file.

      • Avatar Airplanedude
        Airplanedude says:

        Thanks All.
        I’m running the headless client on a Synology Diskstation 1513+.
        I modified the file per the interactions provided by the OP, at the location Dan pointed to, /volume1/@appstore/CrashPlan/conf.
        (Before I did this I had changed the file on the local machine – then realizing that it probably has to be done on the Diskstation itself. At any rate, the local machine still has the edited file.)
        I do not see an increase in effective upload speed.

        Anyone out there who has experience running Crashplan off a NAS and using the config file edits described by the OP?
        Pointers highly appreciated.

        • Avatar Bashyroger
          Bashyroger says:

          Hi Airplanedude,

          Ik just tried this on my synology 211+ and it worked perfectly! My upload changed from around 1 to 5,6 Mbps. The only think I did was changing the contents of the dataDeDupAutoMaxFileSizeForWan XML tag from 0 to 1 on the Synology and restart the service…

          • Avatar potatohead
            potatohead says:

            Hi Bashyroger,

            I’m trying to do this via SSH with the admin account on my Synology DS1511+ and when I try to edit my.service.xml with vi, I’m told it’s read only and. I’m a linux newbie, so I don’t know my way around file permissions. I’ve tried chmod to both the conf directory and the my.service.xml file, but I’m dissalowed: permission denied.

            Any ideas?

            Thanks in advance,

          • Thanks so much !!

            If you are having trouble editing the file login ssh to your synology and login as “root”. Same password as the admin password.

  11. Avatar shaun
    shaun says:

    Using a mac, backing up my iTunes library and a few aperture libraries~800 gb..edited the file and saw an increase in backup speeds…unfortunately not to the point you saw…wondering if I should unapt and start afresh..
    thx for the tip…I don’t think I’ll be going back t crash pan once my 3yrs are up..next yr..

  12. Avatar Mikel
    Mikel says:

    Dude, thank you so much!! I’ve been uploading for about a year now and am at 7.7 TBs out of 7.9 TBs. I’ve hesitated to add media to my server because the upload would never catch up at the rate I add media, but not anymore! I was getting a solid 1.2 Mbps for months, but as soon as I made the change illustrated above, it jumped up over 8 Mbps and it holding steady. Thanks a million!

  13. I’m running crashplan on a synology NAS, with a 10mbps upload speed fiber connection.

    Some services can easily go up to 10mbps, but crashplan doesn’t, even with your trick, it seems that changed nothing, stuck at 2mbps max.

    My cpu is as high as 6% and my memory usage at 10%.

    Any ideas or people in the same situation ?
    Thanks !

  14. Avatar Rodney
    Rodney says:

    I tried adding 1 to my windows 7 x64 box, but no luck. What am I missing? Please don’t laugh, I am getting 106 Kbps. Yes K not Mb. Please help. My speeds that I get are 30Mb down and 5 Mb up.

    • Avatar Jeff
      Jeff says:

      I made the change to the file as posted and crashplan started uploading my whole data set all over again – 800 gigs. Its averaging about 6mbps but going to take days to get caught up again.

    • Avatar Rodney
      Rodney says:

      I made the change, but I am not seeing any increase in upload speeds.

  15. Avatar Oskar
    Oskar says:

    Any mac users here care to elaborate where to find the right file? My /Library/Application Support/Crashplan/ contains only 1 file, ui.properties.

  16. Thanks for this tip! I went from 6 Mbps to around 30 Mbps!

  17. I made the change (windows 7 x64 box) but it made no difference. I have 8Mbps upload speed but I’m still only getting 2.5Mbps upload to CrashPlan.

    Incidentally I had to run Notepad++ as administrator in order to save the change to my.service.xml

  18. I made the change (Win7 x64 box) but it made no difference. I have 8Mbps upload speed but uploads to CrashPlan are still running at only 2.5Mbps.

    Incidentally I had to run Notepad++ as an administrator before I could save the change to my.service.xml

  19. Avatar Brent
    Brent says:

    I’m running the latest Mavericks OS, and there is no directory usr/local/crashplan/conf/ I did the ctrl-shift-G. That’s what it said. In fact, in the usr/local folder, I only have 3 items. I’ve got a blazing fast internet connection with 1tb of data to upload. Could you walk me through how to get to the file and then how to manipulate it? Do I need Terminal? Do I edit the file in TextEdit? Pardon my ignorance. Thanks.

  20. Hey, I am also wondering how to find and edit the mac version file; I don’t want to wait 35 days for this backup to be complete :/ I am getting 1.6mbps right now. I can see the .xml file, but will editing it now make the backup start over?

  21. Hey, I am also wondering how to find and edit the mac version file; I don’t want to wait 35 days for this backup to be complete :/ I am getting 1.6mbps right now. I can see the .xml file, but will editing it now make the backup start over?

    Edit: on second thought, I tried editing the file and it won’t let me, even though I am the admin of this computer. It says I don’t own the file.

    • Hello, To edit the file for OS X (Yosemite), takes multiple steps.

      1) Open my.service.xml as text document, change string and select “Duplicate”, save to documents. Make sure to keep as .xml extension
      2) Open two finder folders. First one with documents folder and second one with location of xml file.
      3) Rename the xml file you saved to documents to my.service.xml
      4) Rename the xml file located in /Library/Application Support/Crashplan/conf to xxmy.service.xml
      5) Copy your xml file from documents folder to /Library/Application Support/Crashplan/conf
      6) /Library/Application Support/Crashplan/conf should now have both files.

      I was old fashioned and did a plain restart of the computer, then re-read the post and did the stop/start of crash plan. It looked the same to me either way, could not tell a difference on the Crashplan backup tab.

      I was 80GB through a 600GB new backup and it didn’t erase the online backup. Only concern is that the first 80GB will be recognised in the future on a restore . . .

      Hope this helps, I have been watching it for 2 hrs now, jumped from low of 800kbps to 4.5mbps, but its the weekend, hoping for better during the week.

  22. Avatar Paul Carpentier
    Paul Carpentier says:

    HI! I loved your article and analysis and it sounded extremely plausible to me, so my disappointment was substantial when I tried it out on my new iMac running Mavericks and found no difference after applying the mod to the XML file. Any insights by anyone would be appreciated!

  23. Avatar Patrick
    Patrick says:

    I edited this file for a Mac. (fyi, someone listed above in the comments where to find it).

    It did improve my speed slightly, but I am running at 1.6 Mbps right now. Prior to the 2 TB level, I was getting up to 10 Mbps consistently. I’ll keep an eye on it. Anything else to try?

    • Same thing here. Running on Windows 7 (i7), upload speed started as 5-7 Mbps, then the next day dropped to 1-2Mbps. I’m uploading for third day now, got to 31GB and the speed dropped to a slow and steady ~500Kbps. CPU does not look busy at all. I have a default.service.xml file but there is no dataDeDupAutoMaxFileSizeForWan tag in it…

      I wonder what’s going on. Any clues are appreciated. I’m testing the service right now (30 days free trial), want to go with full paid CrashPlan, but need to confirm that this is going to work.

      Any help?

      • Avatar Michael
        Michael says:

        I just used this on my Windows 7 x64 and it worked fine – not spectacular but I went from ~1.1Mb/s to ~2.3Mb/s so definitely an improvement. If you are seeing the default.service.xml file instead of the my.service.xml you are probably looking in the \Program Files\CrashPlan\conf directory instead of the \ProgramData\CrashPlan\conf directory. Don’t forget, you need to edit the file as an administrator and must restart CrashPlan (system tray, service, processes, etc.) before it will take effect. Good luck.

    • I was able to find and edit the file my.service.xml on my Mac (Mavericks), in folder /Library/Application\ Support/CrashPlan/conf using sudo vim.
      however, my upload speeds are still fairly slow 2.7 Mbps, I am on a gig connection. Are there any other settings to tweak?

  24. Avatar Justin
    Justin says:

    I made the change you suggested above but am still watch my speed systematically drop to 1.5Mbps. On a 50Mbps fiber connection. So frustrated! I’m on Win 7×64, are there any other settings you recommend changing? I have compression off and data-depulication set to minimal.

  25. Pingback:Network Rockstar | Speed Up CrashPlan Backups: Automagic!

  26. Avatar Ronny
    Ronny says:

    The deduplication option under advanced settings shall be AUTOMATIC in order to make this setting work? I can also choose MINIMAL or COMPLETE deduplication in the settings. Could we just change the setting to MINIMAL instead of editing this xml file?

  27. Avatar Adrian
    Adrian says:

    I have changed my dataDeDupAutoMaxFileSizeForWan but it did not speed up anything, I’m still transferring below 400 Kbps. But the CPU usage went from constantly 100 % to less than 10 %. I’m in Sweden so maybe the distance is part of the problem. I’m using OS X Mavericks.

  28. Avatar Jorgen
    Jorgen says:

    It works! 🙂 Thanks alter3d !
    I’m on a 36 Mbit / 36 Mbit internet connection in Denmark and my upload went from 200-300 Kbyte/sec. to around 5-9 Mbit/sec. 🙂 nice!
    ( backing up Synology NAS DS212j )

    • Hey Jorgen, I too have a Synology NAS with Crashplan installed. Did you edit that file on the NAS itself and if so how?

      • Avatar Jorgen
        Jorgen says:

        Hi Alex

        Yes, I enabled SSH on the NAS and used Putty as SSH client from my computer.
        Login as “root” with your admin password.
        Now go to this path: /volume1/@appstore/CrashPlan/conf/
        I edited the file my.service.xml in VI

        / Jorgen

        • Thanks Jorgen. I edited the file, restarted the service and no difference. I’ve even restarted the NAS itself and no improvement. The edit to the file is being saved though so presumably it’s reading it in correctly.
          I’m getting about 1Mbps on a line where I can often get 10Mbps upstream.

          • Avatar Jorgen
            Jorgen says:

            Hi Alex
            Okay, try to set compression to OFF
            In the GUI goto: Settings->Backup->Advanced settings->Configure
            / Jorgen

  29. I’ve now set compression to OFF and still getting the same speed after restarting the server.

    • Avatar Jorgen
      Jorgen says:

      Hi Alex
      My upload has gone down to 1 Mbps again 🙁

      I’m now throwing away my backup and starts over again with a new one on another Crashplan server.
      Crashplan has some info about this here: http://support.code42.com/CrashPlan/Latest/Troubleshooting/Can_I_Move_My_Backup_To_A_Server_With_Faster_Speeds

      Right now I’m back one about 5 Mbps, if it drops again I will consider another backup provider.
      you get what you pay for… and maybe this is too cheap…

      / Jorgen

      • Avatar Jorgen
        Jorgen says:

        Hi again
        Just a small update.
        The server change was definitely my solution! I have now uploaded more than 300 GB in less than 3 days. My current upload speed is 9.6 Mbps
        / Jorgen

        • Avatar Michael
          Michael says:

          Hi Jorgan, Don’t know if you can help, I have logged into my sinology via putty as root, but when I try to go to the path I keep getting “access denied”. Any suggestions would be appreciated.

  30. Avatar synobill
    synobill says:

    Thanks for the original post and thanks Jorgen — I’ve boosted my upload speed from 1.2 Mbps to 10.6 Mbps. Amazing.

  31. Hi,
    Just downloaded the trial and trying to upload 600-odd Gb. I’ve tried the Crashplan recommendations for changing the default settings when doing initial backup and it helped a lot. However, I’m having trouble following the instructions on this page. I’m on a Windows 7 64-bit laptop and I can’t for the life of me find the programdata folder

  32. Wait, I got it…

  33. Access denied – can’t save the file!! What now? I’m already logged on as administrator in Windows 7 x64. I’m just editing the file with Notepad if that helps…

  34. Holy cr@p! That’s fast! I just peaked at over 61Mbps! Keeps dropping back down to 30-ish days though. This is for a 700Gb upload.
    So for anyone else having same problems as I did on Windows, here’s what I did in a nutshell:

    1, You need to enable hidden folders (Google that – I did). That’s how you find ‘ProgramData’ folder

    2, For administrative access to making changes to the file, open the ‘my.service’ file in Notepad and make the changes. DON’T SAVE. Instead, open Notepad again from Start menu but as administrator – right click, “Run as administrator”. Copy and paste the entire content from the original ‘my.service’ file into the new Notepad file. Save the file. When the dialog box opens, save as “all file types” and click on ‘my.service’ to write over the original file.

    3, Ctrl+Alt+Del, to start task manager. Click ‘processes’ and end Crashplan. Then restart.

    4, Check out Crashplan’s own support pages for tweaking the default settings: https://support.code42.com/CrashPlan/Latest/Configuring/Speeding_Up_Your_Backup

  35. Pingback:CrashPlan Slow Part 2 - Cloud Storage Buzz

  36. Pingback:Making CrashPlan Faster – Need Help! topic | usa2

  37. Avatar Cygnus X
    Cygnus X says:

    i have been a crashplan user for the last 2 weeks, in the beginning in off hours i would get 6-7Mbps (got a 35/8 line) friday night i was at 400 out of 800gb…then saturday when i woke up i noticed the backup had started over from scratch ARGHHHH turned out that acroding to them and i quote”What happened was that two automated processes on our server coincided in such a way that caused your backup archive to become unlinked from your computer’s CrashPlan identity”

    hmmm luckily we where able to get it back to 50% complete, and get my pc linked up again (they gave me 3 free months for the trouble), BUT now my speeds vary between 800kbps and 1.2Mbps

    im on win7 pro 64bit file is located in “C:\ProgramData\CrashPlan\conf\my-service xml” as other have noted you cant save the edited file with note pad, i saved it as something else, then in explorer i deleted the original (windows will ask for admin rights…just click ok) and the file is now gone, rename the file you made to my.service.xml and thats it.

    i stopped the crashplan service and restarted it….no change still .7 to 1.2Mbps sigh…reboot no change, at this speed theres 40+ days to get the other hafl of my backup completed

    comparing uploading speed stuff to youtube i get 8.4Mbps which where it should be

  38. Avatar Rich
    Rich says:

    I have found the file on my mac, but it doesn’t mention deducing. This is the file content below. Does anyone know what i need to change?

    1391693927463 1364274000353 conf/my.log.properties 3600000 900000 CONSUMER true STILL_RUNNING,INTRO /Users/Castellari LOW false IOPOL_THROTTLE 4243 0 false eAC5py7ZWV3ZPh4FeyibcR+ScuE=:+HdDDPlYKvg= true 1364274000353 upgrade 15000 4 1 0 65536 65536 524288 1310720 1048576 2621440 20971520 true 2 true false http://www.crashplan.com false 0.0 0.0 0.0 0.0 0 0 true ANY true 4320 true 7200 true 10080 false 4320 false 7200 false 1440 false false SAFE false 20 Default 1 1391693735291 true ON 1073741824 0 AUTOMATIC true 86400000 03:00 true 900000 true 0 1256016278684 1440 15 10080 43200 /Library/Caches/CrashPlan /Library/Application Support/CrashPlan/backupArchives/ false 4096 32768 131072 524288 524288 10485760 1073741824 604800000 300000 60000 1 true 2419200000 true false true 10 30000 30000 4 2 10 80 20 true true 4242538496 0.8 0.8 30000 0.8 false 61 false -1 4200 false -1 621415107467411493 false -1 622032249959219203 false -1 622033376868040760 false -1 1 false 1227212 false

  39. Just viewed my.service.xml on NAS and have no entry for so stuck. Found it in the Mac Client package, but not sure if that does anything?

    Can someone explain in dummy terms, what to check / edit if using Synology headless install with client on OSX 10.9.1

  40. Avatar Philip
    Philip says:

    I am on a mac.

    I typed the recommended command and found the file but when I try to save I get a read only error. How do I resolve this so I can save the file?

  41. Avatar IainB
    IainB says:

    On a Mac (Mavericks), go to Library > Application Support > CrashPlan and locate the my.service.xml file. Open it in TextEdit (don’t just double click on it), and make the line change from 0 to 1
    Save, restart.
    This has tripled my upload speed – still only 1.7mbps, but hopefully this will get better later at night. I am in the UK, on VirginMedia cable with a supposed max upload of 3mbps.

  42. Thank you so much for this, it increased my reported upload speed from 3Mb to ~40Mb! bringing my 1.5 TB upload from 40 days to 2 days

    BUT I dont think these numbers are accurate, because my speedtest testing is consistant with what comcast says my limit is, at 10mb. Either way though, cpu is down to 10% so I know it is working.

    • Avatar alter3d
      alter3d says:

      Right after you restart the CrashPlan service, or after your backup sets start recycling (if you use backup sets), you’ll notice that your backup speed seems crazy high — I’ve seen 600-700Mbps before.

      What this is is the *effective* backup rate, after taking into account scanning existing files for changes, etc. Since CrashPlan has the ability to update only parts of files that had already finished in the past, instead of re-uploading the whole file, but it counts the whole file in its speed calculations (even if it’s just scanning to see if it needs to upload changes and ultimately changes nothing), you see crazy speeds when it’s going through the sync process.

      So basically… seeing numbers way above your line’s capacity is actually normal for a while after restarting CrashPlan or a few other situations… it should settle down in 30-60 minutes, depending how many files you have, how big they are, and how far along you got previously.

  43. FWIW: I updated my linux CP client recently (they have an incompatibility in the latest release where a PC-to-PC backup won’t work until the target is updated to the same version). The default in the xml-config file looks like it’s now 1000000000.

    Not sure why they did that, but that’s what I found.

  44. Avatar nick
    nick says:

    THANK YOU! I have a very large 9TB backup set and have been suffering for 3 years. Back up to max upload speed. Seriously. Thank you!

  45. I don’t use Crashplan but that’s awesome investigative work

  46. Avatar shaun
    shaun says:

    I atarted toying with my settings yesterday as I have 10 months left on my subscription and thought it’d be nice to get a backup set done before then! I’m on fios and have been on the 300kpbs fix for a WHILE when uploading to cplan and get the 40 days to complete set. I did all the changes suggested up until last december on this blog and no go…so moved to back blaze. But last night as I said- I thought I’ll try once more. I changed a few things that sounded better and voila…1.6-1.7 mbps constant(12 hrs now!)….now at 7 days till completion. I’m on OSX mavs and here’s my changes… user away or present=100% CPU (was like 20%), verify every 14 days,, data de-dup minimal, compression auto, no encryption, no watch file system in RT, no b/u open files, in the frequency tab i have all sliders to the left except for remove deleted files which is every 2 days. no limits on lan/wan sending rates…i think all the rest is at defaults…

  47. Avatar Brandon
    Brandon says:

    Great post – Thank you!! This tweak is working for me as well. I’m still migrating everything from my octopus of external drives to the new Synology, but so far I’m doing an initial backup of 3.1TB mostly raw photo data. My CPU and RAM usage hasn’t changed considerably before or after any tweaking, but wow did the speed change.

    DiskStation 1813+ (bootstrapped)
    Client app running on OS X 10.8
    Fiber 75/35mbps pipe.

    -12 hours into the backup I was sitting right where I started, around 7mbps (no noticeable slow down… yet)
    -Altered the my.service.xml file per your suggestion, turned off compression in the app, and
    immediately jumped to ~18mbps.
    -Just checked on it again after another 24 hours, mid-morning on the east coast (probably off-peak for CrashPlan’s servers) and I’m at 36-37mbps. That’s higher than my connection is even rated for, but could be accurate… I don’t think my 35mbps upstream is a hard cap. Amazing.

  48. Avatar chris
    chris says:

    Thanks for the tips everyone. Was able to get Crashplan (running headless on a Synology NAS) uploading much much faster now. Was able to go in via ssh (with no real experience doing linux command line) and manipulate the file on the NAS, using a Mac. Instead of doing like 300kbps, am now doing between 5Mbps and 10Mbps, which is pretty much what my upload speed is rated at. Its still going to take at least ten days, but thats significantly better than the 6 months or whatever it was going to take to upload 1.2gb of data.

  49. Thank you so much for the info. I have been using crashplan for over 3 months and was just about to cancel because the speed was at about 900 Kbps and when trying to upload 1.8 TB of data it was just taking to long (after 3.3 months only at 95% completed and still 14 days to go). I did your change as described and WOW 10.7 Mbps on average now (and only 13 hours remaining). This is with a 105/10 Mbps connection. Thank you again.

  50. Avatar Grant
    Grant says:

    I have 12Mbps upstream but getting max 2.5 to crashplan. Speed tests and uploading to other servers indicate bottleneck not at my end. I tried the changes above (Windows 7) but no matter what I try, it maxes out at 2.5. Guess patience will have to prevail.

  51. Avatar Aaron
    Aaron says:

    I have to say thank you.

    I was on my 3rd data set and it slowed to 2mbps no matter what. Even though my first data sets where 30mbps.

    Now its back up to 30mbps and spikes at 35mbps. Now i can finish my last 2.5 TB quickly and not 7 years. lol

    PS on linux.

  52. Avatar Jambert
    Jambert says:

    I am also on Windows 7 x64 and saw no difference. Also the speed in the crashplan GUI is wrong, if I check my router it hovers around 2 Mbps (I’m in Europe), the GUI starts at about 400 Mbps 😛 but it does level off to about 2.3 Mbps after 5 min.

    I do not see the 100% CPU usage either, with or without data deduplication, it is usually about 2% with the occasional peak at 7%, maybe this fix only works on old computers that cant handle the calculations required.

  53. Thank you so much!

    I have a nice FIOS connection (75/35), and when I first started using Crashplan, I was getting between 20Mbps and 35Mbps depending on network traffic. After a while, I started capping out at 9.5 Mbps. This computer also doubles as a Folding@home machine, and I noticed it was slowing down my work units considerably. Looking into it further, I found Crashplan was maxing out a full core on my overclocked i7 3770k. This is when I started looking for solutions via the Google.

    Found your post here, and made the change. Currently uploading at 34.5 Mbps!

  54. Crashplan was running extremely slow at 2 Mbps on Verizon FIOS until I made this one change . Jumped to 20-100 Mbps. Backed up 3 TB in a bit over 2 weeks so very happy. My question is what does this one simple change really do (what am I giving up if anything)? and why doesn’t Crashplan just make this the default setting given such a tremendous performance improvement?
    Thanks for any info. Curt

  55. Avatar Chris
    Chris says:

    You are the man! My uploads jumped from 4-5 to >40Mbps. Thank you so much!

  56. I ran the script and confirmed that it worked. For some reason, I am still getting slow upload speed. (around 2 Mbps) vs. a network speed test for uploading of 30 Mbps. Does anyone have any additional advice on what I can do to increase the upload speed?


  57. In backup settings, should Data de-duplication be set to “automatic” or “minimal”?

  58. Avatar Zalia
    Zalia says:

    @ Philip: To get around your read only error select the config folder then go to File/ Get Info, then click on the lock in the bottom right corner to be able to make changes (fill in your pw), then change the admin to Read & Write (instead of Read only). Close the lock again and then open the xml file in TextEdit and change it.

    I’ve changed the 0 to a 1 and for the first couple of minutes I saw improvement, now back at a sluggish 300kbps… Any other ideas??

    • Avatar Whammy!
      Whammy! says:

      @ Zalia: I turned off compression and that helped quite a bit. If you’re backing up media (JPG, MP4, MOV, etc.), there’s no reason to be compressing it most likely and it will free up a ton of CPU power.

      @ alter3d: You’re a true Internet Hero!

  59. Avatar Steve
    Steve says:

    Thank you. I know this is an old post and I hope you get this but wow. I wish I read this earlier as I am backing up movies (large files) and it took me months. I changed the one field and I went from under 1MB to over 15MB.

  60. Avatar garth25
    garth25 says:

    Thanks so much for this. Went from 3mbs to sustained 11.5mbs over night.

  61. Thanks, this does work! I had to use your script though because the instructions here doesn’t tell Windows users to restart their CrashPlan engine and not just the main program application after changing the XML file.

  62. Avatar Peter
    Peter says:

    Thanks! Any chance of you sharing the grep, awk, and sed scripts you used to produce the data? Thanks!

  63. Holy crap, you saved my whole backup experience.
    Went from 6Mbps to 86, and cpu load from 1.4 to 0.2…

    You, dear Sir, are a Hero!

  64. Not working for me. Its 24pm in Switzerland and i have 100/100 FTTH. Worst speedtest i ever had still showed 30 Mbits Upstream.

    Guess Crashplans Network is the Bottleneck here. Probably better for me to look for a european Alternative. Besides Zoozle are there any other suggestions with unlimited plans ?

  65. Excellent! I glad I re-read the part about multiple backup sets. It didn’t work for me until I changed all the backup sets. I am now maxed on my upload speed. When I started using CrashPlan a couple of years ago with my Synology I thought that is just how it was. I took my Synology into work to use the high speed upload but it wouldn’t run any faster than it did at home.

  66. It would be awesome to know more about what logs you used and how you extracted the data points you charted. I’m contemplating writing a service to email me if my large, long-running backups stop or drop below certain upload speeds.

    • Avatar alter3d
      alter3d says:

      I can’t seem to find the original scripts that I used to do the data point plotting, but you can find the raw data in the CrashPlan history logs (in Linux, under /usr/local/crashplan/log/history*).

      In those logs, there are lines like

      I 11/07/14 10:20AM [BACKUP_SET_NAME] Stopped backup to CrashPlan Central in 22 minutes: 1 file (1.40GB) backed up, 400.80MB encrypted and sent @ 6.6Mbps (Effective rate: 7Mbps)

      With a bit of text parsing to pull out the date/time and the data rates, you can massage it into a usable form.

  67. Avatar Christian
    Christian says:


  68. Running on Macbook Air with Yosemity. Changed it as described and substantial improvement!
    Improvement from approx. 2,8 Mb/sec to 28-32 Mb/Sec!!
    Peaks (several minutes) of 180-200 Mb/Sec!

    Thanks so much!

  69. Great tips, very useful, has in fact made my backups a lot faster 😀



  70. Avatar Alex Berg
    Alex Berg says:

    will not disable block-deduplication which is awesome because it means if the start of the file is changed but nothing has been inserted or removed then the later part of the file is still not transferred, only the blocks containing the edits are transferred.

    The dataDeDupAutoMaxFileSizeForWan disables the ‘rolling’ deduplication for all files larger than the given size in bytes. The rolling deduplication detects when stuff has been inserted or removed thus moving blocks in the remainder of the files.

    To test that this is correct, pick a large file, copy it and then edit the begging of the file for instance using on your file called ‘my_test_file’
    dd if=/dev/random of=my_test_file bs=1k count=1024 conv=notrunc

    Caution: dd will write what you tell it where you tell it and will happily erase your harddrive. Do not mess around with it (as in changing the arguments) before learning what it does.

    You can see what has actually been transferred in log-files crashplan\cp_bin\log\backup_files.log.0, each line concern a file, and the syntax is:
    Crashplan log syntax
    I 42 0 (filesize in bytes) [,,,0,0,? (total file blocks?),?]

    Best Alex

  71. Avatar Alex Berg
    Alex Berg says:

    I meant setting
    dataDeDupAutoMaxFileSizeForWan =1

  72. Noticed that the change from 0 to 1 in my.service.xml somehow went back to 0…. Strange huh?
    Don’t know at what moment (may restart of macbook, Yosemity).
    Installed latest java version, stopped chrashplan and restarted crashplan (in terminal! not just quit).
    Situation now is that my.service.xml is still at 1, Java updated to latest version but unfortunately back to the standard 3 Mb/sec.
    However…If I pause the upload (pause button in screen) and a few seconds start upload again, I get speeds of 200 MB/sec for a few seconds, and going down to 3 Mb/Sec again..

    • Avatar Pete
      Pete says:

      Mine’s also randomly changing back to 0. Stays for a while, didn’t figure out what makes it go back. Anyone has any ideas?

      • Avatar Cristina
        Cristina says:

        on mac:

        stop the crahplan service:
        CrashPlan console > click the logo on right top > stop

        open command prompt:
        Applications > Utilities > Terminal

        open TextEdit as admin:
        sudo /Applications/TextEdit.app/Contents/MacOS/TextEdit

        open and edit config file:
        /Library/Applicationa Support/CrashPlan/conf/my.service.xml


        restart crashplan service

  73. This is what I have sent to Crashplan. I only can suggest to send a similar message to Crashplan too. Maybe that wake them up. Excuse me for my English, but I hope the message is clear.

    Dear people of Crashplan,

    Please, please do something about the upload speed!
    Upload is way to slow and it’s not because of hardware, network, internet speed.
    Internet forums are full of similar problems and lots of users experience this slow upload speed.

    Current speed is 3 Mb/sec, but should be much much higher in my opinion.

    I tried everything, including your setting adjustments but it keeps slow uploading.

    Come on, people of Crashplan. It’s 2014. We’ve got cars driving on Mars so please don’t tell me that the upload speed can’t be higher.
    Maybe it has something to do with good old fashion Java? I don’t know and it’ not my job but upload speed of 3 Mb/Sec is not 2014.

    B.t.w. my internet subscription is 180 Mb/sec, unlimited and is working fantastic. Location: xxx (only for Crashplan)

    Sincerely hope you take my complaint seriously and get your technicians @ work.

  74. Avatar Chris
    Chris says:

    When I run CrashPlanFix.bat as an admin on my Win7 machine I get this message:

    “**** Updating my.service.xml … ***
    ‘sed’ is not recognized as an internal or external command,
    operable program or batch file.
    **** Stopping CrashPlan service … ***
    The CrashPlan Backup Service service is stopping.
    The CrashPlan Backup Service service was stopped successfully.

    **** Starting CrashPlan service … ***

    The CrashPlan Backup Service service was started successfully.

    **** Done ! ****
    Press any key to continue . . .”

    Is this what I should see?

    I do see the my.service.xml file in the conf folder.

    Thank you

    • Avatar alter3d
      alter3d says:

      No, this isn’t quite what you should see. The ‘sed is not recognized…’ bit means that it won’t work.

      Did you run the batch file directly from within the ZIP file? You probably need to unpack the whole thing to a directory first. The ZIP file contains a Windows version of sed (sed.exe) that does the search-and-replace stuff, and it won’t get extracted/executed if you don’t extract everything first.

  75. Avatar goslow2gofast
    goslow2gofast says:

    Sorry if this seems like a stupid question, but I can’t find the /usr/local/crashplan directory, or the my.services.xml on my Synology NAS server. It’s a linux device, and has CrashPlan installed and running, and I can SSH in to it and see folders and files, but can’t track down the xml file. Any thoghts?


    • Avatar alter3d
      alter3d says:

      Most NAS units don’t run “normal” Linux… it runs a Linux kernel and many of the same user-space tools, but where it puts stuff in the filesystem can be quite different.

      I don’t have a Synology, but you can probably find the file with:

      find / -name my.service.xml

      (note that it’s ‘service’ and not ‘services’ like you had in your post)

  76. Avatar goslow2gofast
    goslow2gofast says:

    Okay, thanks, I think I found it at:



  77. Avatar Wout Reynaert
    Wout Reynaert says:

    First of all: thanks a million for sharing this information with us, you have helped out a lot by sharing your knowledge.

    I had the same problem: max upload speed never went near the 10Mbps upload limit of my ISP, while other uploads (like wetransfer) all achieved maximum speed (around 9.6 Mbps). My upload to crashplan averaged around 2 Mbps… I tried the solution I found in this blog (disabling deduplication and compression), but to no avail.

    HOWEVER: I did find the solution for my problem: besides changing the dataDeDupAutoMaxFileSizeForWan, I also had to set following settings to a higher number:


    Those settings can be found in the same xml file, under

    By default the sizes were configered as 65536. I changed the values to 1048576, and restarted the crashplan service, and BAAAMMM!! 9.6 Mbps upload to crashplan continuously!!

    Hope this can help out others who are experiencing the same problem…


    • Avatar Wout Reynaert
      Wout Reynaert says:

      I see some of my comment was stripped off while posting (because it was included in brackets). The settings I set to 1048576 are:


      these can be found under servicePeerConfig

  78. Avatar James P
    James P says:

    Thanks for this tip. I was able to back up 3TB of data in 5 days!!

    Now, disaster struck and I lost my HD. Unfortunately, the dedupe settings don’t help with restore speeds. I’m now at a 2 month wait to get the 3TB of data restored :(. As before, downloads start off fast, then quickly grind to a slow halt.

    Has anybody figured out how to RESTORE data at a faster speed?

  79. Avatar Kristiyan
    Kristiyan says:

    Thank you so much for the post , really really helped a lot!

    and I have applied both




    I was restoring a few gigs data from last night at 3Mbps and now its 25 Mbps average !

    will see hows the upload speed once the restore is complete, but should be about there as well as I am having fiber optic 100Mbps

    Thanks guys!

    • Avatar Mike
      Mike says:

      You said that you changed the properties for:



      I am still having slow speeds after just setting dataDeDupAutoMaxFileSizeForWan to 1. What did you set these other properties in order to speed it up?

  80. Pingback:Network Rockstar | Speeding up CrashPlan Backups | FeedLab

  81. Avatar Anders
    Anders says:

    The dataDeDupAutoMaxFileSizeForWan=1 did the trick on my Synolgy DS212+, now i am uploading with speeds upto 11 Mbps.

  82. Thanks alter3d!

    Went from 3,5 Mbps to 55-82 Mbps, averaging at about 69 Mbps.
    Went from 99% CPU to 20-44% CPU depending on upload speed.

    No wonder why the Code42 support says that 4 Mbit is faster then average.
    They have a serious problem with the software, basically limiting at 3,5 Mbps for the majority of people.

    Thanks for fixing it alter3d!

  83. Pingback:Advanced CrashPlan backup strategy | Callum Macdonald

  84. Thanks for posting this, can confirm it seems to have helped with my upload speed.

  85. Avatar Arto
    Arto says:

    Great write up! It worked wonders for me! Uploaded 7.5TB in 21 days. Unfortunately now that the backup completed Crashplan will not run. Probably has to do with RAM restrictions and 2GB not being enough.

    • Avatar Jamie
      Jamie says:

      Arto – this was probably due to the upgrade to 3.7. It broke for everyone. There was a temporary fix which worked for some, but platters now has an upgrade in the DSM package manager. Install that and you’ll be away again – but see my post below.

  86. Pingback:Followup: Speeding up Crashplan Backups | Network Rockstar

  87. Avatar Jamie
    Jamie says:

    Having now had this running for a couple of days I can say that this speed fix is brilliant. It’s cut months off my back up time and still works perfectly with the 3.7.0 upgrade. I can’t thank you enough for this. Nice one! 🙂

  88. Tweaks no longer work for me with updated client. Now getting 1 mbps backups when previously maxing at 25 mbps.

  89. Crashplan has the most shitty and unhelpful customer service I have ever seen.

    At the very beginning I asked their online chat representative for how to setup port forwarding and open ports on firewalls for the client to work. They didn’t even tell me exactly what ports have to be open!

    What made me more frustrated is that they said network configuration for Crashplan to work is not included in Crashplan support because it’s not “Crashplan related”
    How stupid it is?!

  90. AWESOME! It worked for some reason when I woke up from sleep mode it was backing at 150 mbps !!! then of couse went down to 3 in about 2 minutes :(.
    I tried the fix just now and it’s up to 19.5mbps.
    Hope I can get back in the hundreds at some point…. 😉
    Thanks for this!

  91. Wow!

    Backup speed went from around 800Kbs up to 12Mbs-19Mbs on my DS214se.
    CPU utilization decreased from over 85% to 77%.

    Thanks for sharing 😉

  92. Tutorial for those using Windows to gain access to my.service.xml file in hidden @appstore directory on Synology NAS:


  93. Avatar Wolfgang
    Wolfgang says:

    Worked perfectly on my QNAP NAS. Upload speed changed from about 8Mbps to round 90Mbps!!!

    Thank you so much for publish this… Have to backup about 1.3TB!!!

  94. Avatar Sherree Casusol
    Sherree Casusol says:

    Thanks so much for this tip. I just signed on to crashplan and it was taking over a week for the initial backup (dashboard said 19 + days!) This fix is awesome! if you no longer recommend Crashplan what do you recommed? I am on the 30 day free trial right now so I can still switch.

    You indeed rock!

  95. Avatar Keith
    Keith says:

    Worked for me as advertised, DS214+ backing up 663GB in 62,053 files. I was stuck at the 2.2-2.3Mbps upload even though my Comcast cable will speedtest.net at about 5.6Mbps. After the change the backup started high then worked down to 5.4-5.6Mbps as expected. CPU usage during backups ranges form 7-12% in top. Did the .xml change as above and set dedupe to minimal and compression off.

    • Avatar Keith
      Keith says:

      Should have mentioned, most of what I backup is photos, movies and FLAC audio…so little compression would be realized, if any.

  96. this also worked for me. I have the headless crashplan and my speeds jumped up from 2mb upload where they were stuck before. This is how my settings in (my.service.xml) look right now (after the change):

    i made this change in both .xml files (on windows 7 and also in the headless nas folder). For those running normal crashplan in windows just updating the one file in program data…../conf should be enough

  97. Avatar Daniel Horande
    Daniel Horande says:

    I tried this, It didnt work. I am having NIGHTMARES wth this!!!!:-((((

    I recently got Xfinity as ISP. My service is 50 Mpbs Download 5Mpbs Upload.
    I am not getting more than 0.5 Mpbs upload with crash plan at any moment. I have checked Bandwidth seversl times with different site from different devices and it always shows 6Mpbs Upload.

    When I hook my Computer to My cell phone through Hot Spot, The CrashPlan upload goes up to 9Mbps. Constant 5 Mpbs uploading.

    Any ideas??

  98. Avatar Kerry
    Kerry says:

    FYI, Crashplan seems to explain that these speed increases are, in essence, an illusion. See this page at their site

    • Avatar alter3d
      alter3d says:

      Quite frankly, Crashplan are idiots. Their dedupe algorithm works fine for small data sets, but is horribly inefficient for large (multi-terabyte) data sets.

      The problem is multifold:
      – Their code is not multi-threaded, so you can have a billion CPU cores and still get bottlenecked on CPU quite quickly.
      – Their dedupe algorithm seems to be quite bad (poor block sizing for the data set size, poor hash lookups, etc).
      – Their dedupe algorithm is super aggressive about deduping, to the point that it will try to dedupe even if it isn’t ACTUALLY BACKING UP YOUR DATA. That is, it spends so much time figuring out if it can dedupe a file (even a brand new file it has never seen before) that your backups run so slow that they will never finish.

      Their claims in that article only hold true for small data sets, or people with specific use cases (ie.. large Outlook PST files), or people with EXTREMELY low upload bandwidth. In all other cases, their algorithm causes problems on large data sets.

      Now, having said all that, I’ll refer you to Update #2 at the top of the post; specifically that CrashPlan lost all my data and I would never recommend them as a backup provider. So the argument over their dedupe algorithm is moot anyways.

  99. Out of interest what are you looking at as an alternative now? I have around 6TB of data on my mirrored 1513+’s that I was looking to CrashPlan to store but it’s now looking like that’s not going to be a good idea.

    Dropbox is out at £66per month for their Dropbox Business (it’s limited to 5TB)

    • Avatar alter3d
      alter3d says:

      To be honest, I just use rsync to a remote server for the small, very important stuff, and manual backups for the big, less important stuff. I have a bunch of large (4TB+) drives that I bring home, run backups, and bring offsite every few weeks. Last time I looked I hadn’t found a cloud provider that did everything I needed.

  100. Avatar PeterJ
    PeterJ says:

    Hi, great tip.
    My uploadspeed went from 1,7 Mbps to 11,8 Mbps and CPU from 50% to 16%
    The file location on Synology is btw. /volume1/@appstore/CrashPlan/conf

  101. Avatar austin user
    austin user says:

    Hi all,

    I tried this as well and it helped, I am using a Synology NAS. The file was located in /volume1/@appstore/CrashPlan/conf

    However, I found initially that when I re-started the app in Synology the changes were clobbered. So, I removed write permissions (chmod a-w my.service.xml) to prevent this.

    My upload speed went from 2.5Mbs to ~6.5Mbs consistently. Still not close to using all the available BW, but, much better!

  102. Avatar Palannov
    Palannov says:

    Your solution worked! For a while…
    It improved my backup speed from 1Mbps to 300Mbps. But as it progresses, over a minute, it starts dropping back down to 1Mbps.

    I found that by pausing the backup and restarting it, it jumps back up to 300Mbps. But it starts dropping back eventually to 1Mbps again.

    Any advise would be hugely appreciated.

  103. anyone have any ideas on how to make it download faster by messing with a config setting?

  104. Avatar Craig Bryant
    Craig Bryant says:

    I just had to comment to say that this solution worked amazingly for me. I have been having issues of slow backup speed when backing up from one machine ot another on my home LAN (so not externally). I was getting around ~3.6Mbps which was going to take over a month to backup ~900GB of data. After setting both the dedup flags in the config file my speed sits around ~60Mpbs while even peaking at over 100Mpbs sometimes! I even raised a support ticket with CrashPlan and this never came back as a solution.

    Thanks for the writing the guide 🙂

  105. Avatar Jeffrey Helle
    Jeffrey Helle says:

    I’d just like to add my thanks for this fix. It solved my problem!

    For MacOS Sierra Beta the file location is:
    Macintosh HD/Library/Application Support/CrashPlan

  106. Avatar Ryuugakusei
    Ryuugakusei says:

    For a headless Synology client, you can edit the “my.service.xml” file in the following folder


    If you get the “Can’t open file for writing” message when trying to save, use the sudo command for admin level editing:

    Go to: /volume1/@appstore/CrashPlan/conf
    Type in: sudo vi my.service.xml

    Edit >01<

    My upload speed has gone from 1.3 Mbps avarage to 8 Mbps, Shaving about 6 months off my estimated upload!

  107. Pingback:CrashPlan on Synology NAS | Gunnar Olafsson

  108. 2017 now, and still crashplan is plagued by this problem.
    I began with a staggering slow 2,5mbit max.
    By changes the value it is 30mbit now!
    So: Thank you for sharing!

  109. Avatar Remulus
    Remulus says:


    I set


    and your suggestion and It really works like charm.

    My backup and restore speed has increased by x4.


  110. Avatar Remulus
    Remulus says:

    Some text disappeared from my previous post.

    So I also changed value of inboundMessageBufferSize and outboundMessageBufferSize to 1048576.

  111. wow, thank you! this made me go from 2MB /sec to 136 MB/sec! from 30 days to 10 hours, lol. and regarding your lost data…yikes. i just signed it and now i am afraid =(

  112. Ah, that is a thing of beauty. I have a 10.5TB backup that includes some files that are over 300GB in size. This was running at <3 Mbps. I made the change (I used a value of 128k instead of 1), and my speed jumped to 63.5 Mbps. Since I expect Verizon to throttle me at that speed, I set a cap of 40Mbps. That's holding steady. Thank you for the guidance.

    BTW, I recently did an adoption. Much cursing in my house. I'm sorry to hear your troubles. I eventually got mine sorted, so I'm fine. But this thing is a nightmare if you want to do anything out of the ordinary.

  113. Avatar Kerry
    Kerry says:

    I saw your update 2 and I have just had the same scenario happen to me. I adopted a new iMac to replace my previous one. I have over 9 TB backup. I just went to restore a file and all prior backups have gone. Crashplan has lost my backup following their instruction.

    This is actually the second time they have lost my backup. The last time was just after I signed up for 4 years service. They extended me to 5 years that time.

    I now hate Crashplan, and they’re killing Home. Trouble is, for my volume of data on my NAS Backblaze would cost over $600 per year…

  114. Avatar Robert
    Robert says:

    For those using a Synology as a headless client, this doesn’t appear to work unless you also change to 1 as well. Example:


    This got me from 3.1 Mb/s to 17.8 Mb/s. I suspect since it’s being transferred through my Crashplan Windows client over my LAN, the first setting is being applied?

    • Avatar Robert
      Robert says:

      FYI, it looks like the comment fields strip any kind of tags, so here it is without the tags:

      dataDeDupAutoMaxFileSize 1
      dataDeDupAutoMaxFileSizeForWan 1

  115. Hello mates, nice article and good arguments commented here, I am really enjoying by these.