I am trying to download a large file from IPFS using the wget command. The initial download is fast but when I have downloaded 13.2GB of the file to connection just halts and becomes extremely slow. Is there a limit in digitalocean?

I have checked:

  • Disk space
  • Bandwidth not maxed
  • re-downloading, same thing fast until it reaches 13.2GB then super slow

command used:

wget -rc --no-parent URL/filename.tar.gz

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

×
2 answers

Hello, @lekanovic

Have you checked if the load on the server is not high during the downloading of the file? You can monitor the performance of the server by using top or htop . I had a similar issue once and it was because the load on the server was getting high for some reason and I’ve needed to perform the action when the server was not that busy. You can also quickly check the server logs for anything suspicious.

Also are you eventually able to download the file and and the issue is that after it reaches 13.2GB the download speed becomes slow?

Let me know how it goes.

Regards,
Alex

Thanks for your answer Alex,

Let me better describe the scenario.
I start a download with wget, it is super fast until it reaches 13.2GB then it stalls and is super slow(kB/s). If I leave that command and I start a new shell and redo the same command but a different local folder, same thing happens it is super fast until it reaches 13.2GB then this new wget session also becomes slow.
So this is not that the device is slow or busy, it seems it has something to do with the size. But cannot tell what.

Same thing happens if I use ipfs to download the file

cd folder1
ipfs get <ipfs_path>

it is fast until it gets to 13-14GB then it stalls, I have checked diskspace, top and I see nothing strange. And same thing here if change folders it is again fast until it gets to 13-14GB

cd folder2
ipfs get <ipfs_path>
  • Hello, @lekanovic

    Thanks for explaining this in details.

    I must say that this is rather strange. We do not limit the download speed on our droplets in any way, but since the issue always occurs at the exact same time (when it gets to 13-14GB) there be might something that is causing the issue for you.

    I’ve just tested this and created a test Ubuntu 18.04 droplet and created a file which is 15GB and downloaded it using the wget you’ve provided. However I must say that the process took quite a while due to the size of the file.

    wget -rc --no-parent URL/filename.tar.gz
    

    I’ve also downloaded the file using rsync and the file was downloaded for 3 minutes:

    rsync -avz root@IPaddress:/home/user/filename.tar.gz .
    

    Would you mind also testing the download speed and see if there will be any issues when using rsync instead of wget ?

    Hope this helps!

    Let me know how it goes.

    Regards,
    Alex

    • codeHere is how to reproduce the issue

      mkdir test1
      cd test1 
      wget -rc --no-parent domain.com:8080/ipns/QmNYTSutWNEhArBsdmvJM68PGwVxj8UmZynKzS5yTUXt7B/encrypted
      

      you will now see fast download, until we reach ~13GB then it will stall and become superslow. Keep this shell do not close it, instead create a new shell, do this

      mkdir test2
      cd test2
      wget -rc --no-parent domain.com:8080/ipns/QmNYTSutWNEhArBsdmvJM68PGwVxj8UmZynKzS5yTUXt7B/encrypted
      

      You will now see, that is fast until it reaches ~13GB then same thing. So to me it is not related to download speed but a disk throttle(or something).

      edited by alexdo
      • Hello, @lekanovic

        I understand that. May I ask if you’ve tried to download the file using something different from wget e.g using rsync as I mentioned in the previous reply. This is just for testing purposes to see if the issue will still occur.

        Regards,
        Alex

Submit an Answer