-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pushing large files error #99
Comments
@ermolaev94 is it S3-compatible storage? (yandex cloud or something)? Just curious if it is something specific about them .... |
According to the aws docs, it looks like |
It's yandex-s3, single file limit is 5Tb.
Hm, thx, run this command. I will return with the update ASAP. |
I've tried your suggestion and error is still the same. My config file for AWS is the following:
I've generated huge file with the following command:
File is ~1.1Tb, count of chunks with the single chunk size = 512Mb should be approximately <2300. Then I've run dvc add & push:
and have got the same issue. Then I've tried to push via AWS-CLI:
and it works fine I suppose that |
Overview
Pushing large files under S3 bucker leads to the following error:
I've tried to fix situation by settin chunk size according to the AWS documentation:
It does not help. I've tried to debug
dvc-s3
and cheked that argument is read, but it's not clear how it is used. I've noticed that "s3" config stayed empty, while "self._transfer_config" has updated.Problem starting from 800Gb file size.
The text was updated successfully, but these errors were encountered: