Skip to content

Commit

Permalink
s3hash: always stream bytes in legacy mode (#3903)
Browse files Browse the repository at this point in the history
  • Loading branch information
nl0 authored Feb 28, 2024
1 parent ef1161a commit dcdfb88
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 3 deletions.
1 change: 1 addition & 0 deletions lambdas/s3hash/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ where verb is one of

## Changes

- [Changed] Always stream bytes in legacy mode ([#3903](https://github.com/quiltdata/quilt/pull/3903))
- [Changed] Compute chunked checksums, adhere to the spec ([#3889](https://github.com/quiltdata/quilt/pull/3889))
- [Added] Lambda handler for file copy ([#3884](https://github.com/quiltdata/quilt/pull/3884))
- [Changed] Compute multipart checksums ([#3402](https://github.com/quiltdata/quilt/pull/3402))
Expand Down
6 changes: 3 additions & 3 deletions lambdas/s3hash/src/t4_lambda_s3hash/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -383,11 +383,11 @@ async def compute_checksum(location: S3ObjectSource) -> ChecksumResult:
if total_size == 0:
return ChecksumResult(checksum=Checksum.empty())

if not CHUNKED_CHECKSUMS and total_size > MAX_PART_SIZE:
if not CHUNKED_CHECKSUMS:
checksum = await compute_checksum_legacy(location)
return ChecksumResult(checksum=checksum)

part_defs = get_parts_for_size(total_size) if CHUNKED_CHECKSUMS else PARTS_SINGLE
part_defs = get_parts_for_size(total_size)

async with create_mpu(MPU_DST) as mpu:
part_checksums = await compute_part_checksums(
Expand All @@ -397,7 +397,7 @@ async def compute_checksum(location: S3ObjectSource) -> ChecksumResult:
part_defs,
)

checksum = Checksum.for_parts(part_checksums) if CHUNKED_CHECKSUMS else Checksum.sha256(part_checksums[0])
checksum = Checksum.for_parts(part_checksums)
return ChecksumResult(checksum=checksum)


Expand Down

0 comments on commit dcdfb88

Please sign in to comment.