Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Uploads with the S3 asset storage plugin have mime type of application/octet-stream in S3. #3184

Open
jawngee opened this issue Nov 5, 2024 · 2 comments
Labels
type: bug 🐛 Something isn't working

Comments

@jawngee
Copy link

jawngee commented Nov 5, 2024

Describe the bug
Using the S3 asset storage plugin all uploads have a mime type of application/octet-stream in S3. This happens with minio, supabase storage and aws.

The asset table in the database has the correct mime type however.

I'm guessing whatever is doing the uploading to S3 isn't setting the Content-Type of the request correctly.

To Reproduce
Steps to reproduce the behavior:

  1. Fresh install of vendure
  2. Configure the asset storage plugin to use S3
  3. Upload an asset
  4. View upload in S3's file browser

Expected behavior
Expect the files in S3 to have the correct mime-type

Environment (please complete the following information):

  • @vendure/core version: 3.0.5
  • Nodejs version: v20.10.0
  • Database (mysql/postgres etc): Supabase
@jawngee jawngee added the type: bug 🐛 Something isn't working label Nov 5, 2024
@jawngee
Copy link
Author

jawngee commented Nov 6, 2024

This is my config for vendure:

AssetServerPlugin.init({
            route: 'assets',
            assetUploadDir: path.join(__dirname, '../static/assets'),
            namingStrategy: new DefaultAssetNamingStrategy(),
            storageStrategyFactory: configureS3AssetStorage({
                bucket: process.env.S3_BUCKET!,
                credentials: {
                    accessKeyId: process.env.S3_ACCESS_KEY!,
                    secretAccessKey: process.env.S3_SECRET_KEY!,
                }, // or any other credential provider
                nativeS3Configuration: {
                    region: process.env.S3_REGION!,
                    endpoint: process.env.S3_URL!,
                    forcePathStyle: true,
                },
            }),
            // // For local dev, the correct value for assetUrlPrefix should
            // // be guessed correctly, but for production it will usually need
            // // to be set manually to match your production url.
            // assetUrlPrefix: IS_DEV ? undefined : 'https://www.my-shop.com/assets/',
        }),

and then for minio in docker:

version: '3.4'
services:
  minio:
    image: minio/minio
    container_name: minio
    ports:
      - "9000:9000"
      - "9001:9001"
    environment:
      MINIO_ROOT_USER: admin
      MINIO_ROOT_PASSWORD: password
    volumes:
      - ./minio/data:/data
    command: server /data --console-address ":9001"

@jawngee
Copy link
Author

jawngee commented Nov 8, 2024

In s3-asset-storage-strategy.ts for Minio and Supabase you need to specify the ContentType in the params object. See lines 246-254.

const upload = new Upload({
client: this.s3Client,
params: {
...this.s3Config.nativeS3UploadConfiguration,
Bucket: this.s3Config.bucket,
Key: fileName,
Body: data,
},
});

The issue is that there doesn't appear to be any way to find the mime type for the file being passed to the writeFile() function as it's only given a filename key and a blob of data to upload.

You could use a package like mime but that package infers the mime type based on file extension. To be more secure you'd need to use a magic-mime based package which would introduce a native dependency (libmagic) as it infers mime type based on the contents of the file. I don't know how big of an issue this actually is.

For the time being, for our own app, we are using the mime package:

npm install mime@3 @types/mime@3

and using a modified version of the writeFile function:

	private async writeFile(fileName: string, data: PutObjectRequest['Body'] | string | Uint8Array | Buffer) {
		const { Upload } = this.libStorage;

		const upload = new Upload({
			client: this.s3Client,
			params: {
				...this.s3Config.nativeS3UploadConfiguration,
				Bucket: this.s3Config.bucket,
				Key: fileName,
				Body: data,
				ContentType: mime.getType(fileName) ?? 'application/octet-stream',
			},
		});

		return upload.done().then(result => {
			if (!('Key' in result) || !result.Key) {
				Logger.error(`Got undefined Key for ${fileName}`, loggerCtx);
				throw new Error(`Got undefined Key for ${fileName}`);
			}

			return result.Key;
		});
	}

We are now seeing the correct mime types show up which is no longer confusing our processing pipelines!

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: bug 🐛 Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant