Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Angular 19 PutObjectRequest readableStream.getReader is not a function #6834

Open
3 of 4 tasks
mike-appvision opened this issue Jan 20, 2025 · 6 comments
Open
3 of 4 tasks
Assignees
Labels
bug This issue is a bug. investigating Issue is being investigated and/or work is in progress to resolve the issue. p2 This is a standard priority issue

Comments

@mike-appvision
Copy link

mike-appvision commented Jan 20, 2025

Checkboxes for prior research

Describe the bug

Hi, I have an Angular application that I recently updated to version 19, and during this migration I also updated aws-sdk/client-s3 to version 3.731.1.

Before the migration the following code worked fine allowing me to upload files to S3. However, now the same code throws the exception TypeError: readableStream.getReader is not a function.

store(filename: string, data: Blob, folder: string): Promise<string> {

  const weakThis = this;

  return new Promise<string>(async (resolve, reject) => {

    try {

      const input: PutObjectRequest = {
        ACL: 'public-read',
        Bucket: this.awsBucket,
        Key: `${folder}/${filename}`,
        Body: data,
      }

      const command = new PutObjectCommand(input);
      const resp = await weakThis.client.send(command);

      if (resp.$metadata.httpStatusCode < 200 || resp.$metadata.httpStatusCode > 299) {
        this.logger.error(`[AwsFileService][store] Error storing file at path ${key}`);
        this.logger.error(`[AwsFileService][store] HTTP Status ${resp.$metadata.httpStatusCode}`);
        reject(resp.$metadata.httpStatusCode);
        return;
      }
      resolve(key);

    } catch (error) {
      debugger;
      console.log(`[AwsFileService][store] Error storing file at path ${key}`);
      this.logger.error(`[AwsFileService][store] Error storing file at path ${key}`);
      console.log(`[AwsFileService][store] ${error}`);
      this.logger.error(`[AwsFileService][store] ${error}`);
      reject(error);
    }
  });
}

Interestingly when I run the Angular application via ng serve this code works as expected. However, when I attempt to build a production copy of the application via ng build, and then run as a standard Node JS application that the issue occurs.

Environment:
Node.js: 22.12.0
Angular: 19.1.2
TypeScript: 5.7.3
aws-sdk/client-s3: 3.731.1"

Regression Issue

  • Select this option if this issue appears to be a regression.

SDK version number

@aws-sdk/[email protected]

Which JavaScript Runtime is this issue in?

Node.js

Details of the browser/Node.js/ReactNative version

22.12.0

Reproduction Steps

  1. Create a typical Angular 19 application
  2. Install aws-sdk/client-s3
  3. Add code above in description
  4. Test 1: run ng serve
  5. Test 2: run ng build

Observed Behavior

Test 1: File uploads successfully
Test 2: Exception is thrown.

Expected Behavior

Test 1: File uploads successfully
Test 2: File uploads successfully

Possible Solution

No response

Additional Information/Context

No response

UPDATE

I am thinking the original Test 1 scenario is not valid. Since posting this I changed the version of aws-sdk/client-s3 back to 3.32.0 (last known working version), and now I'm able to replicate the issue regardless of how the app is built. I have switched back to 3.731.1 and am able to replicate the issue both with ng serve and ng build

@mike-appvision mike-appvision added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Jan 20, 2025
@partheev
Copy link

+1

I have faced this issue and reverted back my package version to 3.617.0 from ^3.617.0.

After debugging, I realized that it's dues to https://github.com/aws/aws-sdk-js-v3/releases/tag/v3.729.0 minor version update.

As it's breaking the flow, anyone please provide steps or configuration to avoid this issue.

@mike-appvision
Copy link
Author

mike-appvision commented Jan 20, 2025

+1

I have faced this issue and reverted back my package version to 3.617.0 from ^3.617.0.

After debugging, I realized that it's dues to https://github.com/aws/aws-sdk-js-v3/releases/tag/v3.729.0 minor version update.

As it's breaking the flow, anyone please provide steps or configuration to avoid this issue.

Thank you for replying! While the latest version of the SDK is still broken, you pointing me to version 3.617.0 has allowed me to patch the issue so my clients can now upload photos. Short term fix for sure, but at least we can upload photos again!

@aBurmeseDev aBurmeseDev self-assigned this Jan 21, 2025
@wis-dev
Copy link

wis-dev commented Jan 22, 2025

The latests stable version where it works is 3.726.1

Starts failing starting release 3.729.0

I thinks could be on this commit b6204f8

Code example putting an object to the bucket:

const bucket = new S3Client({})

const input = new PutObjectCommand({
    Bucket: 'bucketName',
    Key: 'key',
    Body: file,
    ACL: 'public-read',
    ContentType: file.type
})

await bucket.send(input) // readableStream.getReader is not a function at getAwsChunkedEncodingStream

Error message:

readableStream.getReader is not a function at getAwsChunkedEncodingStream

@aBurmeseDev aBurmeseDev added investigating Issue is being investigated and/or work is in progress to resolve the issue. and removed needs-triage This issue or PR still needs to be triaged. labels Jan 22, 2025
@aBurmeseDev
Copy link
Member

aBurmeseDev commented Jan 24, 2025

Hi @mike-appvision - thanks for reaching out and for your patience while we look into it.

I noticed you mentioned files were uploaded successfully on Test 1 and failed on Test 2. Could you elaborate on that?

During my attempt to replicate the issue, I was able to reproduce the error in first few uploads and then it started to upload successfully. Here's my repro for reference: https://github.com/aBurmeseDev/aws-sdk-js-s3-angular

In the meantime, here are a few workarounds we'd like to suggest here:

  • The root cause may be due to the recent change about default data integrity protections with S3. Although it's not recommended, as a workaround, you can disable checksum by setting client configuration.
  const client = new S3({
  ...
  requestChecksumCalculation: "WHEN_REQUIRED",
  ...
});
  • You can convert input file types into one of these string | Uint8Array | ReadableStreamOptionalType | BlobOptionalType. (ref)
      const fileArrayBuffer = await file.arrayBuffer();  // Convert File to ArrayBuffer

      const command = new PutObjectCommand({
        Bucket: 'Bucket_Name',
        Key: file.name,
        Body: new Uint8Array(fileArrayBuffer),           // Convert ArrayBuffer to Uint8Array
        ContentType: file.type,
      });
  • You may also be able to provide polyfill for the API being used.

Hope it helps!
cc: @partheev @wis-dev

@aBurmeseDev aBurmeseDev added response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. p2 This is a standard priority issue labels Jan 24, 2025
@trivikr
Copy link
Member

trivikr commented Jan 24, 2025

This is happening with file uploads, since requestBody is a File object in flexible checksums middleware.

updatedBody = getAwsChunkedEncodingStream(requestBody, {

The implementation expects it to be ReadableStream.

https://github.com/smithy-lang/smithy-typescript/blob/fbe3c04b5627a8aea693b5bfc1598adbac0213d5/packages/util-stream/src/getAwsChunkedEncodingStream.browser.ts#L22

@trivikr
Copy link
Member

trivikr commented Jan 24, 2025

As @aBurmeseDev mentioned, the simplest workaround at the time of comment is to disable checksum computation by setting the following configuration during client creation

  const client = new S3({
    // ... other params
    // ToDo: Remove workaround once https://github.com/aws/aws-sdk-js-v3/issues/6834 is fixed.
    requestChecksumCalculation: "WHEN_REQUIRED",
  });

If the files are not huge, you can convert them to ArrayBuffer too.
It's going to read all bytes into memory though, as per specification.

  // ...
  await client.putObject({
    Bucket: import.meta.env.VITE_AWS_S3_BUCKET_NAME,
    Key: file.name,
    Body: await file.arrayBuffer(),
  });
  // ...

I'll raise this issue internally with the team on Mon, Jan 27th, and provide an update.

@github-actions github-actions bot removed the response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. label Jan 25, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug This issue is a bug. investigating Issue is being investigated and/or work is in progress to resolve the issue. p2 This is a standard priority issue
Projects
None yet
Development

No branches or pull requests

5 participants