Take our short survey. Stack Overflow for Teams — Collaborate and share knowledge with a private group. Create a free Team What is Teams? Collectives on Stack Overflow. Learn more. Asked 2 years, 8 months ago. Active 2 years, 8 months ago. Viewed 3k times. However, API Gateway seems to interpret the payload as a string, because I receive the following error when I test the method: Execution failed due to configuration error: Integration response of reported length is larger than allowed maximum of bytes.
Did I miss something in the configuration? Add a comment. Active Oldest Votes. Vladyslav Usenko Vladyslav Usenko 1, 4 4 silver badges 15 15 bronze badges. Thank you for your answer. Regarding the hard limit, it doesn't seem to apply to binary pass throughs.
Denis Weerasiri Denis Weerasiri 1 1 gold badge 7 7 silver badges 16 16 bronze badges. Thank you Denis, I wasn't aware that the limit applies to binary uploads as well. I only need a reverse proxy to act as a public HTTPS front for my local server, running on a custom port, because of browser restrictions. No transformations, no rate limiting etc. Which easy and quick to set up service do you suggest for this use case? I would suggest you to try out ALB. Can you follow the steps in the following documentation and see whether you can setup an ALB and integrate your express server as the backend?
Sign up or log in Sign up using Google. Sign up using Facebook. Sign up using Email and Password. Post as a guest Name. Then click on the Create Project button to go into the editor. Once the editor is opened, you can see that a Lambda file with your project name has been already added to the project with a basic code skeleton.
It will open the API Gateway configuration panel on the right side of the editor. For the deployment stage, we can either use the default stage or define a custom stage. To define a custom stage, check the Custom Deployment Stage checkbox and then provide a valid name e.
This stage name can only contain alpha-numeric characters or the underscore character. Once all these fields have been configured correctly, click on the Inject button to set the API Gateway as the trigger for the Lambda function. Now you will see that the trigger icon in front of the Lambda handler has been turned green.
You can click it to do any changes to the API trigger if required. A sample such event can be found below. If the request is a file upload, the file content is sent in the body field of the event. If the file content is a text format such as CSV, this raw content is sent in the above field. Else, if the file content is a binary type such as an image, that content is sent in Base64 encoded format. In the latter case, the isBase64Encoded field of the event is set to true as well.
Therefore our next step is to extract this file content from the event object. Since we need to support both binary and text-formatted files, we have to identify the Base64 encoded payloads and decode them accordingly. Our next task is to generate a name for the file to be stored in S3. For the file name part, we are going to use the current epoch timestamp in milliseconds , which makes it easier to identify a file uploaded at a given time. We can implement this functionality as follows.
Then the complex task is to determine the extension of the file. For that, one option is to ask the file uploader to include an HTTP header with the original file name so that we can extract the extension from that name. The other option is to determine the file extension based on the content-type header of the HTTP request.
For that approach, we can use an NPM library such as mime-types without implementing it from scratch. For that click on the Add Dependencies button on the toolbar and search for the above library. Once you found the library on the search results list, click on the Add button to add it as a project dependency. The final task of our implementation is to define an S3 bucket and implement the code to store these files into that bucket.
For that drag-n-drop the S3 block from the resources panel onto the code editor line where we need to put our upload code snippet. Since we are going to create a new bucket to store the uploaded files, go to the New Bucket tab and provide a name for the bucket and select the operation as Put Object. Then for the content of the object field, we need to provide the fileContent variable we defined earlier.
Once these values are set, click on the Inject button to generate the code snippet and inject it into the editor. In the successful file upload scenario, we are going to print a logline and then return the message " Successfully uploaded ". In that case, the user who uploaded the file will receive an HTTP response with this message as the response.
In the failure scenario, we are going to log the error message and then throw the error, so that the user will receive an HTTP response indicating an error. Since now our project implementation is complete, we can save it into a Git repository and deploy it into our AWS account. To save the project, click on the Save button on the toolbar.
0コメント