S3 bucket upload task exceeds the 5GB Limit

View some of the Frequently Asked Questions to our support staff. Included are some tips and tricks making this forum ideal for users getting started with GoAnywhere MFT. Note: Users can reply to existing topics but only our support staff can add new topics to this forum.
1 post Page 1 of 1


Posts: 7
Joined: Thu Feb 16, 2017 11:03 am

Post by Support_Andy » Thu Mar 07, 2019 9:14 am
S3 bucket upload task exceeds the 5GB Limit

Possble Error in Log:
Your proposed upload exceeds the maximum allowed size (Service: Amazon S3; Status Code: 400; Error Code: EntityTooLarge; Request ID: XXX). Full stack trace written to '#####_error_1.log'

AWS size limit 5TB. https://aws.amazon.com/blogs/aws/amazon ... ize-limit/


The S3 Task's Upload action does a direct PUT of the entire file in 1 request. The main purpose of the specific S3 Task is to work with an object's metadata. Even though GoAnywhere supports uploading files from the task (PUT), it doesn't use the multiple part upload, which is needed to support larger files.


GoAnywhere does support the multiple part upload to S3 Buckets if using the resource:s3:// syntax (File System - Copy). The project should use the Copy Task with the destination as the resource:s3:// syntax (or use the file chooser "..." > Resource Links > Amazon S3) This will take advantage of the multiple part upload and allow for files over 5GB to be uploaded to S3.
1 post Page 1 of 1