Data Sync to AWS S3 - Overview and Best Practices

Published Dec 12, 2017

It's important for many developers and IT teams to have somewhere to securely store data in a highly available, scalable and low-cost system.

Cloud storage has become increasingly popular as a storage choice for data-driven companies, with AWS at the forefront of cloud computing services. Cloud storage is cheaper and more convenient than building on-premise systems, and Amazon S3 is one of the most widely used AWS cloud storage options, allowing concurrent access to data by many separate clients or application threads.

Below you'll find out why you'd want to work on AWS S3 to sync and upload data, and discover some best practices for doing so.

AWS S3 Sync and Upload: Use Cases

Below are some compelling reasons why you'd want to upload or sync data to Amazon S3.

  • Organizations can migrate PDFs, financial documents, employee data, and project data to S3, negating the need for on-premise storage.
    You can host entire static websites which serve HTML files, images, videos, and client-side scripts such as JavaScript on S3.
  • You can use Amazon S3 to store and distribute static web content and media directly from S3 since each stored object has its own unique HTTP URL.
  • You can use the command line interface in S3 to write automated scripts which continuously migrate files such as logs and backup data from an application server.
  • Users can sync data to S3 for big data analytics and then sync the results of these analyses back to on-premise systems.

AWS S3 Sync and Upload Methods

There are several methods used to get data from on-premise or other cloud storage systems to S3 buckets and vice versa, and the actual method of choice depends on how much data needs uploading or syncing. Note that buckets are simply units of storage for objects in S3, and each bucket can hold an infinite amount of data.

For less data (in the order of less than a petabyte), one of the following three options will work:

1. The AWS Command Line Interface

You can set up an IAM user with administrative access, install the AWS CLI on your machine, and, finally, create, copy, retrieve, and delete files in the cloud using commands in the CLI. Once you get to grips with this method, you can write your own scripts for backing up files to the cloud and retrieving them from the cloud.

2. AWS Import/Export

The import/export option speeds up transferring data into and from Amazon S3 and your portable storage devices. This option is suitable for 16 terabytes or less of data, and it involves mailing your portable storage device directly to AWS for uploading or syncing. Import/Export is ideal for off-site backups and disaster recovery.

3. AWS Storage Gateway

Storage Gateway provides AWS cloud connectivity from on-premise servers. You can connect as a local disk, which transfers your data to S3 for backup while keeping a copy of the data stored locally, making this a hybrid cloud storage option.
If you need to upload data at petabyte or even exabyte scale to S3, you'll need some alternative options, both of which AWS provides.

  • Snowball is a petabyte-scale data transport solution that overcomes the normal barriers to large-scale data transfers, namely high network costs, long waiting times, and serious security concerns. Transfer over Snowball bypasses the Internet—AWS sends a snowball appliance, which you connect to your local network and load data onto before returning it to AWS.
  • Snowmobile is for data movement at a different scale—this service can handle the shipment of exabytes of data in the form of trucks, each equipped with up to 100 petabytes of storage capacity. You directly copy the data from your systems on to each Snowmobile, letting you backup or migrate an entire data center to S3.

Upload A Directory to S3 with PHP V2

AWS S3 sync is possible using one of the available SDKs. The AWS SDK for PHP lets you use AWS services with PHP code, which is great for developers, and there are AWS SDKs available for several programming languages. This small overview shows you how to upload an entire directory of files to an Amazon S3 bucket using AWS SDK for PHP V2.

To begin, create a client object which uses your AWS access key ID and your secret access key. You must have these security credentials to make programmatic calls to AWS API operations:

$client = S3Client:::factory(array(
  'key'	=> 'your-aws-access-key-id',
    'secret'	=> 'your-aws-secret-access-key'


To upload the contents of a directory, it's as easy as using the using the uploadDirectory() method, the local file path, and the name of your S3 bucket, like so:

$client->uploadDirectory('/local/directory', 'my_bucket');

Code samples are referenced from the AWS blog.

The above method compares the contents of the local directory and S3, and only transfers files that have changed. You can also download from S3 to your local disk using the downloadBucket() method.

Closing Thoughts

There are many use cases for which you'll want to perform an AWS S3 sync or upload (if your company chooses this cloud storage service for its data needs).The precise methods for uploading and syncing with S3 will vary depending on the volume of data transferred to and from the service.

Upload methods range from using the AWS CLI or SDKs to transfer lower volumes of data to solutions such as Snowball and Snowmobile for large scale transfers to S3.
As a developer, the AWS SDKs make it easy for you to interact with Amazon S3 and other AWS services in a language you are comfortable using.

Discover and read more posts from Limor Wainstein
get started