Mastering the AWS SDK for PHP (TLS306) | AWS re:Invent 2013

Post on 12-May-2015

26.960 views 6 download

Tags:

description

The AWS SDK for PHP allows PHP developers to interact with AWS services in a fluid and familiar way. Learn how to use convenience features, like iterators and waiters, as well as high-level abstractions, such as the Amazon Simple Storage Service (Amazon S3) stream wrapper. We also demonstrate the powerful capabilities inherited from the underlying Guzzle library.

Transcript of Mastering the AWS SDK for PHP (TLS306) | AWS re:Invent 2013

© 2013 Amazon.com, Inc. and its affiliates. All rights reserved. May not be copied, modified, or distributed in whole or in part without the express consent of Amazon.com, Inc.

Mastering the AWS SDK for PHP

Michael Dowling & Jeremy Lindblom, AWS Developer Resources

November 15, 2013

PART 1

PART 2

Using the AWS SDK for PHP

by working with

Amazon S3

High-level features

for common use cases

with the AWS SDK for PHP

AWS SDK for PHP

https://github.com/aws/aws-sdk-php

@awsforphp

Follow Us!

AWS SDK for PHP

• Suite of HTTP clients for AWS services

• Built on top of Guzzle (guzzlephp.org) – Persistent connections, parallel requests

– Event hooks, plugins, wire logging

• Helpful functionality – Easy pagination

– "waiters"

– Automatic retries

– etc.

AWS SDK for PHP – Installing

• Composer / Packagist

• Phar

• Zip

• PEAR

• Yum (Amazon Linux)

AWS SDK for PHP – Installing via Composer

{

"require": {

"aws/aws-sdk-php": "~2.4"

}

}

> composer.phar install

In your composer.json:

From your terminal:

http://getcomposer.org

Using the AWS SDK for PHP

PART 1

Using the AWS SDK for PHP

CreateBucket

PutObject

ListObjects

Amazon S3 operations

Amazon Simple Storage Service (Amazon S3)

Scalable storage in the cloud

http://my-stuff.s3.amazonaws.com/photos/kittens.jpg

Bucket Name Object Key

Using the AWS SDK for PHP

CreateBucket

PutObject

ListObjects

Creating a Bucket – Complete Example

<?php

require 'vendor/autoload.php';

use Aws\Common\Aws;

use Aws\S3\Exception\S3Exception;

$aws = Aws::factory('config.php');

$s3 = $aws->get('s3');

try {

$result = $s3->createBucket([

'Bucket' => 'bucket-name',

]);

$s3->waitUntilBucketExists([

'Bucket' => 'bucket-name',

]);

} catch (S3Exception $e) {

echo $e->getMessage();

}

Two Ways to Instantiate a Service Client

1. Client factory method

2. Service builder (Aws\Common\Aws)

Instantiate a Client via its Factory Method

use Aws\S3\S3Client;

$s3 = S3Client::factory([

'key' => '{your-aws-access-key-id}',

'secret' => '{your-aws-secret-key}',

// 'region' => 'us-east-1',

]);

Instantiate a Client via the Service Builder

<?php

require 'vendor/autoload.php';

use Aws\Common\Aws;

$aws = Aws::factory('/path/to/config.php');

$s3 = $aws->get('s3');

$s3again = $aws->get('s3');

var_dump($s3 === $s3again); #> bool(true)

Providing Credentials

• Options array ['key' => '?', 'secret' => '?']

• Configuration file

• Environment credentials – AWS_ACCESS_KEY_ID

– AWS_SECRET_KEY

• IAM Instance Profile credentials

See http://blogs.aws.amazon.com/php

(Providing credentials to the AWS SDK for PHP)

Executing a Command

try {

$result = $s3->createBucket([

'Bucket' => $bucket,

]);

} catch (Aws\S3\Exception\S3Exception $e) {

echo $e->getMessage();

}

Executing a Command – API Docs

Handling a Response – Modeled Responses

$result = $s3->listBuckets();

$result['Buckets']; // Implements ArrayAccess

print_r($result->toArray());

echo $result;

$result->getPath('Buckets/0/Name');

$result->getPath('Buckets/*/Name');

Handling a Response – API Docs

Wire Logging – Debug Easily

<?php

require 'vendor/autoload.php';

use Aws\Common\Aws;

use Aws\S3\Exception\S3Exception;

use Guzzle\Plugin\Log\LogPlugin;

$aws = Aws::factory('config.php');

$s3 = $aws->get('s3');

$log = LogPlugin::getDebugPlugin();

$s3->addSubscriber($log);

try {

$result = $s3->createBucket([

'Bucket' => 'bucket-name',

]);

$s3->waitUntilBucketExists([

'Bucket' => 'bucket-name',

]);

} catch (S3Exception $e) {

echo $e->getMessage();

}

use Guzzle\Plugin\Log\LogPlugin;

$log = LogPlugin::getDebugPlugin();

$s3->addSubscriber($log);

Wire Logging – Output

# Request:

PUT / HTTP/1.1

Host: this-is-my-test-wonderful-bucket.s3.amazonaws.com

User-Agent: aws-sdk-php2/2.4.6 Guzzle/3.7.3 curl/7.25.0 PHP/5.3.27

Date: Wed, 25 Sep 2013 23:03:13 +0000

Authorization: AWS AKIAEXAMPLEEXAMPLE:Xn2lJPREXAMPLEQkPmY0IAUng=

Content-Length: 0

# Response:

HTTP/1.1 200 OK

x-amz-id-2: eNUZEXAMPLEM4U7c/4WPIlshkfVUEXAMPLEUkyKirVgjEXAMPLEsfwZ3Mx

x-amz-request-id: C91E6E8CD19680F7

Date: Wed, 25 Sep 2013 23:03:15 GMT

Location: /this-is-my-test-wonderful-bucket

Plugins

• Plugin architecture from Guzzle

• Uses the Symfony Event Dispatcher

• Many plugins included with Guzzle

(Log, Backoff, History, Mock, HTTP Cache, etc.)

Wait for it…

<?php

require 'vendor/autoload.php';

use Aws\Common\Aws;

use Aws\S3\Exception\S3Exception;

$aws = Aws::factory('config.php');

$s3 = $aws->get('s3');

try {

$result = $s3->createBucket([

'Bucket' => 'bucket-name',

]);

$s3->waitUntilBucketExists([

'Bucket' => 'bucket-name',

]);

} catch (S3Exception $e) {

echo $e->getMessage();

}

Waiters

• Poll resources until available

• Handle asynchronous and eventually consistent

operations more easily

$result = $s3->createBucket([ 'Bucket' => $bucket ]);

$s3->waitUntilBucketExists([ 'Bucket' => $bucket ]);

Using the AWS SDK for PHP

CreateBucket

PutObject

ListObjects

Uploading an Object to Amazon S3

$result = $s3->putObject([

'Bucket' => 'my-cool-photos',

'Key' => 'photos/photo01.jpg',

'Body' => fopen('/path/to/photo01.jpg', 'r'),

'ACL' => 'public-read',

]);

$url = $result->get('ObjectURL');

Command Syntax – Short vs. Long Form

// Short form

$result = $s3->putObject([

// ...

]);

// Long form

$command = $s3->getCommand('PutObject', [

// ...

]);

$result = $command->getResult();

Executing Commands in Parallel

$command1 = $s3->getCommand('PutObject', [

'Bucket' => 'my-bucket-name',

'Key' => 'my-first-key',

'Body' => fopen('path/to/file1.ext', 'r')

]);

$command2 = $s3->getCommand('PutObject', [

'Bucket' => 'my-bucket-name',

'Key' => 'my-second-key',

'Body' => fopen('path/to/file2.ext', 'r')

]);

$s3->execute([$command1, $command2]);

Using the AWS SDK for PHP

CreateBucket

PutObject

ListObjects

Listing Objects in Your Bucket

$result = $s3->listObjects([

'Bucket' => 'my-bucket-name',

]);

$nextResult = $s3->listObjects([

'Bucket' => 'my-bucket-name',

'Marker' => 'photos/photo-1000.jpg',

]);

Iterators – Enumerate Your Data Easily

• Iterate through entire result sets

• No handling of markers or tokens

• Lazy loads results

• Compatible with SPL iterators

Iterators – Enumerating Your Data Easily

$objects = $s3->getListObjectsIterator([

'Bucket' => 'my-bucket-name',

]);

foreach ($objects as $object) {

echo $object['Key'] . "\n";

}

SDK 1.x – A Time Before Iterators

$sk = null;

$people = array();

do {

$params = array('TableName'=>'people');

if ($sk) {

$params['ExclusiveStartKey'] = array(

'HashKeyElement' => array(

'S' => $sk

)

);

$sk= null;

}

$r = $dynamo_db->scan($params);

if ($r->isOK()) {

foreach ($r->body->Items as $item) {

echo (string) $item->name->S;

}

if ($lk = $r->body->LastEvaluatedKey) {

$sk = (string) $lk->HashKeyElement->S;

}

} else {

throw new DynamoDB_Exception('...');

}

}

while ($sk);

SDK 1.x – A Time Before Iterators

$sk = null;

$people = array();

do {

$params = array('TableName'=>'people');

if ($sk) {

$params['ExclusiveStartKey'] = array(

'HashKeyElement' => array(

'S' => $sk

)

);

$sk= null;

}

$r = $dynamo_db->scan($params);

if ($r->isOK()) {

foreach ($r->body->Items as $item) {

echo (string) $item->name->S;

}

if ($lk = $r->body->LastEvaluatedKey) {

$sk = (string) $lk->HashKeyElement->S;

}

} else {

throw new DynamoDB_Exception('...');

}

}

while ($sk);

SDK 2.x – The Era of Iterators

$scan = $db->getScanIterator([

'TableName' => 'People',

'AttributesToGet' => ['Id', 'Name']

]);

foreach ($scan as $person) {

echo $item['Name']['S'];

}

Advanced Iterator Usage – Example

$objects = $s3->getListObjectsIterator([

'Bucket' => 'my-bucket-name',

]);

$objects = new LimitIterator($objects, 0, 5);

foreach ($objects as $object) {

echo $object['Key'] . "\n";

}

Using the AWS SDK for PHP

CreateBucket

PutObject

ListObjects

High-level Abstractions in the AWS SDK for PHP

PART 2

"I want to use Amazon S3 like a local filesystem."

Amazon S3 Stream Wrapper

PHP Stream Wrappers

• Provide a

consistent I/O

interface for

various protocols – fopen(), fread(),

file_get_contents(), mkdir(), etc…

– file://, http://, ftp://,

php://, …

PHP Stream Wrapper Protocols

echo file_get_contents('/path/to/file');

Is equivalent to…

echo file_get_contents('file:///path/to/file');

file:// is the protocol

PHP Stream Wrapper Protocols

echo file_get_contents('http://www.amazon.com');

http:// is the protocol

Amazon S3 Stream Wrapper

// First, register the wrapper

$s3 = Aws\S3\S3Client::factory();

$s3->registerStreamWrapper();

echo file_get_contents('s3://bucket_name/key');

file_put_contents('s3://bucket_name/key', 'hi!');

s3:// bucket_name /key

Pseudo-directory

• Amazon S3 knows about buckets and keys

• There aren't real subdirectories

• Pseudo-directories can be created using a key

prefix

Bucket: foo

Key: baz/bar/bam

URL: http://foo.s3.amazonaws.com/baz/bar/bam

Reading bytes off of a stream

$fp = fopen('s3://bucket_name/key', 'r');

while (!feof($fp)) {

echo fread($fp, 1024);

}

fclose($fp);

Instead of reading all bytes up front and then using them

Using Stream Filters

$in = fopen('s3://bucket_name/key.zip', 'r');

$out = fopen('/path/to/file', 'w');

stream_filter_append($out, 'zlib.inflate');

stream_copy_to_stream($in, $out);

fclose($in);

fclose($out);

Filters are used for: Conversion, Compression, Encryption, etc.

Listing Objects and Buckets

$dir = "s3://bucket_name/";

if (is_dir($dir) && ($dh = opendir($dir))) {

while (($file = readdir($dh)) !== false) {

echo "Name: {$file}\n";

}

closedir($dh);

}

Listing Objects and Buckets

$dir = 's3://bucket_name';

$iterator = new RecursiveIteratorIterator(

new RecursiveDirectoryIterator($dir),

RecursiveIteratorIterator::CHILD_FIRST

);

foreach ($iterator as $file) {

echo $file->getType() . ': ' . $file . "\n";

}

The more intuitive recursive iterator iterator iterator iterator …

Listing Objects and Buckets

file: s3://bucket/aws_logo.png

file: s3://bucket/background.gif

file: s3://bucket/channel.xml

file: s3://bucket/data.txt

file: s3://bucket/destruct

dir: s3://bucket/dirupload

file: s3://bucket/dirupload/aws-autoloader.php

file: s3://bucket/dirupload/aws.zip

file: s3://bucket/dirupload/guzzle/guzzle/.gitignore

file: s3://bucket/dirupload/guzzle/guzzle/.travis.yml

file: s3://bucket/dirupload/guzzle/guzzle/CHANGELOG.md

file: s3://bucket/dirupload/guzzle/guzzle/LICENSE

PHP Stream Wrappers: Summary

• Access Amazon S3 like a local filesystem

• Read bytes off of a stream on demand

• Use familiar methods like file_get_contents

• Access to PHP's stream filters

"I need to upload a local directory to

an Amazon S3 bucket"

For static blogs, websites, etc.

.

├── images

│ ├── dotted-border.png

│ ├── email.png

│ ├── noise.png

│ ├── rss.png

│ └── search.png

├── index.html

Local filesystem

.

├── images

│ ├── dotted-border.png

│ ├── email.png

│ ├── noise.png

│ ├── rss.png

│ └── search.png

├── index.html

Amazon S3 Bucket

uploadDirectory()

$client->uploadDirectory('/local/directory', 'bucket');

1. Recursively iterates over all local files

2. Checks if the file does not exist or if it has changed

3. Uploads matching files (single or multipart)

public function uploadDirectory(

$directory,

$bucket,

$keyPrefix = null,

array $options = array()

)

Key Prefix?

https://bucket.s3.amazonaws.com/blogs/index.html

$client->uploadDirectory(

'/local/dir',

'bucket',

'blogs/' // Key prefix

);

uploadDirectory() Options

$client->uploadDirectory('/local/dir', 'bucket', '', [

'params' => ['ACL' => 'public-read'],

'concurrency' => 20,

'debug' => true

]);

Debug output:

Uploading /local/dir/images/email.png -> images/email.png (8 bytes)

Uploading /local/dir/images/noise.png -> images/noise.png (12 bytes)

Uploading /local/dir/images/rss.png -> images/rss.png (100 bytes)

Uploading /local/dir/images/search.png -> images/search.png (200 bytes)

Uploading /local/dir/index.html -> index.html (1024 bytes)

"I need to download an Amazon S3 bucket to

my local filesystem"

downloadBucket()

$client->downloadBucket ('/local/directory', 'bucket');

Works exactly like uploadDirectory()

public function downloadBucket(

$directory,

$bucket,

$keyPrefix = null,

array $options = array()

)

Sync Amazon S3 -> Amazon S3

$client->uploadDirectory(

's3://bucket_src',

'bucket_dest'

);

Upload the contents of one Amazon S3 bucket to another using

uploadDirectory().

"I need to use the SDK with my framework."

Third-party Integrations

• Laravel 4 Service Provider

• Zend Framework 2 Module

• Silex Service Provider

https://github.com/aws

If you'd like to help us or contribute to another

framework integration, please come talk to us.

Mastering the AWS SDK for PHP

Learned how to use the AWS SDK for PHP by

working with some basic Amazon S3 operations.

Learned how to speed up some common

development use cases by using some of the

SDK's high-level abstractions.

What you will do next

• Star ★ the AWS SDK for PHP GitHub repo https://github.com/aws/aws-sdk-php

• composer install the SDK into your project https://packagist.org/packages/aws/aws-sdk-php

• Subscribe to the AWS PHP Development Blog http://blogs.aws.amazon.com/php

• Build cool PHP apps on the AWS cloud!

Please give us your feedback on this

presentation

As a thank you, we will select prize

winners daily for completed surveys!

TLS306

Mastering the AWS SDK for PHP

Follow us: @awsforphp