Big Data

Article: Big Data Year In Review

A good review article about Big Data in 2012


What’s new in Linux Kernel 3.7

Linux Kernel 3.7

The version 3.7 of Linux Kernel was just released few days ago (11 December).  In this version there are a lot of interesting features, and I will mention only unified system architecture support for ARMTCP Fast Open and Virtual extensible LAN tunneling protocol. For an extended list of the new features, you can check a list here.


How to use AWS SDK for PHP 2 and Symfony Console

Few days ago I wanted to write a small PHP script to backup (upload) some files in Amazon S3. And because Amazon just released last month a new version of their SDK, AWS SDK for PHP 2 I decided to use this one.
I also choose to use Console component from Symfony2 and Composer to install all the dependencies.

In the root folder of my project I downloaded and installed composer, like this:

curl -s "" | php

In composer.json file I added this dependencies:

"require": {
"aws/aws-sdk-php": "*",
"monolog/monolog": "1.0.*",
"symfony/class-loader": "2.1.3",
"symfony/console": "2.1.3",
"symfony/yaml": "2.1.3",
"doctrine/common": "2.3.0",
"symfony/finder": "2.1.4"
"autoload": {
"psr-0": {
"Razvan": "src/"

and after this I ran:

php composer.phar install

After runing this command, you will see in the project folder, a directory called vendor where are installed all the dependencies.

After that I created on the same level with vendor a folder called src and inside this one another folder called Razvan.

[bash]mkdir -p src/Razvan[/bash]

inside the Razvan folder I put my BackupCommand.php

Below is a part of the code:


namespace Razvan;

use Symfony\Component\Console\Command\Command;
use Symfony\Component\Console\Input\InputArgument;
use Symfony\Component\Console\Input\InputInterface;
use Symfony\Component\Console\Input\InputOption;
use Symfony\Component\Console\Output\OutputInterface;
use Symfony\Component\Yaml\Yaml;
use Symfony\Component\Finder\Finder;

use Aws\S3\S3Client;
use Aws\S3\Enum\CannedAcl;
use Aws\S3\Exception\S3Exception;

* BackupCommand
* eg: usage for upload:
* php sync.php s3 /path/to/folder/ upload –relative
* @author razvan tudorica
class BackupCommand extends Command

private $s3;

protected function configure()
//… get the configuration for s3

->setDescription(‘Syncronize Amazon S3 folder with the local folder’)
->addArgument(‘localFolder’, InputArgument::REQUIRED, ‘Local folder FULL path’)
->addArgument(‘action’, InputArgument::REQUIRED, ‘upload (or download)’)
->addArgument(‘bucket’, InputArgument::OPTIONAL, ‘The bucket name’, $this->bucket)
->addArgument(‘s3Folder’, InputArgument::OPTIONAL, ‘Remote S3 folder path’)
->addOption(‘relative’, null, InputOption::VALUE_NONE, ‘If set, we will use relative path in the bucket’)

$conf = array(
‘secret’ => ‘YOURSECRET’,
‘region’ => ‘us-east-1’ //or your bucket’s region
$this->s3 = S3Client::factory($conf);

protected function execute(InputInterface $input, OutputInterface $output)
$action = $input->getArgument(‘action’);
$localFolder = $this->input->getArgument(‘localFolder’);
$bucket = $this->input->getArgument(‘bucket’);

//check if input the parameters are correct

$finder = new Finder();

foreach ($finder as $file)
//some extra logic for setting the key
$key = $startFolder . DIRECTORY_SEPARATOR . $file->getRelativePathname();


‘Bucket’ => $bucket,
‘Key’ => $key,
‘Body’ => fopen($file->getRealpath(), ‘r’),
‘ACL’ => CannedAcl::PUBLIC_READ
catch (S3Exception $e)
$this->output->writeln("The file was not uploaded: " . $e->getMessage());



And now we have to call this command. For this, in the root folder of the project I created another small script called backup.php with this content:

#!/usr/bin/env php

require_once __DIR__.’/vendor/autoload.php’;

use Razvan\BackupCommand;
use Symfony\Component\Console\Application;

$command = new BackupCommand();

$application = new Application();


Now, we can run it from the console, with:

[bash]php sync.php s3 /home/razvan/mypictures[/bash]

Based on this skeleton, I’m working for a more complex syncronization script between Amazon S3 and a local folder. You can find the code on GitHub.


It is possible the tutorial above to not be very complete and accurate.

The code it’s not optimized at all (eg: if you have a big amount of files or very big files).