Installing Kubeflow on AWS and running a training

If you ended up on this blog post probably you know what kubeflow is and want to experiment with it.

The Kubeflow project is made by Google, and even there is some documentation about how to use it on AWS, this is not very accurate, it is very limited and not updated, thus I decided to write this blog post where to describe my experience creating a PoC for Kubeflow.

In the first place, we need an AWS account. I will not go into details about setting up the IAM permissions for users and I will consider for the sake of simplicity an account with all the necessary permissions.

Continue reading

docker-compose very slow

In case docker-compose is very (very) slow, and if you run it on a virtual machine, check the entropy

cat /proc/sys/kernel/random/entropy_avail

If it is under 100 probably this is the problem.
One possible solution is to install an entropy daemon, like haveged

apt install -y haveged

I had this problem on a VM running at Scaleway.

Backup database in S3

A simple cron backup script of the databse in aws s3. Our script is called mysqlbackup.sh and it looks like this:

#!/bin/bash
DB="razvantudorica"
NOW=$(date +"%m_%d_%Y")
BACKUPFILE="${DB}_${NOW}.sql.gz"
mysqldump --login-path=s3gallery --databases $DB | gzip > $BACKUPFILE
aws s3 cp $BACKUPFILE s3://backup.razvantudorica.net/$BACKUPFILE --profile backuper
rm $BACKUPFILE

And now a few explanations about the script.

First of all we need to have installed the awscli command.

Afterwords, as you can see, the database password is not hardcoded into the script. We can setup the password with mysql_config_editor. This will store authentication credentials in an encrypted login file named .mylogin.cnf.

For our example database, razvantudorica, and database user myuser, we can run

mysql_config_editor set --login-path=razvantudorica --host=localhost --user=myuser --password

The next step is to configure aws s3 bucket and credentials.

  • Create a bucket in s3, in our example is called backup.razvantudorica.net.
  • Create the IAM credentials and save them in ~/.aws/credentials as
[backuper]
aws_access_key_id=AK... 
aws_secret_access_key=...
  • And in ~/.aws/config add
[profile backuper]
region=eu-west-1
output=json
  • The last step is to test our script. If no error occurs at running and the backup file is uploaded successfully in S3, then everything is correct and we can add it the crontab list.
  • Run crontab -e and add this line
0  1 * * * /root/mysqlbackup.sh >> /var/log/mysqlbackup.log

Unknown database type enum requested

Using symfony (5) console command to create new migration based on my entities, I encountered this error.

php bin/console doctrine:migrations:diff

Unknown database type enum requested, Doctrine\DBAL\Platforms\MySQL57Platform may not support it.

The simplest solution is to add mapping_types in config\packages\doctrine.yml

doctrine:
    dbal:
        # ...
        mapping_types:
            enum: string