Kolby Allen

Just a mac/linux guy living in a windows world

Automate S3 to EBS with PS

| Comments | AWS, Backups, Powershell

Currently we have a few clients that utilize AWS as part of their DR solution (upcoming articles). Part of this process has us store the larger base images on EBS storeas. This allows us to save time during the restore process. The code below will take a specific set of Keys (S3 term for Folders) and copy them to a local path, matching the folder name. This script requires that the AWS Powershell tools be installed.

Copy from S3 to Local - CopyfromAWStoLocal.ps1
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
# +---------------------------------------------------------------------------
# | File : CopyfromAWStoLocal.ps1
# | Version : 1.0
# | Purpose : Copy files from S3 to Local Storage
# | Synopsis:
# | Usage : .\CopyfromAWStoLocal.ps1
# +----------------------------------------------------------------------------
# |
# | File Requirements:
# | Must have AWS S3 CLI installed & Powershell tools
# | CLI - https://s3.amazonaws.com/aws-cli/AWSCLI64.msi
# | PS Tools - http://aws.amazon.com/powershell/
# +----------------------------------------------------------------------------
# | Maintenance History
# | View GitHub notes: https://github.com/allenk1/ISO-Scripts/commits/master/BackuptoS3_Snapshots.ps1
# ********************************************************************************


# Default input params
$access = "ABC12312312312312312"
$private = "ABCDEFGHIJ1231231231ABCDEFGHIJ1231231231"
$foldernames = @("Dir1", "Dir2", "Dir3")
$bucket = "bucketname"
$downloadpath = "E:" #no trailing slash
$region = "us-west-2"

import-module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"

# Clear any saved credentials
# Clear-AWSCredentials -StoredCredentials

# Set credentials
Set-AWSCredentials -AccessKey $access -SecretKey $private
Set-DefaultAWSRegion $region

foreach ($dir in $foldernames){

    $store = $downloadpath + '\' + $dir
    Read-S3object -BucketName $bucket -KeyPrefix $dir -Folder $store

}

Automating AWS Backups with PowerShell

| Comments | AWS, Backups, Powershell

A few of my clients currenlty have some simple windows server on AWS. In order to save costs we wanted to utilize some of the storage tools that AWS provides. We settled on EBS snapshots for VM level backup and then doing a flat file copy to S3. For this particular client we enabled versioning on S3 in order to retain multiple version.

Here is the script to backup the files:

Backup to S3 and Snapshots - BackuptoS3_Snapshots.ps1
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
# +-----------------------------------------------------------------------
# | File : BackuptoS3_Snapshots.ps1
# | Version : 1.0
# | Purpose : Backs up to S3 & creates EBS snapshots
# | Synopsis:
# | Usage : .\BackuptoS3_Snapshots.ps1
# +------------------------------------------------------------------------
# |
# | File Requirements:
# | Must have AWS S3 CLI installed & Powershell tools
# | CLI - https://s3.amazonaws.com/aws-cli/AWSCLI64.msi
# | PS Tools - http://aws.amazon.com/powershell/
# +------------------------------------------------------------------------
# | Maintenance History
# | View GitHub notes: https://github.com/allenk1/ISO-Scripts/commits/master/BackuptoS3_Snapshots.ps1
# ****************************************************************************


# Default input params
$access = "AKIAJGXXXXXXXXXXXXXX"
$private = "ABC123123ABC123123ABC123123ABC123123ABCD"
$vol_id = @("vol-XXXXXXXX", "vol-XXXXXXXX")
$servername = "Server_NAME"
$region = "us-west-2"

 #YYYYMMDD
$date = Get-Date -format s

import-module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"

# Clear any saved credentials
# Clear-AWSCredentials -StoredCredentials

# Set credentials
Set-AWSCredentials -AccessKey $access -SecretKey $private
Set-DefaultAWSRegion $region

# Loop through all volumes and create snapshots
# Naming Scheme ServerName_VOLID

foreach ($vol in $vol_id) {

    # snapshot the EBS store
    $snapshot_name = $servername + '_' + $vol + '_' + $date

    New-EC2Snapshot -VolumeId $vol -Description $snapshot_name

}

# Now flat file copy to S3
# Enable Bucket versioning in order to keep mulitple version of the file
# TODO: Version with Script

$copy_dirs = @('C:\Path\to\Backup\Directory')
$bucket = "bucketname"

foreach ($dir in $copy_dirs){

    # Key setup by ServerName_DATE
    $key = $servername + '_' + $date
    Write-S3Object -Folder $dir -BucketName $bucket -KeyPrefix / -Recurse

}

Please make sure to set the following Variables:

  • $access - Your Access Key (from IAM)
  • $private - Your Private Key from (IAM)
  • $vol_id = @(“vol-XXXXXXXX”, “vol-XXXXXXXX”) - Array of EBS volumes to backup
  • $servername - Server Name for Backup labeling
  • $region - Decide what Region you’d like to backup to
  • $copy_dirs = @(‘C:\Path\to\Backup\Directory’) - Array of local path’s you’d like to setup
  • $bucket - Backup bucket name (make sure the creds above have full access)

Once this is setup, schedule with Windows Task Scheduler and you are ready to go.

Welcome to KolbyAllen.com

| Comments |

Welcome to the new Kolbyallen.com. Look here for my musings on IT Consultanting, working with Linux/Macs in business and wrangling the cloud.