Cloud Storage

From SocialStack

If you'd like to use cloud storage such as DigitalOcean spaces, AWS S3 or Azure blob storage, Socialstack has an installable mechanism called "cloud hosts" which allows all uploaded files to automatically route and be served from a remote storage system.

Getting Started[edit | edit source]

Install the cloud hosts module, then restart your API:

socialstack i Api/CloudHosts

This will create a new configuration entry called "CloudHost" where you can then drop in the configuration for the particular host you would like to use. By default the CloudHosts config will be created with a list of the directly supported hosts, along with a null next to each one. You can also use more than one simultaneously.

DigitalOcean Spaces Config[edit | edit source]

The space should have a folder called "content" and "content-private" at its root. From there, its contents are the same as what would otherwise be in the Content folder. Upload your existing Content folders contents into it.

Digio-space-root.png

In the CloudHost config in the admin panel:

{
  "DigitalOcean": {
      "SpaceOriginUrl": "https://your-space.ams3.digitaloceanspaces.com",
      "SpaceKey": "DO...",
      "SpaceSecret": "..space secret.."
  }
}
  • CDN: It'll use the spaces CDN by default. Turn it off by adding:
{
  "DigitalOcean": {
      ...
      "DisableCDN": true
  }
}


  • PDFs: By default uploaded PDFs are served from DigitalOcean spaces as bytes which means displaying them inline inside e.g. an iframe on your page doesn't work in Chrome. If you'd like them to be displayable:
{
  "DigitalOcean": {
      ...
      "DisplayPdfInline": true
  }
}

AWS Config[edit | edit source]

The bucket should have a folder called "content" and "content-private" at its root. From there, its contents are the same as what would otherwise be in the Content folder. Upload your existing Content folders contents into it.

{
  "AWS": {
      "S3ServiceUrl": "s3.eu-west-2.amazonaws.com",
      "S3AccessKey": "..access key..",
      "S3AccessSecret": "..access secret..",
      "S3BucketName": "bucket-name",
      "LockedDownAccess": false
  }
}
  • LockedDownAccess: Set it to true if the bucket is locked down from public access and needs to use "S3CannedACL.BucketOwnerFullControl"
  • PDFs: By default uploaded PDFs are served from DigitalOcean spaces as bytes which means displaying them inline inside e.g. an iframe on your page doesn't work in Chrome. If you'd like them to be displayable:
{
  "AWS": {
      ...
      "DisplayPdfInline": true
  }
}

Using s3cmd to Upload from an existing server[edit | edit source]

  • Setup the access using the key and secret setup in AWS/DO, the example below is for Digital Ocean hosted in ams3
apt install s3cmd
 s3cmd --configure

The settings should look something like the values below, note ams3 and digitaloceanspaces

New settings:
  Access Key: **************
  Secret Key: ******************************
  Default Region: US
  S3 Endpoint: ams3.digitaloceanspaces.com
  DNS-style bucket+hostname:port template for accessing a bucket: %(bucket)s.ams3.digitaloceanspaces.com
  Encryption password:
  Path to GPG program: /usr/bin/gpg
  Use HTTPS protocol: True
  HTTP Proxy server name:
  HTTP Proxy server port: 0

Check the access and get a list of the buckets

s3cmd ls

Once tested you can upload the files, where ??? is the bucket name

/var/www/stage/Content
s3cmd put * s3://????/ --recursive

Refs for Digital Ocean https://docs.digitalocean.com/products/spaces/reference/s3cmd/