The Simple Order System uses your AWS account to upload the images into your S3 bucket. This may sound scary, but it isn’t very difficult to setup, and this makes our system simpler for us to maintain as the images that are uploaded never end up in our system. You also end up paying AWS for the bandwidth needed to download the images as well as the storage space needed to store them, rather than us having to track this and markup the costs.
If you don’t already have an AWS account, go and sign up for one here: https://aws.amazon.com. Your first year will probably be free if you don’t already have an account.
All that is required from AWS is an S3 Bucket, along with an IAM User to access it.
S3 is a cloud storage service offered by Amazon that is sort of like Dropbox, but only the cloud part, and without a fancy interface. Dropbox and other services like that actually use S3 to store your files (at least last I checked). You can read more about AWS S3 here: https://aws.amazon.com/s3/. Current pricing is something like $.03 per GB per month for storage and $.09 per GB for bandwidth (used when you download an image to your local computer). So if your lab received 10 GB of files in a month, your bill for the S3 part would be around $1.25 ($.30 for 10GB of storage, $.90 for bandwidth used to download them once). The storage fee would probably be a lot less than $.30, but who cares, it is $.30! We would recommend you delete the files as you go, maybe saving them for 5 days or something.
S3 stores things in what they call a Bucket
. Basically a bucket is the top level folder where all your files and folders will be. You can make as many buckets as you want, our system will only want one bucket. You could make another bucket to store other files you want - like images for your website, or even run your website out of an S3 bucket (this website is run out of an S3 bucket!).
Once you have signed up for an AWS account, and are logged into the AWS Console, go to the S3 page.
Click the Create Bucket button, and give the bucket a name - they have to be globally unique - maybe something like sos-your-lab-name
. Under region, choose US Standard.
Make a note of what you named the bucket, you will need that when setting up your Lab info in our system.
Everything in AWS is secure by default. While you can use the clunky AWS interface to upload files and what not, our system can only interact with it if you setup a user in AWS and grant access to it to allow us to PutObject
into it. You then will give us that users Access key and Secret.
So we are going to create an IAM User, and give that User a Policy that only allows it to PutObject
into your bucket.
Users
Create New User
sos-orders
would be goodCreate
buttonShow User Security Credentials
- you will only be shown them once. You can also download them (bottom button - `Download Credentials). You need both of these values - Access Key ID and the Secret Access Key. Keep these somewhere safe.Attach Policy
it should say Inline Policies
- click on that line and it should expand.sos-policy
yourbucketname
with what you named your bucket.IAM User Policy - you must copy everything exactly as it is below, making sure to replace yourbucketname
with the name you gave your bucket.
For example, if you named your bucket ronssosbucket
then it would look like:
So now that you have all the info our system requires, there is one last piece you probably need. Above all we did was setup a way to get the images into your AWS S3 account. You are going to want to also get them out of S3 and onto your computer.
To do this you need an S3 client - we recommend Cloudberry Explorer.
There are 2 versions, a free version which is more than enough for our purposes, or you could splurge and get the pro version so the poor guy can make some money off of an awesome product.
Cloudberry will also need an IAM User and Policy. You should not use the same User as you created for the Simple Order System as you will want Cloudberry to do more things. We purposely only gave the Simple Order System access to putting stuff in the S3 - you are going to want to get stuff out, and more than likely make other buckets and what not, so the policy you give to Cloudberry will allow full access to the S3. DO NOT DO THIS FOR THE SIMPLE ORDER SYSTEM!
You will want to install Cloudberry on any computer you want to be able to access the S3 files on.
Go back thru the steps above to create a new IAM User, maybe call this one Cloudberry, and for this users policy use this instead:
Please note that this Policy allows it to do anything it wants on S3 - create another bucket, delete a bucket, upload anything etc. It is very important that you keep the user you give this Policy to secret. Make sure you do not use this policy for the user you set up in our system.
It is always important when using the AWS services that you secure things as much as possible. If someone were to gain access to your User with an all access Policy they could potentially abuse your account and cost you money.
A bit about what the Policy says:
“Effect”:”Allow” - this is usually Allow as everything is Denied by default. This is Allowing the following Action on the Resource.
“Action”: “S3:*“ - what do you want to allow? this one says on the S3 allow anything (* is a wildcard). In the Policy we made for the Simple Order System, we only wanted to allow it to PutObject
so instead of a * wildcard it is PutObject
, which means that is all the policy will allow.
“Resource”: “*“ - This is a little scary, the * wildcard is allowing total access to anything in S3. But that is what you need in order to be able to do anything in S3, so it is ok. In the Policy for the Simple Order System User, we specified the Resource as "arn:aws:s3:::ronssosbucket/*"
which is limiting it to that particular bucket, but again with a * wildcard, so it can PutObject
anywhere it wants in that bucket, including making any number of folders and sub folders.