Backblaze for offsite backup - alternative to Amazon S3
Hi Ben, I have motion detected events saving to an Amazon S3 bucket and deleted after 1 month, however I have just signed up for Backblaze who have an Amazon S3 compatible API (see link) because they are significantly cheaper and my offsite storage costs of motion events across 4 cameras are mounting.
Unfortunately when I tried to use the Backblaze Amazon S3 compatible credentials, I can't quite specify enough I think in Security Spy to make it work. Will you be able to support this as it makes a significant cost saving for offsite storage and is still basically on Amazon S3 at a cheaper price and I am sure this would appeal to your users.
https://www.backblaze.com/b2/docs/s3_compatible_api.html
Unfortunately when I tried to use the Backblaze Amazon S3 compatible credentials, I can't quite specify enough I think in Security Spy to make it work. Will you be able to support this as it makes a significant cost saving for offsite storage and is still basically on Amazon S3 at a cheaper price and I am sure this would appeal to your users.
https://www.backblaze.com/b2/docs/s3_compatible_api.html
Comments
We use Amazon's AWS Command Line Interface tool to do the S3 uploads - perhaps this isn't compatible with the Backblaze API for some reason. Unfortunately, if this is the case, there would be little we can do about it. But the error message may shed some light as to what is going wrong.
https://www.backblaze.com/b2/docs/
- in the meantime a screenshot of the error is here:
https://www.dropbox.com/s/abe9t9n4bsl7b3i/Screenshot 2020-05-23 at 18.56.08.png?dl=0
At 1/4 of the price of S3 it is a significant incentive for offsite event storage!
Ben, your clue that you utilize the AWS CLI led me to the root of the failure, as well as a workaround solution.
The AWS CLI uses, by default, Amazon endpoints. The AWS CLI will upload to B2 buckets ONLY IF you supply the correct endpoint via a command line parameter (--endpoint-url). I had hoped that the endpoint URL could be specified as in $HOME/.aws/config, but that doesn't seem to be the case.
I poked around until I found the location that SecuritySpy installs the AWS CLI ($HOME/.local/lib/aws). I then performed these steps to hardcode the Backblaze endpoint so that uploads from SecuritySpy would work.
1) Rename the distribution AWS application. [mv $HOME/.local/lib/aws/bin/aws $HOME/.local/lib/aws/bin/aws-dist]
2) Create a wrapper script for "aws-dist" script in $HOME/.local/lib/aws/bin/aws. The contents of this wrapper script (between the "------" markers):
------
#! /bin/sh
$HOME/.local/lib/aws/bin/aws-dist --endpoint-url=https://s3.us-west-002.backblazeb2.com "$@"
exit $?
------
Note that your endpoint URL might be different. It might have been better to specify that in an environment variable. Also, the quotes around $@ are also critical.
3) Make the wrapper script executable. [chmod 755 $HOME/.local/lib/aws/bin/aws]
4) Configure in SecuritySpy. "Path on server" is just a top-level folder inside your bucket, and not the endpoint like in your screenshot. Mine is just "SSpy" - yours can be whatever you choose.
After that, attempt test upload. Mine worked, and uploads have been working since.
This is a little beyond my general mac experience but I will attempt to follow the instructions and see what I can make work!
Very pleased to know that it can be used with Backblaze and it would be nice if this can be introduced into a release as well!
To test, please undo your custom aws config (e.g. rename the folders so that the standard aws tool is at ~/.local/lib/aws/bin), and then test with the B2 endpoint address entered into SecuritySpy. Please confirm this works as expected.
@sachaski - assuming you haven't done the custom setup yet, could you please test this too. Enter into SecuritySpy the specific endpoint address that has been provided to you by Backblaze. It can be entered with or without the "https://" protocol specifier.
If anyone is interested in creating their own S3 storage, I can recommend setting up Minio in a Docker container. I've set this up on a little Linux server running in my parents house which acts as a free offsite S3/backup solution.
https://min.io
I access this via a reverse proxy using my own URL e.g s3.myhome.com
A very useful guide to set up a reverse proxy is here: https://www.smarthomebeginner.com/traefik-2-docker-tutorial/
example docker-compose details:
s3:
image: minio/minio:latest
hostname: minio
container_name: minio
restart: always
ports:
- "9007:9000"
volumes:
- /data/backups:/data
- ${USERDIR}/docker/minio/config:/root/.minio
command: server /data
If you want to remove old video files, you could use the following CLI:
mc rm -r --force --older-than 1d2h30m myminio/mybucket
I also use Duplicati as a backup client for various machines to my offsite S3 storage:
https://www.duplicati.com
On another note after installing the software I am getting 4001 errors on one of my reolink cameras. I have 2 reolink cameras and at first they were both getting these errors but one of them settled down. The other is mainly trying to connect.
My computer resources include 128GB RAM and 2 x 3.46 GHz 6-Core Intel Xeon and a Radeon RX 580 8 GB
running Mojave
The Reolink model is an RLC-422
I never had a problem with it until the Beta software so was wondering if there is a bug to be ironed out there maybe?
Also the Reolink seems to have settled down as well.. so possibly a false alarm
As for the Reolink cameras, this is not a problem with the beta, but rather with the cameras themselves, whereby they provide unreliable RTSP streams. Please see our notes on our list of supported cameras, where we advise against using Reolink cameras.
I'll send the crash report to you by email.
Error 1590,88794 /bin/bash: /Users/userA/.local/lib/aws/bin/aws: /Volumes/MyDrive/Users/userA/.local/lib/aws/bin/python: bad interpreter: No such file or directory
- Click the Go menu in the Finder; select "Go to Folder..."
- Enter ~/.local/lib
- You should see an "aws" folder - delete it (move to Trash)
- In SecuritySpy, go to Preferences -> Uploads and test the S3 upload (it should re-download and install the AWS software)
Does that do it?
I was able to get uploads to work but I'm not sure why the workaround was needed.
Actions I took:
I read the aws cli documentation and tried (from the mac terminal) to create a bucket in my S3 account after using 'aws-configure'.
The bucket was created successfully and I was able to upload a file to it - all via the cli.
I then went into securityspy and both S3 and B2 now work given their respective credentials.
I can only guess that the CLI was preventing all uploads (including to B2) based upon the creds not being entered via the installer. Basically entering the credentials seems to have unblocked the API calls from SecuritySpy
AWS CLI Documentation: https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html
Error 1005 Note: AWS CLI version 2, the latest major version of the AWS CLI, is now stable and recommended for general use. For more information, see the AWS CLI version 2 installation instructions at: https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2.h
After that if I install the AWS CLI 2 I get the "Error 2" message again.
This is on the latest SS version which downloaded today and also on the previous version downloaded a few days ago.
Any ideas?
I just followed your lead here and in fact I used the terminal and did the aws configure and now it is working! I hadn't noticed your post slip in earlier. Thank you!