Backblaze for offsite backup - alternative to Amazon S3
  • Hi Ben, I have motion detected events saving to an Amazon S3 bucket and deleted after 1 month, however I have just signed up for Backblaze who have an Amazon S3 compatible API (see link) because they are significantly cheaper and my offsite storage costs of motion events across 4 cameras are mounting.

    Unfortunately when I tried to use the Backblaze Amazon S3 compatible credentials, I can't quite specify enough I think in Security Spy to make it work. Will you be able to support this as it makes a significant cost saving for offsite storage and is still basically on Amazon S3 at a cheaper price and I am sure this would appeal to your users.
  • This is an interesting potential option for cheaper offsite storage. When you attempt a test upload via Preferences -> Uploads in SecuritySpy, what error messages do you get?

    We use Amazon's AWS Command Line Interface tool to do the S3 uploads - perhaps this isn't compatible with the Backblaze API for some reason. Unfortunately, if this is the case, there would be little we can do about it. But the error message may shed some light as to what is going wrong.
  • They do have some documentation for developers using the API
    - in the meantime a screenshot of the error is here: 2020-05-23 at 18.56.08.png?dl=0

    At 1/4 of the price of S3 it is a significant incentive for offsite event storage!
  • I was intrigued by this, since I didn't have an offsite solution set up yet for my SecuritySpy captures. I created a Backblaze B2 bucket that was S3 compatible, and received the same errors when attempting a test upload from SecuritySpy into that bucket.

    Ben, your clue that you utilize the AWS CLI led me to the root of the failure, as well as a workaround solution.

    The AWS CLI uses, by default, Amazon endpoints. The AWS CLI will upload to B2 buckets ONLY IF you supply the correct endpoint via a command line parameter (--endpoint-url). I had hoped that the endpoint URL could be specified as in $HOME/.aws/config, but that doesn't seem to be the case.

    I poked around until I found the location that SecuritySpy installs the AWS CLI ($HOME/.local/lib/aws). I then performed these steps to hardcode the Backblaze endpoint so that uploads from SecuritySpy would work.

    1) Rename the distribution AWS application. [mv $HOME/.local/lib/aws/bin/aws $HOME/.local/lib/aws/bin/aws-dist]

    2) Create a wrapper script for "aws-dist" script in $HOME/.local/lib/aws/bin/aws. The contents of this wrapper script (between the "------" markers):
    #! /bin/sh

    $HOME/.local/lib/aws/bin/aws-dist --endpoint-url= "$@"
    exit $?
    Note that your endpoint URL might be different. It might have been better to specify that in an environment variable. Also, the quotes around $@ are also critical.

    3) Make the wrapper script executable. [chmod 755 $HOME/.local/lib/aws/bin/aws]

    4) Configure in SecuritySpy. "Path on server" is just a top-level folder inside your bucket, and not the endpoint like in your screenshot. Mine is just "SSpy" - yours can be whatever you choose.

    After that, attempt test upload. Mine worked, and uploads have been working since.
  • @krose- Thank you so much!

    This is a little beyond my general mac experience but I will attempt to follow the instructions and see what I can make work!

    Very pleased to know that it can be used with Backblaze and it would be nice if this can be introduced into a release as well!

  • @sachaski - You are welcome. Please be aware that following these steps will disable your ability to use Amazon's S3 service, since the B2 endpoint is hardcoded. Just be aware of that.
  • Thanks! This also works for DreamObjects with their URL.
  • @krose - I'll not need the amazon services if I can get Backblaze going! :)
  • Many thanks @krose for outlining exactly what needs to be done in order to support this. I think this would be a useful addition to SecuritySpy so we've added this functionality to the latest beta version of SecuritySpy (currently 5.2.3b22). Now, under the upload settings, you will find a separate bucket name field and endpoint field, for S3 uploads. As the endpoint, leave this empty for standard AWS S3, or specify your Backblaze B2 endpoint address.

    To test, please undo your custom aws config (e.g. rename the folders so that the standard aws tool is at ~/.local/lib/aws/bin), and then test with the B2 endpoint address entered into SecuritySpy. Please confirm this works as expected.

    @sachaski - assuming you haven't done the custom setup yet, could you please test this too. Enter into SecuritySpy the specific endpoint address that has been provided to you by Backblaze. It can be entered with or without the "https://" protocol specifier.
  • @dnchen, if you can test this with DreamObjects too that would be great!
  • This is great, thank you Ben! I can confirm it works when using my own S3 endpoint with software called Minio.

    If anyone is interested in creating their own S3 storage, I can recommend setting up Minio in a Docker container. I've set this up on a little Linux server running in my parents house which acts as a free offsite S3/backup solution.

    I access this via a reverse proxy using my own URL e.g
    A very useful guide to set up a reverse proxy is here:

    example docker-compose details:

    image: minio/minio:latest
    hostname: minio
    container_name: minio
    restart: always
    - "9007:9000"
    - /data/backups:/data
    - ${USERDIR}/docker/minio/config:/root/.minio
    command: server /data

    If you want to remove old video files, you could use the following CLI:

    mc rm -r --force --older-than 1d2h30m myminio/mybucket

    I also use Duplicati as a backup client for various machines to my offsite S3 storage:
  • Hi @paul2020 this sounds great, thanks for posting!
  • Thanks, @Ben, for adding the endpoint-url into the GUI! I can confirm that it does work when using the Backblaze servers (after backing out my workarounds).
  • @ben Thank you and I will get on to testing it but probably at the weekend when I will have a moment to do so!
  • @krose - great to hear, and thanks for posting your solution, this made it easier for us to implement this in SecuritySpy.
  • @ben I have the beta software uploading only one of my cameras for some reason and not the others as far as I can see.. I'll look in the morning on Backblaze to see what if anything there has changed.

    On another note after installing the software I am getting 4001 errors on one of my reolink cameras. I have 2 reolink cameras and at first they were both getting these errors but one of them settled down. The other is mainly trying to connect.

    My computer resources include 128GB RAM and 2 x 3.46 GHz 6-Core Intel Xeon and a Radeon RX 580 8 GB
    running Mojave

    The Reolink model is an RLC-422

    I never had a problem with it until the Beta software so was wondering if there is a bug to be ironed out there maybe?
  • @ben - Backblaze seems to be accepting the other streams now - Perhaps it was an initialisation issue - but I did notice that the stream it was recording had no spaces in the camera name and I hyphenated the other camera names and perhaps that solved it.

    Also the Reolink seems to have settled down as well.. so possibly a false alarm
  • Hi @sachaski, good to hear the uploads are now working, yes perhaps this was an issue with the file names.

    As for the Reolink cameras, this is not a problem with the beta, but rather with the cameras themselves, whereby they provide unreliable RTSP streams. Please see our notes on our list of supported cameras, where we advise against using Reolink cameras.
  • @ben the beta crashed last night..

    I'll send the crash report to you by email.
  • @ben sorry, I just realized you sent an update with the endpoint. It works well with DreamObjects. Thanks!
  • @dnchen - great, thanks for reporting back!
  • @ben sorry for joining the thread late and BTW thank you all for working through this - I can also share my gratitude for BB support. I'm getting an "Error 2" trying to test the server. Any thoughts on what this Error correlates to so that I can troubleshoot and report back ?
  • Hi @JimC I think we'll need some more information here in order to diagnose this. Could you please email us a screenshot of your settings, with the Test button clicked and displaying this error?
  • email on the way thanks @ben
  • Not sure what happened, but I just realized my uploads to the S3 bucket were no longer happening. I'm getting this error:

    Error 1590,88794 /bin/bash: /Users/userA/.local/lib/aws/bin/aws: /Volumes/MyDrive/Users/userA/.local/lib/aws/bin/python: bad interpreter: No such file or directory
  • Hi @dnchen that's one we haven't seen before. Please try this:

    - Click the Go menu in the Finder; select "Go to Folder..."
    - Enter ~/.local/lib
    - You should see an "aws" folder - delete it (move to Trash)
    - In SecuritySpy, go to Preferences -> Uploads and test the S3 upload (it should re-download and install the AWS software)

    Does that do it?
  • @ben I am trying to set up SS on a different computer - more powerful than the one it is replacing and also running the current OS, and my attempts to upload to BackBlaze when doing the test etc.. are also getting the "Error 2" - preventing me from being able to do the computer change over. The current machine running an older OS still works however.
  • Here’s what solved this for me:

    I was able to get uploads to work but I'm not sure why the workaround was needed.

    Actions I took:

    I read the aws cli documentation and tried (from the mac terminal) to create a bucket in my S3 account after using 'aws-configure'.
    The bucket was created successfully and I was able to upload a file to it - all via the cli.
    I then went into securityspy and both S3 and B2 now work given their respective credentials.

    I can only guess that the CLI was preventing all uploads (including to B2) based upon the creds not being entered via the installer. Basically entering the credentials seems to have unblocked the API calls from SecuritySpy

    AWS CLI Documentation:

  • @ben when I try the locl/lib approach it installs AWS CLI and then I get the following message:

    Error 1005 Note: AWS CLI version 2, the latest major version of the AWS CLI, is now stable and recommended for general use. For more information, see the AWS CLI version 2 installation instructions at:

    After that if I install the AWS CLI 2 I get the "Error 2" message again.

    This is on the latest SS version which downloaded today and also on the previous version downloaded a few days ago.

    Any ideas?
  • @ben @JimC

    I just followed your lead here and in fact I used the terminal and did the aws configure and now it is working! I hadn't noticed your post slip in earlier. Thank you!
  • Great news
  • OK it looks like Amazon has updated their CLI tools and deprecated support for the old version that SecuritySpy is trying to install and use. So we'll have to update our CLI install for the new version. I'll make sure to get this done for the next SecuritySpy update.

    The "Error 1005" message is just a warning however, and it does indicate that the AWS v1 tool has been successfully installed by SecuritySpy, which will then work for uploads...

    ...unless you manually install the CLI v2 tool, which apparently breaks the CLI v1 installation, resulting in the error 2 in SecuritySpy (which means "file not found" when it's attempting to find the v1 tool).
  • Thanks Ben !!

    Are those CLI errors coming through or SecuritySpy error numbers ?

    Not playing favorites but the BlackBlaze storage has been great, cost effective, and easy to implement.
  • The note "AWS CLI version 2, the latest major version" comes from the CLI tool installer, and the error note "Error 1005" comes from SecuritySpy because it's not expecting that message from the installer. We've now implemented the new install method for the CLI tool v2 in SecuritySpy for the next update.

    Great to hear that the Backblaze service has been working well.

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!