Before posting, and to avoid disappointment, please read the following:
- This forum is not for 2BrightSparks to provide technical support. It's primarily for users to help other users. Do not expect 2BrightSparks to answer any question posted to this forum.
- If you find a bug in any of our software, please submit a support ticket. It does not matter if you are using our freeware, a beta version or you haven't yet purchased the software. We want to know about any and all bugs so we can fix them as soon as possible. We usually need more information and details from you to reproduce bugs and that is better done via a support ticket and not this forum.
An area of the forum to post ideas for new products or discuss things that are not related to 2BrightSparks products
- Posts: 3
- Joined: Thu Jan 03, 2019 2:17 pm
It would be great to add to S3 AWS the option for Storage Class especially when Amazon added nicely priced "S3 Glacier Deep Archive"
Example how it's done by competition:
Code: Select all
Using Glacier Deep Archive – CLI to upload a new object and set the storage class:
$ aws s3 cp new.mov s3://awsroadtrip-videos-raw/ --storage-class DEEP_ARCHIVE
This would place files directly on required storage class and avoid need to do the lifecycle rules, etc...
- Posts: 1
- Joined: Wed Dec 15, 2021 12:01 pm
For execution delicate use cases (those that require millisecond access time) and often got to information, Amazon S3 gives the accompanying stockpiling classes:
S3 Standard — The default stockpiling class. Assuming that you don't determine the capacity class when you transfer an item, Amazon S3 doles out the S3 Standard stockpiling class.
Diminished Redundancy — The Reduced Redundancy Storage (RRS) stockpiling class is intended for noncritical, reproducible information that can be put away with less excess than the S3 Standard stockpiling class.
- Posts: 13
- Joined: Sun Jan 17, 2016 9:08 pm
I have given up waiting on them to add the ability to copy direct to Deep Archive. I need to reduce my cloud storage costs and other products have supported writing direct to Glacier Deep Archive for some time. While I may pay more for the software product it will more than pay for itself when my monthly storage costs drop by more than 60%. I love SyncbackPro, but not having a method to write direct to AWS Depp Archive of Azure Archive is forcing me to another solution.
- 2BrightSparks Staff
- Posts: 476
- Joined: Mon Jan 05, 2004 6:51 pm
Hi, as mentioned in another reply, SyncBackPro V9 (and V10) support Glacier Deep Archive and have done since July 2021.