My goal is to to fully ditch Google Photos for Immich. I have about ~3TB of photos and videos. Looking for a super simple way of backing up the library to cloud storage in case of a drive failure without spending a ton.

Ideally, this will require nothing on my part besides copying files into a given folder. And ideally the storage will be encrypted and have basic privacy assurances.

Also if it matters my home server is running Debian. But I’d prefer something that runs in docker so I can more easily check on it remotely.

  • maplesaga@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    39 分钟前

    Pcloud has lifetime deals with encryption.

    I’ve had it for a very long time and paid once long ago. It works on Linux as well.

  • Daniel Quinn@lemmy.ca
    link
    fedilink
    English
    arrow-up
    8
    ·
    3 小时前

    Buy two 4tb extern drives. Copy your photos onto both. Leave on at your mom’s house in a closet. Leave the other in a locker at work or a safety deposit box.

    No monthly fees, no techbro cloud capitalists.

  • zorflieg@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 小时前

    Use Restic backups to a local drive then sync that with something like rsync to ovhcloud cloud archive (not cold archive but that can work too). You can also skip the local copy but it’s better to have one and if you sync weekly it gives you opportunities to do things like cull photos you took too many of before it slaps them all up. There are plenty of GUI based restic interfaces now if you want a quick check or browse. Use healthchecks.io to monitor the cronjobs and alert you if they aren’t working.

  • 0x0@lemmy.zip
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 小时前

    And ideally the storage will be encrypted and have basic privacy assurances.

    Do it locally with cryptomator or similar so the cloud will only see encrypted data.

  • Decronym@lemmy.decronym.xyzB
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    37 分钟前

    Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:

    Fewer Letters More Letters
    IP Internet Protocol
    NFS Network File System, a Unix-based file-sharing protocol known for performance and efficiency
    SMB Server Message Block protocol for file and printer sharing; Windows-native
    SSH Secure Shell for remote terminal access
    ZFS Solaris/Linux filesystem focusing on data integrity

    5 acronyms in this thread; the most compressed thread commented on today has 8 acronyms.

    [Thread #123 for this comm, first seen 1st Mar 2026, 17:30] [FAQ] [Full list] [Contact] [Source code]

  • Saganaki@lemmy.zip
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    7 小时前

    I’ve had a good experience with pCloud. One-time lifetime fee. Just set the Immich directory in its entirety as a backup folder.

    3TB is a weird place to be with their pricing, though. You can buy 2 TB twice, iirc.

  • qjkxbmwvz@startrek.website
    link
    fedilink
    English
    arrow-up
    11
    ·
    10 小时前

    Not the same, but for my Immich backup I have a raspberry pi and an HDD with family (remote).

    Backup is rsync, and a simple script to make ZFS snapshots (retaining X daily, Y weekly). Connected via “raw” WireGuard.

    Setup works well, although it’s never been needed.

    • yo_scottie_oh@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 小时前

      raspberry pi and an HDD with family (remote)

      Is this the way to go for off-site backups w/ family? In terms of low power draw, uptime, etc.

      • Imaginary_Stand4909@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        7 小时前

        Okay, how do you get sanoid & syncoid to run, because I’ve tried, and I’m just too dummy. When it makes a backup, is it literally making a zfs data record/pool/whatever on the other machine? Or is it more like a file? I have a Proxmox running cockpit (SMB & NFS) and the machine is connected to a USB drive bay that has ZFS. My immich is saving pictures to my ZFS drive bay via SMB.

        I’ve tried to do

        syncoid pool_name/data/immich root@cockpit.service.IP.addr:mnt/samba/backups
        

        but I get hit with:

        Long ass error message
        WARNING: ZFS resume feature not available on target machine - sync will continue without resume support.
        INFO: Sending oldest full snapshot Orico2tera4/data/immich@syncoid_nova_2026-01-27:13:38:44-GMT-05:00 to new target filesystem root@192.168.0.246:/mnt/samba/backups (~ 42 KB):
        /dev/zfs and /proc/self/mounts are required.
        Try running 'udevadm trigger' and 'mount -t proc proc /proc' as root.
        44.2KiB 0:00:00 [ 694KiB/s] [===========================================] 103%            
        CRITICAL ERROR:  zfs send  'Orico2tera4/data/immich'@'syncoid_nova_2026-01-27:13:38:44-GMT-05:00' | pv -p -t -e -r -b -s 43632 | lzop  | mbuffer  -q -s 128k -m 16M | ssh      -S /tmp/syncoid-root1921680246-1772385641-845218-1784 root@192.168.0.246 ' mbuffer  -q -s 128k -m 16M | lzop -dfc |  zfs receive  -F '"'"'/mnt/samba/backups'"'"' 2>&1' failed: 256
        

        I’ve tried reading the github docs and some forums but I’m dummy. I just want to have backups that I can encrypt and keep in a cloud for cheap somewhere. Does it literally have to be two different machines (god I’m dumb)? Can I just auto run ZFS snapshots and encrypt then save those to Drive/OneDrive/Whoever?

        • ikidd@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 小时前

          You can do a sanoid sync to another zpool or dataset on the same machine or a remote host, they behave the same. It’s replicating that dataset on the other machine, then sending the snapshots after that point over via zfs send. You can instruct sanoid to prune those snapshots after the send and start new ones for the next send, or just accumulate them so you have points in time to revert to.

          IIRC, you can send a zfs snapshot to a file, but I can’t recall how to do that, so AFAIK, you can’t just send it to a file based service like Onedrive. You can use a service like zfs.rent and send them a harddrive with your base sync on it (encrypt it) and then once they’ve brought it online, you can sync to that. Best to test out your methods with the drive hooked up locally.

          I know it’s anathema to Lemmy, but the best help you’ll get is Claude where you can paste the errors in and have it sort it out for you as you troubleshoot. It’s pretty good at shit like that.

  • IsoKiero@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    19
    ·
    12 小时前

    I use hetzner storagebox for similar needs. It’s not encrypted, so you need to manage that by yourself, but they support a ton of protocols and pricing is decent, even if they’re increasing the price shortly.

    • Jediwan@lemy.lolOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 小时前

      What does the setup look like on your end? Is there like, an app? Also how would I look into managing encryption by myself?

      • Christian@feddit.org
        link
        fedilink
        English
        arrow-up
        5
        ·
        11 小时前

        I use Borg Backup to backup specific folders of my hard disk to my hetzner storage box.

        The software is triggered by corn/systemd to start a backup.

      • IsoKiero@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 小时前

        I’m using proxmox backup server to make copies of full virtual machines, it takes care of encryption and verification of the data, so it’s not exactly the same than your scenario. Borg Backup is commonly recommended, but restic and dejadup are worth checking out too.

  • MalReynolds@slrpnk.net
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    7 小时前

    And ideally the storage will be encrypted and have basic privacy assurances.

    Why would you trust a company to encrypt for you when Cryptomator exists ?

    Also, a couple of 4TB drives for cold backup (one offsite) avoids another subscription.

  • CHOPSTEEQ@lemmy.ml
    link
    fedilink
    English
    arrow-up
    12
    ·
    11 小时前

    Backblaze B3, backup software of your choice pointed at the Immich library. Photos get put into Immich, backup runs, data encrypted and saved offsite.

    • Jediwan@lemy.lolOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      11 小时前

      Backup software of your choice pointed at the Immich library

      Any recommendations? Preferably something something with a homeassistant integration or docker container with webui so I can more easily access it remotely. New to all this.

      • CHOPSTEEQ@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 小时前

        I use Duplicati and I THINK it has a container option? It is a web UI though.

        I have my Immich library on a network drive and I took the lazy way and have my desktop duplicati just back up the network drive instead of directly on the server 😅

        • Jediwan@lemy.lolOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 小时前

          Looks like it does have a container option! $100/year for Backblaze computer backup is above what I was hoping to spend but it’s unlimited and I’m looking for a set it and forget it option so I’ll probably do exactly that, thank you.

          • vext01@feddit.uk
            link
            fedilink
            English
            arrow-up
            3
            ·
            10 小时前

            There’s a plan where you pay some tiny amount per gb. Thats the one to use.

            • Jediwan@lemy.lolOP
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 小时前

              It’s $6/tb which isn’t bad but for my 3tbs is still more than $100/yr.

              • CHOPSTEEQ@lemmy.ml
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 小时前

                Yes you just taught me I’m paying more than I needed to using their B2 directly lol but I gave a few different backup buckets configured and I don’t mind paying a little extra for flexibility, vs paying for each machine I want data backed up on.

              • vext01@feddit.uk
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 小时前

                I’d be surprised if you find cheaper, but if you do, please report back.

                Fwiw, BB have been super reliable for me over that past few years I’ve used them.

          • ToffeeIsForClosers@piefed.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            10 小时前

            This is what I did, only I set it up such that my family’s computers are backing up to my large external drive, and this drive is connected to the computer with the unlimited BB running and backing up. Just to get a little more benefit out of the cost.

      • vext01@feddit.uk
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 小时前

        I use restic to backup immich (and everything else) to b3.

        Be sure to stop the docker container while you backup to avoid skew.

        Backblaze saved my ass at the end of last year when I had a hardware failure.

  • iamthetot@piefed.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 小时前

    My current solution is to pay for a few TBs of cloud storage, which is enough for my backup needs. My server has a few scripts on it that I wrote which all run on different cron schedules. The scripts, in general, shut down the service it’s backing up, tars and compresses the files related to the service, spins the services back up, then copies the compressed archive to a central backup location, and a secondary backup on-site external hard drive. Another script runs every day which prunes old backups from the cloud storage, then uploads the new ones.

  • SabMayaHai@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 小时前

    I’m in a similar predicament except my backup target is offsite storage reachable via SSH. What are people’s thoughts between kopia and restic for such datasets?

    • capital@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 小时前

      I have backed up and restored several TB of data using restic. It’s been great in my experience.

      I’ve mostly used it to back up to Wasabi but if I was setting it up now, I’d take a good look at CloudFlare R2.

      If you’ve already got a host, you might implement the rest server.

  • epyon22@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 小时前

    I just recently setup kopia to backup my next cloud data and have it sync to idrivee2 seems to be reasonable price for s3 compatible storage