Hi fellow self-hoster.

Almost one year ago i did experiment with Immich and found, at the time, that it was not up to pair to what i was expecting from it. Basically my use case was slightly different from the Immich user experience.

After all this time i decided to give it another go and i am amazed! It has grown a lot, it now has all the features i need and where lacking at the time.

So, in just a few hours i set it up and configured my external libraries, backup, storage template and OIDC authentication with authelia. All works.

Great kudos to the devs which are doing an amazing work.

I have documented all the steps of the process with the link on top of this post, hope it can be useful for someone.

  • @jqubed@lemmy.world
    link
    fedilink
    English
    4819 days ago

    Your website hasa banner that says it uses cookies and that by using it I acknowledge having read the privacy policy, but if I click More Information it takes me to a page the wiki says want created yet.

    • ShimitarOP
      link
      fedilink
      English
      1619 days ago

      Never noticed. I don’t do anything with the cookies anyway, its just a docuwiki self hosted, no ads, no data collection, nothing. I don’t even store logs.

      I might need to write the privacy policy… Will do tomorrow.

      • @starshipwinepineapple@programming.dev
        link
        fedilink
        English
        1119 days ago

        Im not familiar with doku wiki but here’s a few thoughts

        • privacy policy is good to have regardless of what you do with rest of my comments
        • your site is creating a cookie “dokuwiki” for user tracking.
        • cookie is created regardless of user agreement, rather than waiting for acceptance (implied or explicit agreement). As in i visit the page, i click nothing and i already have the dokuwiki cookie.
        • i like umami analytics for a cookieless google analytics alternative. They have a generous free cloud option for hobby users and umami is also self hostable. Then you can get rid of any banner.
      • Atemu
        link
        fedilink
        English
        518 days ago

        If you don’t process any user data beyond what is technologically required to make the website work, you don’t need to inform the user about it.

      • @teawrecks@sopuli.xyz
        link
        fedilink
        English
        218 days ago

        Afaik the cookie policy on your site is not GDPR compliant, at least how it is currently worded. If all cookies are “technically necessary” for function of the site, then I think all you need to do is say that. (I think for a wiki it’s acceptable to require clients to allow caching of image data, so your server doesn’t have to pay for more bandwidth).

    • ShimitarOP
      link
      fedilink
      English
      218 days ago

      i have double checked but i do not have any banner on my wiki at all… Where did you see one? The only cookie is a technical cookie only used for your preferences and no tracking.

  • Sibbo
    link
    fedilink
    English
    1019 days ago

    I’m using immich for half a year or so now. There only problem is that it did not chunked uploads. So one large video just never uploaded, and I had to use nextcloud to upload it instead. Otherwise, it’s great.

    • @retro@infosec.pub
      link
      fedilink
      English
      319 days ago

      If you’re self hosting Immich on your local network, I’ve gotten around this by setting the Immich app to use my local ip address while on my home wifi network.

    • ShimitarOP
      link
      fedilink
      English
      218 days ago

      Yes, i encountered this issue as well. Seems that tweaking NGINX setting helped. Still stupid that a large upload will stall all the others.

  • @non_burglar@lemmy.world
    link
    fedilink
    English
    819 days ago

    I love immich. I just wish for two things:

    • synchronised deletes on client server
    • the edit tools on mobile to actually work on the photo at hand instead of creating a new photo with new metadata. May as well not have the tools, tbh.
  • Thank you for this. I plan to look at the authentication part more closely, but that’s the part I can’t quite figure out (being an amateur at this stuff but still trying), since I’m nervous with just a password accessing it remotely or from the phone.

    Authelia, NGINX, there is so much that’s confusing to me, but this might help.

    • @enumerator4829@sh.itjust.works
      link
      fedilink
      English
      1219 days ago

      I’d recommend setting up a VPN, like tailscale. The internet is an evil place where everyone hates you and a single tiny mistake will mess you up. Remove risk and enjoy the hobby more.

      Some people will argue that serving stuff on open ports to the public internet is fine. They are not wrong, but don’t do it until you know, understand and accept the risks.(’normal_distribution_meme.pbm’)

      Remember, risk is ’probability’ times ’shitshow’, and other people can, in general, only help you determine the probability.

      • @gray@pawb.social
        link
        fedilink
        English
        419 days ago

        good general advice until you have to try to explain to your SO the VPN is required on their smart TV to access Jellyfin.

        • @enumerator4829@sh.itjust.works
          link
          fedilink
          English
          419 days ago

          Then you expose your service on your local network as well. You can even do fancy stuff to get DNS and certs working if you want to bother. If the SO lives elsewhere, you get to deploy a raspberry to project services into their local network.

          • @pirat@lemmy.world
            link
            fedilink
            English
            218 days ago

            deploy a raspberry to project services into their local network

            This piqued my interest!

            What’s a good way of doing it? What services, besides the VPN, would run on that RPi (or some other SBC or other tiny device…) to make Jellyfin accessible on the local network?

            • @enumerator4829@sh.itjust.works
              link
              fedilink
              English
              118 days ago

              Well, I’d just go for a reverse proxy I guess. If you are lazy, just expose it as an ip without any dns. For working DNS, you can just add a public A-record for the local IP of the Pi. For certs, you can’t rely on the default http-method that letsencrypt use, you’ll need to do it via DNS or wildcards or something.

              But the thing is, as your traffic is on a VPN, you can fuck up DNS and TLS and Auth all you want without getting pwnd.

        • @AtariDump@lemmy.world
          link
          fedilink
          English
          119 days ago

          It’s one thing to expose a single port that’s designed to be exposed to the Internet to allow external access to items you don’t care if the entire internet sees (Jellyfin).

          Ots other thing when you expose a single port to allow access to items you absolutely do care if the entire internet sees (Immich).

          • @enumerator4829@sh.itjust.works
            link
            fedilink
            English
            118 days ago

            If you’ve taken care to properly isolate that service, sure. You know, on a dedicated VM in a DMZ, without access to the rest of your network. Personally, I’d avoid using containers as the only barrier, but your risk acceptance is yours to manage.

    • ShimitarOP
      link
      fedilink
      English
      319 days ago

      Feel free to ask, even in pm, if I can help. Not a guru myself, but getting a bit more experience overtime.

  • @Darkassassin07@lemmy.ca
    link
    fedilink
    English
    419 days ago

    I’m curious;

    Which ML CLIP model did you go with, and how accurate are you finding the search results?

    I found the default kinda sub-par, particularly when it came to text in images.

    Switched to “immich-app/XLM-Roberta-Large-Vit-B-16Plus” and it’s improved a bit; but I still find the search somewhat lacking.

    • @waitmarks@lemmy.world
      link
      fedilink
      English
      6
      edit-2
      19 days ago

      The best one I have found was one of the newer ones that was added a few months ago. ViT-B-16-SigLIP__webli

      Really impressed with the accuracy even with multi word search like “espresso machine”

      • @Darkassassin07@lemmy.ca
        link
        fedilink
        English
        119 days ago

        How well does it do with text in images?

        I often find searching for things like ‘horse’ will do a decent job bringing up images of horses, but will often miss images containing the word ‘horse’.

  • @happydoors@lemm.ee
    link
    fedilink
    English
    418 days ago

    My only issue with it is that on my iphone, the app constantly freezes and says I have 3 photos left to upload. It’s almost certain to freeze for a few minutes and the upload becomes stalled as well. This behavior made it take a long time to backup my library and it makes it a pain in the ass to share photos quickly with people. Popping into the webUI has none of these issues (just no uploading of my photos). I still quite love the app

    • ShimitarOP
      link
      fedilink
      English
      219 days ago

      I backup with restic the database backups done by immich, not the database itself, and the Library/library folder which contains the actual images and videos.

    • @Lem453@lemmy.ca
      link
      fedilink
      English
      219 days ago

      I used to use a docker container that makes db dumps of the database and drops it into the same persistent storage folder the main application uses. I use this for everything in docker that had a db.

      Immich as recently integrated this into the app itself so its no longer needed.

      All my docker persistent data is in a top level folder called dockerdata.

      In that I have sub folders like immich which get mounted as volumes in the docker apps.

      So now I have only 1 folder to backup for everything. I use zfs snapshots to backup locally (zfs auto shot) and borgmatic for remote backups (borgbase).

      All my dockers all compose files that are in git.

      I can restore he entire server by restoring 1 data folder and 1 compose file per stack.

      • Ulrich
        link
        fedilink
        English
        219 days ago

        I don’t understand how that’s helpful. If something is corrupted or my house burns down, a local backup is going to go with it. That’s why I asked for external backups.

        • ShimitarOP
          link
          fedilink
          English
          118 days ago

          I have three tiers of backup. Never heard or the 3,2,1 rule?

          3 backups 2 locations 1 offsite

          I backup one time on an external disk connected to the server. A second time to another disk, connected on an OpenWRT router located in the patio. A third copy is uploaded to my VPS in the cloud.

          not all three are symmetrical due to disk sizes. But critical data is always backed up on all three. Daily backups.

          Restic do deduplication and encryption too, so actual data usage is really minimal and all is kept safe.

        • @bdonvr@thelemmy.club
          link
          fedilink
          English
          319 days ago

          If anyone’s interested, here’s my Immich backup script. You setup rclone to use an S3 storage service like BackBlaze which is quite cheap. I also use a crypt which means RClone will encrypt and decrypt all files to/from the server. S3 configuration and crypt setup.

          Then set this up as a cron job. With the “BACKUP_DIR” option when you delete a photo it will get moved to the “deleted” folder. You can go into your S3 provider’s lifecycle settings and have these get deleted after a number of days. I do 10 days. Or you can skip that and they’ll be gone forever.

          #!/bin/bash
          SRC_PATH="/path/to/immich/library"
          DEST_REMOTE="b2crypt:immich-photos/backup"
          BACKUP_DIR="b2crypt:immich-photos/deleted"
          RCLONE_OPTIONS="--copy-links --update --delete-during --backup-dir=$BACKUP_DIR --suffix `TZ='America/New_York' date +%Y-%m-%d`.bak --verbose"
          rclone sync $SRC_PATH $DEST_REMOTE $RCLONE_OPTIONS
          
          
          • Ulrich
            link
            fedilink
            English
            119 days ago

            Yeah, I don’t know what any of these words mean. I just want to click “export” and back all the data up to a flash drive. Is that too much to ask?

            • ShimitarOP
              link
              fedilink
              English
              218 days ago

              I think it is. It doesn’t take much to understand which folders needs to be backed up. They are also pretty clear on the immich website on how to backup the database itself. No, just an “export” wouldn’t be good enough since the files themselves do not include the metadata.

              • Ulrich
                link
                fedilink
                English
                118 days ago

                I think it is.

                Why is that?

                They are also pretty clear on the immich website on how to backup the database itself

                Yeah I’m pretty tired of hearing things are “pretty clear” or “not that complicated” and then being directed to an absolute word salad of technical terms no one without a computer science degree would understand.

                No, just an “export” wouldn’t be good enough since the files themselves do not include the metadata.

                They could…add them?

              • Ulrich
                link
                fedilink
                English
                119 days ago

                There’s no way to do that for your entire library. Also I assume that would not retain the Immich-specific metadata like the ML object tags and the “people” tagged in the photos.

                • @bdonvr@thelemmy.club
                  link
                  fedilink
                  English
                  219 days ago

                  You should have a backup solution for your server that should cover this, without that you should probably stick with managed photo backup services.

              • Ulrich
                link
                fedilink
                English
                119 days ago

                Reading the comment I replied to, it appears to be much much more complicated. And I don’t understand how anyone can claim otherwise.

                • @catloaf@lemm.ee
                  link
                  fedilink
                  English
                  119 days ago

                  Key word is “appears”. Choose your source and destination, run rclone. That’s it. No harder than going to the page, clicking export, picking a folder, save. It’s really not hard at all, give it a try.

                • ShimitarOP
                  link
                  fedilink
                  English
                  118 days ago

                  You need to backup exactly two folders, which i have also pointed out in another commend and in the wiki.

                  However you back those folder up, it’s up to you.

  • @nucleative@lemmy.world
    link
    fedilink
    English
    318 days ago

    Haven’t checked in a while but is there any hope for cloud storage of the image library yet? I’m kind of holding out for S3 support because I don’t want to manage multiple terabytes locally.

    • @sandwichsaregood@lemmy.world
      link
      fedilink
      English
      417 days ago

      I don’t think immich supports this natively but you could mount an S3 store with s3fs-fuse and put the library on there without much trouble. Or many other options like webdav.

  • kr0n
    link
    fedilink
    English
    219 days ago

    I have a problem generating thumbnails for photos taken from summer 2023 until now (using my iPhone 12 Pro). It’s like a format problem or something. I don’t know ¯_(ツ)_/¯

    • @ra1d3n@lemm.ee
      link
      fedilink
      English
      419 days ago

      You might want to submit a bug report. Their pace of development is insane for OSS.