Lem453

joined 2 years ago
[–] Lem453@lemmy.ca 1 points 2 days ago

On my unraid router, this is called DNS override

Immich.example.ca resolves to a local ip when you search for within the network. For every DNS entry on cloudflare for my domain, I have an equivalent one on my router and pihole that points to the local domain

[–] Lem453@lemmy.ca 2 points 3 days ago

Thats an excellent feature, thanks for the suggestion!

[–] Lem453@lemmy.ca 1 points 4 days ago

This looks fantastic! Thanks!

[–] Lem453@lemmy.ca 3 points 4 days ago (1 children)

No but it seems perfect with the chores section. Thanks!

[–] Lem453@lemmy.ca 2 points 4 days ago

This looks great, I had no idea it had household management as well. Will give this a try.

[–] Lem453@lemmy.ca 4 points 4 days ago

Web app is probably best. That makes it easy to share with other members in the family so anyone can check off a task once done. Also having tasks pile up when they are not done is also very useful so you can see the backlog. It's close to a calendar app but not quite optimized for something like this.

Ideally tasks could have a description section that explains how to do it, like flushing the water heater could have be exact steps written for the heater tank that you have specifically avoiding the need to keep looking it up every year.

There are several good webapps like this already, but they all are paid and locked in to their platform.

 

Any idea for a self hosted home maintenance reminder system? Essentially something that will have reminders for things that reoccur regularly either ie yearly, monthly etc. Ideally it would have some way of checking it off to show the task was completed and then it would reoccur at the preset time next month /year etc.

[–] Lem453@lemmy.ca 2 points 1 week ago (1 children)

Not sure about encrypted storage on the SD card, never heard of that.

However, reolink can be setup with the app once then connected to a VLAN that has no internet access and connects only to home assistant. Then you access the camera with home assistant only.

Requires technical knowledge to setup, but reolink cameras work well in this setup.

[–] Lem453@lemmy.ca 3 points 1 week ago (1 children)

This is blaphemous against the holy temple of Bcachefs.

[–] Lem453@lemmy.ca 1 points 3 weeks ago* (last edited 3 weeks ago)

Authentik handles SSO for all my apps like immich, linkwarden, owncloud etc. Openid when available but some web apps are done via forward proxy auth. Jellyfin uses LDAP via authentik which isn't sso technically.

Other than me, no one else mounts samba shares directly. All personal files are synced to server and other devices with owncloud (OCIS).

[–] Lem453@lemmy.ca 2 points 3 weeks ago* (last edited 3 weeks ago)

Yes its config file only, but if you get the File editor app, it's quite easy to just copy and paste a few lines into the editor.

Once it's setup it never changes.

[–] Lem453@lemmy.ca 1 points 3 weeks ago (3 children)

I've not looked for an LDAP solution but stuff like this is why i went with authentik over other solutions. Because authentik has LDAP built in, i can use this when needed (jellyfin) but then use openid for other apps (which us superior in almost every way for home lab use)

 

This is a hugely requested feature for many years and a huge hole in my entire self hosted ecosystem. Every self-hosted app I have connects to my Authentik system for user management... Except home assistant. Arguably one of the apps I need it for the most for the whole family to use with their accounts.

Devs have been resistant for some reason.

There is now a community integratation that allows user management for HA to be via any openID backend (authentik, keycloak etc).

I've been running it for a few days and it works perfectly. Very easy to setup if you already have a working authentik setup and know how to use it with other apps like immich.

 

New Android TV client (nvidia shield and other android devices) just dropped. Lots of nice improvements.

0
submitted 1 year ago* (last edited 1 year ago) by Lem453@lemmy.ca to c/selfhosted@lemmy.world
 

I'm trying to setup owncloud with single sign on using Authentik. I have it working for normal users. There is a feature that allows automatic role assignment to users so that admin users from authentik become admin users for owncloud.

This is described here: https://doc.owncloud.com/ocis/next/deployment/services/s-list/proxy.html#automatic-role-assignments.

In this document, they describe having attributes like

- role_name: admin
  claim_value: ocisAdmin

The problem I have is I don't know how to input this information into an Authentik user. As a result, owncloud is giving me this error:

ERR Error mapping role names to role ids error="no roles in user claims" line=github.com/owncloud/ocis/v2/services/proxy/pkg/userroles/oidcroles.go:84 request-id=5a6d0e69-ad1b-4479-b2d9-30d4b4afb8f2 service=proxy userid=05b283cd-606c-424f-ae67-5d0016f2152c

Any authentik experts out there?

I tried putting this under the attributes section of the user profile in authentik:

role_name: admin
claim_value: ocisAdmin

It doesn't work and it won't let me format YAML like the documentation where the claim_value is a child of the role_name.

 

Technically this isn't actually a seafile issue, however the upload client really should have the ability to run checksums to compare the original file to the file that is being synced to the server (or other device).

I run docker in a VM that is hosted by proxmox. Proxmox manages a ZFS array which contains the primary storage that the VM uses. Instead of making the VM disk 1TB+, the VM disk is relatively small since its only the OS (64GB) and the docker containers mount a folder on the ZFS array itself which is several TBs.

This has all been going really well with no issues, until yesterday when I tried to access some old photos and the photos would only load half way. The top part would be there but the bottom half would be grey/missing.

This seemed to be randomly present on numerous photos, however some were normal and others had missing sections. Digging deeper, some files were also corrupt and would not open at all (PDFs, etc).

Badness alert....

All my backups come from the server. If the server data has been corrupt for a long time, then all the backups would be corrupt as well. All the files on the seafile server originally were synced from my desktop so when I open the file locally on the desktop it all works fine, only when I try to open the file on seafile does it fail. Also not all the files were failing only some. Some old, some new. Even the file sizes didn't seem to consistently predict if it would work on not.

Its now at the point where I can take a photo from my desktop, drag it into a seafile library via the browser and it shows successful upload, but then trying to preview the file won't work and downloading that very same file back again shows the file size about 44kb regardless of the original file size.

Google/DDG...can't find anyone that has the same issue...very bad

Finally I notice an error in mariadb: "memory pressure can't write to disk" (paraphrased).

Ok, that's odd. The ram was fine which is what I assumed it was. HD space can't be the issue since the ZFS array is only 25% full and both mariadb and seafile only have volumes that are on the zfs array. There are no other volumes...or is there???

Finally in portainer I'm checking out the volumes that exist, seafile only has the two as expected, data and database. Then I see hundreds of unused volumes.

Quick google reveals docker volume purge which deletes many GBs worth of volumes that were old and unused.

By this point, I've already created and recreated the seafile docker containers a hundred times with test data and simplified the docker compose as much as possible etc, but it started working right away. Mariadb starts working, I can now copy a file from the web interface or the client and it will work correctly.

Now I go through the process of setting up my original docker compose with all the extras that I had setup, remake my user account (luckily its just me right now), setup the sync client and then start copying the data from my desktop to my server.

I've got to say, this was scary as shit. My setup uploads files from desktop, laptop, phone etc to the server via seafile, from there borg backup takes incremental backups of the data and sends it remotely. The second I realized that local data on my computer was fine but the server data was unreliable I immediately knew that even my backups were now unreliable.

IMHO this is a massive problem. Seafile will happily 'upload' a file and say success, but then trying to redownload the file results in an error since it doesn't exist.

Things that really should be present to avoid this:

  1. The client should have the option to run a quick checksum on each file after it uploads and compare the original to the uploaded one to ensure data consistency. There should probably be an option to do this afterwards as well as a check. Then it can output a list of files that are inconsistent.
  2. The default docker compose should be run with health checks on mariadb so when it starts throwing errors but the interface still runs, someone can be alerted.
  3. Need some kind of reminder to check in on unused docker containers.
 

Looking for a self hosted YouTube front end with automatic downloader. So you would subscribe to a channel for example and it would automatically download all the videos and new uploads.

Jellyfin might be able to handle the front end part but not sure about automatic downloads and proper file naming and metadata

 

This should eventually make it's way into jellyfin. Eager to see the performance improvements.

 

Beautiful stats for Jellyfin. I just set it up in docker compose yesterday. Love it!

 

I'm wondering if I can get a device that enables zwave over Ethernet/wifi and connect that to my home assistant setup?

Basically I have a home assistant setup in my house. I want to add a few simple things to my parents place but I want it to all be on the same HA instance.

On the router in my parents place, I can install wireguard to connect it to my LAN. So now my parents network is the same as my LAN network.

I'm looking for a device that can connect to zwave and then send that info over the LAN to my home assistant. Does such a thing exist? Thanks.

 

By local control, I mean if the Z-wave hub is down will the switch still work as a dumb switch and turn the lights on/off?

This is the product I would like to get, but can't find if they allow 'dumb switch' operation. Does anyone have experience with these? https://byjasco.com/ultrapro-z-wave-in-wall-smart-switch-with-quickfit-and-simplewire-white

Thanks!

view more: next ›