Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I run a 48 TB ZFS pool (Z2; 24TB usable), and an offsite, offline 48 TB ZFS pool (mirror).

My primary server is Proxmox, where I have 8 unprivileged VMs with Docker nested inside (since ZFS 2.2.0, Docker on unprivileged ZFS works now without detours).

My backup is ZFS-send plus Borgmatic as a fallback, until I trust the ZFS-send approach that I just added (I usually have "migration phases" for new features that may last 1-2 years).

I run Gitlab, Nextcloud, Funkwhale, InfluxDB, Graphana, Homeassistant, Mailcow-Dockerized (as an email-management and collector solution) and other things. All of this not public, but on local VLAN and VPN. The good thing with Docker + Watchtower is that I only have very little (<30 Mins) administration work per month, most of this is also testing new features. Has worked reliably for the past 5 years.

For connecting households (my brother's family etc.), I use OPNsense and pfSense boxes.

My server is powered by a 90 kWp solar array - I don't have a (house-) battery yet, but that will be added in 1-2 years. Current energy consumption is about 250 Watt - but I also have an old Xeon from 2013.

The main cost (after initial investment) is harddrives and (of course) learning time. It pays off in other areas, too. I just got a job that I think I only got because of my report of self-hosting.

The biggest convenience is that I can sync all devices and all data (e.g. Nextcloud auto-upload from phones, DAVx5 for calendar/contacts) to a single, secure and private place that is then automatically backed up. I don't know the correct word, but maybe "liberating" describes the feeling best.

All of this does not require super-secret knowledge, just some motivation (and you should not be completely broke). I started in 2017 with the plan to learn Linux and get away from FAAN(M)G and I am happy I started this endeavor back then.



That all sounds great but what we desperately need is instructions on how to do things like this for the less technical, and easy maintenance… I think those are the limiting factors for wide spread adoption


I agree. We need more and better instructions and easier-to-setup tools - all of this got a big boost with docker (compose). On the other hand, we need to pair the less-technical with the power-users and help each other. I am responsible for about 8-10 people. This frees them from cloud providers and helps me to built better and more robust solutions.


You’re doing great work then, well done!

I run a synology nas with docker compose for some services and it works well. With a young family though when something does go wrong it can throw out a few days of my (very little) spare time.


Yeah, I have a family myself. This only worked for me because I have the habit to go to bed early (like 8PM) and stand up at 4-5 AM in the morning. This gives (or: gave) me about 1-2 hours before breakfast to learn Linux and DevOps.


I had a plan once for a website where we’d all post our self hosting architectures as kind of blueprints of known good configurations plus some instructions to build them… another project dropped due to spare time


Having used FreeNAS and TrueNAS Core and TrueNAS Scale, honestly it's not that complex and doesn't require any technical knowledge to get a basic system set up with backups to a cloud location of choice. The docs are very straightforward apart from a few specific terms you might need to Google but help tooltips generally do the trick, and everything that regular users might need is accessible through a trivial web UI.


There definitely is a market for someone who wants to make a full-on at-home cloud like that a product.

Just a fancy box, you give it an internet connection and power and then it Just Works. 100% local.


I spent a good number of years thinking this was the right solution and heading towards it. I had drawn up designs for my house, solar installation and half a rack of kit. This was to run a domain for my immediate family and all services including media storage and streaming. Solar was going to be 100% grid independent i.e. not some crappy provider based install. The infra was all targeted at Debian.

Then I got two sick parents and a divorce on my hands.

Then I realised exactly how much time, energy, money and headspace managing all this crap really took out of my life because I had to spend it on other stuff instead not out of choice but necessity. One of the absolute killers for all of this wasn't really the cost and complexity but the time to curate my data and deal with it is incredibly high if you have a lot of it. One of the worst sub-parts was the question from the kids: I want this music and then going to get it and deal with it and then distribute it to everyone. This would take hours a month to manage.

I came out of the other side of this somewhat unscathed thankfully and with a much larger pile of cash than I thought I was going to.

So a year long project took place. I combed through several TB of data, from myself and my parents and did a huge minimisation effort. I deleted all the fuzzy ass photos, pictures of stuff no one cared about, all the films I'd already watched, deleted all the crap music, deleted scanned paperwork dating back to 1989 I'd never need, everything. I also did a physical clean out at that point. Then I scanned all the family photos as a backup.

End game was everything everyone in my family ever did, or found value in, could be crammed in 200Gb of iCloud with 40Gb left over. Apple got to handle the music too. So that's how it rolls. I have no infrastructure other than a laptop, router and offline backup drive now, no huge energy dependency (my energy usage is stupid low) and my domain sits on AWS/CloudFront and Apple iCloud+.

There's a dependency in one form or another so I'll pick the one that makes my life easier. Feeling liberated for me is not having the weight of this on my shoulders personally and not leaving that weight on my kids shoulders one day to pick through and deal with. I've been through too many dead people's stuff to want to do that to someone else now.

If I want to learn something, I'll do a formal higher education thing, because I have the time to do it now!

Edit: notably there was an advantage here at two points. Firstly my internet connection was down for 2 days when someone cut the cable. I did not lose any services because I could 4G tether. And secondly I moved house and had several months of hell trying to get an internet connection sorted out. Again, 4G was fine. I was pulling 100 gig a month over that 4G working from home and there was no noticeable material difference for me.


I know what you mean. I cannot really comment (I have a sick parent, now, too, unfortunately).

Just one answer to this:

> One of the worst sub-parts was the question from the kids: I want this music and then going to get it and deal with it and then distribute it to everyone. This would take hours a month to manage.

There are million ways to set this up, I know. This is how I did it: People have their "music" folder that is synced to the Nextcloud Server. This folder is then bind-mount (read-only) to the Funkwhale-Docker inside an LXC container. On file updates, Funkwhale scans files and add any new music to people's libraries, so they can immediately listen to their new music. There is no human intervention required, zero admin work (beyond the initial setup).

I followed this design principle throughout all my services and it is pretty much "if I have no time: leave it running and it will run 12month + without requiring work".

I am at a stage now where this really _safes_ me time. All the document organization, paperless office etc. free up my spare time so I can play with my kid, help my parents etc.

I simply lost all trust to cloud providers when Amazon once said I could backup my photos for free (5TB). The upload took one month. 12 months later, they said they would deprecate the service. I had to pull everything again. Never again.


Sorry to hear about the sick parent.

Regarding the music, mostly obtaining it in the first place is where the issues appear. Either it's difficult to get it (buy / warez / rip) and I have people with little to no technical ability and interest using it. Ergo Apple Music worked nicely.

Wait until it all breaks and you spend all night up because the kids are complaining that they can't get to their music (been there). Can point at Apple now and say it's their fault ;-)

As for trusting the cloud, you don't have to. Just have an exit plan. For me it's a backup drive and Spotify should apple go to shit.


Could you elaborate on what you mean by off-site and offline mirror? How do you go about maintaining that?


Good question. This is a bit tricky and you may argue about the term "offline". My offsite ZFS array sits at a remote location (my parents house, 100km distance - just enough for a nuclear strike). It is automatically turned on using a Shelly PlugS on a timer, once per week. It then connects via VPN automatically to my main site, runs some checks for ransomware (like changed file-count), pulls ZFS updates and then shuts down again.


If you’re sending zfs snapshots (and not deleting them), doesn’t that give you protection against ransomware? If so, a high number of files changed might be exactly when you want to snapshot and replicate, to minimize the worst case outcome (that the ransomware gets root and zfs destroys your local copy).


Yes, but if your hypervisor is compromised, ZFS Snapshots could theoretically also been deleted/modified/etc. - I wanted to cover this scenario. This is still work-in-progress and my script currently just aborts early in case of anything suspicious. Also, the file changed check only applies to my Borgmatic solution, ZFS only works at the snapshot/dataset level.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: