this post was submitted on 09 Apr 2026
26 points (88.2% liked)

Selfhosted

58417 readers
717 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

To be honest, I've seen commercial 7' racks in data centres and computer rooms that were worse than the worst ones here!

I was once tasked with rejigging 3 racks in a remote computer room. The racks were arranged in an "L" pattern due to the constraints of the room. None of the doors - front or back - could close because of cables running between servers and switches. Some cables actually ran diagonally across the L shape. A lot of cables were jammed between the mounting rails, and 3 metre cables were used where a 50cm one would have done, or 2 metre ones where a 3 meter or more was needed. Almost nothing was labelled, and where it was, it was wrong. The cable colour coding scheme was ignored, and nothing was recorded. There were servers racked on a slant - TWO nuts off on one side - and even mounted back-to-front. Others were literally sat directly on top of other kit, not bolted in at all. RAID arrays for critical servers were mounted in adjacent racks, with the cables running around the opened rear rack door, and there were a number of suspicious, unmarked servers, of odd brands that were hooked into the main switch, that nobody could identify. One turned out to be an abandoned Nagios server, but one was never identified, and nothing broke, nobody screamed when I turned it off.

Just about all the horrible things you have seen or heard about were in that room. It took weeks to sort it out.

top 6 comments
sorted by: hot top controversial new old
[–] ikidd@lemmy.world 8 points 2 days ago* (last edited 2 days ago)

I had a car dealership I was to add new servers into a new rack and recable it. I walked into a room with about half a dozen servers balanced on a pile of cat5, BNC and serial cables about 4' high. I spent 3 weeks untangling cables, removing dead cable, decommissioning serial and token ring networks and re-terminating or re-running ethernet that didn't test well.

Pretty much everything was done by scream test because nothing was marked. I found an ancient server that was still used for manuals occasionally that was drywalled into a old closet in the shop when I traced down a line I disconnected and one of the mechanics asked where his manuals had gotten to. That server was shut down every night when they turned off the shop lights and booted back up every morning for who knows how many years when someone came in to work and turned on the lights.

I eventually got to the point I could set up my rack and SANs/servers, patch everything over from the network rack I mounted on the wall, and get guys going on the workstations.

We had a series of meetings after that with the sales team about getting a technical appraisal before we sold our equipment into dealerships. And every dealership I worked in after that was pretty similiar.

Honestly, it was an amazingly satisfying feeling at the end to look in that room after I was done. I get a little shiver 20 years later thinking about it now.

[–] linuxguy@piefed.ca 7 points 2 days ago
[–] ApocolypticGopher@infosec.pub 4 points 2 days ago

Favorite I've had was a switch just sitting on the panels of a drop ceiling. Wasn't documented that's where it was either. I spent an hour or so hunting around a department when somebody mentioned, "there's that tech thing in the ceiling".

[–] Alvaro@lemmy.blahaj.zone 6 points 2 days ago (1 children)

My worst rack experience was at an office I did IT in, the networking closet had 3 racks and like 20 switches/routers, each one with almost all the ports in use.

There. Was. No. Cable. Management. None...

Everytime someone changed something over the years, they would grab the nearest cable and connect it however they wanted.

You literally had to craw between cables and follow them with your hand from one port to the other as there was no other way to find what went where.

I'm talking 10-15 minutes to switch a cable from two switches on the same rack.

I once spent 2 hours mapping out where 1(!) endpoint was connected because the cable died (they were all basically trash) and there was no mapping so I had to use a line tracer (tone generator)

[–] SteveTech@aussie.zone 1 points 1 day ago* (last edited 1 day ago) (1 children)

Usually if they're all managed switches, you can look at the MAC table and map things out that way.

[–] Alvaro@lemmy.blahaj.zone 2 points 1 day ago

Yeah, but not if the cable died and the port has been down for long enough that the last MAC was already cleared and there is no historic logs 🙃