this post was submitted on 25 Apr 2026
21 points (95.7% liked)

Hardware

7145 readers
55 users here now

All things related to technology hardware, with a focus on computing hardware.


Some other hardware communities across Lemmy:


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
 

I just got two of these. Fully loaded. Disks, sleds, rails.
Fiber cards + 4 onboard NICs and 4 more on another card.
Its a dual proc board with a bunch of ram slots. (I think its Sandybridge procs, DDR3.)
20 HDD bays. These things are (older) beastly storage boxen.

Board Manufacturer: Supermicro
Chassis Part Number: CSE-846BTS-R920BP
Board Part Num: X9DRi-LN4+/X9DR3-LN4+
Product PartNum: SSG-6047R-E1R24N

I got them because they were at a remote colo, and they crashed a bunch of times.
They cost us more downtime than they were worth.
I happened to be in town and made my boss an offer.
He didn't have to pay for e-waste fees, and I removed his problem for the low, low cost of $0.

So now they are my problem.
I don't need 200 TB of redundant storage. I'm gonna shop em out and sell em.
No idea if the dual 920 watt psu will blow my apt breakers. Takes a lot of juice to spin 20 hdds.

So far, I've hauled them across half the US, up my stairs, and admired them.
I found a youtuber 'Art of the Server' with some helpful vids. Watched a bunch.
No real idea what I'm doing next.

I've configured them several times in the past. They always died after months of steady service.
Dead disks, etc. Maybe bad controllers?
A fault that intermittent is hard to diagnose, but they are in front of me now.
I can do whatever I need to. These are complicated devices.
My original plan of teardown and rebuild seems unwise now.

I'm interested in any practical feedback.

you are viewing a single comment's thread
view the rest of the comments
[–] dbtng@eviltoast.org 2 points 3 days ago* (last edited 3 days ago)

Those were a couple really good vids. I've never been a storage specialist, but I do manage all the storage for a small MSP, so I'm not ignorant. Like, I know ZFS pretty darn well, and I apparently collect storage servers for fun.
That Wendell guy tho, he really knows his shit.
I don't know that I got any final answers from him, but it left me with a lot to consider.

Honestly, a good chunk of what he had to say had me questioning my build with my Highpoint SSD7540 PCIe 4.0 x16 / 8x M.2 Ports NVMe card ... on a completely different machine, a build I was quite satisfied with until now. (It's on my gamer/server, my main box.)

I put a lot of research and performance testing into the Highpoint build. It's an 8x card supporting Gen 4 NVMe in an (actually) 16 lane slot. I populated 4 bays. Each stick gets 4 lanes, which is great for Gen 4. (I figured some day in the future when NVMe gen4 is dirt cheap, I'll fill the rest, and each stick will just get 2 lanes.) After some testing, I decided to use the hardware RAID controller on the card. Considering what old Wendell had to say, I suspect that perhaps it should be software raid instead ... still, that would mean relying on Windows to run the raid, and I don't trust Windows. And then there's the fact that after reviewing all the spec sheets, I've realized there's a lot I don't know about the card. But the Highpoint smokes, and I mostly just store video games there. So maybe bit-rot isn't a big deal anyway.

All very interesting stuff. Thanks.