Computers & Electronics

Any NAS/Storage expert / Data center person here need some help. SAS hard drives.

  • Last Updated:
  • Mar 29th, 2022 3:03 pm
[OP]
Deal Addict
Feb 16, 2014
1663 posts
296 upvotes
Hamilton

Any NAS/Storage expert / Data center person here need some help. SAS hard drives.

I have few SAS harddrive from Seagate

what hardware would I need to complete this as storage unit.

I was hoping to run them as plex server/backup my data.

total of 30 drives each 4TB each. -Some are SSD as well.

what the hard drives kind looks like.
https://www.amazon.ca/Seagate-Enterpris ... B00A45JFJS

any type of storage hardware I can buy thats plug and play - I was hoping not to spend too much money on hardware.
14 replies
Sr. Member
Dec 6, 2020
937 posts
1043 upvotes
You'll need to spend somewhere between $1,000 and $10,000 to hook up 30 SAS drives, depending on what's available on the surplus market at the moment. This will involve buying enterprise-grade hardware that is decidedly not plug and play.

Sell the drives (used 4TB SAS drives are worth at least $50 CAD each) and use the proceeds to buy a Synology NAS.
Deal Addict
Apr 4, 2017
2041 posts
657 upvotes
Toronto
The drives are brand new never used.

I’m thinking any used servers that companies don’t need to put drives inside..
Deal Addict
Apr 29, 2018
2330 posts
1694 upvotes
Vancouver
Often used servers get auctioned off or sold on surplus sites for low prices. I have seen some at ableauctions.ca, go for next to nothing. No HDD with them, so your drives should work in the ones that have a SAS controller
Can't Stop. Won't Stop. Game Stop
Deal Addict
User avatar
Nov 1, 2017
2566 posts
1752 upvotes
30 drives and not trying to spend too much money don't really go together.

Something like this would handle 24 drives: https://www.ebay.ca/itm/154727472948

You'd need to check servethehome to tell whether this is a good deal or not

If the ssd drives are nvme, expect to hit bottlenecks. Older hardware can't keep up with nvme ssds.
Deal Addict
Apr 29, 2018
2330 posts
1694 upvotes
Vancouver
Just checked eBay and 8 port SAS cards look to be around 25$. Doesn't sound too bad, but not sure if they will work with the drives - https://www.ebay.ca/itm/334073776007

AFAIK - SAS is the latest gen and should work, but prolly better to do a lil research before jumping in
Can't Stop. Won't Stop. Game Stop
Deal Expert
Aug 22, 2006
29848 posts
15364 upvotes
kramer1 wrote: Just checked eBay and 8 port SAS cards look to be around 25$. Doesn't sound too bad, but not sure if they will work with the drives - https://www.ebay.ca/itm/334073776007
While I'm not familiar with that particular card (TBH I'm not even sure what form factor that is) but budget SAS cards are pretty cheap, usually sub $50 for 8 ports. YMMV since ebay.
OP definitely wants SFF8087 ports though.

Each card can support 8 drives direct, so that's 4 cards alone. Or they could go expanders.
There are 24 port cards, but they're expensive. It makes more sense to get a board with multiple PCI-E slots and cheap cards if money is an object.

@PerformingAzura is on the right track with an SC846 which would make housing 24 of those 30 drives effortless.
The ebay post they posted is basically a drop in ready to use solution. It's an "A" backplane which means a slightly more messy cabling solution but it works.

You can DIY the same solution for better and perhaps slightly cheaper but it won't be that much cheaper.
I prefer "E" series backplanes because of expanders and those CPUs (yes plural, there's 4 of them) is HUGELY overkill. But they are old so they're cheap.

At the end of the day, if you've never done this, this is the best solution. I could give you a parts list for what I'd build, but between hunting down the parts, shipping on 12 different things, and assembling them yourself, it's easier just to buy this outright.
Do you not have anything else to do rather than argue with strangers on the internet
Nope. That's why I'm on the internet arguing with strangers. If I had anything better to do I'd probably be doing it.
Deal Expert
Aug 22, 2006
29848 posts
15364 upvotes
PerformingAzura wrote:
If the ssd drives are nvme, expect to hit bottlenecks. Older hardware can't keep up with nvme ssds.
IOPS? Not a chance.
Raw transfer speed? I can get 2GB/s read/write with enough rust. Ironically with hardware that's similar to the link you posted.
Do you not have anything else to do rather than argue with strangers on the internet
Nope. That's why I'm on the internet arguing with strangers. If I had anything better to do I'd probably be doing it.
Deal Addict
User avatar
Nov 1, 2017
2566 posts
1752 upvotes
death_hawk wrote: IOPS? Not a chance.
Raw transfer speed? I can get 2GB/s read/write with enough rust. Ironically with hardware that's similar to the link you posted.
I've never experienced it myself. Nor will I ever given the cost of these kinds of drives.

Linus showed a video where they used a GPU and special software to "accelerate" file transfers, because it was being cpu bottlenecked.

Deal Expert
Aug 22, 2006
29848 posts
15364 upvotes
PerformingAzura wrote: I've never experienced it myself. Nor will I ever given the cost of these kinds of drives.
Yeah most people won't. Supporting that many drives becomes a minor pain in the ass. Plus most people don't need hundreds of terabytes of storage.
And then there's us crazy people who see the 16TB WD Gold deal and think "Hmm.... 384TB is only $10k...."
Seriously if I had $10k on hand right now I'd actually buy 24 of them.
Linus showed a video where they used a GPU and special software to "accelerate" file transfers, because it was being cpu bottlenecked.

Is there a TL;DW? I'm not sure I want to sit through 27 minutes.
I'm actually surprised he needed a GPU to accelerate though. I hit 2GB/s on some benchmarks with an OLD CPU.
Do you not have anything else to do rather than argue with strangers on the internet
Nope. That's why I'm on the internet arguing with strangers. If I had anything better to do I'd probably be doing it.
Deal Addict
User avatar
Nov 1, 2017
2566 posts
1752 upvotes
death_hawk wrote: Yeah most people won't. Supporting that many drives becomes a minor pain in the ass. Plus most people don't need hundreds of terabytes of storage.
And then there's us crazy people who see the 16TB WD Gold deal and think "Hmm.... 384TB is only $10k...."
Seriously if I had $10k on hand right now I'd actually buy 24 of them.



Is there a TL;DW? I'm not sure I want to sit through 27 minutes.
I'm actually surprised he needed a GPU to accelerate though. I hit 2GB/s on some benchmarks with an OLD CPU.

I think he went from 20GiB to 40GiB/s after acceleration.
Deal Addict
Mar 26, 2008
1573 posts
1005 upvotes
Toronto
I recently build something like this that can run truenas/unraid

SERVER:
Rosewill 4U case: ~$150
https://www.ebay.ca/itm/203641869614?ep ... SwLZZhXr61

Asus / Supermicro e5 v1/2 or v3/4 motherboard
motherboard: ~200
ddr3: 64GB: ~150
cpu: Dual 2620 6 core. ~20
ATX 500W PSU: ~60
SAS card (should use LSI HBA card, NOT pure RAID card). H200E (not H200, E is for external connection to the diskshelf) - ~60 (need to flash to IT mode)
Cable to connect to disk shelf (Netapp DS4246 for example) SFP (SFF-8436) to MiniSAS (SFF-8088) : $50
Boot drive: 2 SSD mirrored: 2x 128GB is fine. ~50
You can add in another SAS card like H200,H310, M1015 to hook up 8 SAS drive within the server. Only get those if you want plug and play and you would need to flash them to IT mode.
SAS card: ~50
Cables to SAS drive X2: 2x$20
https://www.amazon.ca/gp/product/B013G4 ... UTF8&psc=1

Cost of server: >$800

Then you need a diskshelf for the 24 drives.
4U 24 disks shelf (i prefer netapp since many people from the unraid/truenas community uses them)
~$250 without caddy after shipping

24 caddys
~$200
or you can print them out with a 3D printer

Cost of diskshelf >$450

Total cost: >$1200

While PerformingAura listed a pretty decent package of Server+Shelf+Caddy for $1200 (+ mystery duty), it uses a RAID adaptec card which can cause inflexibility of setup and incompatibility with Truenas or unraid. I would stick with LSI HBA cards for maximum community support.

If going the ebay route for parts, I recommend going for Canadian Sellers or USA sellers that use Global Shipping Program. I hate dealing with brokerage from shippers.

One thing to keep in mind is that if going with the Disk Shelf route, it is loud and power hungry.

But the alternative is the get 2 or 3 4U rosewill case with all the server parts, install 8 SAS drive on each (or more if you install a 2nd HBA card. But you may not have enough Sata Power cable or max you can fit 2 to 4 in the 5.25 bays). If you get quiet fans, it is bearable to work with within the same room.

One final note is that don't be tempted to get the cheapest SAS card. you may accidentally bought an old one that doesn't even work with 4TB drives. Really Stick with H200, H310, M1015 as a start. once you learn more you can get more modern 12Gbps cards but it will burn holes in your wallet.
Deal Addict
User avatar
Nov 1, 2017
2566 posts
1752 upvotes
Real_GM wrote: I recently build something like this that can run truenas/unraid

SERVER:
Rosewill 4U case: ~$150
https://www.ebay.ca/itm/203641869614?ep ... SwLZZhXr61

Asus / Supermicro e5 v1/2 or v3/4 motherboard
motherboard: ~200
ddr3: 64GB: ~150
cpu: Dual 2620 6 core. ~20
ATX 500W PSU: ~60
SAS card (should use LSI HBA card, NOT pure RAID card). H200E (not H200, E is for external connection to the diskshelf) - ~60 (need to flash to IT mode)
Cable to connect to disk shelf (Netapp DS4246 for example) SFP (SFF-8436) to MiniSAS (SFF-8088) : $50
Boot drive: 2 SSD mirrored: 2x 128GB is fine. ~50
You can add in another SAS card like H200,H310, M1015 to hook up 8 SAS drive within the server. Only get those if you want plug and play and you would need to flash them to IT mode.
SAS card: ~50
Cables to SAS drive X2: 2x$20
https://www.amazon.ca/gp/product/B013G4 ... UTF8&psc=1

Cost of server: >$800

Then you need a diskshelf for the 24 drives.
4U 24 disks shelf (i prefer netapp since many people from the unraid/truenas community uses them)
~$250 without caddy after shipping

24 caddys
~$200
or you can print them out with a 3D printer

Cost of diskshelf >$450

Total cost: >$1200

While PerformingAura listed a pretty decent package of Server+Shelf+Caddy for $1200 (+ mystery duty), it uses a RAID adaptec card which can cause inflexibility of setup and incompatibility with Truenas or unraid. I would stick with LSI HBA cards for maximum community support.

If going the ebay route for parts, I recommend going for Canadian Sellers or USA sellers that use Global Shipping Program. I hate dealing with brokerage from shippers.

One thing to keep in mind is that if going with the Disk Shelf route, it is loud and power hungry.

But the alternative is the get 2 or 3 4U rosewill case with all the server parts, install 8 SAS drive on each (or more if you install a 2nd HBA card. But you may not have enough Sata Power cable or max you can fit 2 to 4 in the 5.25 bays). If you get quiet fans, it is bearable to work with within the same room.

One final note is that don't be tempted to get the cheapest SAS card. you may accidentally bought an old one that doesn't even work with 4TB drives. Really Stick with H200, H310, M1015 as a start. once you learn more you can get more modern 12Gbps cards but it will burn holes in your wallet.
I wish IT mode was the default... if you do buy a card that doesn't support IT mode, I heard you could create RAID1 with 1 drive and then pass them into truenas/unraid. I haven't tried it myself though.
Deal Addict
Mar 26, 2008
1573 posts
1005 upvotes
Toronto
PerformingAzura wrote: I wish IT mode was the default... if you do buy a card that doesn't support IT mode, I heard you could create RAID1 with 1 drive and then pass them into truenas/unraid. I haven't tried it myself though.
Personally, I would stay away from RAID drive if you don't have much system admin experience.

When you create a single drive hardware raid volume, the OS (truenas/unraid) and system would recognize as a raid volume, and not a drive, to build your storage pool.

If your raid card dies, even with replacing with the same model, it is possible that all the raid volumes get corrupted and you would lose your whole pool.

It maybe fixable when you have good records of which drive belongs to which pool, but it takes a lot of effort and time even when you know what you are doing.

My recommendation is to stay away from Raid cards that can't do IT mode.

PerformingAzura, I think your ebay recommendation can work well when you get a HBA sas card in. Although I'm not sure if you need a SAS expander for the drive backplane and those costs like 200+
Deal Expert
Aug 22, 2006
29848 posts
15364 upvotes
Real_GM wrote: While PerformingAura listed a pretty decent package of Server+Shelf+Caddy for $1200 (+ mystery duty), it uses a RAID adaptec card which can cause inflexibility of setup and incompatibility with Truenas or unraid. I would stick with LSI HBA cards for maximum community support.
Eh... Adaptec isn't really that rare. If it was some no name HBA from China? Sure.
But Adaptec is a big name. I really can't see practically any OS not supporting them.

Even if it did, you can drop in replace with any LSI2008 based HBA.
For the exact link above, you'd need 3x HBAs because the "A" backplane doesn't have an expander.
If you get an "E" backplane you'd only need one.
One thing to keep in mind is that if going with the Disk Shelf route, it is loud and power hungry.
I cannot stress this enough.
This isn't something you put in the living room. This hides in your basement or something.
But the alternative is the get 2 or 3 4U rosewill case with all the server parts
IMO this is a horrible idea and a GIANT waste of space, but if you really wanted to do this, you'd only really need one set of server parts, an expander for every node, and a couple SFF8087-SFF8088 adapters and cables.
Unless you wanted a separate file system for each node. I like one large volume over a bunch of smaller volumes.

Although to be fair, I'm not even a fan of a 4U server then a 4U disk shelf. This is also a waste of space.
But you may not have enough Sata Power cable

This is one reason I absolutely refuse to use consumer cases. I don't know of any power supply that has a ridiculous number of SATA cables and using molex->SATA adapters can be hazardous.
It makes far more sense to use a proper server case. As an example, SC846 with the right backplane, I have (effectively) zero power cables because they're built into the case at the bottom. Technically the backplane is powered by molex. For SATA cables, I have one single SFF8087 cable for 24 drives.

Expanding to another 24 drives requires 2x SFF8087 cables, an SFF8088 cable, and 2x SFF8087 to SFF8088 adapters. You can chain upwards of 1024 drives in this manner if you get the fancy LSI based HBA. The regular cheap one can only do up to 256 drives which works out almost exactly to 10x 4U cases with 24 drives which almost perfectly fills a 42U rack. But that's getting to be overkill even for the most extreme data hoarder.
If you get quiet fans, it is bearable to work with within the same room.
You can swap the fans in most industrial chassis to something quieter.
Heck I know someone that swapped the 40mm PSU fans as well. Those are the real jet engines.

One final note is that don't be tempted to get the cheapest SAS card. you may accidentally bought an old one that doesn't even work with 4TB drives. Really Stick with H200, H310, M1015 as a start.
The chipset you're after is LSI2008.


PerformingAzura wrote: I wish IT mode was the default... if you do buy a card that doesn't support IT mode,
If you have an old motherboard kicking around, it's trivial to flash.
If you don't have an old motherboard around, it's actually quite painful to flash.
Basically you want something that has no trace of UEFI on it because that screws with the card booting into flashing mode.
That's assuming you bought the right card. Any LSI2008 card can easily swap between IR/IT mode but if you're picking up anything that isn't, you're on your own.
I heard you could create RAID1 with 1 drive and then pass them into truenas/unraid. I haven't tried it myself though.
Obviously I haven't tested every single RAID setup out there, but most of them demand the minimum number of drives (in this case 2) to build the RAID.
Since this sounds like it's for an OS drive, it makes more sense to just hook them up direct to the motherboard. Every board has at least 2 SATA. Then you can do OS based RAID which makes compatibility easier.
Real_GM wrote: PerformingAzura, I think your ebay recommendation can work well when you get a HBA sas card in. Although I'm not sure if you need a SAS expander for the drive backplane and those costs like 200+
The built in one is fine. It's only an issue if you want to expand past 24 drives.
Then yeah a proper E backplane is about $200 plus you'd have to swap the HBA.
If you have the intention of going beyond 24 drives at any point in the future, I'd plan ahead now and find an SC846 with an E backplane and not an A (or worse TQ) backplane.
Do you not have anything else to do rather than argue with strangers on the internet
Nope. That's why I'm on the internet arguing with strangers. If I had anything better to do I'd probably be doing it.

Top