• 0 Posts
  • 20 Comments
Joined 1 year ago
cake
Cake day: June 17th, 2023

help-circle


  • Lots of good advice here. I’ve got a bunch of older WD Reds still in service (from before the SMR BS). I’ve also had good luck shucking drives from external enclosures as well as decommissioned enterprise drives. If you go that route, depending on your enclosure or power supply in these scenarios you may run into issues with a live 3.3V SATA power pin causing drives to reboot. I’ve never had this issue on mine but it can be fixed with a little kapton tape or a modified SATA adapter. It’s definitely cheaper to shuck or get used enterprise for capacity! I’m running at least a dozen shucked drives right now and they’ve been great for my needs.

    Also, if you start reaching the point of going beyond the ports available on your motherboard, do yourself a favor and get a quality HBA card flashed in IT mode to connect your drives. The cheapo 4 port cards I originally tried would have random dropouts in Unraid from time to time. Once I got a good HBA it’s been smooth sailing. It needs to be in IT mode to prevent hardware raid from kicking in so that Unraid can see the individual identifiers of the disks. You can flash it yourself or use an eBay seller like ThArtOfServer who will preflash them to IT mode.

    Finally, be aware that expanding your array is a slippery slope. You start with 3 or 4 drives and next thing you know you have a rack and 15+ drive array.









  • Well said. I had hardware that was killed by “upgrades” or manufacturers discontinuing them from their cloud features. I now instal locally controllable hardware as much as possible and it has led to a much more stable and long term reliable smart home. Everything ties back into Home Assistant. The only remaining things I have with a cloud-reliant integration are the robovac our Nest Protect Smoke Alarms, and smart vents. The only reason they’re cloud controlled is there wasn’t a viable option that met feature and price point requirements. Everything else, (65+devices) is local Wi-Fi/Homekit ZigBee or Z-Wave


  • Even on the Windows side of things they’re frustrating. Company took my perfectly working Thinkpad and replaced it last September with an “upgraded” Dell Inspiron laptop. It’s a piece of crap. Wakes up all the time in my bag, randomly drops wifi, and randomly drops ViewSonic monitors. Official IT solution: this happens sometimes, we don’t know why, and we’re going to send you Dell monitors instead.

    *Edit I guess it’s actually a Precision, not Inspiron. I don’t buy Dells so I don’t know all the names!


  • Great advice from everyone here. For the transcoding side of things you want an 8th gen or newer Intel chip to handle quicksync and have a good level of quality. I’ve been using a 10th gen i5 for a couple of years now and it’s been great. Regularly handles multiple transcodes and has enough cores to do all the other server stuff without an issue. You need Plex Pass to do the hardware transcodes if you don’t already have it or can look at switching to Jellyfin.

    As mentioned elsewhere, using an HBA is great when you start getting to large numbers of drives. I haven’t seen random drops the way I’ve seen occasionally on the cheap SATA PCI cards. If you get one that’s flashed in “IT mode” the drives appear normally to your OS and you can then build software raid however you want. If you don’t want to flash it yourself, I’ve had good luck with stuff from The Art of Server

    I know some people like to use old “real” server hardware for reliability or ECC memory but I’ve personally had good luck with quality consumer hardware and keeping everything running on a UPS. I’ve learned a lot from serverbuilds.net about compatibility works between some of the consumer gear, and making sense of some of the used enterprise gear that’s useful for this hobby. They also have good info on trying to do “budget” build outs.

    Most of the drives in my rack have been running for years and were shucked from external drives to save money. I think the key to success here has been keeping them cool and under consistent UPS power. Some of mine are in a disk shelf, and some are in the Rosewill case with the 12 hot swap bays. Drives are sitting at 24-28 degrees Celsius.

    Moving to the rack is a slippery slope… You start with one rack mounted server, and soon you’re adding a disk shelf and setting up 10 gigabit networking between devices. Give yourself more drive bays than you need now if you can so you have expansion space and not have to completely rearrange the rack 3 years later.

    Also if your budget can swing it, it’s nice keeping other older hardware around for testing. I leave my “critical” stuff running on one server now so that a reboot when tinkering doesn’t take down all the stuff running the house. That one only gets rebooted or has major changes made when it’s not in use (and wife isn’t watching Plex). The stuff that doesn’t quite need to be 24/7 gets tested on the other server that is safe to reboot.




  • I’ve been using one for several years now with one of the documented switches that add multiple ports. https://docs.pikvm.org/ezcoo/#connections First in a DIY and then with the v3 hat Kickstarter I guess total I’m at $270 between the Kickstarter HAT and ezcoo switch plus the cost of a Pi (which I already had) I can reach 4 machines over my Tailnet and jump between them reliably. I can also control power on my primary server. (others are on a network managed PDU and can be forcibly reset that way if needed)

    I had an old console from a job but it was so old that it required an ancient version of Java to access through the web interface. I’m sure there may be better options, but for my homelab setup the pikvm has worked well at a price that fit in my budget.



  • I’ve been using obsidian-livesync for a couple months now. Works great cross-platform since it runs directly out of my Vault and doesn’t cost $8/mo. Mine is running on fly.io right now but I may eventually move it to my own machine. https://github.com/vrtmrz/obsidian-livesync/

    I can’t help feeling like Obsidian really missed the mark on their pricing here for hobbyist & home users. I can’t justify paying substantially more than something like iCloud or Google Drive storage when I’m using Obsidian to just sync some text and a few documentation images. Something like $1-2/mo would have been an instant buy for me, but at $8 it was worth my time to investigate other ways of syncing.


  • It’s pretty much always been this way with the HP ones. Years ago when wireless printers were not the standard, we used to connect a printer to the family Windows PC and then share it on the network. We got a new one set it up, and the printer refused to be shared. Turns out HP had explicitly blocked network sharing in their Windows software driver for that printer. Never purchased another one since. Brother isn’t perfect, but I have multiple 8+ year old Brother laser printers still in service right now and they “Just Work.”