In my server I currently have an Intel i7 9th gen CPU with integrated Intel video.
I don’t use or need A.I. or LLM stuff, but we use jellyfin extensively in the family.
So far jellyfin worked always perfectly fine, but I could add (for free) an NVIDIA 2060 or a 1060. Would it be worth it?
And as power consumption, will the increase be noticeable? Should I do it or pass?
Look up the GPU on these charts to find out what codecs it will support: https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new
NVENC support will tell you what codecs your GPU can generate for client devices, and NVDEC support determines the codecs your GPU can read.
Then compare it with the list of codecs that your Intel can handle natively.
Most Intel GPUs are great at transcoding. Reliable, widely supported and quite a bit of transcoding power for very little electrical power.
I think the main thing I would check is what formats are supported. If the other GPU can support newer formats like AV1 it may be worth it (if you want to store your videos in these more efficient formats or you have clients who can consume these formats and will appreciate the reduced bandwidth).
But overall I would say if you aren’t having any problems no need to bother. The onboard graphics are simple and efficient.
I only have a GPU because my CPU doesn’t have any graphics. I don’t use the graphics anyway, but I need it to boot. So I put our crappiest spare GPU in (GTX 750 Ti) and call it good.
I wouldn’t bother. If you end up needing it, it’ll take like 15 min to get it installed and drivers set up and everything. No need to bother until you actually need it.
QuickSync is usually plenty to transcode. You will get more performance with a dedicated GPU, but the power consumption will increase massively.
Nvidia also has a limit how many streams can be transcoded at the same time. There are driver hacks to circumvent that.
If it is working for you as is, no need to make a change
I ran a 1650 super for a while. At idle it added about 10W and would draw 30-40W while transcoding. I ended up taking it out because the increased power wasn’t worth the slight performance increase for me.
Yeah look like a lot… Probably not worth it.
Host steam-headless and use the GPU for that so you can have remote gaming on your phone anywhere you have 5G
deleted by creator