In my server I currently have an Intel i7 9th gen CPU with integrated Intel video.
I don’t use or need A.I. or LLM stuff, but we use jellyfin extensively in the family.
So far jellyfin worked always perfectly fine, but I could add (for free) an NVIDIA 2060 or a 1060. Would it be worth it?
And as power consumption, will the increase be noticeable? Should I do it or pass?
Look up the GPU on these charts to find out what codecs it will support: https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new
NVENC support will tell you what codecs your GPU can generate for client devices, and NVDEC support determines the codecs your GPU can read.
Then compare it with the list of codecs that your Intel can handle natively.
Thanks!
Both the 2060 and the 1060 don’t support AV1 either way, so I guess its pointless to me.