- cross-posted to:
- phoronix@lemmy.world
- cross-posted to:
- phoronix@lemmy.world
I’m interested in benchmarks to compare to my current RX 6650 XT, which is pretty similar to the 4060.
It has 12GB VRAM, which might be enough to mess around with smaller LLM models, but I really wish they’d make a high VRAM variant for enthusiasts (say, 24GB?).
That said, with Gelsinger retiring, I’ll probably wait until the next CEO is picked to hear whether they’ll continue developing their GPUs, I’d really rather not buy into a dead-end product, even if it has FOSS drivers.
12GB VRAM in 2024 just seems like a misstep. Intel isn’t alone in that, but it’s really annoying they didn’t just drop at least another 4GB in there, considering the uplift in attractiveness it would have given this card.
And here I am with 8GBs in 2024 lol
The industry as a whole has really dragged ass on VRAM. Obviously it keeps their margins higher, but for a card targeting anything over 1080, 16GB should be mandatory.
Hell, with 8GB you can run out of VRAM even on 1080, depending on what you play (e.g. flight sims).
That’s why I avoided the 4060s and stuck to my 1060 for now
I doubt it would cost them a ton either, and it would be a great marketing tactic. In fact, they could pair it w/ a release of their own LLM that’s tuned to run on those cards. It wouldn’t get their foot in the commercial AI space, but it could get your average gamer interested in playing with it.
It wouldn’t cost much, but this way they can release a “pro” card with double the vram for 5x the price.
I doubt they will. Intel has proven to be incompetent at taking advantage of opportunities. They missed:
- mobile revolution - waited to see if the iPhone would pan out
- GPU - completely missed the crypto mining boom and COVID supply crunch
- AI - nothing on the market
They need a compelling GPU since the market is moving away from CPUs as the high margin product in a PC and the datacenter. If they produced an AI compatible chip at reasonable prices, they could get real world testing before hey launch something for datacenters. But no, it seems like they’re content missing this boat too, even when the price of admission is only a higher memory SKU…
Got the same card and you can definitely run smaller models on 8GB. There’s no need to pay 200-300 bucks for a 4Gb ram upgrade though. Might be a nice card for people on the lower end but not in our cases. But yeah, I’d really like more vram too, especially with how expensive the higher end cards get - which AMD won’t even bother with anymore anyway. Really hoping for something with 16+ GB for a decent price.
Yeah, I really don’t need anything higher than 6700/7700 XT performance, and my 6650 XT is still more than sufficient for the games I play. All I really need is more VRAM.
If Intel sold that, I’d probably upgrade. But yeah, 12GB isn’t quite enough to really make it make sense, the things I can run on 12GB aren’t meaningfully different than the things I can run on 8GB.
I keep eyeing these Arc cards to possibly put into my Plex system. They are looking pretty juicy.
I’ve got an A380 in my Jellyfin server and it’s a beast
Weren’t Intel cards super inefficient in regards to power draw for their performance?
It likely depends on how much they pay for power and how many users they serve.
E.g. I’d really like AV1 support on my server (helps with slow upload), but the cost for power of a dedicated GPU is inacceptable in my country. The few transcoding reams I’d theoretically need in a worst case scenario are more than met with an iGPU.
Yup. In my area, power usage is a non-issue. I pay $0.12-0.13/month, so my concerns around power usage are only because I don’t want to be wasteful (our energy largely comes from coal and natural gas). So I wouldn’t buy the A380 despite it not mattering too much because it’s just too wasteful.
This new set of cards seem to be a lot more power efficient though, so maybe they’re worth a look if you need something for transcoding.
I’ve just looked it up and the A380 seems to only draw ~17W at idle. That’s better than I thought, but still 2-3 times a HDD.
I wonder whether the new generation will lower the idle power usage too, or only the performance per watt.
Yeah, my NAS uses something like 50W (measured from the wall), with two HDDs, a SATA SSD, Ryzen 1700, and an old GPU (750 Ti, won’t boot without graphics). I haven’t measured everything independently, but online sources say 6W for the GPU. So the A380 would be 3x higher. That doesn’t matter too much in my area, but it’s still extra power draw.
Hopefully the new gen is close to that 6W figure.
Sounds like overkill tbh, can’t you run those things on a £200 mini PC?
Needs virtio support to really be a top seller!
B770 to hypothetical B9XX is what I’m looking for. Phoenix benchmarks because not many doing Linux benchmarks. 8700-8800xt or B700-B9XX for me next year
If this turns out to be a solid performer, the price could make it the best midrange value since AMD’s Polaris (RX 480). I hope Intel’s build quality has improved since the A770.
If that was some OEM design for a retail PC, fine. But fuck off with shit like glued back plates on dedicated GPUs you buy.
So is this the one with the telephone sanitizers?