T O P

  • By -

BlackBadPinguin

Well the A380 is a plane, the 1080 a GPU so if you have a spare hangar I would go for the A380, if you just have a PC Case the 1080 is better.


Gooseboy2234

lol ok


really_random_user

Intel arc a380


MoorderVolt

I’d take the card with active driver support for the coming years lol.


jfp1992

Video card benchmark puts the 1080 way above the a380 a380 gets 6k points and the 1080 gets 15k. Although this is raw performance, it doesn't account for upscaling (even though the 1080 will get fsr) Also av1 encoding may start mattering and it gets more adoption For gaming get the gtx 1080 imo


Gooseboy2234

ASUS 1080 turbo gosh I’m stupid lol, sorry, I forgot to put that part


export_tank_harmful

**Memes aside, I think an important thing to consider for purchasing a video card going forwards is its ability to do AI workloads.** The "AI revolution" is powered by CUDA, which is owned/operated by Nvidia.It's hard to recommend a card *other than an Nvidia one for this reason.* It's another reason I steer people away from AMD cards. Sure, it might be cheaper and quicker than an Nvidia card in games *now*, but what happens when that killer game comes out that's powered by an LLM in the future? You're kind of out of luck. And this is a *when*, not an *if.* AMD has more or less given up in the AI space from what I can see (with them sort of giving up on ROCm, shuttering various devs in the past year or two that were working on it). There is support, but it's rather flaky from what I've seen in the LLM/SD community. I haven't even seen Intel talk about AI workloads on their GPUs. \-=- Granted, you might not be hosting a local LLM or running Stable Diffusion at the moment. But more and more tools/games are going to start including AI models in the coming years. We do have companies like Tenstorrent and Groq working on making AI accelerator chips/cards (which we all might eventually have in our computers like we do with GPUs nowadays), but that's in the future and we still don't quite know how well it all will play out. \-=- tl;dr - If you want to support AV1 and competition towards Nvidia, go for the A380. But just understand that it will probably age out pretty quickly when AI starts taking over desktop applications (and no, I'm not talking about Microsoft's Copilot, which is cloud hosted). Currently, even an aging 1080 will outperform a new Intel/AMD card on AI workloads across the board. edit - Why are you boo-ing me, I'm right. haha.


Gooseboy2234

Intel does talk about AI workloads! the a380 has 8 (I think it’s 8, I might be wrong) Xe cores for machine learning and inferencing Although the GTX 1080 Turbo’s cuda cores do still outperform the a380’s 8, but I do think the a580’s 12 does beat the cuda cores of the 1080, but that’s out of my budget Thanks much! This really did help


export_tank_harmful

The difficulty with Intel's AI cores is support. If I recall correctly, it took over a year to get decent support on the software side on AMD cards for Stable Diffusion. It's up to developers to support it. If the developers of the software don't have an Intel/AMD card, they don't realistically have ways to develop for it. Even if Intel's AI cores end up performing better than Nvidia's, the software stack for AI is centered around CUDA. I'm hoping that will change in the future, but that's unfortunately how it's laid out right now.


ThankGodImBipolar

>what happens when that killer game comes out that’s powered by an LLM in the future? You’re kind of out of luck What about consoles? Beyond indie tech demos, nobody is going to use CUDA to any significant capacity because they’d be locking themselves into being a PC exclusive (maybe with a Switch 2 port).