☆ Yσɠƚԋσʂ ☆@lemmy.ml to Technology@lemmy.mlEnglish · 2 months agoHuawei enters the GPU market with 96 GB VRAM GPU under 2000 USD, meanwhile NVIDIA sells from 10,000+ (RTX 6000 PRO)www.alibaba.comexternal-linkmessage-square56linkfedilinkarrow-up189arrow-down12
arrow-up187arrow-down1external-linkHuawei enters the GPU market with 96 GB VRAM GPU under 2000 USD, meanwhile NVIDIA sells from 10,000+ (RTX 6000 PRO)www.alibaba.com☆ Yσɠƚԋσʂ ☆@lemmy.ml to Technology@lemmy.mlEnglish · 2 months agomessage-square56linkfedilink
minus-squareI Cast Fist@programming.devlinkfedilinkarrow-up2·2 months agoDoes anyone know if it can run CUDA code? Because that’s the silver bullet ensuring Nvidia dominance in the planet-wrecking servers
minus-squarepeppers_ghost@lemmy.mllinkfedilinkEnglisharrow-up5·2 months agollama and pytorch support it right now. CUDA isn’t available on its own as far as I can tell. I’d like to try one out but the bandwidth seems to be ass. About 25% as fast as a 3090. It’s a really good start for them though.
Does anyone know if it can run CUDA code? Because that’s the silver bullet ensuring Nvidia dominance in the planet-wrecking servers
llama and pytorch support it right now. CUDA isn’t available on its own as far as I can tell. I’d like to try one out but the bandwidth seems to be ass. About 25% as fast as a 3090. It’s a really good start for them though.