Would be really cool if AMD could extend official support of ROCm to RDNA2 and RDNA3 GPUs. I've tested a number of them and they all work fine. Lack of official support is keeping folks away, as they are under the impression that ROCm doesn't work on those GPUs. I'm pretty sure they all work (I've tested rx6600 and 6700xt, and I also own a 7900xtx which is officially supported). Also a VRAM boosted consumer version of your upcoming GPU would go a long way in attracting folks to ROCm. Like a 32GB version of such a GPU would be a big selling point. At least for us LocalLlama enthusiasts.
Would be really cool if AMD could extend official support of ROCm to RDNA2 and RDNA3 GPUs. I've tested a number of them and they all work fine. Lack of official support is keeping folks away, as they are under the impression that ROCm doesn't work on those GPUs. I'm pretty sure they all work (I've tested rx6600 and 6700xt, and I also own a 7900xtx which is officially supported).
Also a VRAM boosted consumer version of your upcoming GPU would go a long way in attracting folks to ROCm. Like a 32GB version of such a GPU would be a big selling point. At least for us LocalLlama enthusiasts.
please instead of wasting ressources on social media, develop rocm support for more hardware. RX 7000+ also! thanks.