>>4516I used an RX6600 for a while. For gaming at 1080p it's absolutely completely adequate and AMD's drivers are a lot better than they were in the Radeon HD days. Power consumption isn't bad either. It's a valid choice.
>>4515Because AI, in particular image generation, is a hobby of mine, I hunted around and got a deal on a used RTX3090 on eBay last year. If you mess with Stable Diffusion at all, you absolutely need CUDA. I haven't tried any LLM stuff
my interests run in other directionsbut I would not be surprised to learn it's the same.
>but muh ROCmis beta quality software, still experimental, with nightly builds that break everything else and force you to chase obscure Python errors for hours to get it working again every time it updates, nine years after it was introduced. And because ROCm is an emulation layer that tricks the software into thinking that an AMD GPU's shader cores are CUDA cores, it's always going to run a lot slower than running it on nVidia natively, if it works at all. The RX6600 would do image gen, if slowly, with very strict limits on image size. I couldn't train LoRA files with it. The Python code for those specific functions isn't fooled by ROCm. Kohya_ss would just puke up ten thousand lines of Python errors and halt. Maybe it's been fixed since then. Maybe it hasn't.
tl;dr it depends entirely on what you want to do