That would be very cool. I am eying to buy a ARC A770 16GB for ML applications, but if I can keep it to 8/12GB VRAM a lot of second hand GPUs would also work.
Maybe, but I would be worried that I couldn’t always run any application I wanted, or be able to try out brand new stuff because I have to wait on support to be added.
staff said 8 gb will be enough
That would be very cool. I am eying to buy a ARC A770 16GB for ML applications, but if I can keep it to 8/12GB VRAM a lot of second hand GPUs would also work.
Don’t get an arc. ML stuff mostly relies on CUDA which is only supported by Nvidia cards.
Apparently Intel’s oneAPI is catching up quickly to CUDA, at least compared to AMD’s ROCm mess.
Maybe, but I would be worried that I couldn’t always run any application I wanted, or be able to try out brand new stuff because I have to wait on support to be added.
microsoft and amd joined hands in making Ai chips there is hope for AMD
yeah staff said he finetuned it in 3090