“Phison is leading this campaign with its aiDAPTIV+ LLM Training Integrative Solution. That name is pretty long, though, so I’ll simply use the word aiDAPTIV+ from here on.
. . .
Well the folks at Phison decided to apply this principal to AI. Their aiDAPTIV+ design uses specialized SSDs to reduce the amount of HBM DRAM required in an LLM training system.
Phison argues that the number of GPUs needed in a system is determined by the size of the DRAM the problem requires. Their argument is that if a GPU has 80GB of HBM, and it’s running a model that needs 800GB of parameters, then you need to use ten GPUs.”- pcper.com
Source: pcper.com