​​​​​​​​​​​​​​​​​         

Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Here’s why 100TB+ SSDs will play a huge role in ultra large language models in the near future



  • Kioxia reveals a new project called AiSAQ that wants to replace RAM with SSD for AI data processing
  • Larger SSDs (read: 100TB+) could improve RAG at a lower cost than using memory alone
  • No timeline has been given, but expect Kioxia’s rivals to offer similar technology

Major language models often generate plausible but factually incorrect outputs – in other words, they create things. These “hallucinations” can damage reliability in critical information tasks such as medical diagnosis, legal analysis, financial reporting and scientific research.

Retrieval-Augmented Generation (RAG) mitigates this problem by integrating external data sources, which allows LLMs to access information in real time during generation, reducing errors, and, by putting the soil in the actual data, improving contextual accuracy. Implementing RAG effectively requires substantial memory and storage resources, and this is particularly true for large-scale vector and index data. Traditionally, this data has been stored in DRAM, which, while fast, is both expensive and limited in capacity.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *