Zyphra ZAYA1 becomes the first large-scale Mixture-of-Experts model trained entirely on AMD (AMD) Instinct™ MI300X GPUs, AMD Pensando™ networking and ROCm open software. ZAYA1-base outperforms Llama-3 ...
HOUSTON--(BUSINESS WIRE)--Hewlett Packard Enterprise (NYSE: HPE) today announced the HPE ProLiant Compute XD685 for complex AI model training tasks, powered by 5 th Gen AMD EPYC™ processors and AMD ...
AMD is in a celebratory mood after AI research firm Zyphra successfully trained its cutting-edge, large-scale Mixture-of-Experts (MoE) model, ZAYA1, entirely on AMD’s accelerated computing platform, ...
ZAYA1-base, the core version of the model, has 8.3 billion total parameters with 760 million active parameters AMD has stated that Zyphra has developed ZAYA1, a Mixture-of-Experts (MoE) foundation ...
Joint collaboration between Zyphra, AMD, and IBM delivers ZAYA1, the first large-scale Mixture-of-Experts foundation model trained entirely on an AMD platform using AMD Instinct MI300X GPUs, AMD ...
[SPONSORED GUEST ARTICLE] AMD will hold a webinar on Thursday, July 17 at 11 am ET on AMD ROCm 6 Updates & What is HIP? Register is here: https://webinar.amd.com/AMD ...
SANTA CLARA, Calif., Nov. 24, 2025 (GLOBE NEWSWIRE) -- AMD (NASDAQ: AMD) announced that Zyphra has achieved a major milestone in large-scale AI model training with the development of ZAYA1, the first ...