National Cyber Warfare Foundation (NCWF) Forums


Apple debuts OpenELM, a family of language models with 270M, 450M, 1.1B, and 3B parameters, designed to run on-device, pre-trained and fine-tuned on p


0 user ratings
2024-04-24 17:33:25
milo
Developers

Shubham Sharma / VentureBeat:

Apple debuts OpenELM, a family of language models with 270M, 450M, 1.1B, and 3B parameters, designed to run on-device, pre-trained and fine-tuned on public data  —  Just as Google, Samsung and Microsoft continue to push their efforts with generative AI on PCs and mobile devices …




Shubham Sharma / VentureBeat:

Apple debuts OpenELM, a family of language models with 270M, 450M, 1.1B, and 3B parameters, designed to run on-device, pre-trained and fine-tuned on public data  —  Just as Google, Samsung and Microsoft continue to push their efforts with generative AI on PCs and mobile devices …



Source: TechMeme
Source Link: http://www.techmeme.com/240424/p35#a240424p35


Comments
new comment
Nobody has commented yet. Will you be the first?
 
Forum
Developers



© Copyright 2012 through 2024 - National Cyber War Foundation - All rights reserved worldwide.