22.05.2024 21:41:35 - dpa-AFX: Meta To Launch Chameleon Multi-modal LLM

WASHINGTON (dpa-AFX) - As the artificial intelligence game intensifies, Meta
Platforms (META) is working on a state-of-the-art multi-modal large language
model named Chameleon.

According to the company's research paper, the proposed LLM can single-handedly
perform tasks previously performed by different models and could integrate
information better than previous ones.

The paper noted that Chameleon uses an 'early-fusion token-based mixed-modal'
architecture, under which the model learns from a combination of images, code,
text, and other inputs. Additionally, it uses a mix of images, text and code
tokens to create sequences.

'Chameleon's unified token space allows it to seamlessly reason over and
generate interleaved image and text sequences, without the need for
modality-specific components,' the research paper stated.

The latest model is trained in two stages using a dataset of 4.4 trillion tokens
of text, image-text combinations, and sequences of interwoven texts and images.
The researchers trained two versions of Chameleon using 7 billion parameters and
one with 34 billion parameters for more than 5 million hours on Nvidia A100 80GB
GPUs.

Meanwhile, Meta's competitors - OpenAI has launched GPT-4o and Microsoft (MSFT)
has introduced MAI-1 model a few weeks ago.



Copyright(c) 2024 RTTNews.com. All Rights Reserved

Copyright RTT News/dpa-AFX
Name WKN Börse Kurs Datum/Zeit Diff. Diff. % Geld Brief Erster Schluss
META PLATF. A DL-,000006 A1JWVX Xetra 469,800 14.06.24 17:35:45 +0,400 +0,09% 0,000 0,000 469,850 469,800

© 2000-2024 DZ BANK AG. Bitte beachten Sie die Nutzungsbedingungen | Impressum
2024 Infront Financial Technology GmbH