Meta Is Developing New Multimodal AI Model Chameleon to Rival OpenAI's GPT-4o
Related Articles
-
Opinion: ‘The New York Times’ and its undeserved Pulitzer
The prestigious newspaper’s coverage of the war on Gaza, like other Western media outlets, has exposed its tall claims of objectivity and impartiality.
-
OpenAI unveils an even more powerful AI, but is it 'alive'?
In the 2013 film "Her," Joaquin Phoenix plays a shy computer nerd who falls in love with an AI he speaks to through a pair of white wireless earbuds. A little over a decade after the film’s release, it’s no longer science fiction. AirPods are old news, and with the imminent full rollout of OpenAI’s GPT-4o, such AI will be a reality (the “o” is for “omni"). In fact, OpenAI head honcho Sam Altman simply tweeted after the announcement: “her.” GPT-4o can carry on a full conversion with you. In...
-
OpenAI, Google Double Down on Visuals With Multimodal AI
In the cutthroat world of artificial intelligence, tech behemoths are betting big on a new frontier: multimodal AI. As the shine of text-based chatbots dims, companies are gambling that the future belongs to AI assistants capable of seeing, hearing and conversing with users more naturally and intuitively. The battle for AI dominance has taken on […]