The Meta Connect 2024 kicked off on Wednesday with a series of speeches, keynotes, and product reveals directly to the public.

Hosted on its Facebook, Twitter, and other social media accounts, Meta’s annual gathering unveiled a major shift in focus for its Reality Labs and Meta AI divisions.

Mark Zuckerberg, Founder and Chief Executive, Meta rolled out a trove of new products, solutions, and teasers to a packed audience at the conference. Here are the key highlights from Zuckerberg’s big reveal.

Meta Quest 3S Joins the Family

Zuckerberg officially debuted the Meta Quest 3s, an affordable version of the Quest 3 that will ship on 15 October for $299.

According to the exec, the new headset would provide access to a wider customer base due to its compute and functionality at a budget-friendly price.

Currently, the Quest 3s offers comparative features to the Quest 3, along with the same Qualcomm Snapdragon XR2 Gen 2 processor but with relatively vivid mixed reality passthrough, seamless hand tracking, and Horizon OS. However, it will include Fresnel rather than pancake lenses.

Conversely, software and efficacy enhancements boosted the device’s compute, leading to fully optimised performance compared to the Quest 3 at launch.

He told the audience,

“In the past year, we’ve made so many improvements and optimizations to the technical stack, effective resolution, and latency that they are actually better in the Quest 3s at [two ninety-nine] today, than it was in the Quest 3 when we launched it a year ago.”

Zuckerberg added that the Quest family of headsets would also upgrade collaboration and integration with Microsoft, improving functionality and mixed reality viewing of the latter’s family of apps.

He added that the Horizon OS would create a “primative” hyperlink system via Meta Portals to move from different immersive experiences, significantly enhancing interoperability between immersive worlds.

Also, new features would would allow YouTube video co-watching in an upcoming partnership with Google.

The Meta Quest Store would also host a new collection of games such as Batman: Arkham Shadow, Square Enix’s Triangle Strategy, Just Dance, and Meta’s recently-acquired Supernatural fitness game, Zuckerberg said.

Photorealistic People, Places, and Things

To fully immerse users in their virtual worlds, Meta has upgraded Horizon OS virtual spaces with more photorealistic avatars and environments.

Using Quest 3 LiDAR and camera functionality, users can scan and upload real environments, creating virtual shared spaces known as a hyperscape.

By scanning, sharing, and recreating their favourite environments, users can create bespoke immersive worlds for metaverse experiences with the Hyperscape beta app.

Meta AI: Upgraded

In a massive overhaul of Meta’s AI capabilities, Zuckerberg announced that Meta AI reached multimodality to understand both images and text.

However, the latest upgrades would include natural voice interactions for interacting with AI bots and photorealistic avatars.

Initially, AI Studio provided text-only features, but with the live AI avatars, people can interact with their virtual representations of their favourite stars, people, and personalities.

Zuckerberg demoed the hyperrealistic avatars with Don Allen Stevenson, Creative Technologist and Futurist, to showcase their realistic interactions.

Regarding Meta’s newest large language model (LLM) release, Llama 3.2, Zuckerberg championed support for open-source models, stating they offered benefits to developers compared to closed-source ones.

To date, Llama LLM holds the world’s largest monthly active users (DAUs) at 500m, he noted.

He lauded Meta’s open-source stance, stating,

“I think open-sourcing [will] definitely going to be the most cost effective, customizable, trustworthy and performant option out there for developers to use, and I’m really proud of the work that we’re doing on this.”

Furthermore, the new LLM would empower users with real-time voice dubbing for language translations across its family of apps, breaking language barriers across social media.

The Smartest Glasses on the Block

In a nod to its Meta’s Ray-Ban Stories successes, Zuckerberg explored several key enhancements for the iterative smart glasses.

New updates included Audible and iRadio integrations, however, the Ray-Ban Stories’ computer vision would now offer even greater functionality such as remembering objects, scanning QR codes, call phone numbers, set reminders, and remembering parking spaces.

“We’re adding multi modal video AI, so Meta AI is now going to be able to give you real time help as you were doing things,” Zuckerberg said

During his speech, he showed audiences a demo of Stories offering dressing tips and advice using Meta AI and real-time computer vision.

In a demo with UFC Flyweight Champion Brandon Moreno, the two live demoed Ray-Ban’s Stories latest real-time translation capabilities.

The demo comes just days after Meta extended its partnership with French spectacle designers EssilorLuxottica for a further decade, following successful sales of the former’s second-generational Stories.

Project Orion: The Future of AR Smart Glasses

Codenamed Orion, the holographic devices were “the most advanced glasses the world has ever seen,” Zuckerberg said.

He noted that Meta had begun work on Orion a decade ago, but due to technical challenges, the holographic device needed meet specific benchmarks.

They needed to weigh less than 100 grammes, be wireless and tetherless, carry a wide field of view (FoV), and have holographic displays “sharp enough to pick up details [and] bright enough to see in different lighting conditions,” he added.

To fully optimise the device prior to release, Zuckerberg stressed that Orion would release to a handful of partners as a developer kit, prior to becoming the firm’s “first consumer full holographic AR glasses.”

However, what sets Orion apart from other devices is its use of a neural interfacing device — Meta’s latest wristband solution — to actively interface the human brain with the devices, eliminating controllers.

Explaining further, Zuckerberg said,

“This isn’t just the first full screen, full wide field of view holographic AR glasses. This is also the first device that is powered by our wrist-based neural interface. So when people have gotten to try out these glasses, we’ve shown them to a handful of people this point. It’s like an emotional experience.”

However, Meta would continue to innovate the device with sharper visuals, lighter form factors, and a seamlessly integrated interface for users.

He concluded: “When we have the next version of this hardware, it is going to be ready to be our first consumer full holographic AR glasses.”

Meta formerly codenamed the smart glasses Project Nazare.

The news comes after Zuckerberg hinted in a fireside chat with Jensen Huang, CEO and Founder, NVIDIA that Meta aimed to shift to AR smart glasses capable of delivering comfortable form factors, long battery life, and enhanced LLM functionality.

Meta Connect is an annual gathering that debuts some of the latest solutions from the Menlo Park-based firm. It covers the company’s advancements in virtual, augmented, and mixed reality (VR/ AR/ MR), artificial intelligence (AI), large language models, and its family of apps — WhatsApp, Facebook, Messenger, and Instagram — among many others.

Like this article? Be sure to like, share, and subscribe for all the latest updates from DxM!

Leave a comment

Trending