The 15th annual Augmented World Expo kicked off yesterday in Long Beach, CA. I reprised my role as Main Stage host, introducing keynote speakers throughout the day. I’ll share a few quick notes before heading back into the conference for Day 2. You can expect a more detailed take on AWE in next week’s newsletter.
AI ❤️ XR
“AI ❤️ XR” was a key theme from AWE co-founder Ori Inbar’s keynote, setting the stage for a message that permeated the rest of the day. For instance, Said Bakadir, Sr. Director of Product Management at Qualcomm noted how XR technology is an ideal interface that allows us to interact with generative AI. Bobby Murphy, Snap’s CTO and Co-Founder, introduced Lens Studio 5.0 Beta, including a generative AI suite. Features include generating textures and face masks within Lens Studio, which can help creatives save time searching for or creating these assets. This message builds on a key theme from Ori’s AWE 2023 keynote: XR is the interface for AI. It also demonstrates the ever-increasing overlap between AI and XR to enable new use cases, accelerate creators’ workflows and power more natural user interactions (e.g., via multimodal input).
Lessons from XR History
Another key theme from Ori’s keynote was that we need to learn from the history of XR to effectively replace 2D computing. I’d posit that most of us at AWE have bet on XR as the next major computing paradigm. Part of identifying potential, desirable XR futures includes learning from the history of XR. I’m excited to check out today’s talk by AR/VR pioneer Dr. Tom Furness, We Need a Super Cockpit for the Mind, and visit the XR History Museum.
Come say hi!
Later today, I’ll be speaking on a panel for How Generative AI Can Make XR Creation More Accessible. Tomorrow, I’m giving a talk on Building for the Future Of Human Cognition. If you’re considering coming to AWE, it’s not too late! Sign up using SPKR24D for a 20% discount.
Check out all of the AWE news highlights from Day 1 here, and watch the Main Stage livestream here.
Human-Computer Interaction News
Distance Technologies raises $2.7M for glasses-free extended reality apps: Distance Technologies raised $2.7 million in pre-seed funding to develop glasses-free XR applications. The company aims to create accessible XR solutions that do not rely on traditional headsets or glasses.
State of User Research Report: The annual survey by User Interviews found that researchers are still relying on tried-and-tested methods (e.g., interviews, surveys, usability testing) to do their work. That said, 56% of researchers use AI to support their work - a significant rise from last year. Other findings include how layoffs impacted respondents, how researchers acquire knowledge about their field, and buy-in about the importance of research.
AI recognizes athletes' emotions: Using computer-assisted neural networks, researchers were able to accurately identify affective states from the body language of tennis players during games. They trained a model based on AI with data from actual games. This study demonstrates that AI can assess body language and emotions with accuracy similar to that of humans. However, it also points to ethical concerns. Find the full paper here.
Is your team working on AR/VR solutions? Sendfull can help you test hypotheses to build useful, desirable experiences. Reach out at hello@sendfull.com
That’s a wrap 🌯 . More human-computer interaction news from Sendfull next week.