Social Sonar: spatial audio for social wayfinding

6 months
Feb - Aug 2025
Multi-modal design
Rapid prototyping
Contextual inquiry
2 Product Designers
1 Design Engineer
1 Sound Designer
Meta Reality Labs Spatial Audio Team
My Role: Lead UX Researcher, Product Designer
Led and moderated generative and evaluative research
Architected system design of the end-to-end experience
Designed mobile UI for the first-time user experience & Messenger integration
imagine.
You're meeting up with a friend at the beach. They've arrived first and you need to find them. This might look something like:
You have their location and a photo of their surroundings but:
Where the heck are they?
What if you could just hear where your friend is instead of playing Where’s Waldo in real life?
introducing social sonar.
What It Is
Social Sonar is a screen-free, hands-free experience on the Ray-Ban Meta Smart Glasses (RBMs). Two users find each other using real-time spatialized audio and location tracking.
Why It Works
Hearing is the fastest human sense. Spatial audio is known to enable faster, more natural connection and presence in a screen-saturated world.
How It Works
Social Sonar uses beacons and phase notifications. To get a first-hand experience of what this sounds like, make sure to wear your headphones or earbuds.
Beacon: spatialized directional pulses that communicate relative distance apart
Phase notification: non-spatialized verbal cues that help both parties stay updated on the journey
🏖️🚶🏻♂️ System activates upon arrival at location
Your friend arrives first and tells Social Sonar that they're waiting for you.
You arrive shortly after. Social Sonar tells you it's activated but it'll start when you get closer to your friend.
🦻🏻🔊 Phase 1: Audio beacons begin
Both you and your friend hear Phase Notification 1 when you're ~ 3 minutes away from each other. Cues for direction and distance begin playing consistently.
📍🚶♀ Phase 2: Approaching your friend
The beacons accelerate, indicating that you're approaching your friend. Both of you hear Phase Notification 2.
👀 👋🏼 Phase 3: Within eyeshot
Social Sonar instructs both of you to look up and scan for your friend's face. A finale of beacons play and Social Sonar stops when the RBMs detect a hand wave via its built-in camera.
Impact
In a final evaluative test, all 4 participating pairs of friends—novices of spatial audio and Social Sonar—were able to find each other successfully.

The moment when 2 friends found each other using only Social Sonar
“That was really fun…very therapeutic at the same time.”
- P4, tasked with waiting to be found by friend
"Something that Google & Apple Maps doesn't have is an emotional aspect where the closer you are to someone, the more excited you get."
- P5, tasked with looking for friend
Key Deliverables
We did a live demo at the Meta office and handed off a functional prototype to the Reality Labs Research Audio team.
Proof-of-concept: a React-based web application prototype that uses heading* and GPS data to continuously measure and update navigation for 2 users simultaneously
*Heading data: direction that a user is facing
How the system works: technical architecture
First-time UX, including onboarding and education, and Messenger integration
Onboarding to Social Sonar via the Meta AI app
Activating Social Sonar via Facebook Messenger
process at a glance.
Timeline
By the Numbers
30
participants in generative research
4
12
RITE (Rapid Iterative Testing and Evaluation) tests
8
defining the problem space.
Scoping
Spatial audio’s superpowers enable a variety of use cases, including social wayfinding.
How we finally landed the problem space
Why Spatial Audio?
Meta Reality Labs was exploring use cases where auditory display outperforms graphical and voice user interfaces. Our stakeholders gave us 1 constraint: to create a product for the Meta Quest or Ray-Ban Meta Smart Glasses (RBMs).
Why Social Wayfinding?
Spatial audio reduces the hassle and chaos of finding and meeting up with friends.
Spatial audio affords:
Screen-free, hands-free navigation
Precise in-situ orientation
🤝
Social wayfinding needs:
Real-time coordination
Precise navigation to the person
generative research.
Summary
To understand the status quo of social wayfinding, I led research on friend-finding behaviors from selecting the methodology to moderating. I observed current tool usage and the workarounds people rely on when they fall short.
To bridge research and design, I derived action-oriented design principles.
Research Finding
Design Principles
Social wayfinding relies on mutual awareness, reassurance, and accountability.
Provide reassurance at the right times, mirroring how people naturally check in with one another.
When cognitive load is low, people anticipate and plan ahead, sketching the rest of the journey in their mind.
Unobtrusively support that anticipation by communicating distance and direction without pulling users back to a screen.
Communication is used to manage expectations and build reliability, rather than provide pinpoint accuracy.
Model the Seeker-Waiter dynamic in the experience to mirror how people already negotiate roles and timing in social wayfinding.
In the final stretch, people trust their eyes over devices, prioritizing visual recognition to close the gap.
Hand off from digital guidance to human vision at the right moment.
design.
Summary
These principles translated social wayfinding behaviors into 3 major design decisions for Social Sonar.
Approach
Design via prototyping included internal pilots, Wizard-of-Oz-ing, RITE testing, and more.
I designed the interactivity of this multi-modal experience, specifying how the Waiter's and Seeker's experiences merge into a single cohesive and shared wayfinding experience.
A diagram of how 2 users and their systems work in tandem to successfully facilitate a social meetup
The audio design evolved significantly across the 3 sprints, but here's a sample of what we had at the beginning vs. the end.
Early samples

Distance

Direction
Final samples

Distance

Direction
Key Changes
Spatialization always in direction of friend’s location
Changes in pitch help localization
Swelling pad that repeats cadence
Three mallet ‘pings’ to communicate distance
Area of Focus: End-to-End Experience Design
I mapped the end-to-end experience to make sure the audio experience fit into a greater interactive system including first-time user experience and activation. I clarified touch points in a system diagram before working with pixels.
First-time user experience of Social Sonar
Activating Social Sonar between 2 users
First-Time User Experience: Education
The challenge to designing education was providing a sufficient crash course on spatial audio within a 1-2-minute duration and inverting the conventional mental model of navigation via maps. I collaborated closely with the Sound Designer to determine how the screens support audio demos.
Before Concept Testing
Used maps as a complementary visual to match existing mental models.
Map-based animation
After Concept Testing
Despite Social Sonar's intention to reduce screen usage, users prefer having a visual aid for audio demos, because localization ability varies across individuals, especially during onboarding for spatial audio novices.
Removed the maps to indicate that Social Sonar directs users as the crow flies, agnostic to what's visible on a map
Activation: Messenger Integration
Activating Social Sonar occurs within Facebook Messenger—Meta's social messaging app.
I spearheaded the discussion around whether Social Sonar would be a standalone mobile application or a built-in feature of an existing Meta application. I based my decision off generative research which showed how texting and calling were the minimum when coordinating social meet-ups. As a result, I chose to introduce Social Sonar within Meta's social apps, such as Facebook Messenger.
Before Concept Testing
Based on an audit of other location-sharing products (e.g., Apple's Find My), I designed a flow that asked for permissions and duration of Social Sonar at the beginning of every instance
Wireframing what it's like to start Social Sonar with a friend
After Concept Testing
I distinguished the chat bubbles from system messages more clearly. I also eliminated an additional screen by creating a contextual menu for the duration settings and changed the system messages into tiles that showed a preview. I assumed that permissions would be a one-time task during onboarding within the Meta AI app.
The next iteration
evaluative research.
Summary
To test how novices experience Social Sonar, I conducted evaluative research with 4 pairs of participants, unfamiliar with spatial audio and Social Sonar.
Methodology
I planned and moderated a qualitative usability testing that probed for participants' ability to localize (conceptualize direction and distance) and find their friend.
Set up: Participants began < 500m away from each other and walked through screen-based onboarding
Task: Locate one another in a large, public space using the system, with minimal moderator guidance.
Findings
The moment when Seeker 3 finds Waiter 3
However, Social Sonar's precision is inherently lower than visual navigation tools. As a result, users can't preview or plan their route in advance. Persistent continuous information delivery isn't feasible via audio due to listening fatigue.
outcomes.
Social Sonar
Bespoke sound assets, as played throughout a video MVP
First-Time UX
Introducing Social Sonar via the Meta AI app.
Onboarding includes educational demo tracks and visual aids to familiarize the user to spatial audio.
Hear it for yourself!
Activation
Users can access Social Sonar via Facebook Messenger.















































