
social sonar: pioneering social wayfinding with spatial audio
Lead UX Researcher, Product Designer
Systems thinking | Experience design | Rapid prototyping | Concept testing | Contextual inquiry
6 months (Feb - Aug 2025)
2 Product Designers, 2 Design Engineers

Sponsored by Meta Reality Labs
context.
You're meeting up with a friend at the beach, but you have no idea where they are.
You have their location, but it's crowded, your hands are full, and all the umbrellas look the same.
Where the heck are they?
overview.
Spatial audio enables faster, more natural connection and presence in a screen-saturated world. How might we leverage spatial audio technology to help people find each other in public, self-organized, and outdoor spaces?
The Answer
Social Sonar is a screen-free, hands-free experience on the Ray-Ban Meta (RBM) smart glasses. Two users find each other using real-time spatialized audio and location tracking.
How It Works
Put on some headphones and experience each end of the experience.

📍🚶♀
The Waiter arrives
Waiter's experience
Seeker's experience

🏖️🤔
The Seeker arrives and begins looking for the friend
Both hear Phase Notification 1
Beacons are spatialized directional pulses that also communicate relative distance apart.
Always spatialized in direction of friend’s location
Changes in pitch help localization
3 mallet pings communicate distance

💬⏳
The Seeker nears the Waiter
Both hear Phase Notification 2 and faster beacons

🙋♂️🙋♀️
The 2 users are united
Both hear Phase Notification 3, non-spatialized beacon finale, and a cue that ends the experience.
My Contributions
Led generative and evaluative research, including methodology, execution, analysis, and reporting
Defined key interaction points in the end-to-end experience
Designed mobile UI for the first-time user experience
Impact
In a final evaluative test, all participating pairs of friends were able to find each other successfuly only using Social Sonar.
“That was really fun…very therapeutic at the same time.”
- Waiter P4
"Something that Google & Apple Maps doesn't have is an emotional aspect where the closer you are to someone, the more excited you get."
- Seeker P4
We demo'd and delivered a functional prototype and additional assets to the Reality Labs Research Audio team.
A live demo at the Meta offices to our stakeholders
Proof-of-concept: a React-based web application prototype
Onboarding to Social Sonar via the Meta AI app
Timeline

the problem space.
Scoping
Spatial audio’s superpowers enable a variety of use cases, including social wayfinding.

How we finally landed the problem space
Why Social Wayfinding?
Spatial audio reduces the hassle and chaos of finding and meeting up with friends.
Spatial audio affords:
Screen-free, hands-free navigation
Precise in-situ orientation
Social wayfinding needs:
Real-time coordination
Precise navigation to the person
research.
Objective
To understand the status quo of social wayfinding, I led research on friend-finding behaviors from selecting the methodology, writing protocol, and moderating. I observed current tool usage and the workarounds people rely on when they fall short.
Bridging Research and Design
To make this research actionable, I derived basic principles that the design should aim to achieve.
Research Finding
Design Implication
Social wayfinding relies on mutual awareness, reassurance, and accountability.
Provide reassurance cues at the right times, mirroring how people naturally check in with one another.
When cognitive load is low, people anticipate and plan ahead, sketching the rest of the journey in their mind.
Unobtrusively support that anticipation by communicating distance and direction without pulling users back to a screen.
Communication is used to manage expectations and build reliability, rather than provide pinpoint accuracy.
Model the Seeker-Waiter dynamic in the experience to mirror how people already negotiate roles and timing in social wayfinding.
In the final stretch, people trust their eyes over devices, prioritizing visual recognition to close the gap.
Hand off from digital guidance to human vision at the right moment.
design.
Principles in Action
These principles translated social wayfinding behaviors into concrete design decisions for Social Sonar.
The Process
Rapid prototyping and testing were the primary methods of design and iteration over the course of 4 months (May - August). It involved internal pilots, Wizard-of-Oz-ing, RITE testing, and more.
Iteration after iteration: click into each sprint to see the details of what we did, learned, and achieved during each sprint
My Focus
I mapped the end-to-end experience to make sure the audio experience fit into a greater interactive system including onboarding, activation, and initiation. I clarified system interactions via a flowchart before putting pen to paper.

A diagram of how a first-time user would set up Social Sonar
My Focus: Education
Onboarding and user education occurs within the Meta AI app—the default control center for the RBMs. I applied the carousel pattern to allow users to preview Social Sonar's features and made education short and concise.
Initially, I explored using maps as a complementary visual; however, this defeated the purpose of a landmark-agnostic navigation tool. Although this is meant to be a screen-free experience, concept testing showed that they preferred having a visual aid while audio demos played, because spatial audio is a new experience to most.
My Focus: Activation
I audited similar location-sharing tools like Apple's Find My to incorporate similar aspects, such as granting consent and setting time limits.
Audit of Find My
Wireframing what it's like to start Social Sonar with a friend
User testing revealed the these chat bubbles needed to be differentiated as system messages because they looked too similar to users' chat messages.
In the next iteration, I eliminated an additional screen by creating a contextual menu for the duration settings and changed the system messages into tiles that showed a preview as well. I made an assumption that permissions would be part of the onboarding experience on the Meta AI app, instead of having it repeat for every instance of Social Sonar.
I also designed the conversation UI, specifying how the Waiter's and Seeker's experiences merge into a single cohesive and shared wayfinding experience.

A diagram of how the Seeker's and Waiter's experiences affect each other
impact.
Objective
Conduct final evaluative research to learn how novice users perceive Social Sonar’s spatial audio friend-finding system
Methodology
I planned and moderated a qualitative usability testing that probed for participants' ability to localize (conceptualize direction and distance) and find their friend.
Set up: Participants began < 500m away from each other and walked through screen-based onboarding
Task: Locate one another in a large, public space using the system, with minimal moderator guidance.
Findings
The moment when Seeker 3 finds Waiter 3
However, Social Sonar's precision is inherently lower than visual navigation tools. As a result, users can't preview or plan their route in advance. Persistent and continuous information delivery isn't feasible via audio due to listening fatigue for the user.
outcomes.
Core Feature: At a Glance
Bespoke sound assets, as played throughout a video MVP
Activation: The Waiter announces arrival and the system confirms that the Seeker will be notified
Phase 1: At ~250m, Social Sonar notifies both about the relative time left. The Seeker begins hearing consistent audio beacons from the direction of the Waiter.
Phase 2: At 80m, Social Sonar notifies both that they're approaching
Phase 3: At 30m, Social Sonar notifies both to begin scanning the surroundings.
Onboarding: At a Glance
Onboarding includes demo tracks to familiarize the user to spatial audio.
Hear it for yourself!
Activation: At a Glance