Social Sonar: spatial audio for social wayfinding

6 months

Feb - Aug 2025

Multi-modal design

Rapid prototyping

Contextual inquiry

2 Product Designers

1 Design Engineer

1 Sound Designer

Meta Reality Labs Spatial Audio Team

My Role: Lead UX Researcher, Product Designer

Led and moderated generative and evaluative research

Architected system design of the end-to-end experience

Designed mobile UI for the first-time user experience & Messenger integration

imagine.

You're meeting up with a friend at the beach. They've arrived first and you need to find them. This might look something like:
You have their location and a photo of their surroundings but:
aerial photography of blue-and-white patio umbrellas
Everything
looks
identical
aerial photography of blue-and-white patio umbrellas
Everything
looks
identical
aerial photography of blue-and-white patio umbrellas
Everything
looks
identical
aerial photography of blue-and-white patio umbrellas
Everything
looks
identical
A large group of people at a beach
Too
many
people
A large group of people at a beach
Too
many
people
A large group of people at a beach
Too
many
people
A large group of people at a beach
Too
many
people
Someone carries a huge bundle of beach floats.
Hands full
Someone carries a huge bundle of beach floats.
Hands full
Someone carries a huge bundle of beach floats.
Hands full
Someone carries a huge bundle of beach floats.
Hands full
Where the heck are they?
What if you could just hear where your friend is instead of playing Where’s Waldo in real life?

introducing social sonar.

What It Is

Social Sonar is a screen-free, hands-free experience on the Ray-Ban Meta Smart Glasses (RBMs). Two users find each other using real-time spatialized audio and location tracking.

Why It Works

Hearing is the fastest human sense. Spatial audio is known to enable faster, more natural connection and presence in a screen-saturated world.

How It Works

Social Sonar uses beacons and phase notifications. To get a first-hand experience of what this sounds like, make sure to wear your headphones or earbuds.

Beacon: spatialized directional pulses that communicate relative distance apart

Phase notification: non-spatialized verbal cues that help both parties stay updated on the journey

🏖️🚶🏻‍♂️ System activates upon arrival at location

Your friend arrives first and tells Social Sonar that they're waiting for you.

You arrive shortly after. Social Sonar tells you it's activated but it'll start when you get closer to your friend.

🦻🏻🔊 Phase 1: Audio beacons begin

Both you and your friend hear Phase Notification 1 when you're ~ 3 minutes away from each other. Cues for direction and distance begin playing consistently.

📍🚶‍♀ Phase 2: Approaching your friend

The beacons accelerate, indicating that you're approaching your friend. Both of you hear Phase Notification 2.

👀 👋🏼 Phase 3: Within eyeshot

Social Sonar instructs both of you to look up and scan for your friend's face. A finale of beacons play and Social Sonar stops when the RBMs detect a hand wave via its built-in camera.

Impact

In a final evaluative test, all 4 participating pairs of friends—novices of spatial audio and Social Sonar—were able to find each other successfully.

The moment when 2 friends found each other using only Social Sonar

“That was really fun…very therapeutic at the same time.”

- P4, tasked with waiting to be found by friend

"Something that Google & Apple Maps doesn't have is an emotional aspect where the closer you are to someone, the more excited you get."

- P5, tasked with looking for friend

Key Deliverables

We did a live demo at the Meta office and handed off a functional prototype to the Reality Labs Research Audio team.

Proof-of-concept: a React-based web application prototype that uses heading* and GPS data to continuously measure and update navigation for 2 users simultaneously

*Heading data: direction that a user is facing

How the system works: technical architecture

First-time UX, including onboarding and education, and Messenger integration

Onboarding to Social Sonar via the Meta AI app

Activating Social Sonar via Facebook Messenger

process at a glance.

Timeline

By the Numbers

30

participants in generative research

4

design sprints

Design sprints

12

RITE (Rapid Iterative Testing and Evaluation) tests

8

novice participants in evaluative research

novice users in evaluative research

defining the problem space.

Scoping

Spatial audio’s superpowers enable a variety of use cases, including social wayfinding.

How we finally landed the problem space

Why Spatial Audio?

Meta Reality Labs was exploring use cases where auditory display outperforms graphical and voice user interfaces. Our stakeholders gave us 1 constraint: to create a product for the Meta Quest or Ray-Ban Meta Smart Glasses (RBMs).

Why Social Wayfinding?

Spatial audio reduces the hassle and chaos of finding and meeting up with friends.

Spatial audio affords:

Screen-free, hands-free navigation

Precise in-situ orientation

🤝

Social wayfinding needs:

Real-time coordination

Precise navigation to the person

generative research.

Summary

To understand the status quo of social wayfinding, I led research on friend-finding behaviors from selecting the methodology to moderating. I observed current tool usage and the workarounds people rely on when they fall short.

Contextual Inquiry

n = 10 (5 pairs)

Protocol: shadow participants on their way to meet up with a friend

Guerilla Interviewing

n = 20

Protocol: observe people waiting or seeking during the last leg

To bridge research and design, I derived action-oriented design principles.

Research Finding

Design Principles

Social wayfinding relies on mutual awareness, reassurance, and accountability.

Provide reassurance at the right times, mirroring how people naturally check in with one another.

When cognitive load is low, people anticipate and plan ahead, sketching the rest of the journey in their mind.

Unobtrusively support that anticipation by communicating distance and direction without pulling users back to a screen.

Communication is used to manage expectations and build reliability, rather than provide pinpoint accuracy.

Model the Seeker-Waiter dynamic in the experience to mirror how people already negotiate roles and timing in social wayfinding.

In the final stretch, people trust their eyes over devices, prioritizing visual recognition to close the gap.

Hand off from digital guidance to human vision at the right moment.

design.

Summary

These principles translated social wayfinding behaviors into 3 major design decisions for Social Sonar.

Design For Presence

🔊

Audio beacons that begin automatically at 250m

🛠️

Bespoke sound assets that update in real-time based on head-tracking, distance, and direction of the other party

Rationale

Design For Presence

🔊

Audio beacons that begin automatically at 250m

🛠️

Bespoke sound assets that update in real-time based on head-tracking, distance, and direction of the other party

Rationale

Design For Presence

🔊

Audio beacons that begin automatically at 250m

🛠️

Bespoke sound assets that update in real-time based on head-tracking, distance, and direction of the other party

Rationale

Design For Presence

🔊

Audio beacons that begin automatically at 250m

🛠️

Bespoke sound assets that update in real-time based on head-tracking, distance, and direction of the other party

Rationale

Reassure the User

💬

Verbal phase notifications that play automatically at 80m and 30m



Rationale

Reassure the User

💬

Verbal phase notifications that play automatically at 80m and 30m



Rationale

Reassure the User

💬

Verbal phase notifications that play automatically at 80m and 30m



Rationale

Reassure the User

💬

Verbal phase notifications that play automatically at 80m and 30m



Rationale

Embed Social Norms

📲

A complementary mobile prototype to consent to and activate Social Sonar

💬

Verbal confirmation of arrival to initiate an instance of Social Sonar

Rationale

Embed Social Norms

📲

A complementary mobile prototype to consent to and activate Social Sonar

💬

Verbal confirmation of arrival to initiate an instance of Social Sonar

Rationale

Embed Social Norms

📲

A complementary mobile prototype to consent to and activate Social Sonar

💬

Verbal confirmation of arrival to initiate an instance of Social Sonar

Rationale

Embed Social Norms

📲

A complementary mobile prototype to consent to and activate Social Sonar

💬

Verbal confirmation of arrival to initiate an instance of Social Sonar

Rationale

Approach

Design via prototyping included internal pilots, Wizard-of-Oz-ing, RITE testing, and more.

I designed the interactivity of this multi-modal experience, specifying how the Waiter's and Seeker's experiences merge into a single cohesive and shared wayfinding experience.

A diagram of how 2 users and their systems work in tandem to successfully facilitate a social meetup

The audio design evolved significantly across the 3 sprints, but here's a sample of what we had at the beginning vs. the end.

Early samples

Distance

Direction

Final samples

Distance

Direction

Key Changes

Spatialization always in direction of friend’s location

Changes in pitch help localization

Swelling pad that repeats cadence

Three mallet ‘pings’ to communicate distance

RITE Objective

Test direction and distance beacons

What We Did

Wizard-of-Oz method: Participants were Seekers, using beacons to find the Waiter (team member)

Before

  • Activate Social Sonar with 700m to go between 2 users

  • Only play audio beacons

After

  • Activate Social Sonar with 300m (~3 minutes) to go because visual maps are more conducive to navigating to a general vicinity

  • Play verbal cues for additional information and reassurance in terms of distance or time remaining to the friend

RITE Objective

Test first-time user experience, activating Social Sonar, and the Waiter's experience

What We Did

Concept testing: Participants walked through paper prototypes for screen-based flows and audio demos for education of updated beacons


Wizard-of-Oz: Participants were Waiters, using phase notifications and beacons to anticipate when the Seeker (team member) would arrive

Before

  • Prioritized the Seeker's experience with verbal cues designed to update users on the progress of their journey

  • Social Sonar begins manually

  • Social Sonar ends when the distance between 2 users is no longer GPS-accurate (i.e., < 30m)

After

  • Phase Notifications (verbal cues) notify both the Seeker and Waiter at distance-based thresholds

  • Social Sonar is activated manually but the beacons only start when the Waiter has confirmed arrival

  • Social Sonar ends when the RBM glasses detect a hand wave from either user

RITE Objective

Test the web prototype to evaluate success of spatial audio with live head tracking

What We Did

Usability testing to see how well participants could interpret audio and independently find each other in hero and edge cases

Before

  • Beacons didn't update in real-time

  • Paper prototypes and audio demos were separated

After

  • Beacons updated in real-time thanks to head tracking by the web prototype

  • High-fi prototypes, combining audio demos with first-time UX flows on a mobile device

Area of Focus: End-to-End Experience Design

I mapped the end-to-end experience to make sure the audio experience fit into a greater interactive system including first-time user experience and activation. I clarified touch points in a system diagram before working with pixels.

First-time user experience of Social Sonar

Activating Social Sonar between 2 users

First-Time User Experience: Education

The challenge to designing education was providing a sufficient crash course on spatial audio within a 1-2-minute duration and inverting the conventional mental model of navigation via maps. I collaborated closely with the Sound Designer to determine how the screens support audio demos.

Before Concept Testing

Used maps as a complementary visual to match existing mental models.




Map-based animation

After Concept Testing

Despite Social Sonar's intention to reduce screen usage, users prefer having a visual aid for audio demos, because localization ability varies across individuals, especially during onboarding for spatial audio novices.

Removed the maps to indicate that Social Sonar directs users as the crow flies, agnostic to what's visible on a map

Activation: Messenger Integration

Activating Social Sonar occurs within Facebook Messenger—Meta's social messaging app.

I spearheaded the discussion around whether Social Sonar would be a standalone mobile application or a built-in feature of an existing Meta application. I based my decision off generative research which showed how texting and calling were the minimum when coordinating social meet-ups. As a result, I chose to introduce Social Sonar within Meta's social apps, such as Facebook Messenger.

Before Concept Testing

Based on an audit of other location-sharing products (e.g., Apple's Find My), I designed a flow that asked for permissions and duration of Social Sonar at the beginning of every instance

Wireframing what it's like to start Social Sonar with a friend

After Concept Testing

I distinguished the chat bubbles from system messages more clearly. I also eliminated an additional screen by creating a contextual menu for the duration settings and changed the system messages into tiles that showed a preview. I assumed that permissions would be a one-time task during onboarding within the Meta AI app.

The next iteration

evaluative research.

Summary

To test how novices experience Social Sonar, I conducted evaluative research with 4 pairs of participants, unfamiliar with spatial audio and Social Sonar.

Methodology

I planned and moderated a qualitative usability testing that probed for participants' ability to localize (conceptualize direction and distance) and find their friend.

n = 8 (4 pairs: 1 Seeker + 1 Waiter)

60-90 minute tests

Set up: Participants began < 500m away from each other and walked through screen-based onboarding

Task: Locate one another in a large, public space using the system, with minimal moderator guidance.

Findings

All 4 pairs were able to find each other successfully. Even as a lean prototype, Social Sonar outperformed existing tools by enhancing delight and engagement, reducing reliance on screens, and taking up the social burden of constant communication.

All 4 pairs were able to find each other successfully. Even as a lean prototype, Social Sonar outperformed existing tools by enhancing delight and engagement, reducing reliance on screens, and taking up the social burden of constant communication.

“Something that Google & Apple Maps doesn't have is an emotional aspect where the closer you are to someone, the more excited you get. When I heard the higher tone, I would get like a little bit more excited. And then as it got faster and faster, it got more exciting because I knew I was approaching [my friend].”

- Seeker 3

“I got excited when the notification told me to start looking around.”

- Seeker 4

"I wanted to experiment to see if I could replicate that sound again. When I was able to, based off the hypothesis that if I go the wrong direction, it would sound again and it did. I was like, ‘A-ha, that was great!’”

- Seeker 3

“Oh, I think I heard something change…I think I’m going the wrong way.”

After S4 started going in the opposite direction, they were still able to backtrack using the click despite not being concretely aware of the pitch changes.

- Seeker 4

“It removes the onus and stress on your friends that you're trying to meet up with because people might move position between the time of being asked and finding them."

- Seeker 1

Waiter 4 felt “Very socially connected"…“there’s a lot of chaos in texting but this was moderated [by the system]....there’s instruction coming that will alert me when it needs my attention.”

- Waiter 4

The moment when Seeker 3 finds Waiter 3

However, Social Sonar's precision is inherently lower than visual navigation tools. As a result, users can't preview or plan their route in advance. Persistent continuous information delivery isn't feasible via audio due to listening fatigue.

“I feel like that pinpoint visual cue [through maps] is so much more accurate than just trying to survey and see where that person is.”

- Seeker 3

“I don’t like to be perceived as lost because I feel like I look vulnerable" when glued to a screen.

- Seeker 4

“It's probably not going to be best if they're [beacons] constantly playing, but definitely there was too much of a gap…I didn’t hear enough of them to feel confident.”

- Waiter 3

“When I wait normally, I check their location, adjust my position, and even call to redirect...this system removes that agency.”

- Waiter 1

outcomes.

Social Sonar

Bespoke sound assets, as played throughout a video MVP

First-Time UX

Introducing Social Sonar via the Meta AI app.

Onboarding includes educational demo tracks and visual aids to familiarize the user to spatial audio.

Hear it for yourself!

Activation

Users can access Social Sonar via Facebook Messenger.