activeNIH2024–2029 • NIH 10792324

Neuroscience-in-the-Wild: Memory & Sensor Fusion (NIH 10792324)

This project integrates smartphone-based experiential sensing, wearables, and intracranial neural recordings to study autobiographical memory formation during real-world behavior and to inform future memory-enhancing neuromodulation.

Neuroscience-in-the-Wild: Memory & Sensor Fusion

Project Overview

The NIH project combines continuous multimodal sensing (audio/video, GPS, IMU, eye-tracking, physiology, and self-reports) with precisely synchronized intracranial recordings from participants with chronically implanted therapeutic devices.

A core objective is to capture and analyze autobiographical memory encoding in natural settings rather than only controlled laboratory tasks. The effort includes development of the CAPTURE mobile recording app and analysis pipelines for large multimodal streams.

By linking real-world behavioral context to neural activity, the project establishes translational foundations for future closed-loop interventions aimed at improving memory outcomes in neurological disorders.

An active Trustworthy Mixed Reality thread evaluates robustness of visual tracking and context alignment pipelines that support reliable in-the-wild cognitive experimentation.

Key Capabilities

  • Synchronized collection of neural, wearable, and environmental data streams during unconstrained real-world behavior
  • Multimodal alignment methods for time-locked analysis across intracranial signals and sensed context
  • Sensor-fusion and model-building workflows for context-shift and event-boundary detection
  • End-to-end mobile tooling for in-the-wild experimental capture at scale
  • Translational analysis pipeline connecting neural markers to memory-relevant behavior
  • Robustness analysis for XR tracking pipelines used in real-world cognitive sensing workflows

Example Use Cases

  • Studying autobiographical memory encoding and retrieval in daily-life environments
  • Detecting context shifts from multimodal sensory and neural observations
  • Characterizing neural signatures that can support future adaptive stimulation policies
  • Evaluating robustness of real-world brain-and-sensor data fusion workflows
  • Assessing trustworthy mixed reality sensing reliability for longitudinal cognitive experiments

Project Figures

Real-world field deployment for synchronized neural and multimodal sensing.
Real-world field deployment for synchronized neural and multimodal sensing.
Participant setup illustrating mobile sensing and neural recording integration.
Participant setup illustrating mobile sensing and neural recording integration.
Example multimodal context representation used for context-shift analysis.
Example multimodal context representation used for context-shift analysis.
Conceptual loop from sensed context and neural state to adaptive intervention design.
Conceptual loop from sensed context and neural state to adaptive intervention design.
Neurosymbolic feature extraction pipeline used for adaptive visual tracking in mixed reality workloads.
Neurosymbolic feature extraction pipeline used for adaptive visual tracking in mixed reality workloads.
Overview of traditional and learning-based SLAM tracking components analyzed for trustworthy XR experimentation.
Overview of traditional and learning-based SLAM tracking components analyzed for trustworthy XR experimentation.

Selected Publications

Research Themes

Project Details

Agency
NIH
Award Number
NIH 10792324
Duration
2024–2029
Status
active
Team
L. Garcia
Public Status
Public NIH abstract, poster materials, and related Trustworthy XR publications are available. Additional datasets, models, and intervention workflows are under active development and will be released as they mature.