Dinithi Dissanayake

HCI | Applied AI | Adaptive Wearables

Dinithi Dissanayake

I am a PhD candidate at the National University of Singapore, working with Prof. Suranga Nanayakkara in the Augmented Human Lab.

My research sits at the intersection of Human–Computer Interaction and Applied AI, where I explore how intelligent systems can understand people through multimodal cues, interpret user context, and adapt their responses in real time. Ultimately, I aim to create wearable systems that feel less like tools and more like supportive partners.

More broadly, I enjoy building data-driven systems, spanning ML modeling, prototyping, and deployment that improve decision-making and everyday user experiences.

News

  • Feb 2025 – Our paper “VRSense” was accepted to CHI LBW 2025. See you in Japan!
  • Sep 2024 – Started an internship via AHLab × Meta Reality Labs collaboration.
  • Sep 2024 – Passed my Qualifying Examination. Now officially a PhD Candidate.
  • Aug 2023 – Began my PhD at the National University of Singapore (NUS).
  • Jan 2023 – Started Data Analytics consulting at LIRNEasia.
  • May 2022 – Joined Axiata Digital Labs as a Data Engineer.
  • Aug 2022 – Graduated with First Class Honors (Electronic & Telecommunication Engineering), University of Moratuwa.
  • June 2022 – “CrossPoint” presented as a full paper at CVPR 2022.
  • Oct 2022 – “3DLatNav” presented as a workshop paper at ECCV 2022.

Academic

Industry

Research Projects

User-Aware Adaptive Assistive Wearables

Illustration of a taxonomy for adaptive assistive wearables

We conducted a systematic literature review of 63 papers examining how adaptive wearables sense user states, trigger context-aware interventions, and support real-time cognitive or behavioral feedback. We introduce a taxonomy of sensing modalities, adaptation triggers, and intervention strategies, and highlight key design challenges and opportunities. This work provides a foundation for developing next-generation wearables that meaningfully adapt to users’ needs. (Under review.)

Sensory Spotlight: Anticipating User Attention

Example scenario showing audio-visual cues used for predicting attention shifts

Sensory Spotlight explores how AI can anticipate shifts in human attention by combining audio and visual signals—similar to how we react to salient events in our environment. The model predicts attention shifts and provides saliency to indicate which modality “got the spotlight,” supporting decisions about where and how feedback should appear (e.g., smart-glasses display vs. audio).

VRSense: An Explainable System to Help Mitigate Cybersickness in VR Games

VRSense overview image

VRSense is an explainable system to help VR game developers assess cybersickness. Instead of a black box, the system uses interpretable features to provide actionable insights into game design and user interactions. Designed to be plug-and-play, VRSense helps developers understand how effectively their game mitigates motion sickness. Read the paper.

3D Object Transformation and Regeneration for Privacy in Mixed Reality

Mixed reality privacy project image

We developed a 3D–2D correspondence technique for point clouds and a 3D vision algorithm that lets users add, delete, or modify parts of a 3D object, enabling regeneration on the other end. Evaluated against simulated privacy attacks and implemented on-device for real-world feasibility. Resulted in two papers: CrossPoint (CVPR) and 3DLatNav.

Work Experience

Research Intern — Meta Reality Labs × Augmented Human Lab

Aug 2024 – Feb 2025
  • Led the design and execution of a large-scale VR gameplay user study (N=150), investigating motion sickness during active VR interactions.
  • Collected and synchronized multimodal high-frequency data including eye tracking, head motion, and physiological signals in real-world XR settings.
  • Developed machine learning models to predict real-time discomfort and cybersickness onset during gameplay.
  • Built robust data pipelines for aligning heterogeneous sensor streams at scale.
  • Translated model outputs into actionable insights for VR game evaluation on VR gaming platforms.

Data Engineer — Axiata Digital Labs

May 2022 – Aug 2023
  • Engineered scalable ML/DL deployment pipelines for the company’s AI Factory platform, supporting model development, testing, and production integration.
  • Designed and maintained data ingestion and model integration workflows across multiple telecom business domains.
  • Developed and deployed a customer churn prediction model with end-to-end pipelines, enabling data-driven decision making for enterprise clients.
  • Worked across AWS and GCP environments to support reliable, production-grade ML systems.
  • Collaborated with product and engineering teams to bridge research prototypes and real-world business applications.

Data Analytics Consultant — LIRNEasia

Jan 2023 – Aug 2023
  • Designed and implemented ML pipelines for classifying built-up regions from satellite imagery to support a national-scale urbanization index.
  • Integrated geospatial workflows using QGIS and Google Earth Engine for automated spatial analysis and visualization.
  • Built longitudinal datasets to study household energy consumption patterns across regions and time.
  • Collaborated with policy researchers to translate data-driven findings into evidence-based insights for urban planning.