AIR Print – Art based Pollution Visualizer

The Art-Based Pollution Visualizer is a mobile application that renders real-time air pollution data into immersive audiovisual experiences. Using camera-based augmented reality, the app overlays the visual field with flowing color gradients and spatialized sound, offering users a direct sensory encounter with invisible environmental conditions.

Each pollutant — PM2.5, PM10, NO₂, CO, SO₂, and O₃ — is represented by a dedicated sound layer and a dynamic color component. As environmental levels rise or fall, the intensity, movement, and tone of these layers shift accordingly. This creates a continuously evolving soundscape that reflects the moment-to-moment composition of urban air.

Users can capture snapshots of the AR scene. Each snapshot includes embedded air quality measurements and GPS location, creating a visual and auditory record of a specific place and time. These captured moments are saved locally, viewable as both a gallery and a map-based interface. This archival layer allows users to reflect on environmental conditions not just as data, but as lived, affective impressions.

Pollutants move as semi-transparent colored circles drifting across the screen, mimicking the diffusion of gases. Each component is weighted by molecular mass, making heavier gases move more slowly. The result is an intuitive, emergent behavior that invites interpretation rather than instruction — letting the user feel the presence of pollution as motion, tone, and atmosphere

Pollution Shots as they are saved to the camera roll

This tool is designed not only as an environmental communication platform, but as a research probe into how affective and embodied interactions can shift public perception of air quality. It aligns with UN Sustainable Development Goal 11.6, which targets reduction of air pollution’s negative health effects, especially in urban spaces.

The Art-Based Pollution Visualizer does not issue warnings or recommendations. Instead, it offers a form of experiential knowledge that complements numerical data — helping users internalize the presence of pollutants in ways that are perceptual, intuitive, and potentially transformative.

Artistic and Epistemic Vision

The project is grounded in Methodical Transfer Epistemology (MTE), an approach where scientific data is translated into aesthetic form through rule-based mappings. In this context, sound and image are not metaphorical but structurally derived from environmental conditions. The work proposes that when pollution becomes perceptible as sound and space, it may enter public consciousness through channels of attention that bypass abstract cognition.

By listening to the air — literally — users may begin to ask new questions, not just about what pollution is, but about how it feels to live within it.

Screenshots of the app UX interface

The Home, AR, and Gallery screens
An AR pollution shot, and the Map and Sound Print screens

Technical Overview

  • Platform: React Native (Expo), using camera and audio APIs
  • Sensors/Data: External API for local air quality; device GPS for geolocation
  • Sound Engine: Six polyphonic samplers, each mapped to a specific pollutant
  • Visual Layer: AR overlays with animated diffusion behavior
  • Snapshot System: Combined ViewShot and MediaLibrary export with metadata
  • Data Representation: Volume and delay of sounds modulated by pollutant levels