Overview

BOID BUGS VR is an interactive ecosystem simulation built for Meta Quest 3/3S and inspired by the concept of the insect apocalypse. An immersive, VR demonstration of the delicate balance between natural growth and ecological collapse, as determined by insect populations and human interaction. Players can now enter the garden in first-person, surrounded by the damage of pesticides or a bounty of flowers. BOID BUGS VR features a unique 2.5D style, utilizing appealing 2D assets and translating them to a 3D environment by way of a 'billboarding' effect.

Introduction

BOID BUGS VR is an improved 2.5D interactive landscape which utilizes the Boids algorithm to simulate the movement of behavior of insect populations in VR. This project visualizes an ecological system where the player’s interactions influence the health of the environment which surrounds them on all sides.

This concept is rooted in the continued study of the “Insect Apocalypse”, the dramatic decline in insect populations worldwide due to pesticide use and environmental changes. This decline has also resulted in a degradation of ecosystems, proving the essential services insects provide. BOID BUGS VR uses a 3D boids model as a symbolic and visual medium to represent this balance.

When the user points their right Quest Touch controller to the soil beneath them and presses the trigger, they plant flowers which attract new insects. When pointing their controller above the horizon line, a trigger press will spray pesticides to eliminate them. Over time, these actions visibly shape the 2.5D ecosystem, changing not only the number of insects, but also the health of the environment. As the insect population declines, the sky becomes grim and polluted, the leaves of the trees begin to dwindle, and the plants in the garden start to die. If the user plants more flowers, the insect population increases, and the sky becomes clear, the tree is healthy, and a luscious garden fills their view.

This system aims to evoke reflection on the human impact and the interdependence of the environment, noting that every decision has visible consequences. Placing the player directly in this perspective makes their actions all that more personal, choosing a view full of pesticide spray or a beautiful, flower-filled landscape.

Methods

Within Unity’s OpenXR system and using C#, each insect acts as an instance of a Boid prefab controlled by local rules of alignment, cohesion and separation. The environment reacts dynamically to player interactions using several key functions:

The Boid system used in BOID BUGS VR extends the flocking model into a three-dimensional, cylindrical environment, with additional constraints that shape how insects move, cluster, and respond to the ecosystem. Each parameter contributes to flocking in the cylinder-shaped space:

Beyond movement, several environmental elements provide feedback to player interaction:

The largest constraint when moving the project from 2D to VR was its visual assets: all sprites were originally 2D. To work around this limitation, a simulated 3D environment and billboarding system were introduced.

Takeaways

BOID BUGS VR required extensive knowledge in XR development and Quest 3 utilization. Both were relatively new to me, making this a valuable opportunity to expand my technical skillset.

One of the most unique challenges involved working with 2D assets inside XR. I wanted to preserve the original BOID BUGS aesthetic while adding spatial depth and immersion. Through research, I discovered the billboarding technique, which enabled this intersection between 2D sprites and 3D XR environments. The result produced a distinctive visual style that I plan to continue using in future projects.

I also developed a stronger understanding of Unity’s OpenXR system. While scene development remains similar to standard 3D workflows, handling input binding and interaction logic introduces an entirely new layer of complexity. After initial difficulties achieving consistent interactions, I explored documentation and learned how to properly bind controller inputs across XR devices.

Overall, this project marked an important step forward in my Unity development practice—branching into XR workflows and broadening the possibilities for future immersive projects.