This AR project explores how gestures can be used to enhance user interaction with everyday tools. Users can pose their hands with specific gestures to activate handy tools.
Initially, I wanted to explore gestures as a fun experiment. However, I realized that if one day AR/VR technology becomes ubiquitous, quick, intuitive gestures linked to everyday tools could dramatically increase user convenience. This idea inspired my project development. In this project, users can use three convenient tools by making specific gestures with their hands that represent the numbers 1, 2, and 3. By making these gestures, users can activate the corresponding tools to make daily life more efficient and enjoyable.
User flow
The flow of this project is presented below.
Hand gesture design
I brainstormed a range of everyday tools that work with different gestures. The key is to make sure that users can easily connect specific gestures to the tools they want. Initially, I considered gestures such as the binocular gesture to activate the zoom-in function and the frame gesture to summon the camera. However, these options didn’t align with my original intention of providing productive tools, and some gestures were difficult to detect reliably. Eventually, I opted for simplicity, creating quick links similar to those on the desktop and mobile interfaces, and using gestures to indicate one, two, and three. This design ensures that users can access the tools they need intuitively and efficiently.
UI
The project's user interface is kept simple and intuitive, reflecting the fact that it is an everyday, convenient toolset. This simplicity ensures that each tool can be used quickly and easily without distraction.
Demo Video
How I built it
Unity
The project was built using Unity’s UI Canvas feature to design the interface, along with C# scripts to enable core functionalities like the timer, to-do list, and scene transitions.
Meta Interaction SDK
After exploring the Meta Interaction SDK’s sample scenes, I found out there are a lot of cool components and interesting interactions setup that I can take advantage of. I used its movable panel for the main UI panel and the album component from the sample scene. The SDK’s micro-interactions add a polished feel, making the experience more cohesive for users. I also integrated Meta’s building blocks to implement a pass-through feature. Overall, working with this SDK was exciting and underscored how rapidly the AR/VR field is advancing.
Uniy XR Hands
For hand tracking, the most important part of this project, I choose to use Unity XR Hands instead of Meta Interaction SDK’s hand poses plugin. Unity XR Hands offers more flexibility and a streamlined setup for gesture detection. I created three gestures—represented by one, two, and three poses—that are easily recognizable and reliable for detecting simple commands. Although more complex gestures can be challenging, Unity XR Hands is still highly effective and well-suited to many scenarios, and I’m eager to see its future improvements, such as supporting dual-hand gestures.
Challenges
Hands Detection
It was a challenge to consistently detect certain hand shapes, so I chose simpler, more reliable gestures.
Gestures
Choosing gestures that naturally link to tools was difficult at first, but using 1, 2, and 3 as direct quick links helped overcome this.
Meta SDK updates
The Meta SDK evolves with each version, and I've had challenges using older sample scenes that are no longer supported in the latest version due to limitations in accessing some useful features.
Accomplishments
Here are some accomplishments that I am proud of in this project:
Explore toolkits
I had the opportunity to dive into the powerful toolkits offered by Meta and Unity. I discovered new possibilities and functionality within these plugins that not only facilitated my work but sparked ideas for future projects.
AR productive tools
I tried to build more productive tools in the AR experience. While AR/VR gaming is exciting and immersive, I believe AR has the potential to go beyond gaming. I aim to create more AR/VR tools that enhance and simplify users’ everyday lives.
Play with Hand UI
AR/VR relies on controllers or hand gestures to interact with virtual elements. I enjoyed experimenting with hand gestures as a core interaction method in this project and look forward to exploring more hand-based projects!
Reflection
What’s next
I plan to develop Handy Tools into a fully functional AR quick-link system similar to mobile shortcuts. Like the iPhone’s Shortcuts app, I want this project to make daily tasks more convenient by allowing users to assign specific hand gestures to their favorite tools or even applications on their devices.
Learnings
Building blocks in ARVR
I once read an article that compared AR/VR development to constructing a house, where we’re still at the stage of building the essential pieces like tables and chairs. Unlike the well-established mobile and desktop platforms, AR/VR is still in its foundational stages. But this early phase is what makes it so thrilling; we’re setting up the future of an immersive system that was once only imagined in sci-fi. Although our progress might seem small, it’s paving the way to create astonishing “buildings” in AR/VR that will redefine user experiences.
Learning form others
While developing Handy Tools, I researched various projects for inspiration. Many creators are around my age, which is both inspiring and motivating—it reminds me that if they can accomplish impressive projects, so can I! Besides official resources from Meta and Unity, I learned a lot from YouTube tutorials and community discussions. The AR/VR field is growing rapidly, and I plan to contribute by sharing my knowledge online in the future, helping others interested in this area to get started.