Input Exploration Kit
At SoundxVision, we want to explore more use cases with our open-source input device in extended reality (XR), hence the name Input Exploration Kit (IEK). This device, in a compact package, can sense the finger gestures and communicate with an XR device wirelessly to enhance the hand tracking experience.
​​​​​​Join our Discord server for the latest updates here.
Slide, swipe and press 
Our approach with the IEK is to use finger gestures such as sliding, swiping and pressing for interacting in XR, these are the effortless gestures that we have been familiar with since the mass adoption of smartphone and we believe that they are still powerful in XR.​​​​​​​ Not surprisingly, by using the on board sensors on the IEK, these gestures can be detected without relying on a camera or suffering in a pitch-dark environment. For press sensing, this IEK uses a force sensitive linear potentiometer (FSLP) for detecting pressure applied by thumb to another finger and it can detect up to 2048 levels of pressure sensitivity, this analog parameter will be great for objects manipulation in XR.
Outside XR, finger gestures detected from the IEK can also be used as a human interface device for controlling computers, phones or tablets as long as the target device is equipped with Bluetooth Low Energy, we have some demos, in which this IEK is used for controlling a tablet or computer, check it out on BLOG.
The hand interfaces
And to take full advantages of the finger gestures, the IEK comes with a set of hand interfaces for swiping, sliding or the combinations of those 2.  The goal of the hand interface is to provide a quick and easy way to control hand tools and appearance right on hand.
Behind the scene, the transformation data from a hand tracking system (i.e Oculus Hand Tracking) is used for positioning the UI elements, and the role of the the thumb mounted controller is to sense the user's finger gestures and use it for manipulate the UI. Here are some hand interface prototypes from us:

Different approaches to hand interfaces based on swiping and sliding gestures.

Horizontal UI:  on this interface, the selection is navigated by swiping the IEK on another finger in left/right directions. 
Vertical UI:  the selection is navigated by swiping the index finger up/down on  the touch slider's surface.
The combination of vertical and horizontal swiping can be used for a levelled UI, in that case horizontal swipe can be used for navigating between the parent sections and vertical swipe for navigating between the child sections. For example, in a drawing application in VR, the parents section may includes the toolsets such as Pointer, Brushes, Shapes, Loupe,... And in this example, the parent section of Brushes may include tools such as Pencil, Marker, Paint Spray,... in which by navigating the selection to one of the said tools, the hand of user will be assigned with the feature of that tool and can be used by pressing the IEK to another finger.
Fine tuning: In some applications, where a characteristic of the hand tool is adjustable within a range,  the real time positioning mechanism (up to 800 units) of the FSLP can be used for fine tuning that characteristic right on the IEK.

In another scenario, the finger gestures can be used for a "story-based" kind of app on AR glasses or smartglasses, this interface also follows the parents-children concept for sorted contents. From the view of user, his/her friends' stories are displayed as stack of cards, with each stack representing one friend, by swiping up/down the user can navigate through within the stories in the centred stack and navigating between stacks can be done by swiping left/right.
It’s open source
This part of SoundxVision will be open-source, based on the Seeed Xiao BLE Sense board and a force sensitive linear potentiometer from InterLink, the hardware is simple enough if you know a little about soldering and 3D printing (don't worry if you are not familiar, we can help you a bit on this). We minimised the complexity of this device yet still keep it in a compact form so that you can make your own without dealing with too much electronic stuffs,​​​​​​.
For further modifications, for example to use the onboard microphone of the Seeed Xiao BLE sense for voice recognition, the IEK can be programmed using Arduino IDE and CircuitPython.
And for those who don't bother making one but still want a effortless hand tracking experience, we have plan to sell this as a ready-to-use kit at an affordable price.
The SDK
So far, we have been making progress on a SDK for Unity, it will includes APIs for getting input data from the IEK as well as development guidelines on how to implement the hand interface and appearance to optimise the user experience. Stay tuned, we will update as soon as we get it ready.
We are looking for collaborations
Are you a hardware maker or a XR developer wants to implement the thumb-mounted controller on your project, please don't hesitate to drop us a message, we would love to with you deliver effortless interactive experience in XR.
And the future
Next step, we want to make this thumb mounted device much smaller, maybe with a very different shape, for it to become an input device that we can truly use it outdoor effortlessly.
​​​​​​Join our Discord server for the latest updates here.