6 minutes to read With insights from... Sergej Tihonov Former Advanced Software Engineer Can you build your own Augmented Reality glasses? What are the posibilites and limits of the AR technology right now? How do users interact with virtual object in the real world? State-of-the-art AR glasses – That’s DIY. We tested out the Leap Motion Sensor with Project North Star at our Zühlke Camp. In my blog post, you’ll find out if this combination really is ready for practice, and why we felt a bit like the Avengers during testing.Hand gestures and object interaction are two extremely exciting and promising development areas in the field of augmented and mixed reality. Therefore, it was also completely clear to us that we wanted to tackle these two issues at our Zühlke Camp in early May. To not keep things simple, we chose Project North Star. Project North Star calls itself “the world’s first serious open source headset project.” An elementary component is the Leap Motion Sensor, which uses two thermal cameras and three infrared LEDs to determine the position of the hands. The cameras record 200 frames per second and the associated software calculates the 3D position of the hands from the pictures. With it, Project North Star includes all necessary functionalities for our experiments. 3D print, assembly & setup of Augmented Reality glasses All the information needed for printing and assembly was found on the Leap Motion homepage and we were able to get started right away. The files contain a BOM (bill of material) with all the items needed to buy or print. To reduce the shipping costs of the required parts we transferred it to the German market and the metric system. The electronic parts and lenses could be ordered from a Smart Prototyping website, which greatly simplified the whole process. (Just for interest there should soon be a Project North Star Kit). After delivery and a few days of 3D printing – which took almost an entire roll of filament – we had all the parts and could start assembling. Here is a short video of the process: Project North Star starts upside down After assembling all the parts, we connected the North Star (3D printed headgear) directly to the computer. The first thing that struck us immediately was that the screens were upside down and the image was mirrored in the lenses. That’s okay, depiction will be fixed on the software side, in Augmented Reality all contents are displayed correctly. Just like Microsoft’s HoloLens, Leap Motion relies on the Project North Star for its implementation on the Game Engine Unity. Unity allows you to create 2D, 3D, VR & AR games and other experiences. The included Unity project contains its own camera script, which takes over the rotation, the reflection and the division on the two lenses. The necessary Unity Package was found in the same GitHub under software. There is also a readme file in it, which gives a short explanation. The most important thing was to install the “Multi-Device Beta Service” from the ReadMe, it is a newer version of the “Leap Motion Developer Kit”. LeapAR.unitypackege already includes all existing Leap Motion packages: Core, Hands, GraphicRenderer and InteractionEngine. Thus, we could start directly and try out the provided examples. We found the Discord Channel from Project North Star particularly helpful. It is very active, well-structured and you get help quickly with problems you might encounter. If you want to deviate from the standard plan or add extensions, you can get great advice from different users. Implementation During development, we wanted to implement and test the following functionality: UI, which is hovering parallel to the user’s hand UI can be activated and deactivated by a hand gesture UI contains several tabs with different functionality You can switch between the UI tabs using finger gestures A tab where numbers or text can be entered A tab with different sliders A tab with object interaction A new, from scratch gesture: snap The result of our Augmented Reality glasses can be seen in the following video. In our own gesture, we were inspired by Marvel Avengers and implemented a snap gesture. With this gesture, the user could divide the number given in the first tab in half. Gathering the objects in the third tab also felt a bit like collecting the Infinity Stones. Conclusion First things first: Printing and assembling Project North Star was great fun. Although it is extremely bulky, it is quite light. The visual quality of the lenses was good, especially if the room is slightly darkened. Users who generally have to wear glasses can use them with the North Star without discomfort or loss of focus. But be aware that Project North Star is a “work in progress” prototype. It is not a marketable product and the functionalities are currently limited to the possibilities of Leap Motion, which is a sensor in the glasses. Regardless, it is a good way to gain some experience with gesture control and object interaction, and to learn from user testing what works well and feels natural. Our experiences, our take-away from the camp: Position and alignment of palms, forefingers and thumbs are detected very reliably and accurately. The detection of the little finger is also very close to it. The position and orientation of the middle and ring fingers are not very reliable and only provide the correct values in about 60% of When you move your hand, there are many blind spots where these fingers are covered by others in relation to the sensor. This is hard to work with and the user experience suffers greatly. For gestures that are triggered by a rotation movement, the angle at limits should not be too tight because you cannot keep the hand 100% still. Therefore, we recommend that the deactivation radius should be min. 10 degree greater than the activation radius, so that no flickering occurs. Finger gestures should be very simple and tested with many different people, as fine motor skills in the fingers are extremely different from person to person. The same applies to the differences between the right and left hands. When interacting with objects, the user must be actively supported via visual and auditory aids. It helps a lot if the interaction elements adjust or intensify the colours in relation to the distance of the finger and the interaction itself is emphasized by a tone. The best aids will not help you if they are covered by hand during interaction.
Industrial Sector – Successful monetisation of an AI-driven digital service for e-mobility Learn more