top of page

Augmented Reality (AR) is one of the most fascinating technologies, yet many of the possibilities are yet unexplored. 

 

The purpose and mission of this website is to conduct basic research in the field of AR and with that to inspire others to use it for their applications and research. 

My first experiences with AR date back to the ARToolKit in 1999. If you want to know more about me or get in touch, click About me.

///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
Augmented Reality - Quest 3 in action

2023

The Quest 3 is finally here. I've been testing it for a week now and here are some of my experiences summarized in a video. The Quest 3 is a great headset and a lot of fun to use.

///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
AR Cloud - Real World Augmented Reality with crypto payment

2022

There are many different approaches to the so called  Metaverse. My favorite is an augmented reality (AR) solution that works with location-based services. In this private project I wanted to try out use cases where you interact with your crypto wallet in such an environment.

Location-Based Donations:

You can donate to a project directly on site. The donation is thus tied to the project and the commissioning is started immediately through a smart contract as soon as the required sum has been collected.

Location-Based Entertainment:

Games and tours that integrate with the real world. In this way, information about a place can be conveyed in a playful way and participants can collect NFTs.

This is just a prototype for testing and far from a fully functional app. The MetaMask wallet is connected to the app and the wallet address amount is retrieved via an API from Etherscan. Due to SDK issues with MetaMask, the transactions are just dummy transactions which then start the process.

Since the appropriate AR headset from Apple is unfortunately not yet available, I use a MagiMask headset with an iPhone as a display to test my use cases. Due to the smooth video quality, I used a gimbal as a headset view for the recording.

///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
AR Outdoor Game

2021

Proof of Concept

This is a proof of concept for building an Augmented Reality Outdoor Game with Apples ARKit. For this, I used two iPhones. One of them is used with a customized Bluetooth rifle as a blaster. The other one works with a MagiMask as a headset.

I also extended a hardware buzzer with a Bluetooth sender, to use it as a door opener for the Tatooine house. 

The app uses Apple's ARKit Collaborative Session to synchronize the two iPhones together in an augmented reality environment. Both devices launch the same app, but one is preconfigured as a headset display and the other as a blaster. The blaster iPhone is mounted on a custom made Bluetooth gun. I removed everything from the gun except the gun grip, trigger and bluetooth transmitter inside. I added a phone holder to attach the iPhone to the grip. The virtuelle blaster is a Reality Composer scene with a notification trigger to play the sound of the shot and start the animation of the plasma bolts before the blaster's barrel.

When the bluetooth trigger is pressed, the bluetooth transmitter sends this signal to the app and the reality composer scene notification trigger starts the plasma bolt sound and animation.

In the app I use Apple's ARKit Collaborative Session to share the bluetooth gun location and show the reality scene of the blaster on the bluetooth gun in real-time. The Mandalorian blaster is very narrow and small, so overlaying the virtual object with the real gun doesn't work as well as it should. This works much better with large virtual objects, as in my PoC from the AR Shooter. (See below)

I used the personSegmentation property of type ARKits to make my hand on the handle overlay the virtual object.

The second part of my PoC is the village. Here I want to test how to interact with virtual objects outdoors. I used a Bluetooth button as the opener for the Tatooine house. Here it was complicated to sync the Bluetooth button with the virtual door opener of the house. For this I used a buzzer due to the size of the button. So it was much easier to press the Bluetooth button at the virtual button location.

The Tatooine house is also a Reality Composer scene with a notification trigger to play the door sound and start the door animation. The blur has a little animation to look more realistic.

As all of my prototypes, it is only a test what is possible and it is still far away from a fully implemented App.

The goal was to test interactions between real and virtual world with Bluetooth senders and to test how the weather will influence the position detection.

The user experience with this headset solution has room for improvement, but for me this is currently the best way.

///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
AR Shooter

2021

Proof of Concept

This is a proof of concept for building an Augmented Reality Shooter with two iPhones with Apple's ARKit and the MagiMask headset.

In this PoC I'm using two iPhones with Apple's ARKit Collaborative Session to share the weapon's data and location. The weapon is a custom Bluetooth rifle with an iPhone mount on the front. The virtual gun is a Reality Composer scene with a notification trigger to play the shot sound and start the shot animation. The robot is also a Reality Composer scene with an animation that starts as soon as you approach the virtual object.

When the Bluetooth trigger is pressed, the Bluetooth transmitter sends this signal to the app and the Reality Composer scene notification trigger starts the shot sound and the bullet animation. With Apple's ARKit Collaborative Session, I can position the virtual gun in real time to the location of the iPhone attached to the Bluetooth gun. Apple's ARKit also features person segmentation so my hand is still visible over the virtual gun.

This is a prototype only and still far away from a fully implemented and tested app. Also, the headset solution has room for improvement, especially in regards to the user experience. However, the work should offer a straightforward way to get experience with ARKit until Apple's headset is readily available.

///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
AR Whiteboard

2020

Proof of Concept

Working from home seems to be the new normal, and collaborative online meeting tools help to make it more productive. However, an easy and cheap solution for a whiteboard collaboration tool is still missing. Such a tool should be simple to set up, easy to use, and not too expensive.​

I developed a proof of concept that provides an Augmented Reality Whiteboard to your home office.

I used two iPhones for this PoC. One iPhone as a headset display and the other as a remote controller. The same app is running on both devices and the apps are connected with Apple's Multipeer Connectivity framework. With this framework, the remote controller can send the data to the headset app. First you must choose witch device is the headest view and witch the remote. Then you can positioning the whiteboard on the wall. After that you can set colored notes with text, images or only text on the whiteboard, at the position of the small arrow in the field of view. The texts are set with a speech to text function on the remote app.  

This App is only a prototype for the user interface and to prove the technical possibilities with a lot of possible features still missing. Additional features could be manual positioning of objects and using the Collaborative Sessions to work with the remote iPhone control as a pointer on the whiteboard.

However, it should be relatively easy for iOS developers to build this app themselves by using Apple's Multipeer Connectivity framework and pairing the two iPhones to send the information from remote to the headset. The rest are ARKit basics.

///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
Ground Imager

2017

Proof of concept of marking search areas and imaging different metals in the ground.

A metal detector can recognise metal objects in the ground when you move the coil of the detector over the ground.

With this measurement one can differentiate between different metals in the ground and produce corresponding acoustic output signals.​​

The Google Tango technology can track the motion and position of the phone in the real world with a Lidar sensor. Unfortunately Google has stopped the project Tango.

Detect and mark objects 

The Ground Imager App combines metal detecting and the Google Tango technology. Ground Imager analyses the acoustic output of the metal detector and creates a virtual Object at the position of the metal on the ground in realtime. Ground Imager also can use different colors for the virtual objects by analysing the frequency of the acoustic output for different metals.

Mark the search area

The Ground Imager App can mark the area where the coil has moved over the ground with a transparent virtual overlay on the ground. With this feature you can always see where you have scanned for metal in the ground and optimise your moves with the coil.

In the future this information also can be shared with other users in the same area. 

Hardware

XP Deus Metal Detector with a tablet holder for the Lenovo Phab 2 Pro with Google Tango. 

Combining the loudspeaker output of the detector with the microphone input of the Phab 2 Pro with a cable. The cable contains a potential reduction to adjust the line-out voltage to the high sensitivity of the microphone input.​

Ground Imager works with every metal detector with a head phone output.

///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
Bertelsmann Hackathon

2017​

Mentor for the project "Augmented Reality Commerce App" with Google Tango.

The goal was to build an app that shows products with augmented reality in the room and to connect them to a shop website.​

///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
Hotels Now!

2012

The Hotels Now! service was developed for hotel reservations with the Location Based Augmented Reality Plattform Layar.

The app was connected to an API to show the available hotels in the area. The hotel could be booked directly via a deep link.

The project was sold to HRS.de (Hotel Reservation Service).

Computer Bild

///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
ARToolKit Configurator

WinAR Manual

 

1. Choose AR element

2. Threshold and video settings

3. Window and color settings

4. Optional video source 

5. Pattern for AR element 

6. 3D Model for AR element

7. Start AR​ project with all AR elements

Augmented Reality - Tools
Augmented Reality - Research

Diploma

1999-2001


"Pattern recognition in their use in the Augmented Reality"

RFH Cologne - University of Applied Sciences

 

The aim of this project consisted of two parts. On the one hand, an easy-to-use augmented reality project configurator based on ARToolKit. For this purpose the configurator WinAR was developed by my fellow stundet Osman Keskin.

And the other part dealt with pattern recognition and augmented reality use cases. In addition, various pattern tools were developed and tested.

WinAR

The WinAR application offered a user interface for creating augmented reality projects. Several AR elements could be managed in one project. Each AR element had a pattern, a OpenGL 3D object and settings. The configurator then starts the current project with ARToolKit.

 

Hardware

A CyberMaxx 2.0  was  upgraded with a camera module to an Augmented Reality Headset.

The Pattern Box, Pattern Arm and the Pattern Glove were developed for special AR Use Cases.

 

Augmented Reality Use Case

With the help of the Augmented Reality application, every user could change a processor on a main board without any prior knowledge. The instructions were placed in the desired position using the Pattern Arm. The user then only had to work through the individual steps one after the other.

The individual components and steps were marked with signs (A1-E1) and colors. Animated arrows helped with the directions of movement.

bottom of page