VR Driving Simulator - (Private Repository)
Thesis Project: VR Driving simulator with mid-air haptic Navigation
VR Chat Room
VR room with voice chat and leap motion
Mid-Air Haptic Navigation Pattern - (Private Repository)
Mid-Air Haptic Navigation Patterns using Ultrahaptics
Eye Tracking Analysis - (Private Repository)
Eye Tracking analysis in VR Driving Simulator
Dynamic Navigation - (Private Repository)
Dynamic Navigation in VR Driving Simulator
Voice Recording and Playback
Voice Recording and Playback feature
Android Development in Unity
Menu Selection Screen
Select one of the correct options
To check out more work please visit my github profile:
The usability of Mid-Air Haptics in Car Navigation systems - Abstract
Tech Stack: Unity, C#, Python, AWS, Ultrahaptics, Tableau
While several studies have examined the usability of mid-air haptic sensations within the automotive industry, previous research has not yet examined the use of mid-air haptics sensations within in-car navigation systems. Therefore, this study aimed to assess whether mid-air haptic sensations could also be of use to in-car navigation systems. This study specifically aimed to examine whether the inclusion of mid-air haptic sensations within the in-car navigation system improves the user experience of the in-car navigation system and helps to decrease the amount of driver distraction in terms of lane deviation and the eyes off the road time. The results indicate that the mid-air haptic patterns that were associated with specific navigation prompts helped drivers in understanding directions and reaching their destination in a timely manner. The study also indicated that people found the mid-air haptic sensations to be reassuring as they provide continuous feedback as to where one has to go. Additionally, it also made it easier for users to estimate the distance they had to travel before taking a turn. Moreover, the integration of mid-air haptic patterns within the in-car navigation system resulted in a smaller value for the mean lane deviation and a lesser eyes off the road time as compared to the mean lane deviation and eyes of the road time associated with the baseline condition which included a navigation system consisting of voice guided and visual navigation. Along with this, the results also indicated that users were the most distracted while only using voice guided and visual navigation. Based on the findings in this study, it can thus be concluded that integrating mid-air haptic sensations within the in-car navigation system serves to minimize driver distraction and optimize navigational comprehensibility.
Multi-Player VR play area with voice chat and leap motion - Abstract
Tech Stack: Unity, C#, Unity networking
The goal of this project was to build a VR meeting room, with leap motion hand interaction. The meeting room was build over UNET and could be accessed from anywhere in the world. In this, each user could be uniquely identified and all the hand movement were captured and sent to the network. The basic inverse kinematic algorithm was used to replicate the hand motion based on the controller's position. Users also have the ability to have a decent conversation and could play with the ball on the scene if UltraLeap is enabled.
Benchmark Evaluation of Thunder X2 processor(HPC) - Abstract
Tech Stack: Linux, C++, HPC, DGMM, HPL
The goal of this project was to evaluate benchmark of the ARM thunder X2 processor using HPL and DGEMM algorithm and then compare it with the Intel's core I7 and Xeon processors present in the University at that time. With standard gcc and clang compilers (the latest version present at that time) the performance of ARM thunder X2 processor was not up to mark but with ARM compiler there was a significant increase in performance.
Wireless VR - Abstract
Tech Stack: Java, Netbeans, Wireless VR research papers
The main goal of this seminar was to understand the working the working of wireless VR- How data can be transmitted seamlessly with minimum latency. A side project was created called VR interpolated images to tackle the lag. In this when the user moves around the previous image that was in the frame will be interpolated as per the user head movement. The interpolation scheme replicated the row or column pixels of the image in the last frame depending on the movement.
"University of Bayreuth Scholarship- Winter Semester 2019-2020"
"3rd position in Study Smarter Hackathon, June 2019"