Personal Reflections - Oliwer Szefer 293139
XRD Personal Reflections – Oliwer Szefer 293139, Group X4
Looking from a
personal perspective, I would consider the XRD course to be inspiring and
useful. As I already had some knowledge from the GMD it was easier to work with
all the required tasks. All the projects that I have been working on with the
other group members were divided according to the specific topic. For the first
two it was possible to work remotely as those were used as the mobile apps,
while the 3rd and 4th required a headset and more time to
configure to make everything work correctly.
As the first
project was specifically related to the augmented reality, the point was to be
able to display and interact with the virtual object placed in the real world
using a mobile device and its camera. The Vuforia SDK alongside Unity
was used to make the make the ping pong that used a QR code to display a
virtual wall for the player to be able to rebound the incoming ball. It was necessary to include a trackable ground
plane to be able to move it within the right area. To improve the overall
experience of this project, I have also added the audio effects the way I have
learned during the GMD course, meaning the serialize fields with the specific
audio name and the function to be able to play it in the AudioManager file.
Besides, I have also added and resized the invisible walls to prevent the ball
from going out of the boundaries. Finally, I have made the UI, specifically the
main menu to be able to play the game, change the options – volume and speed of
the ball or to be able to quit. This project was a good way to get started with
the AR in general because it was rather clear where everyone knew what to do.
The only downside of using Vuforia was the fact that each one of the
members needed to import and configure it locally. This project however was a
good base for the further work that I was required to be doing.
The second
project was a continuation of the AR with the use of the ARCore. This
time, me and my group have decided to make a bowling game, where the board, the
ball and the UI were made in Unity.
It was possible to include the ARCore plugin in the project, so
that no further configuration was needed. With the use of the anchors it was
possible to recognize the surroundings to be able to scan the surface where the
bowling board should be displayed later. When I was testing the overall
experience after the last update, I have found out that it also requires some
screen interaction from the user – when the room is scanned, the surface is
generated, it is necessary to hit the screen where it shows, so that the
bowling board appears on top of that surface. This should give an impression
that the object has been placed in the real world. Compared to Vuforia,
it does not require any specific template nor QRcode. However, as it lacks some
deeper explanations it also needs some more time to do the research.
Moving on to the
third project, this one was focused specifically on Virtual Reality.
This time the idea was the Goalkeeper minigame where the player is set to save
as many balls as possible. The XRInteraction Toolkit played a major role
while configuring the setup for this project as there are many guides telling
how to use the in-built scripts, especially to provide the tracking of the pads
using the XRRayInteractor, raycasting an object, in this case the ball
that goes towards the keeper – the player. As my group spent most of the time
together working in the lab configuring the project, the basic menu also had to
be changed to the VR-adapted one placed on the scene, as it uses the VR scene with
the VR camera that works from the headset. Moreover, the XRGrabInteractable component
was used to be able to catch the ball when it was within the range of the
player. The audio has been added to notify the player about the kick. Compared
to the AR projects, this VR one required more time from us to implement the
interactive features that require the use of a headset and constant testing to
check if there were no major errors. After everything was tested with Oculus,
I have also updated the cosmetics of the surroundings to make the experience
even better. I have also added a skybox to make it look like it is taking place
at night.
The last project
was specifically based on the XR, where the point was to combine the virtual
reality with the real world. To do that, my group started with the XRInteraction
Toolkit configuration with the use of the OVRRigCamera and InputOVR
responsible for the interaction with the virtual objects. This time,
besides the controller support, the hand detection was also included as the
target was to make a game with Passthrough with the use of hand tracking to
interact while playing the mini games such as – pattern memory game, dart game
or a card game. The scripts used originate from the Oculus Interaction SDK.
The one that was used in each of the mini games is Grabbable and HandGrabInteractable
because it allows to catch and carry the virtual object because it reacts to
the actions from the hands which are detected by the headset’s camera. From the
personal perspective, I spent most of the time working with the Dart Game where
I used the inbuilt script – Physics Grabbable that allowed to throw the
carried object into a following direction. As each of the games used
passthrough our group together included a possibility to move the objects
around to be able to place it at any place that the player wants.
Summing up, it
was a great experience to learn about each technology during the XRD. The VR
projects required a little more time to work with the use of the headset,
however considering how much time there was for each project, the results are
more satisfying for me. Everything has been achieved the way I imagined as team
member – a pong game, bowling game (AR) and goalkeeper, minigames (VR and XR).
Comments
Post a Comment