Personal Reflections - Przemyslaw Kozik (293155)
Personal Reflections - Przemyslaw Kozik
While working
with projects on the XRD course, I learned a lot in the context of extended
reality and how to work with it in Unity. We started working on the makerbase
augmented reality ping pong project. Markerbase AR consists in introducing
virtual elements into the real world using appropriate markers. In our project one
marker is responsible for displaying the board, and the other for controlling
the player's paddle.
In this
project I was responsible for preparing the environment and the repository for
group work, the script responsible for moving and bouncing the ball off the
walls, moving the opponent's racket, as well as the mechanics of keeping the
player's paddle on the marker. The selection of markers turned out to be
crucial in the project, because they had to be irregular enough for the
tracking points generated on them to be easily recognized when we aimed the
camera at them. Because the project used a camera made available by the Vuforia
engine, the elements were much smaller. This required adjusting the unity's
physical configuration so that the ball would bounce naturally off the walls. Most
of the scripts used in the project were related to collisions and force, which
I learned during the game development course. Since maker base is associated
with the possibility of losing track of markers or changing their position
unexpectedly, it is worth protecting against such situations. Since Vuforia
provides a number of events, this project has implemented listening for loss
and finding marker tracking and displaying warnings to the user. In addition,
the palette marker is used only to change the position of the element in one
axis, which ensures that the object is always in the correct position in
relation to the board.
The next
project was based on AR, but without markers. In this project, I was
responsible for ARCore configuration and mechanics related to rolling the ball
and scoring. Using ARCore, we were able to scan the terrain to find a
flat surface to place items on later. In the case of our project, it was a
bowling board, which was prepared as a prefab. While the game is running, ARSession
is used to manage the whole process, and AR Session Origin which contains the
camera and to which all detected trackables are assigned. The ARRaycastManager
is responsible for the location of the board, which checks the location of the
place where the user clicked and, based on the trackables found so far, spawns
the board. After the board is spawned, the plane manager is turned off, which
is responsible for displaying scanned flat surfaces and all previously found
surfaces. At this point, the operation of the board is as if it were on a
regular Unity scene, with the difference that the camera is our phone. As in
the previous project, the scripts supporting the game mechanics are mainly
based on collision detection and adding force to the ridgitbody of the ball.
For games based on ARCore, it is worth remembering that the spawned elements’ position
may change due to their constant adaptation to the environment. We found out
about this very well when resetting the position of the ball whose position was
different from the initial one. The solution turned out to be to use a relative
position based on the parent of the ball.
The third
project required a bit more commitment and work, as well as time spent reading
documentation and various types of tutorials. This is a virtual reality
goalkeeper simulator created using the XR interaction toolkit. In this project
I was responsible for the configuration of the project and the simulator, the
movement system and the shot and defense of the ball. The design is based on
the XR origin, which is the center point of the game, has an assigned camera
and tracks the position and movement of the controllers. The movement of the
controllers and the actions performed on them are based on the input system, so
it is possible to simulate them using the mouse and keyboard, which in our case
worked well in the initial version of the game. Later, due to the high dynamics
needed to catch the ball, it was very difficult. Since in the XR interaction
toolkit, interactions with objects are based on interactors, XR Ray interactor
has been assigned to the controllers, which, with a shortened operating range
and graphical representation, allows catching a ball to which the XR Grab
Interactable component is assigned. In addition, this script has many events
representing user actions, in our case we used catch and drop events to add a
point and reset the game. Since it was my first contact with VR and Oculus, I
also spent a lot of time getting to know the equipment itself and its
capabilities. With each subsequent test of the game, I learned something new
and better understood the process of operation of individual elements of the XR
interaction toolkit.
After the
experiences from the third project, it's time for the last project. This time
we decided to use Oculus Interaction, which provided many helpful built-in
features. The project consists of several minigames. I was responsible for the
preparation of the project along with the configuration of Oculus Integration,
passthrough API, memory minigame and menu. The use of a prefab that includes
OVRCamera and InputOVR allowed the use of the hand detection mechanism. In
addition, it was required to assign appropriate interactors, in my case it was
a poke interactor and a button that will react to it. After such preparation,
it's time to write an algorithm that creates a sequence of clicks and verifies
the correctness of the sequence performed by the user. It was a strictly
programming task, and the only element related to VR was listening for the user
to press a button. Components provided by Oculus Interaction were also used to
create the menu. In this case, the main role is played by the PointableCanvas,
which interacts with the RayInteractor assigned to the hands, which gives the
ability to control the entire game using only hands. To diversify the gameplay,
a passthrough API has been added, which allows you to place virtual elements in
the real world. This task was mainly based on adding a passthrough layer to the
camera. Unfortunately, because this is a new functionality, its operation can
only be verified on the built version of the application.
This was the
project I had the most fun with. This was due to the knowledge acquired while
creating previous projects and the possibility of using it to create something
interesting. I find the whole period of working with XR projects very
productive and interesting. I learned a lot in the context of mechanics related
to the process of creating AR and VR applications, I got to know the tools that
make work quick and pleasant, and I also got to know Oculus and its
possibilities.
Comments
Post a Comment