XRD Week 12 - Final
Repository link: https://github.com/composer404/XRArcade
XRD Week 12
This week we focused on polishing the card game and bringing all the minigames together. A RayInteractable component has been added to each tab along with InteractableUnityEventWrapper to listen for select and hover actions. The cards prepared in this way interact with RayInteractor from this moment, so that the user can play with his hands. /Przemek
| Card Game Scripts |
Since the project uses the Pasthrough API, we decided to add the ability to move objects remotely using the right controller. Thanks to this, the user is able to move objects in the real world to any available place. We achieved this effect by assigning ObjectManipulator and LineRenderer to the newly created object and colliders to object that can interact with it. In addition, RayInteractor has been disabled in the right controller, so that both of these functionalities do not overlap. /Przemek
For the dart game we have decided
to come up with improvements to enhance the better experience of the entire
game. As the box collider for the throws did not work as intended because of
the difficulty to carry the throw correctly, we decided to switch it to the capsule
collider for both, the front part that hits a certain part of the board and the
middle part where it is supposed to be carried. /Oliwer
The basket with the transparent walls
with the box colliders has been added to be able to move the throws around. On
top of the basket, we placed the buttons to reset or to close the game to open
the menu. Once the reset button is pressed, all the throws are respawned with ondestroy that removes the object from the game and initialize that allows to spawn
the throws inside the basket. /Oliwer
| A throw with the capsule collider |
To be able to distinguish the
difference between the number of points given for a certain hit at the board, the
rigidbody of the throw has been used to specify the points that need to be subtracted. We have included all the related functionality in the DartController that has been assigned to the front part of the throw, with the rigidbody specified inside the SerializeField. /Oliwer
After such preparations, it's time to move all the mini-games to one stage and connect them to the menu. In this case, there were no major problems. When you click in the menu, the selected object is activated and the others are deactivated. One difference is CardGame where the Destroy method is used when turned off and the object is reinitialized when turned on. This procedure allowed us to activate the card rotation animation every time. In addition, we spent a lot of time on solving smaller bugs mainly related to the interaction element and the position in which individual elements are spawned. /Przemek
Summing up, the entire project might be considered to be completed as we have learned about the technical aspects of each one of the topics - XR. Throughout that time we have learned about each technology in a way that we already have some experience that might be useful while coping with that in the future. We have also learned how to use the XR Interaction Toolkit to be able to make the interactions with the objects, using Grabbable, Physics Grabbable or Hand Grab Interaction. This allowed us to move the object with the collider from once place to another. Besides, we have also spent most of the time during the initiation part to add passthrough and handtracking, so that we can use our own hands tracked with the Oculus cameras to interact with the game object in a way to be able to change its location or to be able to press on it. As both handtracking and passthrough are still considered to be experimental, the dart game might require some precision from the player while making a hit. All of the following tools or code changes required us to make the updates in Unity to see the results using the Oculus headset subsequentely. /Oliwer
https://youtu.be/0xnPkO1tBP8
This week
we also decided to add one additional game – hangman. It’s a guessing game
where user has to guess the word by knowing only the length of it. The game has
been done with keyboard simulator represented below.
Keyboard
buttons
First step
was to create keyboard object together with buttons.
Each button consists of collider.
Scripts
Next step
was to make scripts that will allow user to interact with the keyboard. The
game consists of 3 scripts: Keyboard, KeyboardButton and TypingArea.
Keyboard
script
Keyboard
script consists of methods that handle insertion of characters or space,
deletion and whether the caps button is pressed.
KeyboardButton
script
This script
consists of Keyboard object and delegates char insertion
TypingArea
script
TypingArea
script handles the hand movements and detects which hand is being used.
Collider
for typing area
Having keyboards
button designed the next step was to create collider for typing area.
Layers
Project had
added 4 additional layers for the proper interaction.
Input
Canvas
Input canvas
consisted of TextMeshPro Input Field which was then added to the keyboard
object script.
/Justyna
Referenced links:
https://www.youtube.com/watch?v=Ril-5dWBOSU&t=1040s - grab and throw objects in VR
https://www.youtube.com/watch?v=H6d-hagFFNc&list=PLQMQNmwN3Fvx2d7uNxMkVOs1aUV-vxrlf - XR Interaction Toolkit setup
https://www.youtube.com/watch?v=1FRqniErAfs - how to interact in VR
Authors: Przemek & Oliwer & Justyna
Comments
Post a Comment