Before we get too deep into everything, I’d like to show all of you the trailer for our game, League of War: VR Arena, so that you have some basis for what I’ll be referring to throughout this talk. 


Getting into the session goals, these lessons came about through the creation of a League of War game for the Sony Playstation VR and that game was League of War: VR Arena. Prior to League of War: VR Arena, the League of War brand was solely on the mobile platform.  And in this session I will be sharing with you guys what worked and what didn’t work, and how we came to those conclusions.

What worked on mobile doesn’t always work directly in VR due to the immersion the player experiences being in the VR space. Player comfortability was a new area that we had to take in consideration for and cater towards.  And that element of course shifted how we addressed handling the UX and UI in the game.  I hope that by the end of this, all of you will gain a better understanding for how to approach creating a VR game from a UX element of a mobile game. Largely this is accomplished by identifying a primary gameplay element and implementing UI/UX features that enhance that core experience.  


We built our VR game around cultivating the nostalgic feeling of “playing with toys” and “playing in an arcade.”  We did so by introducing elements that heightened the core experience while keeping it simple, fun, and engaging. I’ll be identifying the key UI/UX elements to faithfully take the core experience, which is to say the fundamentals of a successful game on one platform (in this case mobile) and develop it for the VR platform, while maintaining the brand integrity.


There were three core areas where we had to problem solve the UX:

  • How the battlefield was presented
  • How players interacted with their units on and off the battlefield
  • UI menus.

Since the game revolves around battling your vehicles, the first item we wanted to nail down was how the battlefield was presented to the player. Two main avenues we explored were to place the battlefield on the floor or raised up on a table in a fixed spot.


There were a lot of exciting possibilities when we thought of using the entire floor. Similarities of a pokemon battle came to mind (except tanks and not Pokemon) where you have a whole field to deploy your units on, or the idea of using an arm deck holder similar to what’s seen in Yugi-Oh, where you could use a deck of cards to play your units on the field.  But, before we got in over our heads, we broke down the pros and cons of a set up that utilized the whole floor space.



  • Battlefield set at a larger scale as it would occupy the entire floor space- thus making it naturally more immersive
  • It would allow the player to walk into and around the battlefield, they could bend down, or walk from one end to the other end.
  • With more space, the player could use a remote controller, or something along the lines of an item or card that they could throw onto the field


  • Player would need a larger safe space to play- we don’t know how much space everyone has who plays, whether their playing in a living room or bedroom. This poised an issue because we didn’t want players to feel that they couldn’t fully enjoy the game based on how much room they had.
  • With the need for a larger space, players face the higher possibility of running into items and furniture in their home.- This is something we wanted to avoid obviously as we don’t want anyone getting hurt.  
  • If the player is walking in a larger space than it also runs the risk of the player losing their orientation. -Playstation VR does not track leg movement so if you have a player walking around and then they look down and realize their “digital” legs are not moving as the same in real life, it may cause disorientation.
  • When we looked into components such as the deck controller that was affixed to the player’s arm, where they could pull a card and throw it on the field, we initially tested how comfortable this was by holding our arm in place and mimicking how the player would need to move. The pacing of having to turn your head and look down towards your arm, on top of the need to keep your arm raised to about chest height for over 30 seconds proved to be quite draining, especially considering the Design Team set the matches to be a couple minutes long.  The fatigue felt by having your arm raised up was the same feeling as if you’ve ever had a life drawing class and you are using an easel for the first time.


While we liked the thought of having the battlefield on the floor to provide a larger sense of scale and feel more immersive, some of the cons were a bit too concerning for us to see it as a viable option in the long run. In turn, we then broke down how the battlefield would function if it was on a table in front of the player.


Initially when we thought of using a table, it wasn’t as appealing as the floor utilization idea…well, because it’s a table, however the more we looked into it, it turned out to be quite beautifully simplistic and help the game to feel more in lines of a traditional arcade.


  • A great thing about the table, was that the player would have a grounding focal object for which they could navigate the virtual world.- This meant we didn’t have to worry about players losing their positioning or them noticing as much that they didn’t have legs since their focus would be on the table in front of them. 
  • The table would allow players to safely look around the battlefield without losing their sense of orientation.- The player would always be able to tell where they are in relation to their armies and the enemy’s side of the table.
  • As I mentioned slightly earlier, the table brought about more of an arcade atmosphere to the game which was a direction we wanted to embrace. The idea of pitting player verse player on a table was reminiscent of arcades in the 80s.


The 1 major CON of the table:

  • Battlefield is on a smaller scale and limits how much the player can fully integrate themselves.- They could walk around the battle, but not through it like it he could if it was on the floor. The player could still lean down and place their face close to the table top objects, but it wasn’t quite the same impression as if it was life size on the floor.


We ultimately went with the table approach over the floor direction, as we felt it was more suited for the arcade experience we were driving towards and it accommodated a wide range of players with a low barrier to entry for VR. When we started focus testing, people who felt they were not “gamer” inclined felt more grounded in having the table in front of them to pivot their movemens around in the virtual world.  So that was a positive reinforcement that we had made the right decision with the table approach.


Now that we had decided to present the battlefield on a table, we had to sort out where the players were going to be placed. Players were initially placed on the side of the table so they would have an equal distance between their forces and their enemy forces. This allowed the player to easily lean or side step left and right to stick their face right up to the units and defense bases. Seemed like a good idea, right? Well not quite when you add in how the player is going to interact with deploying their units on the battlefield. It turned out that depending on how the player interacted with their units that actually drove the positioning of players around the table as opposed to picking player placement first. I’ll elaborate on why this is in the next couple of slides, as that brings us to the second main UX component that we had to solve, which was how the player deployed and interacted with their units.


With the player being on the side of the table we had moved towards the player having a console in which they press a card that deploys that particular unit. Once the card is pressed, the unit comes out from the player’s base at the end of the table and moves to attack the enemy at the other end of the field- you can see this in the images here. While this particular positioning provided the overall best view for the player of what was happening on the battlefield, it wasn’t as immersive, fun, and engaging as we wanted it to be. 


Once the card was pressed, the interaction ended there and created a wall between the player and battlefield. In order to give the players more interaction with the battlefield and their units, we re-evaluated how the units were represented. We explored moving away from the notion that the player had a “deck” of cards to using the actual 3d models of the units to represent them like toys.


While presenting the 3D units was visually more appealing, it didn’t solve the interaction issue since the player still tapped the unit to deploy it from the base at the end of the table and the interaction ended there and it started feeling a little like “whack-a-mole” which isn’t what we wanted. Our first step towards breaking down this friction between the player and the battlefield resulted in using the motion controllers to targeting the units with the pointer laser so that they could be dragged and dropped onto the field. 

It was as intuitive as grabbing a laser pointer in real life is and pointing it at a penny on the ground. It was extremely precise and that’s something VR excels at - hand eye coordination far more than locomotion in a virtual world. It’s incredibly hard to target that accurately with a game controller or mouse and keyboard. This is especially true for wider audiences who are not typically gamer oriented. So the ease of using the motion controllers like laser pointers came naturally to everyone we had test at this stage. Keep in mind, at this point the player is still placed on the sides of the table.  This was a step in the right direction on the interactivity but not fully to a point we were content with. 


We liked the response from testers regarding how natural the motion controllers felt and didn’t want to lose that, but we also needed to bridge the gap more between the player and the units. Ultimately, this evolved into removing using the motion controllers as a laser pointer and providing the player with a pair of hands. This allowed the player to pick up their units with their hands and physically (virtually) place them onto the battlefield. This improved the player integration and eliminated the friction between the player and battlefield. It allowed for the player to pick up the unit like a toy, inspect it up close without needing to bend down, and place it where they wanted to giving the player a larger sense that they had more control over their gameplay decisions.  However with the player being placed on the side of the table initially, it made the new mechanic a little awkward since the bases were at the ends of the table and the player would be placing their units from the midfield line. In turn we moved the players to be at the far ends of the table which turned out quite nicely as….


It led to a more dramatic experience where there was a heightened sense of “my units” versus “your units.” Inadvertently, this also meant players were in the debris path from units exploding on the battlefield which, excitingly, allows the player to hit debris out of the air. We were quite happy with how the player placement and interaction with the units developed as not only was it fun for us to play but the positive responses we received from focus testing on how intuitive it was pick up a unit and place it on the field solidified it for us.


In addition to picking up and placing their units on the table where they wanted to, we added a targeting mechanic that allows the player to point the unit they’ve picked up and lock onto an enemy unit on the field to target it. This added another strategic element of the game and furthered interactivity with the units and reinforcing a sense of playing with toys.


The last key UX element we had to solve was how the players interacted and used the menus in the game. Often in movies and television you see the futuristic menus with someone moving their hands all across a digitally projected screen. This motif was something that fit quite well with our game’s theme, but while as cool as it was, we came to the conclusion it’s not quite as efficient for long time use as we ran into the same issue earlier with the arm deck holder idea for deploying units. With this type of menu set up the player’s arm would be strained after a short period from having to keep it raised up.  In order to keep the motif to a certain extent, there were guidelines we developed along the way to make the player more comfortable. There were two menus set ups that we had to address in our game.


The table was at the center of the game, both physically and figuratively, and because of this we wanted to use it to facilitate the UI menus to help reiterate that the entire game focuses around how the player interacts with this table-top arcade. There were two predominantly different types of menus in the game: 

Floating Menus: where the player would be funneled to either the Campaign or Arcade mode of the game or the settings screen. Initially we had set up the buttons with an animation and visual change to indicate that the player had touched a button and pressed it. However in testing, there were too many times where a player didn’t realize they were interacting with the button or had actually pressed it. So we added haptic feedback where the motion controllers vibrated. When the player hovers on a button, it scales slightly and has a soft vibration to indicate to the player they had come in contact with an object. When the button is selected, it scales, has a stronger vibration, and changes color. The vibration was such a clear signal that this made a drastic difference in the player recognizing that they were performing an action.


In the case of our Campaign menu there were several buttons we had originally laid out in a more “moviesque” way, where the buttons were in a semi-circle around the player. However, there were issues with this format. The player had to side-step and turn their head to reach items on either far end of the menu and there was a noticeable discomfort in having to raise your head to see the top items. 


Not to mention, after a short while there was strain in the player’s arm with having to raise it higher than their shoulder in order to reach the top most buttons. We’ve encountered this situation before with the ideas we played with for deploying units on the field, so we were confident we could resolve this situation as well. To do so we had to re-evaluate how the menu setup was managed- which I will get to shortly.


The second menu set up we had were Combo Menus (Floating plus a Unit or Battlefield element): where the player was presented information in a way that required both types of elements, including a list of items the player could view, so typically there was a need for a scroll view set up in these cases. Scroll views in the more traditional sense of horizontal and vertical lists felt a little stiff and were not as visually appealing in the VR space- at least to us. 


To help make them more visually interesting and feel a bit more natural when in use, we placed the scrollviews on a curve. This not only made them more visually appealing in 3D space but when in use the player’s arm movement was minimized as they could pivot their arm rotation instead of needing to fully move their arm up or across their body.


Going back to what I mentioned about how we had to re-evaluate how the menu setup was handled.  In order to make the menus more accessible and comfortable we developed a couple guidelines.  All menu buttons were not to cause the player to reach higher than the height of their shoulder. Chest height ended up being the highest we went due to it being a bit more comfortable than shoulder height.  The strain of reaching higher than your shoulder is similar to how if you’ve ever sat in a life drawing class and you have to use an easel...if it’s been awhile, you have to keep resting your arm because your shoulder isn’t use to your arm being up that high. There’s a substantial amount of fatigue initially, so a person has to build up a tolerance over time.  Additionally, all menu items need to be visible in the player’s immediate view without them needing to turn their heads. As you can see in the examples here, button height was adjusted and placement of menu items were moved into a much more focused view as opposed to what we had earlier.


Scrollviews and heavily used buttons (such as the “Battle” and “Back” button) were to be kept at the waist/hip height. This was found to be the most comfortable position for an extended period, which was helpful in areas such as the Store, Arcade, and My Squad Manager.  Scrollviews were arched for aesthetic and ease of motion.


Player hands should never go through the table or table objects. Not only was this due in part because of the design of the table- the height, where you stand, and comfortability of your reach zone, but also in light of how this upholds the suspension of disbelief and prevents the awkward "hand through table" problem that looks so bad. Players motions are set up so that they ‘grab’ and ‘pull away from’ table to pick up units as opposed to ‘pushing downward’ on a typical button.


One of our play tests we conducted was over at the Foundry, so I’m going to let some of our clips from that play test play in the background here while I talk about focus testing. There was a lot of great feedback from focus tests so I’m going to go over the items that surfaced the most.

  • Everyone grasped the idea of picking up a unit and placing it on the battlefield along with the aiming component immediately from the start of the game- which was great.
  • There was a tendency in people to target towers much more often than the units. On the other hand, a large number of people tried to pick up or move units that were already on the field after they were deployed.
  • It was also noted that people predominantly played with their right hand, presumably because that was their dominant hand as when we would ask about why they used the hand that they did for playing they often didn’t even notice they had used one more than the other.
  • One of the more peculiar things we noticed in play tests was that people had a tendency to fiddle with the Motion Controllers like they were drumsticks. Much in the same way you might tap on your desk with your pencil.
  • Overall, people seemed to have the most fun and engagement when the game was in their face during the heat of a battle.

One particular reoccurring item that came out of focus testing was that when players first put on the headset, your need to reset the camera by pressing the “Start” button on the Motion Controller. Now this is standard, but the interesting thing to watch was how many people would look down at their hand thinking they would see their hand as normal not realizing they had a headset on, so they actually couldn’t see their hand. 

One of the things we did to help players see where the button was on the Motion Controller was give the player a prompt that had the image of the Motion Controller with the “Start” button highlighted along with directions on what to do. This is actually is the longest line of text in the game, which we’re perfectly fine with.


Overall, the UI content was kept as minimal as possible. If there were too many items to pay attention to it can deter from the main action the player should be focusing on. This resulted in some statistics, progress meters, and text being removed because they weren’t essential and didn’t really contribute to the engagement of the player. People don’t read every line of text in games, but can often gloss over them, and when you have too much text it can be overwhelming. This was something we noticed players doing in the onboarding stages of our mobile games. Less really is more and playing with toys is more fun than reading...especially in VR. So don’t be afraid to remove an element, as it might shed light on how to solve other issues.

While there is always room for tweaking and improving, we were quite happy with the end result for the UX and UI in the game. VR is growing and more people are getting exposed to it. An overwhelming number of people in our external play tests had never used VR before, and because of this, it’s so important to remember your audience. Simplicity is key especially in VR as it’s very easy to get carried away with all the possibilities. I hope that by sharing our experiences on League of War: VR Arena that this gives you ideas for how to approach developing a VR game using the core user experience of a mobile brand.

League of War: VR Arena

For LOW: VR Arena, I was UI/UX Lead for the project. My responsibilities included:

  • Developing and testing the UX flow.
  • Creating prototypes and wireframes of UI/ UX elements to test within the VR space.
  • Working with the Design team to understand the design goals and needs for the project to better the UX flow.
  • Setting up all UI content in Unity and then passing off to engineering.
  • Monitor focus tests and apply results to improve the UX flow.
  • Manage other artists on the team and delegate tasks to meet project deadline.
  • Work with Publisher to submit all marketing and asset needs for Sony store, Paris Games Week, PAX West, PSX, and retail kiosks.