Screenshot of BayWa arcade game microsite on a MacBook Pro and an iPhone 7

Web Arcade Game with WebSockets

Cloud Under provided full-stack web development for a real-time browser arcade game with competition and daily prizes to be won.

Once the game is started, the player navigates the snowplough through traffic using the arrow keys on a desktop computer or by tapping onto the screen on mobile devices. The game starts off slowly and the speed increases slightly every 20 seconds (until a maximum speed is reached) to make the game harder. Players must collect gift parcels to gain points. Once they collide with a car, the game is over and they can register their score in the daily high scores table to win real prizes and try as often as they wish to improve their score. The high scores table is reset daily at midnight. The game was live for 9 days in the build-up to Christmas 2016.

Check out the demo video:

Screen recording of the website and game on a desktop browser

Front-end development

As this was a very short-term (6 days) campaign promoted via Facebook, search engine optimisation was not important in this case, so we focussed on a fast and responsive user interface in which the DOM was rendered entirely on the client using React.

The rendering of the game itself, i.e. street, cars, gift parcels and (not to forget) the snow particles emitter in front of the snowplough was implemented with 2D WebGL rendering framework PixiJS with great performance both on desktop computers as well as mobile browsers.

Our client presented us their basic idea of the game and provided us all content and design, but it was up to us to figure out the detailed gameplay rules and algorithms. Because we wanted to achieve fair gameplay as much as possible, it was important to consider that the visible length of street including gifts and obstacles had to be exactly the same on all devices, from a long and slim mobile phone in portrait mode to a widescreen desktop computer. It was quite tricky to find the optimal ratio for a good length of street and the beautifully designed (by our client) winter scene to fit all possible screen formats, but in the end it worked quite well.

Real-time game with WebSockets

Another aspect that we tried to solve as good as possible in terms of fairness was the fact that in theory pretty much all client-based games can be cheated. We won’t go into too much detail about this here, but the basic problem is this: When a game is played and computed entirely on the client, the server, which collects the high scores, has no other option but to trust the client when it submits the player’s score. For greedy players with the right skills it is more or less easy to figure out how to submit bogus scores to the server to push them to the top of the leaderboard.

Obfuscation is one possibility to make a hacker’s life harder, but because JavaScript is only being compressed and not compiled into machine or byte code the possibilities are pretty much non-existent for browser games (at least it requires a bit more effort to cheat a compiled mobile app, for example).

To make cheating in our little game as hard as possible, we decided to let the (trusted) server rule the game. This means, our server web application would run a game for each player live in real time with two-way WebSocket communication with the client.

The client would receive instructions from the server only to render the scene onto the screen and transmit the intention to change lanes back to the server, i.e. whether the user pressed an arrow key or tapped onto the mobile screen.

The server would decide when and where obstacles and gifts are dropped into the scene, decide whether a lane change is possible at any given time and, calculate collisions (with cars or gifts), announce a speed increase, and, of course, when the game is over and final score.

This way the server has the authority about everything that happens in the game and how the score is calculated. The only way a user could possibly cheat is by developing their own player bot that understands our WebSocket language and simulate lane changes in an optimised way. Although not impossible, it would require considerable effort within a very short space of time.

At first this strategy may seem pretty straight forward, but it was quite a challenge, although a really fun challenge and at times hard as well. A limiting factor is bandwidth and network lag. We cannot render the graphics on the server, so this is still job of the client (JavaScript) application. We kept WebSocket communication to an absolute minimum and even rolled our own binary protocol. Both server as well as client application must share a lot of constants, such as the dimensions of each car, the dimensions and positions of collision rectangles (which is not the same as the images themselves), driving speed at any given stage of the game, speed of the snowplough’s lane changes at any given speed etc.

Play framework server web app

Ultimately we needed two separate implementations of the game for client and server. For the server we decided to use the Play framework for Scala. The server web application is being compiled and run in the Java Runtime Environment. We already knew that Play Framework web applications are great for performance, but it exceeded our expectations once again. We already prepared for auto-scaling of medium-sizes EC2 instances on AWS, but only one relatively small server could handle it all hardly even scratching the 1 percent CPU utilisation mark. I cannot go into details about user numbers, but to give you a rough idea, a 6-digit number of games had been played in less than 6 days around the clock with peak times at noon and in the evenings.

Server monitoring graph showing the CPU utilisation averaging at 0.6% and a few other graphs

Server monitoring showing 0.6% average CPU utilisation

The Play application handled all WebSocket and REST API connections, the real-time games with all its live computations, leaderboards (read and write) and registration for the competition. We used DynamoDB as database back-end. The client application was served statically via S3 and CloudFront. A web app load balancer was used for HTTPS termination but as mentioned before, it turned out that there was never a reason to balance any load between multiple servers. It’s better to be prepared than to risk down-time.

Responsiveness

Responsiveness is vital for fast arcade games, but it’s not easy as the following example demonstrates.

Two rules of the game were, a player cannot change into a lane if there is a car passing by and the snowplough has to complete a lane change before it can change direction again. Now, if the player wants to change the lane and presses the arrow key, this intention is submitted to the server, which decides whether the lane change is permitted according to the rules and the current state of the game and responds with a message.

All this can take a few noticeable milliseconds and worst of all, the lag is not predictable. For users on a fast broadband connection this might be minimal, but on a 3G it can have a bigger impact.

To compensate for this, the client starts with the animation of the snowplough changing lanes slowly but immediately. If the server eventually gives a green light, the animation will continue at normal speed as expected, but if the server denies the change, the snowplough moves back into its original lane, so it looks as if the snowplough only made a little sway without a full lane change.

The client won’t even bother to ask the server if it detects right from the start (when the user presses the arrow key) that a lane change is impossible, because it already knows that there’s an obstacle.

Conclusion

The project was a big hit with the client and the users who were engaged with the German brand for (probably) thousands of hours. It was another example that full-stack web development can achieve great all-round results covering all technical aspects of a web project.

Cloud Under has provided consultation in regards to gameplay and graphics, engineered all of the back-end, front-end and the entire AWS hosting stack in less than 4 weeks before launch.

The key technologies used in this project included: PixiJS, React, Play Framework, Scala, WebSocket, Webpack, S3, CloudFront CDN, Elastic Beanstalk, EC2, DynamoDB, Elastic Application Load Balancer and of course good old HTML5 and JavaScript.