Wednesday, 3 December 2014

Game engines blog 5

This past week was very busy with many deadlines quickly approaching. I spent most of the week only working on my water effects for homework, so this blog post will be about that. I’ll start off with showing off the final result:



I’m really happy about the way it turned out. Maybe next semester I’ll implement it into our game but I would have to change the style a bit. This water looks too realistic for our game’s art style.

The entire process requires 3 rendering passes.

The first pass renders the reflection texture. In order to do that a 2nd camera is added. The reflection camera is positioned at the regular camera’s position and scaled by -1 in the Y direction. The point the camera is looking at is also scaled by -1 in the Y direction. All objects below the water are clipped from the scene. The scene from that camera perspective is stored in a framebuffer object.

The second pass is to render the refraction texture. It is similar to the reflection pass except from the main camera’s perspective and all objects above the water are clipped from the scene. The scene is then stored to another framebuffer object.

The final pass is when the water is actually rendered. The reflection and refraction textures are passed into the water fragment shader, as well as a bunch of other uniforms including 2 other textures (normal map and dudv), and camera direction. In the vertex shader not much is done except preparing variables for the fragment shader and displacing vertices based on the colours from the normal map. The fragment shader is where everything happens. The reflection and refraction maps are sampled for their colour. The uv’s are also distorted based on a constant distortion value set in the shader. The camera direction is used in the fresnel reflection so that when you are looking directly at the water it looks more transparent than when you are looking away from it.

A few mistakes that I made while trying to complete this question were:

The reflection camera position. At first I was just making it upside down instead of scaling it by -1. By setting the orientation to upside down the objects ended up on the wrong side. I also forgot to scale the lookat position as well so reflections didn’t show up where they should be.

Rendering order. Sometimes when I would run the program the objects in the reflection would end up behind the skybox or disappear when the skybox went over them. I fixed this by making sure the objects were the last things to be rendered during the render to texture passes.

Overall I am quite happy with the water effect I was able to create. Hopefully I can implement it into our game sometime next semester.

Friday, 28 November 2014

Game Engines Blog 4

In the past couple of weeks we managed to implement a lot of new things into our prototype. The main things we added were power-ups, AI, and we implemented the Bullet Physics engine.

The bullet physics engine was fairly simple to implement and use. One thing that we need to change though is the way the environment rigid bodies are generated. When I first implemented the engine, I just wanted to get it working. So since our first level is simple enough, I just created boxes manually and set them around the map for collision. In the future we’ll probably try to create the rigid bodies based off of meshes instead of manually. Below is some of our bullet code.



Our game, Kinetic Surge, is meant to be a multiplayer game but we implemented a simple AI for demonstration purposes. The AI just moves in random directions but that's all we really need to demonstrate our main mechanics of our game.

Lastly, we a power-up system to our game. Currently we only have 1 type of power up which gives the player a 1.2x speed boost for 5 seconds. We have 2 locations on the map where the power-ups can spawn and they spawn every 30 seconds. In the future we plan on adding more power-ups, for example like invisibility. Below is a screenshot of a power-up in our prototype.



I’m currently working on my water shaders for the homework questions. I’ll probably post more about it in next weeks blog post.

Friday, 7 November 2014

Game Engines Blog 3

For the past couple of weeks we have been learning about several things related to graphics. Some of the topics I had already learned/knew about such as deferred rendering and motion blur but I did learn about some new things. One of the new things I learned was stencil buffers. Simply put, the stencil buffer is used to limit the area of rendering.The stencil buffer can be useful in many situations such as improving shadows and reflections. I plan on looking into stencil buffers but probably not during this semester. 

I upgraded my water shaders this past week. I added vertex displacement based samples from a height map.

Now all thats missing is some cool lighting and some reflections. I plan on working on that next week to get homework out of the way. I started looking into fresnel reections and I think it shouldn't be too difficult.

In terms of GDW, development has slowed down due to the approaching deadline of the homework questions. But after the MIGS trip next week, game development should continue 100%. I'm currently working on getting first-person camera controls working and after that I'll start working on some fluid character movement.

Friday, 17 October 2014

Game Engines Blog 2

Most of the lectures the past couple weeks have been reviewing topics we’ve already learned to sort of get them fresh in our mind. We went over things like vectors, matrices and the math between them. We also went over the different spaces such as screen space, tangent space, etc. I’m glad we went over these topics because its been a while since I’ve reviewed them myself.


A couple weeks ago I began working on rendering water for a homework question. I finally got it working after several problems with updating uniform values in tloc. The problem was that I wasn’t using shader operators in my code. I plan on continuing to work on the water questions and hopefully implement really cool looking water into our game. Currently my water is basically just a scrolling texture on a quad. Below is a screenshot of my simple water so far.





In yesterday’s tutorial we learned how to expand and create our own component systems. In the tutorial we made a simple health system. Right after the tutorial I started thinking about how what we did in class could translate to our game. I think a simple system just like the health system we made could be used for our stamina mechanic our game.

Friday, 26 September 2014

Game Engines Blog 1

One of the most important topics we learned during these first three weeks was entity component systems. We were shown two images to help understand entity component systems a little better. Entities can be seen as a key with the teeth of the key being its different components. Systems can be seen as locks which require an entity (key) to begin working.







We also looked at scene graphs which is used for node parenting. Scene graphs can be extremely useful for understanding how objects in games should be interacting with each other. For example when a character is using a bow and arrow the arrow would go from being parented to the character to the bow and then to the environment.


I started playing around with TortoiseHg. Last year we actually made a repository for our game but we never ended up using it. After making a test repository and doing very simple commits, pushes, pulls and updates, I now understand how useful having a repository is. I actually wish we used a repository for our code last year because I feel like we had wasted way too much time just trying to resolve issues that would’ve been way easier to fix with simple version control. I’m actually kind of looking forward to keeping a repository of our game. It will be nice for keeping our project clean and neat and not having several folders of different versions of the game on our desktops.


Friday, 11 April 2014

Week 12 - game con

There was no lecture on friday due to Level Up. Our group wasn't chosen to showcase our game, all of the games this year were really impressive and I hope that UOIT represented well at Level Up. Gamecon was on monday and it was really tiring fun as well. I enjoyed looking at and playing all of the other games developed by UOIT gamedev students.
As expected we spent the night before gamecon touching up our game. We added nice visual effects including bloom and better lighting. Here are a couple screenshots for comparison.

old

new

Unfortunately implementing the better graphics presented us with other bugs to deal with. Our unit selected no longer worked correctly and we had to implement a really difficult to use temporary fix. This selected issue should be fixed before the final submission.


Saturday, 22 March 2014

Week 11 - Motion Blur

On Monday a lot of the studios got to show off their games to the rest of the class. Everybody's games are starting to come together and look really great, a huge step up from first semester.


Above is a screenshot of the new lighting system in our game. By next week i'll have screenshots of our entire first level instead of just one model. 

Motion Blur:

Motion blur is an effect that is caused when the image being recorded or rendered changes during the recording of a single frame. This can be caused either by rapid movement or long exposure. 

There are many ways of implementing motion blur. There is an easy way that involves the use of the accumulation buffer. The downside to this method is that it requires rendering each frame multiple times, if you want a lot of blur or have a lot of geometry that can cause slow downs.

- draw the frame
- load the accumulation buffer with a portion of the current frame
- loop and draw the last n frames and accumulate.
- display the final scene

The more modern way or effective way of doing motion blur involves the use of motion vectors. To do motion blur with motion vectors you need to calculate each pixel's screen space velocity then that velocity is used to do the blur. The calculation of this vector is done in fragment shader on a per-pixel basis.