Game Project 6 - Spite Bloodloss


Download Link

Specifications

  • Genre: 3D Action RPG
  • Duration: 14 Weeks Part-Time, 4h/day
  • Engine: IronWrought Engine
  • Team: SoftBlob, 13 people

My Contributions

Details

Ability Architecture

For this project I designed the architecture surrounding abilities in the game, for the player abilities but also enemy attacks and boss abilities. Every ability was itself a game object, connected to the player, enemy or boss game object via a separate AbilityComponent. Each ability had its own VFX, collider and so called behavior, defining its movement pattern if relevant. Each ability had its own instance pool, meaning it had a pool of instantiable objects, mostly relevant for abilities which could have multiple instances of itself active at any given time. Abilities were activated and updated by the AbilityComponent, keeping track of its “parent” game object to make behaviors like a boomerang movement pattern possible.

The navigation mesh used by characters was loaded as an obj-file exported from Unity. We used Unity to generate the mesh which left a lot to be desired in terms of accuracy, but was accepted due to time constraints. The character controllers used funneling to smoothen their paths, and utilized their own methods for choosing their destination. The player destination was determined by raycasting on the mesh and the enemies used the current player position for their seek behaviors or were given local randomized positions when executing their wander behaviors.

Render Pipeline

This game was developed concurrently with our custom engine, IronWrought. I implemented the basis for our forward rendering pipeline based on our assignments in graphical programming, and extended this to encompass all the system necessary for the game.

Sprite and Text Renderers

Screen space quads were used for the sprite based UI, generated from points in screen space by a geometry shader. Glyphs were generated with the help of DirectXTKs spritefont utility. Sprites were scaled based on resolution to conserve their area on screen.

VFX Architecture

The VFX consisted of two main elements, “VFX meshes” and particle emitters. VFX meshes were defined as models using shaders to scroll several textures over their geometry. The particle emitters used sprites for particles, generating a screen space quad in their geometry shader. Both meshes and particle emitters were rendered with alpha blending in a late render pass.

Outline Depth Stencil Pass

I implemented a simple depth stencil state using stenciling, and scaled the transforms of the characters to create an outline effect for the player and the current enemy hovered over by the mouse cursor. This also revealed the player model when it would get concealed by the walls of the environment. Simply scaling the transforms up and down for the outline is not optimal, but was very fast to get up and running.

UI Architecture

I designed the UI architecture for this project, consisting of a Canvas and Button class as its base. There was one canvas per state in the game, and it held collections of buttons, sprites and “animated UI elements”. The canvas ran the button logic but also subscribed to messages posted by the buttons. Each button had one or more messages associated with it, which were sent when an OnClickUp() method was called. The buttons themselves each had an array of sprites, each sprite associated with Idle, Hover and Click states. The buttons could calculate their own bounding box based on the sprite data they were initialized with, but this could also be overwritten. The canvases were loaded from JSON documents, and the player HUD was specialized to listen to gameplay messages such as when an ability went on cooldown for example.

Animated Resource Orbs

Inspired by this Diablo 3 talk at GDC, I wrote shaders that would multiply and scroll different textures across the surface of a sprite. The shader handled one to four scrolling textures for the body, one to two alpha masks which would mask the area shown, and a static overlay sprite. I used this for the health and resource globes on the player HUD.

A fog texture is used three times with differently scaled UV to get the roiling effect of the liquid, and a fourth is used to colorize. One mask is used as the sphere shape, and another tiling mask is scrolled horizontally to create the wave effect. I highlighted the glow on the very surface of the liquid by interpolating between the existing color and a defined glow color based on the UV distance to the “level” of the surface.

This same principle is used on the health and experience bars, as well as on the cooldown effects on the ability icons.

Dialogue System

I extended the dialogue system used in project 5 with features such as speaker portraits and titles. I also specialized it for use as the intro sequence of the game, scrolling voice line text across the whole screen.

Floating Damage Numbers

I developed a system used for spawning floating numbers above the player, for the genuine action RPG experience. Numbers were pooled and had different colors for varying strengths of critical hits and healing. I used an analytical function defined in code to animate the size of some of the numbers to make them visually pop. I also applied a force towards the bottom of the screen and had them randomize their starting directions, going in an arc over the center of the screen.

The floating damage numbers were part of a greater utility, called the PopupTextService. Apart from the damage numbers, it could also spawn fading description cards for when the player unlocked new skills, spawn warning text when the player’s resource ran out and give instructions such as Press Space to Continue during dialogue breaks.

Each of these instances had their own animation data, and all of these parameters were set in JSON documents.

FMod Wrapper and Audio Manager

Since we built the engine from scratch, we needed to implement an audio interface. We decided on FMod and I wrote the basic interface towards FMod as well as the Audio Manager using it. The logic for playing sounds was like in the project before based on messages sent by the gameplay logic to which the audio manager subscribed. All sound resources were held by the manager and requested from the FMod wrapper at load time. I set up FMod channels which the manager could use to group sounds and set levels for, as well as duck channels less important than others.