Monday, December 2, 2024

Medusa Week 15

  •  My focus for this next turn-in, was to solidify the gaze function, add in sounds, and tighten details like volume, attenuation, and timing.
  • To lock in the gaze function, I improved the HUD texture and placement, added a lose state that takes the player back to the start, added another color for Medusa's eyes to be before the player makes eye contact, and add a sound effect to play with it, along with some debugging to make sure the sound doesn't play at unwanted times.
    Adding in the lose state within the player's blueprint

    New texture for the HUD, that appears when the gaze is triggered

    Material for Medusa's eyes, the top section makes her eyes green when not making contact, and the bottom makes her eyes red-orange. The lerp function blends from one to the other using the alpha value from the gaze material function.

    How the new eyes look on Medusa

    Within Medusa's blueprints, I added checks to make sure that the blueprints only start calling for references of her when the player is inside the proper level. This helped prevent errors within the starting area.

    Added some nodes to fade out the stoned SFX once the player looks away.

    A sound cue to produce random snake SFX that I attached to Medusa, that uses attenuation to produce louder sounds as she gets closer to the player.

Thursday, November 14, 2024

Medusa VR Week 12

  •  This week I really started to focus on the post processing effect that I wanted to trigger when the player looks at Medusa. To do this I utilized this tutorial to get me started, and mended it to fit my needs. 
  • I also worked on a HUD display to slowly appear with the post processing effect by linking both of their alpha values together
  • With the effect, I want to add an SFX that sounds like rocks cracking, but during my testing, I realized the way I triggered it made the fx play every tick that the player was looking at Medusa, so the fx played about 5 times a second, and sounded incredibly bass boosted. So, I'll need to continue looking into that next week.
Within the blueprints of the HUD Widget, we get Medusa from her blueprint and set her alpha value to the widget value, 'A'

Create a bind within the widget menu, and link the value, 'A' to the alpha channel

Now within our Medusa blueprints, if 'Should Glow' is true, it triggers a timeline to gradually increase the alpha value to 1, making the post processing effect visible

To link up Medusa's eyes with her animations, set the parent socket to a joint. Braden helped set me up for this by preparing joints located at her eyes.

To create distortion, I used a normal for a stone texture and plugged it into the WorldAlignedTexture node, which spreads it along all surfaces. 

This offsets the normals to add variations in the texture. Afterwards are parameters to control the brightness and desaturation of the effect. Adding the marble node from the previous section just adds a bit more variation.

This controls the color of the effect, and gradually changes based on the distance away from the player.

These nodes control how the effect affects the original colors in the scene. The first time it's added into the effect it blends the colors into it. Then the top nodes helps give us a way to transition the effect from the original scene's colors - when alpha is 0 - to the effect turning on - when alpha is 1. Finally, a desaturation node is used to make the scene monochrome, to make it appear as if everything around you is turning to stone.

This is the overlay I'm using in the HUD, that appears with the post processing effect. I will do another pass on the image to add more details, this is currently being used as a placeholder.

My work logged into Perforce

Thursday, November 7, 2024

Medusa VR Week 11

  •  First I worked on adding some ambiance to the world.
    • I mixed together birds and wind ambiance, fire burning, torch wooshes, and falling stones
    • The birds might have too much of a positive association for the environment we're in, so I may need to look into a different wind ambiance.
    • The fire burning sfx are to match the torches that light up the area
    • The falling stones are representative of the decrepit nature of medusa's temple, fragments of fallen victims chipping off
    • I panned some sfx to the left and others to the right to create a bit of audible depth
      • Hopefully I can replace it with events that will cue sfx to replace the ones in the ambiance

  • Next, I started to work on a material that would become Medusa's eyes.
    • Simply,  a Fresnel function does most of the work, with some vector parameters to set the color, and scalar parameters to control the scaling of the fresnel, the glow amount, and the emissive multiplier.
    • I quite like how creepy it looks! Since the fresnel is made up of annular sections, when placed on a sphere, its appearance remains the same from all angles, creating an illusion of a set of pupils following you
    • I'd love to see if I can mend it to stretch the center to make her eyes appear more snake-like.
Material Function Graph of Glowing Eye Material

Testing Material

Testing Material on a Proxy Medusa
  • Now, while working on the gaze functionality, I've been working with my professor on a new methodology that should hopefully be a bit more optimized.
    • Essentially, I'll be creating two vectors, one from Medusa, and one from the VR camera. We'll take the XYZ values of Medusa's vector and respectively subtract them from the VR camera's vector. Then we'll normalize the vector from the result and the original VR camera vector. Between these two vectors is an angle we'll call Theta. To get theta, we'll take the two normalized vectors and find the DOT product from the two of them. This will create a value between 0 and 1. If the value is close to 1, it means the angle is really small, and you're most likely looking at or close to Medusa. If, say, the angle is greater than 0.9, we'll make that the trigger for Medusa to turn you into stone. 
  • I've also been working on some destruction deformations for our stone head
    • To create this, I first attached an attribute expression node, and left the constant value at (0,0,0) to then add an attribute paint node, which allowed me to paint areas that will be more sensitive to fracturing. From there a scatter node was added, that creates points along the mesh, with points being more condensed in areas I painted in the attribute paint node. Then, I added a Voronoi fracture node, and for the sake of testing, exposed the fragments using an exploded view node.
    • A bit after, I felt that it looked a bit off that all the points stemmed from the outside of the statue, and it seemed a little unrealistic. So to curb that, I appended an iso offset node off of the attribute paint node, and took in the original fbx file. I merged that with the scatter node, and fed that into the Voronoi node. That then allowed me to get some geo clusters inside of the statue head!
    • From there, I added some nodes for UV's and Materials, and appended an RBD bullet solver node, and ending it with a null node that I reference with the RBD to FBX node in the outputs network.
  • Lastly, I got a stone tablet modeled

Here's my work in perforce


Thursday, October 31, 2024

Medusa VR Week 10

 This week I worked on:

  • Starting our organization of files and creating a system of naming conventions in a way that satisfies our needs as a team
  • Getting another pass on the gaze interaction to replace the rotation with the activation of a dynamic material
  • Creating proxy parent and child materials for our active props
I based the folder structure off of this style guide, with alterations to fit our team's needs. This will help us organize assets based off of their functions (art, programming, vfx), and within them, separating into subsections that will automatically sort by sticking to our naming conventions.

This is how our source art folder looks in perforce:

My goal for the next pass of my gaze interaction was to move away from rotation, and onto working with materials, as that will be necessary for the final product. I had a bit of trouble with it this round, as I had to change both the logic of how the gaze's activation works, and switching the kind of asset it affects.

In our 'Line Trace Forward' function, I added two functions for the material, 'Create Dynamic Material Instance' and 'Set Scalar Parameter Value on Materials'. We grab the parameter that holds the static mesh object which has the material assigned to it in the BP_GazeInteractableCube. I've done the same within the 'Basic Gaze' function. Then in the 'Activate' function, instead of the ShouldRotate value setting the rotation rate, I used it to swap between 0 and 1, depending on if it's true or false, which will be sent into the material's scalar parameter that I have multiplied against the base color.

Here are some proxy materials for our active props:

And finally this is the result of the blueprints during gameplay!

Here's the stuff in perforce!

Thursday, October 24, 2024

Medusa VR - Setting up gaze interaction

 For our VR project, I want to create an interaction with Medusa based on the player's gaze. Working with a Meta Quest 2, I was a bit limited in being unable to track eye movement, but fortunately, I'm able to cast a ray from the position of the VR headset that triggers an interaction if the ray hits a specific object.


The first step is to get the Pawn Blueprint for the VR player. We'll create a function that casts the ray from the player's camera called 'Line Trace Forward'. Into the 'Line Trace By Channel' function, we bring in the camera's world location for the starting position, and calculate the end point by adding together the world location and the forward movement from the world rotation. To check if the rays are casting correctly, select the 'Draw Debug Type' dropdown menu in the 'Line Trace By Channel' function and select, 'For Duration.'

Line Trace Forward Function in VR_Pawn

Next, we're gonna create another function for the VR player that tracks the gaze and where the ray hits. We'll feed the 'Line Trace Forward' function through 'Break Hit Result' to say if we hit something, we'll send the object it hits to another blueprint I'll discuss later, and if the object the ray cast hit is an object that we can interact with, we'll send that data to our next blueprint, 'Interact.'

Basic Gaze Function in VR_Pawn

This function double checks for us if the object is valid and if so, to bring it to the 'Activate' function in our next blueprint class.

Interact Function in VR_Pawn


Next we need a parent blueprint class that will allow us to create multiple children of different interactable objects. I've titled it, BP_GazeInteractable. Inside this parent class, all we need to do is create an 'Activate' function. From there, we create a child off of that blueprint. In our child blueprint, we'll bring the static mesh of the object we want to be interactable, under the 'Components' tab. Additionally under that tab, we'll add a 'Rotating Movement' component as the action we'll take when activated. 

In our 'Activate' blueprint, we'll create a boolean to hold the true/false values of if the cube can rotate, with true activating the rotation, and false halting it. In the 'Select' node, leave 'False' as it is, and change the Z value under 'True' to 180. 

Activate Function in BP_GazeInteractive_Child

Back to our pawn blueprint, make sure that our 'Basic Gaze' function is connected to the 'Event Tick' node, and we should be good to go!

Event Graph in VR_Pawn

Here's a demo of how it looks with the draw debug lines on!


Work in perforce:


Monday, October 14, 2024

Week 7 - Let's Make it Stick

 I chose to prep stickers of my turn-in from Week 4 - Basic Lighting. 

This is the trim sheet inside of the Cricut application.

The stickers all printed out!


Gods of War: Sprint 6

 This sprint consisted of gathering all the clips together, lighting and rendering them, and stuffing them all together into one video. I wi...