Collaborative Project Submission Post

Completed Gameplay Capture

Finalising all the work completed with the MA VR team in the week prior, we were able to produce a 5-minute long virtual reality experience from a comic book adaptation. Wit interactions are heavily inspired and directly referenced from In the works of Joshua Barkman (False Knees) the experience places the player as a seagull and the actons of unravelling between two seagulls and their day at the seaside. Going through these scenarios, different interactions such as an ice cream collecting mini-game, throwing beachballs and hitting seagulls with sticks add interactional applications for the player. Due to the immersive and in-depth experience players are affronted within the virtual reality space, It felt important that sound was integrated in an effective way, therefore the bird’s interactivity with the sound to bring its presence felt important to player engagement. Below is the final five-minute gameplay capture, displaying these features.

Final Gameplay Capture

Final Gameplay capture with commentary from VR Students

The below video includes the gameplay capture featured with VR commentary to outline with more clarity the interactions and gameplay feature the player can get involved with.

Critically reflecting on the project, as well as my personal participation in it, the limited time span and ambitiousness of the project and the number of scenes that required animating created quite a few hurdles due to lack of experience. Despite these limitations, we were successful in creating a 5-minute long virtual reality experience that included several interactive elements that the player could engage with. While two out of three of these environments were completed, the last city environment proved to have too many elements that required additional work we didn’t factor into our timeline due to technical issues. Reflecting on my own personal engagement with the project, while I was able to help with early development including storyboarding, conceptual art and modelling I felt that my animations could be pushed to a higher quality in a similar light to those I created the last term. Due to several importing/exporting issues that occurred between Maya and Unity as well as reoccurring issues with the rig, the time spent animating was severely limited due to the initial plan. Due to polygon loss during the exporting process, a lot of the expression and motion in the Maya scene was lost when placed into Unity. The animation itself was also stripped from detail due to rigging issues and last minute directional changes, forcing myself and my team members to focus on meeting our deadline whilst sacrificing nuanced quality. However, taking into consideration stylization the minimal amount of animation and lack of realistic detail adds to character design and comedic effect.

Showreel Of Participation

Below is a showreel to display some of the work I produced during the length of the project, taking some of my reflection into consideration the areas that needed more development were my asset modelling and character animation as I feel these both suffered a lack of quality due to time constraints. To improve my asset modelling I would take more time with texturing as I felt some of the textures and detail did not match the overall world they were imported into. Initially, I explored using software called Nomad (an iOS application) it caused some exportation issues between both Maya and Nomad, this also drastically raised the number of polygons which would be inefficient to put into a Unity scene and would cause issues with real-time rendering overload. A process to help retain polygon quality onto a low poly model can be done by implementing the use of a ‘Normal Map’ which is “a special type of texture that tells the 3D program or game engine to display details on a polygon surface as though they were geometry” (Totten, 2012, p.106). When working with game engines in the future, I fully intend to administer this into my modelling work. In Nomad I was creating realistic textures whilst contradicting the textures being used in the False Knees game file which were very minimal and cell-shaded in style.

Miro Board

Cal from VR kept a record of each group meeting held and what was discussed as well as resolved during each week:

Meeting Notes (Collaborative Unity) – Google Docs

Throughout the entirety of the project, we were able to communicate and convey our progress and ideas very visually to one another through the use of the miro board below. This proved to be particularly useful during all group meetings and helped convey ideas very efficiently during class presentations. Below is the completed Miroboard.

https://miro.com/app/board/uXjVOTsXnB8=/?invite_link_id=965546776052

Future research areas to improve and engage with Games engine relationship to computer animation-

.Blackman, S. 2011. Beginning 3d game development with Unity: the world’s most widely used multi-platform game engine. Apress.

. Penny de, B. 2012. Holistic game development with Unity: an all-in-one guide to implementing game mechanics, art, design, and programming. Waltham, MA: Focal Press.

.Botz-Bornstein, T. 2015. Virtual Reality. Leiden: BRILL.

.Cotton.M. 2021. Virtual Reality, empathy and ethics. Basingstoke: Palgrave Macmillan.

.Ryan, M, L.2003. Narrative as Virtual Reality: Immersion and Interactivity in literature and electronic media.

Collaborative Blog Posts

Module Introduction: Collaborative Unit: Initial Considerations – Esme’s Blog (arts.ac.uk)

Module Introduction: Collaborative Project: Idea and Skill Pitch – Esme’s Blog (arts.ac.uk)

Workshop: Camera Sequencer – Esme’s Blog (arts.ac.UK)

Collaborative Seminar 1:Collaborative Seminar – Esme’s Blog (arts.ac.uk)

Collaborative Seminar 2:Workshop: Group Seminar – Esme’s Blog (arts.ac.uk)

Week 1: Collaborative Project – Esme’s Blog (arts.ac.uk)

Week 2: Collaborative Project – Esme’s Blog (arts.ac.uk)

Week 3: Collaborative Project – Esme’s Blog (arts.ac.uk)

Week 4: Collaborative Project – Esme’s Blog (arts.ac.uk)

Week 5: Collaborative Project – Esme’s Blog (arts.ac.uk)

Week 6: Collaborative Project – Esme’s Blog (arts.ac.uk)

Week 7: Collaborative project – Esme’s Blog (arts.ac.uk)

Week 8: Collaborative Project – Esme’s Blog (arts.ac.uk)

Week 9: Collaborative Project – Esme’s Blog (arts.ac.uk)

Week 10: Collaborative Project – Esme’s Blog (arts.ac.uk)

References

Totten, C. (2012). Game character creation with Blender and Unity. John Wiley & Sons, INC: United States of America: Indianapolis, Indiana.

Week 10: Collaborative Project

Team Summary:

We all prepare for the final critical presentation with the VR students and develop our own slides. I and Marianna also complete all animations for the beach scenes and import them successfully into unity with few further issues but the slight mesh and aesthetic setbacks that do not directly affect gameplay.

Final Critical Presentation with VR students:

Collaborative Project – Final Crit Presentation – Google Slides

As it is the final week of the development of our project, we complied presentations with the virtual reality students for their final critical presentation in order to be given constructive feedback by their professors. During this process, I and Marianna complied some slides to contribute to the overall presentation in which we showcased and examined the different sections of our collaboration such as initial concept art and modelling, the character rigging, speech bubble interactive design and the animation process. This was a very useful process as we were able to compile all the work we had completed together as a team and present it to an audience that had knowledge of the areas of potential improvement. While this presentation was more centred around the VR students and their unity build, so we did not have the presentational space to go into massive depth about our work on the 3D animation side of the project, we were given the chance to summerise how our contribution was vital to the completion of the project and gave us a chance to critically reflect on every aspect of production.

Modelling and Animation Slide
Rigging Slide
Speech Bubble Design
Animation and Modelling

Critical Reflection and Feedback from VR Lecturers

Initially stating how we were able to produce a 5-minute gameplay sequence in software over the course of only 10 weeks, I would argue we achieved a lot that was very useful and education in the field of 3D produced work. I feel our particular successes were in the overall design and story-driven elements due to our concentration on the narrative drive. I feel exploring the method of storying boarding within the 3D environment itself proved massively beneficial on a personal level to gain an understanding of exactly the realm in which we were immersing ourselves and our skillsets.

Reading from the work of Bucher, he brings to attention two questions when considering storytelling (Bucher, 2018):

Who is the audience?

What is one thing you want them to walk away with?

When critically reflecting on these questions in regards to our narrative experience there are two answers regarding what we have produced. The audience is intended for people over 15, but more specifically people who enjoy the narrative experience of a comic book. By creating the addition of swearing and comedic violence, it begs the question as to what age group this exactly falls into, but I feel the real audience is vaguer than a specific age. The game’s ability to draw older audiences into more classical childlike narratives begs the question of VR and animation’s potential as a storytelling medium for more than pure video game entertainment. The game’s purpose, in my opinion, when reflecting on what we achieved seems to serve predominantly learning procedures to teach us effectively collaboration across disciplines. In spite of this, I think it serves a grander purpose of pushing the exploration of non-gameplay heavy experiences that can be created in Virtual reality that embody and underline the importance of an experience in which there is no expectation or pressure placed on a player to do well to succeed.

In terms of collaboration, one of the primary pieces of feedback we received during this critical presentation was how well coordinated and communicated we were as a team in order to create the aspect we did. This I feel was particularly effective in our regular communications and meetings every week. I also on many occasions called teammates using Microsoft teams to continue to be effective in the workflow without having to be present in the same room.

An interesting comment which changed my perspective of our gameplay narratives was the use of speech bubbles seeming ineffective in their state, especially with the audio present. Going forward with these ideas of the project it seems imperative to make this more effectively design, and perhaps integrate elements of the bubbles following the player so they are always reading and facing the viewer.

Relevant tutorial to explore this:

(51) How to make any object looking at and facing towards the player in Unity – YouTube

The major point of reflection in regards to the entire gameplay sequence is the lack of progression we were able to make in regards to the last city scene. If the workflow was more researched and understood prior to starting the project, I feel this would have been achieved much faster as the primary issues that set us back were the Maya to unity conversions. In a reflective element though, the entire process was extremely beneficial educationally and inspires me to want to continue progressing in the field of games animation in future. Since I and the rest of the team mutually agree on combining our efforts to finish the project outside of graded assessment, I feel this will push us all to create a finalised finished piece that can be used as a strong showreel material.

References

Bucher, J (2017), Storytelling for Virtual Reality: Methods and Principles for Crafting Immersive Narratives, Taylor & Francis Group, Oxford. Available from: ProQuest Ebook Central. [7 Feb 2022].

Week 9: Collaborative Project

Meeting Notes and Weekly Goals:

This week’s meeting entailed how the rig issues that previously prevented the animation stage from starting are now at a malleable and workable stage in which all beach scene animations are a priority and should be completed within the course of the week. As well as this good progression was made on the Ice cream mini-game, as well as the development of the unity/ Maya rig tests.

This week’s Aims:

.Complete all personally assigned beach scene files (Interaction 4)

.Resolve any last issues that may prevent gameplay function in Unity

Issue Resolution: Non-Functioning Rigs in Unity

an Issue Marianna encountered when animating one of the interactions was that the animation, when imported into unity, would not apply to the models. After meeting with the VR students, it became apparent the importance of finalising the rig and model before importing and animating sequences intended for unity.

Anim File Export

Using a process I discovered the last term when I had a similar issue regarding a non-functioning rig containing well-developed animation information, I exported Marianna’s animation for interaction 3 as an Anim file and imported this onto the most updated version of the rig in which I had fixed some issues regarding the rigging and skin weighting of the beak. While most of this importation was safely transferred, I had to reanimate the beak speaking in time with the relevant audio as the information could not be transferred onto a rig that did not previously exist. Despite this, the resolution was reached and functioned just fine in unity after engaging in a Microsoft teams meeting with our teammate Cal so we could test directly how well the process worked. Due to the nature of animated games, in which a finalised rig is required to apply animation information in the games engine, animated films can use a referenced rig that updates with the animated information. Proceeding with this information in future I will assist to create a cleaner pipeline in which myself and the other animations are definitely working with the same rig. This will further be prevented by the fact that my and my future collaborators’ rigging knowledge will be more in-depth, especially regarding functions in engines such as Unity, that will prevent issues needing to be consistently fixed.

Anim File Imported on Updated Rig

Animation

As mentioned by Fothergill and Flick on the ethics of human-chicken relationships in video games, the idea of chickens being symbolic of cowardice and domesticity is strongly associated with its comedic effect (2015). While in the context of our project, seagulls in themselves are viewed widely as irritating and inconvenient to humans on a daily basis, which gives a further implication and suggestion for the use of violence in-game. A key aspect of the gameplay for our project is the ability to pick up objects in the environment and use them to attack the seagulls, especially in regards to this interaction here, and bring to attention an important aspect of gameplay. Due to society’s embedded behaviours, it seems obvious to a player without specific instruction that the intention of the game is to inflict violence on the bird who is causing you the most personal irritation. This is expanded by Bucher, who suggests that a key element to the success of a player to advance in a non-linear narrative is the “continued reliance on linear logic” which in its own way implies that the player themselves is the key driver to the narrative rather than a camera (2018,p.84) I feel societal association works to our benefit for this particular project, however, leaves questions for more ethical considerations in future. Violence in video games is something taken as a given, even in its comedic form, and especially for a game that in its aesthetic appearance seems harmless in presentation. It insinuates potential underlying factors that create the product to have a problematic ethical response. Especially in the first person, fully immersive experience of the Oculus rift, where it seems far less distant than simply watching a narrative play out through the confines of a screen. As discussed by Matthew Cotton, Immersive VR systems are considered “empathy-arousal” tools that can be utilised to ‘stimulate empathetic engagement towards marginalised or vulnerable peoples’ ( 2021: 113). In this instant, by using violence as a comedic effect it brings to attention a line perhaps that should be drawn through this incorporation that will be beneficial to consider, especially by what it represents in anthropomorphic characters than embody some kind of human experience. This being said, seagulls in our current climate are in some danger brought to attention by environmentalists, which may spread a negative environmental image to players.

File 1: Bird Dialogue Interaction

The nature of this interaction is to engage with the viewer and their place as a ‘seagull’ who is part of the conversation at hand. The use of the ‘stick’ or piece of washed up driftwood for the player to take first-hand was something interesting to consider when approaching this piece, as the use of constraints in Maya was something that would not necessarily function the same in unity. Taking this into account, I broke the animation down into 5 separate FBX files that the VR students could then trigger and engage in based on player interactivity. File 1: The triggered initial dialogue between seagull A and B. File 2: The Looping animation, engaging the viewer to take the stick. FIle 3: The interactions between seagull A and B after the stick has been taken. File 4: a looping animation that awaits the player to hit seagull A with the stick. File 5: The interactions between Seagull A and B once this action has been triggered.

File 2: Looping Animation

Breaking these down into the different files proved very useful for the VR students when coding into the unity file the different interactions. However, I feel this is something we should have regarded earlier on in the project, as the structural ‘shot’ planning could have been produced more effectively.

File 3: Interaction and Dialogue

File 4: Looping Animation Awaiting Player Engagement

File 5: Last Diaolouge sequence of the Interaction

Speech Bubble Design for Sky Scene

For the sky environmental scene, the only ‘animation’ required was the speech bubbles that will be interacted with the skyspace. For this, I used my previous designs to work in cooperation with Marianna to create bubbles that appear rudimentary in concept but paired with movement appear to have a charm to them. Reflecting on our overall use of speech bubbles in regard to this project, I feel we perhaps could have considered character and emotional portrayal more with the linework involved. For example for the stupid seagull, creating the lines is a specific way that encapsulates his ‘simplicity’ such as lots of easily curved lines, whereas we could have used sharper, more dynamic lines for seagull A to accentuate his frustrations and more depth of thought and feeling.

I think that by incorporating a more ‘foul’ linguistic approach, it tries to separate itself from a younger audience despite its inherent childlike aesthetics that perhaps will assist in pushing forward the use of animated media in the adult world associations.

Critical Reflection on Animation Process

During this week we were productive in our endeavours to complete the first three scenes, the Beach, the Pier and the Sky, which was our minim aim for the 10-week project. This being said, we still needed to develop enough to at least begin animation in the city environment. Going forward with game student collaboration, I feel I have a much deeper understanding of how Maya works in relationship to game engine requirements, and will allow a much longer and more developed contingency time for aspects such as rigging, and will produce animated FBX file tests much earlier on to identify issues sooner.

References

.M. Cotton. 2021. Virtual Reality, empathy and ethics. Basingstoke: Palgrave Macmillan.

.B. Tyr Fothergill and C Flick. 2016. The Ethics of Human-Chicken Relationships in Video Games: The Origins of the Digital Chicken. SIGCAS Comput. Soc. 45, 3 (September 2015), 100-108. DOI: hht://doi.org./10.1145/2874239.2874254

Week 8: Collaborative Project

Meeting and Notes:

Interactivity such as teleportation and grabbing were completed. The rig has also been predominately fixed, and animation has begun in full swing. Music for minigames is complete and scripts on the VR side are going well.

Personal Goals:

.Create Shot List

. Animate Interaction 2

Considering the animation required for this project and also considering time efficiency, we collectively agreed that the best method of animation would be minimal, which gives the overall emotional impression without extensive detail. Despite my best efforts to adjust and fix the rig, time constraints no longer allow us time to keep adjusting the rig, so facial animation will also be kept to a minimum. This, however, appears to suit the process, as conversion from MB into FBX proves to compress and readjust the model until it displays in lower poly/ rather different to Maya in the aspects of mesh presentation.

Animation Development

Developing from last week’s ideas, we began to structure the animation files and rigs into different layers to help us animate in a more organised and structured manner. This will also assist us if we need to make any changes fast without directly affecting the rest of the animation. Reflecting further on best industry practices in this respect, the use of layers is a fantastic way to explore alternate versions of your animation without heavy commitment. Roy expands on how the “weight ” of an animation layer can be adjusted in a similar way to opacity and layer groups found in Adobe Photoshop (2014, P.284)

Animation Layers and Organisation

Shot Lists and Planning

I thought an imperative first step to the animation process was creating a shot list in order to evenly distribute the work into clear, distinguishable sequences based on dialogue. This in itself proved a challenge due to my prior experiences with scenes and shots =, it was difficult to pin down how to separate the different aspects of animation due to its non-linear narrative and structural form. In this sense, I broke it down into the key different environments and key interactions between characters in the order that they were storyboarded. Following on from this I assigned the shots between myself and Marianna, prioritising those on the beach scene due to that being our main goal for the project.

Shot List for Gameplay Interactions

I felt it important to also add the idle animations such as walking and ‘breathing’ due to the constant nature of real-time rendering, and these animations will be used in cycles to help the flow of gameplay.

Animation

Thinking about the stylistic interpretation of the speech bubbles, the integration of how they move in relation to the bird models and also considering player POV when adjusting motion will be something that will have to be relatively controlled within unity’s camera. Due to this, the way in which the interactions will potentially be structured in unity is by ‘ teleporting’ to different squares in which visibility is relatively controlled to a confined space within the games. Beginning the animation process for this VR experience, I had to adapt a different mindset to the one I had harboured for past projects as the animation itself depends on player interactivity, as different sequences are ‘triggered’ by gameplay elements. Animation is also created with the intention of being viewed from multiple angles with lesser regard to cinematography than an animated film.

Cycled Animation at the End of Narrative Sequence

Baring in mind player interactivity, in theory, the interaction should ‘Cycle’ until the player has clicked or progressed through the narrative in order to trigger the next. But in the event of that not happening the seagulls need to participate in continuous movement to allow for a level of realism and believability of the character. If the character remained static after every interaction it would ruin world-building believability and bring to attention flaws in our immersive experience.

Reflecting on my finished shot, I feel I was able to produce a character-driven performance that is suitable for the lower poly models imported within unity, however, there are several aspects I would improve on for future considerations if given the opportunity to fine-tune our project at a later date. The first would be the lack of ‘finger’ and wing movement detail, as it adds a flatness to the characters that makes them appear more like inanimate figurines than anthropomorphized seagulls. I also feel more follow-through, offset and secondary action could have been included during this animated interaction, especially in Seagull B as he sways to the music, where weight could be distributed in the offset of his head to his body. This would also add to the accentuation of ‘birdlike’ flexibility between the head and neck, as the bird contains more vertebrae which allow for sudden neck and head movements.

See the source image
Gull Skeletal Structure

The facial expressions are also lacking in blinks and well-developed eye darts, however, the limitations of the detail within these expressions are predominately due to rigging issues experienced with the eyelids, which continue to be offset from the rig despite revisions to the mesh hierarchies. With these limitations and lack of contingency time to fix it, I think the monotonous, lack of expression in its own way suits the characters in their blunt emotional depth and only adds exemplification to Seagull B’s ‘Stupidity’. In association with facial animation studies, the lack of upper facial animation (brows, eyes, eyelids) can create an uncanny effect that strips the naturalistic element of movement and creates viewer discomfort (Tinwell, 2015). However, due to the simplistic design of the seagulls, which do not strive to achieve photorealism, a lack of expression is passable in expression portrayal, as less animation is required to sell the character’s performance inherently and avoids falling into the realms of the Uncanny Valley on aesthetic design.

Animation Exported into Unity

With the previous issues of last week’s animation import issues resolved, the sections myself and Marianna animated over the past week seem to function well in unity. The major issue at hand that still needs to be tackled is the mesh exporting issues. While there are ways to improve the polygonal interception, the overall smoothing seems to be lost, specifically on seagull A, and the wings of seagull B. In spite of this, I feel the ruggedness of the birds adds a rough element that is replicant of childlike drawings that suggest a more ‘undeveloped conscious’ contrasting visually with the conversations at hand. In spite of this, I intend to look more into solutions surrounding unity’s polygonal processing in the following week.

Potential Video Guidance for Next Week

(51) How to Actually optimize your game in Unity – Complete Game Optimization Guide – YouTube

(51) Unity Probuilder : Smoothing Tool – YouTube

Animation Interaction (with vice over audio included)

References

.Roy, K., 2014. How to cheat in Maya 2014. Abingdon: Focal Press.

.Tinwell, A. 2015. The Uncanny Valley In Games and Animation. Florida: CRC Press.

.B. Tyr Fothergill and C Flick. 2016. The Ethics of Human-Chicken Relationships in Video Games: The Origins of the Digital Chicken. SIGCAS Comput. Soc. 45, 3 (September 2015), 100-108. DOI: hht://doi.org./10.1145/2874239.2874254

Week 7: Collaborative project

Meeting and Notes:

We discussed major setbacks and progress each of us had made. one of the scenes in Unity was entirely created and ready to be played and interacted with in VR space. I and Marianna still encountered issues with our rig and import/ Export issues in unity that we wish to develop for next week.

Goals:

.Finalise the rig

.Resolve any further issues that prevent animation

Speech Bubble Explortation

Exploring the ways in which the comic bubbles could be interpreted and translated into 3D space, I wanted to think about player interactivity as well as aesthetic considerations that empathises gameplay. Looking at design elements in balance with gameplay elements, the VR team intend to create interactable speech bubbles, that a player can grab and use as potential weaponry against the seagulls. Thinking about this as a graphic element, my initial ideas were to create one singular mesh that contains both 3D text and a larger almost spherical bubble that the player can grab. Looking at the original false knees comic as a reference, the bubbles are very basic in design and did not inspire much creativity from me inherently apart from the differing line size that added a hand-drawn aesthetic, that I wanted to replicate in 3D.

False Knees Comic Bubble Reference

Below is the initial test I produced to explore speech bubbles’ aesthetics and movement. The test proved mildly unsuccessful despite the 3d environment as I felt the bubbles’ appearance was still flat, going forward from this I looked at references of different artistic impressions of speech bubbles in old animated films and games to see different design aspects I could interpret into 3D space.

Initial Speech Bubble Test

One particular design that stood out to me is the credit sequence of One Hundred and One Dalmations (Geronimi, C, Luske, H and W Reitherman, 1961) as I found the abstract outlines and shapes were an interesting format that replicated the spots on the dogs. Taking this animalistic concept into account, I thought that this may be an interesting concept to explore with the seagulls. The key element I took from this however was the thick abstract lines and shapes with more human-made, rough edges that do not appear as perfect as my initial bubble.

See the source image
Speech Bubble Inspiration

Developing from this I created a speech bubble that has several different layers that can be interacted with the intention of creating a more 2D effect in 3D space, I added a heavy line replicating that of traditional comic book line art. With the use of the pencil curve tool, I was able to create a ‘drawn’ and more imperfect aspect to the line control which seemed, in my opinion, to fit with the comedic and comic derived goal of the production.

Pencil and Curve Toolsets
Developed Speech Bubble with Layers

Below I recorded and shared this process with my team members so that the style and process of the speech bubble developed were easily accessible and could be recreated in the same style to keep consistency with our animations. This process also allowed me to explore different methods of NURBS to polygonal conversion within Maya.

Process of Creating Stylised Speech Bubble Text

Critically thinking about the process of integrating these bubbles into an interactive and three-dimensional space, there may be some impracticalities in the design aspect as the layered spaced elements of the design could not work well in a 360-degree environment in which all angles of our objects can be explored by players, so this in itself presents challenges on camera and object tracking within the virtual camera. Taking accessibility into account also, we wanted to pick a font that would be easily readable for viewers that may not be able to hear or easily understand the audio and voice acting of the birds, so we explored a style that was simple to read and well-spaced out, but also fitted the ‘hand drawn’ visual to match that of the created 3D lines.

Video References

(49) How to convert/change curve into a polygon in Maya 2020 for beginner – YouTube

Mesh Collision Issues

Expanding on last week’s issues involving missing mesh faces upon unity imports, in an attempt to explore the issues surrounding the deleted faces, I looked at the original Maya file model in an unsmoothed mode to replicate that of its appearance in Unity. I found that the vertex seemed to unnaturally collide with one another and adjusted them so that they were more in line with one another, in the hopes that upon import they would not collide.

Wing Mesh Interception issues on animation export

Mesh interception
Vertex Manipulation

The method eventually proved effective however caused a few issues in which the wing looked abnormally big and messed up the overall volume and scale of the model proportionally. This, naturally, takes away from any sense of ‘realism’ we may have wanted to achieve within character believability, however, temporarily seems to fix any issues regarding mesh collision elements.

Resolved Mesh issues

Scene File setups and professionalism

Now that the rigs are in a functioning state and are ready to be animated I thought it imperative for production to set up a scene in both me and Mariana can use as a base for our animations so they are scaled exactly the same and are both in the exact same spacial position. This will improve the production pipeline as when coding the different interactions into the Unity project folder, the character will not drastically jump apart from one another which will prevent any jarring cuts between character and gameplay interactability.

Project Folders
Beach Interaction Set Up
Restaurant Scene Set UP

Week 6: Collaborative Project

Meeting and Notes:

Everyone discussed their progress. The sound students already had sound files that we can begin experimenting with movement regarding music with. I and Marianna exemplified the setbacks we were experiencing with the rig, and showed the IK developments we were able to make.

Personal Goals:

.Complete and fix issues with the Rigs on both birds

. Explore import/ export relationship with Maya and Unity

Rigging

During the process of the Maya to unity workflow and animation exporting we came across several issues in which I had to research even further and expand my understanding of rigging aspects. Initially, looking at the rigging files that were shared between Marianna and myself, the hierarchy of the controls, geometry and joints was messy, and seemly caused issues during unity exports. Looking into methods of exporting and importing animation from Maya, the below video made some points that indicated to me structure was not cohesive and clear and was attached to several elements of the geometry.

Animation Import/Export in Maya – YouTube

Revising the rigs from the previous week, there seemed to be underlying issues in the way that the Ik in the wings was attached to the rest of the rig, which caused some exporting issues that did not move inside unity’s game engine even when the animation was baked into the FBX file.

Import Tests in Unity

Trying to engage effectively understand Maya and unity’s relationship to one another, I wanted to produce several tests in order to see which issues we will encounter, and be time-efficient before we began properly producing character animation to get an effective Maya to Unity workflow.

Due to previous model import tests being static in motion, in order to test a working import I took the Mr Boney rig and animated all the different IK controllers in unison, baked the keyframes to the rig, and exported them as an FBX file for unity in order to get it to function. As seen in the test below, this test was successful and lead me to consider two different aspects: The hierarchy of Mr Bony’s mesh and joint structure and the export preset options in Maya.

Successful IK Movement in Unity
Bony Rig and Mesh Hierarchies

After studying the structure of the Mr Bony Rig, I noticed that they were primarily sectioned into different groups, that all fed into each other. Containing all the controllers, mesh and skeleton rig into different groups so there is clean access to all of them. Taking this into account, I replicated a similar file structure with a similar joint order and after clearing the history, rebinding the skin and exporting, I was able to create a working FBX file of Seagull A.

Initial Rig Hierarchy
Revised Rig Hierarchy

On earlier exports from Maya and Imports into unity, we experienced issues with the wings not moving, and earlier on there were skin binding issues due to the separation of the wings to the main body mesh. As stated by Watkins, by reducing the objects attached individually to the mesh, the number of “draw calls” required is reduced, which overall reduced joint deformations, therefore reducing the number of “cost processor cycles” (2012,p.280). This, in essence, will optimize the playback speed, which is imperative for the functions of games due to their ‘real-time rendering’.

Below is the video evidencing the FBX export of the animation test from Maya functions in Unity with wings and IK legs which helped me reassess the pipeline me and my team would implement when producing animations. Going forward with our import and exporting issues, we will use the following method below-

Ensure you are animating with a Clean the Hierarchy/ Rig> Animate on layers> Bake> Export

I found that implementing the additional step of animation layers would create a smoother and more efficient pipeline to use later on in the animation process. As stated by Roy, animation layers “increase the ease of creative tasks like trying different approaches and variations” and also simplify the use of graph editor curves by compartmentalising them ( 2014, P.273). This is something we intend to use going forward for the project also as it allows animation baking to be easily reverted in old files if we need to extract or add elements such as facial expressions separate from the body movements.

Video References

Maya to Unity Workflow – Campaign Update – YouTube

Mesh Issues in Unity

Polygons Missing in Mesh

An issue we encountered during this process was a loss of polygons in the mesh when exporting animated rig FX files. Researching into this we found a few solutions that can assist us going into the next week, one being the smooth option which will essentially bake the polygons into a higher count so that once it is converting into an FBX, the polygon information should not be ‘corrupted’ when imported into unity.

Another alternative I looked into with a member of the VR project was polygonal smoothing within unity itself to see if it can be rectified directly within the software.

Smoothing Options in Maya

Due to the multiple tests, we conducted using both animated and static models of the character, it appears that only the animated models cause the mesh issues, even using the above technique. When using the smoothing techniques within unity’s outlier, it had little to no effect, only making the rougher edges and mesh displacements slightly less drastic in appearance. Going forward with our research in fixing this particular issue into the next week it may be important to consider if the mesh in Maya may be directly affecting this perhaps due to polygon collision which can cause their appearance to be ‘defaced’ in unity’s software.

Import and Export Tests in Unity

`Video References

ProBuilder Unity | Smooth Object Tool – YouTube

Animation and Music Relationship Test

Working with the music students, I wanted to use their audio and create an animation that interacts with the soundscape of the game, potentially creating an additional character and comedic effect. The animation was created with the idea that it could act as an ‘idle’ animation for seagull B to help accentuate his ‘simplicity’ and general happiness or naivete. This will also help improve player immersion as the characters are interacting more with their environment.

Dance Animation Test

Importing the animation into unity, while simultaneously reflecting on the progression of the technical aspects involving unity and Maya’s relationship, highlights great progression as it also indicates several factors that are not working. The main indication of an underlying mesh/ rigging issue still needing resolution is the ‘floating’ eyelids which seem to offset themselves from their frozen transformation when any body movement is made. The other obvious issue at hand is the mesh deformity in the right-wing, which I will have to further look into to understand why exactly the polygons delete themselves in appearance once imported into unity.

Dance Animation in Unity

Eye Lid Rigging Issues

Going into Maya once more to inspect the eyelid rigging issues it seems that there is once again a hierarchal issue within the mesh structure, in which the eyelids are separate to the main mesh and seem to pivot around the ‘blink’ controllers rather than the mesh itself.

Eyelid Offset
Eyelid and Eye Offset

In order to resolve this issue, I followed the same method that I applied previously in order to attach the wings to the mesh for skin binding purposes and grouped the eyelids in a way that enabled them to follow the eye mesh without causing an offset. Due to the rebinding required during this process, I also will have to re-rig the set driven keys to creating an effective blink method in the following week ahead. reflecting on this, in future I will ensure that the mesh is neat and comprehendible for additional software exports.

Organisation and File structure

In order to effectively manage and structure our files as a team, I thought it would be more effective to create an organised file structure to allow accessibility for all team members and reduce the number of unnecessary ‘we transfer’ files over email; especially since these were causing Maya file corruptions. This will be particularly useful when sharing the scene set-up files and finalised rigs so that both myself and Marianna animating will be using the same versions of everything to prevent inconsistencies.

This image has an empty alt attribute; its file name is image-2-1024x165.png
File Structure on a Shared Google Drive

References

Roy, K., 2014. How to cheat in Maya 2014. Abingdon: Focal Press.

Watkins, A (2012) Creating Games with Unity and Maya : How to Develop Fun and Marketable 3D Games. Burlington, MA: Routledge. Available at: https://search-ebscohost-com.arts.idm.oclc.org/login.aspx?direct=true&db=nlebk&AN=376905&site=ehost-live&scope=site (Accessed: 27 Februrary 2022).

Workshop: Group Seminar

Following this week’s seminar, we participated in a group discussion with other team members and created a mind map discussing ideas of how to progress and develop our project with differing teams.

In these teams, we had to discuss a question that would make us critically reflect on our project and the way we approach it. Our team’s question was: If you were to remove one element from each of your projects completely, how would this affect the work?

For our project, we took several approaches to this question, thinking about software changes, narrative changes and character changes. FOr narrative, we discuss how without the use of comic bubbles, the project loses its comic adaptation in style and seems to take away from the intended experience. This was contradicted in our class, however, as someone suggested that perhaps adapting the games aesthetically to a cell-shaded style could retain the comic book feel without inherently needing the speech bubbles to adapt to this.

We also discussed how for software use, if we switched from using unity to an engine such as Unreal Engine, the processes and overall rendering would look very different and perhaps create an ease of relationship between Maya and unity due to its easier import and export options. We also discussed that perhaps one of the most significant changes would be changing the characters from anthropomorphic seagulls to humans, as it would take away the entire comedic value of our project and begs the question of writing the concept entirely due to how flat and mundane it would feel in a ‘human’ experience.

In Class Mind Map

One of the major pieces of feedback we received from a classmate was perhaps thinking about how the entire experience, if not in VR, could be adapted in a way that still feels immersive. In this respect, they suggested perhaps making it a 360 experience using real screens and 360 sound space using speakers to immerse viewers in the world without the need for a headset, integrating the real and virtual realms in a more literal sense.

Week 5: Collaborative Project

Group Meeting and Notes:

During this meeting, we met the sound design students who would be helping develop the soundscape and effects for the game.

Personal Goals:

. Set up the scenes for the interactions

.Animate cycles

Scene References and Organisation

While beginning the scene set-up process in order to effectively organise our scenes, I encountered several issues which created some major setbacks in our progression of animation. When moving the global control of the seagull, the main body mesh separated itself and offset from the main rig controllers and the eyes. This issue seemed to occur in both rigs, so looking into solutions for both rigs I found that the root issue was in the rig’s entire hierarchal structure, which needed to be edited to make more coherent sense.

With a lack of understanding of the rigging process, when opening the referenced file to investigate the issue at hand, I tried to delete the history which deleted several elements that destroyed aspects of the rig such as the Wing Iks. As stated by Watkins in his book Creating Games with Unity and Maya: How to Develop Fun and Marketable 3D Games (2012), He expands on the importance of deleting its mesh history before beginning the rigging process to minimise the file size and prevent and excess of nodes, so critically reflecting on this in future, our process of rigging will have to be much cleaner. In this light, I unbound the rig from the skin in order to reset the mesh and deleted its history as I found several issues to tackle.

Rigging Issues and Solutions

One of the key elements we had to consider when improving the rig was the addition of IK controllers. When going to create a test walk cycle with our characters to provide to the VR students, I noticed that with a lack of IK controls in the legs, full-body animation would prove excessively difficult for us in the long run. Taking this into consideration, I research and followed some YouTube tutorials in order to gain an understanding of how leg hierarchies work in an IK Format.

Following this, I created a test that displayed how this worked in order for my fellow teammates to see and understand how I intended to adjust the lower half of both the rigs to improve the mobility and fluidity of the animation.

Seagull A

Applying this to the first rig, Seagull A, in which the legs are longer and more replicant of real human legs, the implementation was fairly smooth and easy to calculate. The real difficulty in terms of achieving ‘realism’ or anthropomorphised Seagull behaviour will be in the skin weighting and directional knee bends.

Seagull Movement Study
Skeleton and Joint Creation

Lining up the ‘knee’ controllers and placing the skeleton IK into the seagull in order to correctly place the joints and attach the new IK legs to the rest of the rig. Understanding joint connection hierarchies and IK applications, a key element I learned that can affect joint rotation is the ‘Rotate-Plane Solver’ which changes the way that the joint can follow a pole-vector control.

IK Implementation, ‘Rotate-Plane Solver’

While it is not vital to gaining directional control, it is possible to set the IK directions early in the rigging process but bending the ‘knee’ joint towards the direction of intended influence, which helped my understanding of knee and pole vector rigging practised to make them smoother in the future.

Once the rig was successfully connected in a sensical hierarchal order, the next key step was to create a pole vector constrain and control that could control the knee pivot. This is a key element to rigging with IK controllers, that in retrospect could have been used to create directional control for the wings, and is something I will heavily consider in future rig creations, especially in more animalistic bipedal rigs.

Set Driven Key Implementation

A learning step for me was understanding how to use and implement set driven keys. Going back to the reference of the seagull movements, their feet roll in a similar fashion to humans; typically in CG rigs, extra features are created in the attribute editor for the ease and access of animations to create the most realistic animation possible without breaking elements of the rig itself. So In creating different controllers for the foot, ball, and individual toe joints, I was able to add set driven keys to that group to create foot and toe rolls that can be manipulated in the attribute editor. Following the tutorial from earlier, the process of creating toe and foot rolls was an extremely useful tool that would help myself and Marianna key aspects of the rig in a more clear and less ‘messy’ way without direct animation onto the joints of the character, which cannot be frozen in transformation.

Toe and Feet Controls

Expanding onto clean rigging pipelines, I thought it may be worthwhile to limit the translation and rotation information in order to effectively restrict the motions available to the character so they make logical sense to the world. For example, in limiting the X-axis information in alls the rig to stop and be more proportional without excessively stretching and contorting the joints and therefore breaking the mesh of the skin. This will help to build a level of consistency during the animation process between me and Marianna as the ‘volume’ and shape of the character will appear the same, and also add to its character believability.

Transformation Limitations

In addition to the IK leg control, I also found issues with the way the eye controls were functioning on the seagull character, as they were directly parented to the eyes rather than aim constrained, which is a much neater method of rigging eyes as it does not directly affect the translation of the sphere, but instead only its rotation.

Old Eye Rigging Example (Before Adjustment)

Below showcases all the rigging adjustments I was able to make on the Seagull A rig, including IK controls and eye rigging controls.

Eye Rigging References

Rigging for Beginners: Eyes and More in Maya – YouTube

https://forums.autodesk.com/t5/maya-animation-and-rigging/eye-aim-constrain-with-rigged-mesh/td-p/8183105

Skin Weight Paint Influence Issues

One of the initial problems I went in to fix was the skin weight issues on seagull A, as both feet had influence over one another, so they could not be lifting without dragging the polygons of the other along with it. Experimenting with different aspects of this, my first attempt was to add a negative paint influence to the faces affected on the opposite foot, however, despite my best efforts in this, the influence seemed to not remove itself. After attempting to find several solutions to this, I discovered that the method that proved most effective in this case was flooding the affected vertexes with either 100% Positive or 0% negative influence in order to get them to act more realistic in motion. While this worked and I was able to more efficiently paint the weight into even and realistic weight distribution, this seemed to point out several issues in the mesh hierarchy, as in flooding these vertexes the wings were also entirely affected and followed the feet rather than the main body mesh. This caused further setbacks in the rigging process but offered suit for some critical reflection on rigging pipelines for myself. In reference to Watkins, he states how Maya in the process of skin binding will attach its vertices to the nearest joints on the initial bind without the contextual physical and biological context we have, and since the wings were not attached to the mesh via ‘combine mesh’ attributes, the vertices must have attached themselves to the feet without spacial context ( 2012, p. 303).

Skin Weighting

Seagull B Rigging Issues and Corrections

Taking the same developments from my rigging progression of Seagull A, I applied these to the seagull Brig. Since learning the process with seagull A, I was able to apply these changes faster with more accuracy, which led to a cleaner overall rig.

Old Eye Rigging (Before Adjustment)

As seen in the video above, the older rigged version of the eyes was not very effective for the animation process, so adjusting and changing them to aim constraints gave us much better control over eye rotation and look much more realistic. In order to maintain and preserve the style and character of the rig, instead of creating an aim constraint that moved both eyes at the same time, I added controls that matched the distance offset between the eyes so they are controlled separately, to upkeep the confused and ‘stupid’ look of the bird.

Eye Rig Adjustments

Added IK and Set Dirven Keys to Rig B

References

Adam Watkins (2012) Creating Games with Unity and Maya : How to Develop Fun and Marketable 3D Games. Burlington, MA: Routledge. Available at: https://search-ebscohost-com.arts.idm.oclc.org/login.aspx?direct=true&db=nlebk&AN=376905&site=ehost-live&scope=site (Accessed: 5 April 2022).

Workshop: Collaborative Seminar

Summary

  • What is it? A VR Comic adaptation of artist ‘False Knees’
  • What is its purpose? To entertain audiences with blunt and existential humour.
  • What are the key features? A player is interacting with two seagulls differing in personality and follows them as they converse through a day at the seaside.
  • Role assignments and group responsibilities? Yiran, Callum and Lin are all working on the Virtual reality and unity scriptwriting elements of the project, me and Marianna are doing modelling and character animation.

Achievement

  • Concept- what is its novelty? It provides a relatively unexplored area of virtual reality experience that is not inherently related to games/ gameplay and provides a more accessible way to experience using VR.
  • What is driving the narrative? The comic-inspired dialogue and interchanging environments.
  • What is the Development Process? Developing a relationship between comic art and virtual 3D space in relation to experiences
  • What is the practical scope? Due to the 3D nature of Virtual reality and 3D animation, there is a pre-existing understanding of workflows and pipelines that require both animation, modelling, and game engine scripting in order to be a finished piece. This collaboration exists in industry-standard practice.

Target Audience

  • Who is the work aimed at? Young adults who enjoy comic humour
  • What aspects of the work have been chosen due to this? There are elements of language that targeted an older audience, however, contain elements of culture more likely to be understood by a more recent generation.
  • Why have you chosen this target audience demographic? it is much easier to gain access to and converse Ideas with people within the 18-25 age bracket due to the nature of the university experience. This also means that our group has a shared understanding of the humour involved, and cultural references.

Technical

  • What techniques are you and your collaborators employing to achieve your goal? We are trying to integrate as much use of virtual reality technology as we can, which includes changing narrative planning such as storyboarding in 3D space. We have also been modelling and animating with more game-like thought processes, keeping things lower poly and animating in loop cycles.

Plans and Timelines

  • What is your timeline for the finalization of your project?

  • How will you test aspects of your project as you progress? Every week we have been having group meetings, in which the VR students bring their own headsets and equipment for us to preview the progress of collating all the scene files together.

Week 4: Collaborative Project

Group Meeting and Notes:

The VR students showcased their midterm critical PowerPoint they intended to show their lectures with all or progress to date on it. The Unity environments have been developed and showed us their city/ beach developments. The storyboard script was also adapted to a finalised point.

Personal Goals this Week:

. VR Storyboarding

. Adjusting any models

VR Storyboarding

Following on from Week 2’s research, in order to effectively plan our gameplay sequence in 3D space, the members of the MA VR course set up a tilt brush for our use again to draw the characters in key elements of gameplay interaction. During this process, both I and Marianna (alongside Cal from MA VR) effectively utilised the 3D space in order to plan spatially and aesthetically the character interaction which not only gave us a clear outline of how the VR headset works but also gave us a more up-close insight into how the characters will look and move which will help with the animation process later on in the project.

While in the context of non-linear narrative storytelling tools, when approaching the storyboarding for this project, we went in with the mindset of an unconventionally structural plot, in which “effects are the direct result of causality” (Bucher, 2018, P.84). This engaged us and challenged us in a different way as it meant that the key elements, while able to follow a chain of differing triggers which allow space for narrative structure, ultimately have to make each sequence make sense on its own. As stated by Bucher, as many VR experiences rely on this non-linear narrative experience, they often include elements of traditional storytelling such as the introduction of empathetic characters (Bucher, 2018, P.84). This element in itself helped Marianna and myself translate our own storyboarding experiences based on a character-driven plot into several, broken down sections that build an overall narrative arc. The viewer engagement with these interactions, reward or cultivate the incentive to continue following the narrative, rather than gameplay elements, adding a uniqueness to the project’s artistic form.

Researching more into the world of games to get a better understanding of how this is multisided, there are several examples of non-linear narrative-heavy games that depend on player engagement such as the Life is Strange (Square Enix) series and TellTale comic adapted games. A key example of this is the recent Life is Strange: True Colours (Square Enix, 2019) which gives the player multiple options that change the overacting narrative of its very narrative-heavy story, which gives a strong implication of the intensive story development that went into it that is perhaps even more thought out than an entire animated film. Elements of this game is particularly brought to attention the different ways narrative can be explored in a more option, non-linear fashion such as interacting with objects in the worlds such as journals, letters, text messages and even NPC characters which help add depth and development to the world in a way that is entirely dependant on the individual’s personal engagement. Taking this into consideration for gameplay aspects for future projects, as a computer animation artist I think is important to think about all the different ways experience and story can be created in the realms of 3D.

Player Engegment through Non-linear Choice Based Gameplay
Drawing in VR Space

Drawing in 3D Space

Taking traditional artistic elements into three-dimensional space has proved to be something very educational and thought-provoking in the technological progression. As stated by Schkolne incoperating the “unfiltered physicality” of sponatiously hand drawn art, into a traditionally indirect and mathmathtically enveloped world add a human element and create a more artistic impression in technology (2002, P.1). This particular element of the VR world particularly drew my interest and has helped me engage more with the medium.

Examples of Tiltbrush Drawings
Example of Tiltbrush Drawing

The capture of Final Storyboard

Below is the final VR storyboard, and its progression of different narrative segments that build up the overall story, depending on player engagement and triggers. This includes the integration of key objects such as beachballs, sticks, fries and ice cream, which are all created with the intention of being interactable for the player.

References

Bucher, J 2017, Storytelling for Virtual Reality: Methods and Principles for Crafting Immersive Narratives, Taylor & Francis Group, Oxford. Available from: ProQuest Ebook Central. [5 April 2022].

Schkolne, S. (2002) ‘Drawing with the Hand in Free Space: Creating 3D Shapes with Gesture in a Semi-Immersive Environment’, Leonardo, 35(4), pp. 371–375. doi: 10.1162/002409402760181132.