Week 9: Animatic

Following on from last weeks storyboard, I created an animatic which pairs audio-visual elements such as sound effects and written music from freesound.org to build a strong sense of musical timing within my film, as like relevant 1930’s cartoons “this performance is primarily audio-visual “and re-created in a “self-sufficient diegetic world” (Crafton, 2013. pp.17). Expanding on that, I focused on the timings of the sounds coinciding with the movements in a similar fashion to that of stop motion animation. This helped me assemble the general movements I intend to animate coming into the next term.

Ideas and considerations for 3D adaptation:

  • I wish to push further on the animated style explored rhymically in the animatic, where I time the frames to the music in a more minimal and more controlled way.
  • I want to see if I am able to collaborate with sound students to push the audio quality to a more professional level.
  • I may explore potential cinematography changes, especially regarding the “VS” shot.

References

. Crafton, D. (2013). Shadow of a Mouse: Performance, Belief, and World-Making in Animation. United States: University of California Press.

Week 8: Story-boarding

Storyboarding and key Story Beats

Perhaps the most essential step of my animation process for this film will be the storyboarding. Heading into this task, I set out with the intention to keep my cinematography flat, without many varying camera angles to give it a more ‘2D appearance’ and mimic that of the view of a real theatre, relating back to the ideas of the Vaudeville comedy. A key element Kenny Roy expands on in his book How to Cheat in Maya 2014 is when considering staging is “how you are going to maintain the high level of communication throughout the life of the shot?” (2014, pp.8). Considering this, I thought what the main element of my story, that adds audience communication and directional attention, is the spotlight. Due to the lack of changing shots within my piece, the spotlight will act as a cue for the main change of action between the character, which will benefit me in the process of creating my shot list.

Story Beats

Beat 1:Introductory black screen which will include finalised title card designs, such as the skeleton head or warner brothers style ‘merrie melodies’ card. Ideas to potentially have effects that act as film rolling to give a contextual feel to the piece.

Beat 1

Beat 2: [Wide Shot] Pin-hole transition opens the scene to the two skeletons standing on a stage with low/ minimal lighting. both of the skeletons are standing in a still and autonomous pose that does not have the minimal movement of still human beings (E.g. breathing).

Beat 2

Beat 3: [Wide shot] The pinhole transition ends and the two skeletons stand in the centre of the screen. There is very low or no sound to create a feeling of anticipation and tension. as on cue with a beat of sound, two spotlights will fall directly onto the skeletons, setting up the ensuring theatrics.

Beat 3

Beat 4: [Close up] Cuts to a close up of Skeleton A’s face looking concerned and worried for what is about to happen.

Beat 4

Beat 5: [Close up] Cuts to skeletons B’s face filled with anger and competitivity, he is more prepared for this battle and is ambitious to win.

Beat 5

Beat 6: [Wide shot] The spotlight falls on Skeleton B, he is raring and ready to go. He dances with fluidity and effectiveness.

Beat 6

Beat 7: [Wide shot] Skeleton B avoids the ‘trap door’ with ease and glides over effortlessly, winning his round of the dance battle thus far.

Beat 7

Beat 8: [Wide Shot] Skeleton A is now at the centre of the spotlight and acts hesitantly and confused at the attention.

Beat 8

Beat 9: [Wide Shot] Skeleton A is startled by the trap door that begins stalking him. He jumps over the hole with panic and several of his bones detach from his body in surprise.

Beat 10: [Wide shot] Skeleton A’s head detaches and he catches it, gets down on his Knees and mimics famous Shakespearean hamlet scene.

Beat 9
Reference to Shakespeare’s Hamlet

I felt using this reference within the theatre environment felt entirely appropriate, and by utilising the most famous line of hamlet “To be or not to be” it makes a clearer and more accessible reference in place of “Alas poor Yorick”.

Beat 11: [Close up] A still frame as the screen shows the skull spotlight, text appears showing “To be…”.

Beat 11

Beat 12: [Close up] The skull turns to face towards the camera and text appears on screen stating “or not to be”.

Beat 12

Beat 13: [Wide shot] Camera falls back on skeleton B, he dances with grace once more and even further confidence than his last round.

Beat 14: [Wide Shot] Skeleton B falls into the trap door and falls apart, crowd boo-ing sound effects.

Beat 14

Beat 15: [Wide Shot] Skeleton A is winning and starts to do a dance with more confidence and grace than previously. (Perhaps Utilising a `John Travolta ‘Saturday Night Fever’ reference.

Beat 16: [Wide Shot] Skeleton A falls into the trapdoor and limbs detach themselves in a overexaggerated comedic effect.

Beat 17: [Wide Shot] Shows the empty stage with a single spotlight left on the trap door/ hole in the floor.

Beat 18: [Pan shot] Pans down through he floor to show piles of previous skeletons and highlights skeleton A’s hand jutting out of the top of the pile.

Beat 19: [Wide shot] Skeleton A’s hand waves as the pinhole transition closes in on it.

Beat 19

Beat 20: Screen Fades to black and credits start to roll.

Beat 20

Thoughts and Processes for the Following Week

Going into the next pre-production stage of my film, I will next complete the project animatic so that I can find the relevant sound materials and see how this times with the overall pace and action of my storyboard thus far, garnering a more solid and stable idea of the dances and movements of my film.

References:

. Shakespeare, W. (1948) Hamlet. Cambridge University Press.

.Roy, K. (2014). How to Cheat in Maya 2014). Abingdon, Oxon: Focal Press.

Collaborative Project Submission Post

Completed Gameplay Capture

Finalising all the work completed with the MA VR team in the week prior, we were able to produce a 5-minute long virtual reality experience from a comic book adaptation. Wit interactions are heavily inspired and directly referenced from In the works of Joshua Barkman (False Knees) the experience places the player as a seagull and the actons of unravelling between two seagulls and their day at the seaside. Going through these scenarios, different interactions such as an ice cream collecting mini-game, throwing beachballs and hitting seagulls with sticks add interactional applications for the player. Due to the immersive and in-depth experience players are affronted within the virtual reality space, It felt important that sound was integrated in an effective way, therefore the bird’s interactivity with the sound to bring its presence felt important to player engagement. Below is the final five-minute gameplay capture, displaying these features.

Final Gameplay Capture

Final Gameplay capture with commentary from VR Students

The below video includes the gameplay capture featured with VR commentary to outline with more clarity the interactions and gameplay feature the player can get involved with.

Critically reflecting on the project, as well as my personal participation in it, the limited time span and ambitiousness of the project and the number of scenes that required animating created quite a few hurdles due to lack of experience. Despite these limitations, we were successful in creating a 5-minute long virtual reality experience that included several interactive elements that the player could engage with. While two out of three of these environments were completed, the last city environment proved to have too many elements that required additional work we didn’t factor into our timeline due to technical issues. Reflecting on my own personal engagement with the project, while I was able to help with early development including storyboarding, conceptual art and modelling I felt that my animations could be pushed to a higher quality in a similar light to those I created the last term. Due to several importing/exporting issues that occurred between Maya and Unity as well as reoccurring issues with the rig, the time spent animating was severely limited due to the initial plan. Due to polygon loss during the exporting process, a lot of the expression and motion in the Maya scene was lost when placed into Unity. The animation itself was also stripped from detail due to rigging issues and last minute directional changes, forcing myself and my team members to focus on meeting our deadline whilst sacrificing nuanced quality. However, taking into consideration stylization the minimal amount of animation and lack of realistic detail adds to character design and comedic effect.

Showreel Of Participation

Below is a showreel to display some of the work I produced during the length of the project, taking some of my reflection into consideration the areas that needed more development were my asset modelling and character animation as I feel these both suffered a lack of quality due to time constraints. To improve my asset modelling I would take more time with texturing as I felt some of the textures and detail did not match the overall world they were imported into. Initially, I explored using software called Nomad (an iOS application) it caused some exportation issues between both Maya and Nomad, this also drastically raised the number of polygons which would be inefficient to put into a Unity scene and would cause issues with real-time rendering overload. A process to help retain polygon quality onto a low poly model can be done by implementing the use of a ‘Normal Map’ which is “a special type of texture that tells the 3D program or game engine to display details on a polygon surface as though they were geometry” (Totten, 2012, p.106). When working with game engines in the future, I fully intend to administer this into my modelling work. In Nomad I was creating realistic textures whilst contradicting the textures being used in the False Knees game file which were very minimal and cell-shaded in style.

Miro Board

Cal from VR kept a record of each group meeting held and what was discussed as well as resolved during each week:

Meeting Notes (Collaborative Unity) – Google Docs

Throughout the entirety of the project, we were able to communicate and convey our progress and ideas very visually to one another through the use of the miro board below. This proved to be particularly useful during all group meetings and helped convey ideas very efficiently during class presentations. Below is the completed Miroboard.

https://miro.com/app/board/uXjVOTsXnB8=/?invite_link_id=965546776052

Future research areas to improve and engage with Games engine relationship to computer animation-

.Blackman, S. 2011. Beginning 3d game development with Unity: the world’s most widely used multi-platform game engine. Apress.

. Penny de, B. 2012. Holistic game development with Unity: an all-in-one guide to implementing game mechanics, art, design, and programming. Waltham, MA: Focal Press.

.Botz-Bornstein, T. 2015. Virtual Reality. Leiden: BRILL.

.Cotton.M. 2021. Virtual Reality, empathy and ethics. Basingstoke: Palgrave Macmillan.

.Ryan, M, L.2003. Narrative as Virtual Reality: Immersion and Interactivity in literature and electronic media.

Collaborative Blog Posts

Module Introduction: Collaborative Unit: Initial Considerations – Esme’s Blog (arts.ac.uk)

Module Introduction: Collaborative Project: Idea and Skill Pitch – Esme’s Blog (arts.ac.uk)

Workshop: Camera Sequencer – Esme’s Blog (arts.ac.UK)

Collaborative Seminar 1:Collaborative Seminar – Esme’s Blog (arts.ac.uk)

Collaborative Seminar 2:Workshop: Group Seminar – Esme’s Blog (arts.ac.uk)

Week 1: Collaborative Project – Esme’s Blog (arts.ac.uk)

Week 2: Collaborative Project – Esme’s Blog (arts.ac.uk)

Week 3: Collaborative Project – Esme’s Blog (arts.ac.uk)

Week 4: Collaborative Project – Esme’s Blog (arts.ac.uk)

Week 5: Collaborative Project – Esme’s Blog (arts.ac.uk)

Week 6: Collaborative Project – Esme’s Blog (arts.ac.uk)

Week 7: Collaborative project – Esme’s Blog (arts.ac.uk)

Week 8: Collaborative Project – Esme’s Blog (arts.ac.uk)

Week 9: Collaborative Project – Esme’s Blog (arts.ac.uk)

Week 10: Collaborative Project – Esme’s Blog (arts.ac.uk)

References

Totten, C. (2012). Game character creation with Blender and Unity. John Wiley & Sons, INC: United States of America: Indianapolis, Indiana.

Week 10: Collaborative Project

Team Summary:

We all prepare for the final critical presentation with the VR students and develop our own slides. I and Marianna also complete all animations for the beach scenes and import them successfully into unity with few further issues but the slight mesh and aesthetic setbacks that do not directly affect gameplay.

Final Critical Presentation with VR students:

Collaborative Project – Final Crit Presentation – Google Slides

As it is the final week of the development of our project, we complied presentations with the virtual reality students for their final critical presentation in order to be given constructive feedback by their professors. During this process, I and Marianna complied some slides to contribute to the overall presentation in which we showcased and examined the different sections of our collaboration such as initial concept art and modelling, the character rigging, speech bubble interactive design and the animation process. This was a very useful process as we were able to compile all the work we had completed together as a team and present it to an audience that had knowledge of the areas of potential improvement. While this presentation was more centred around the VR students and their unity build, so we did not have the presentational space to go into massive depth about our work on the 3D animation side of the project, we were given the chance to summerise how our contribution was vital to the completion of the project and gave us a chance to critically reflect on every aspect of production.

Modelling and Animation Slide
Rigging Slide
Speech Bubble Design
Animation and Modelling

Critical Reflection and Feedback from VR Lecturers

Initially stating how we were able to produce a 5-minute gameplay sequence in software over the course of only 10 weeks, I would argue we achieved a lot that was very useful and education in the field of 3D produced work. I feel our particular successes were in the overall design and story-driven elements due to our concentration on the narrative drive. I feel exploring the method of storying boarding within the 3D environment itself proved massively beneficial on a personal level to gain an understanding of exactly the realm in which we were immersing ourselves and our skillsets.

Reading from the work of Bucher, he brings to attention two questions when considering storytelling (Bucher, 2018):

Who is the audience?

What is one thing you want them to walk away with?

When critically reflecting on these questions in regards to our narrative experience there are two answers regarding what we have produced. The audience is intended for people over 15, but more specifically people who enjoy the narrative experience of a comic book. By creating the addition of swearing and comedic violence, it begs the question as to what age group this exactly falls into, but I feel the real audience is vaguer than a specific age. The game’s ability to draw older audiences into more classical childlike narratives begs the question of VR and animation’s potential as a storytelling medium for more than pure video game entertainment. The game’s purpose, in my opinion, when reflecting on what we achieved seems to serve predominantly learning procedures to teach us effectively collaboration across disciplines. In spite of this, I think it serves a grander purpose of pushing the exploration of non-gameplay heavy experiences that can be created in Virtual reality that embody and underline the importance of an experience in which there is no expectation or pressure placed on a player to do well to succeed.

In terms of collaboration, one of the primary pieces of feedback we received during this critical presentation was how well coordinated and communicated we were as a team in order to create the aspect we did. This I feel was particularly effective in our regular communications and meetings every week. I also on many occasions called teammates using Microsoft teams to continue to be effective in the workflow without having to be present in the same room.

An interesting comment which changed my perspective of our gameplay narratives was the use of speech bubbles seeming ineffective in their state, especially with the audio present. Going forward with these ideas of the project it seems imperative to make this more effectively design, and perhaps integrate elements of the bubbles following the player so they are always reading and facing the viewer.

Relevant tutorial to explore this:

(51) How to make any object looking at and facing towards the player in Unity – YouTube

The major point of reflection in regards to the entire gameplay sequence is the lack of progression we were able to make in regards to the last city scene. If the workflow was more researched and understood prior to starting the project, I feel this would have been achieved much faster as the primary issues that set us back were the Maya to unity conversions. In a reflective element though, the entire process was extremely beneficial educationally and inspires me to want to continue progressing in the field of games animation in future. Since I and the rest of the team mutually agree on combining our efforts to finish the project outside of graded assessment, I feel this will push us all to create a finalised finished piece that can be used as a strong showreel material.

References

Bucher, J (2017), Storytelling for Virtual Reality: Methods and Principles for Crafting Immersive Narratives, Taylor & Francis Group, Oxford. Available from: ProQuest Ebook Central. [7 Feb 2022].

Week 9: Collaborative Project

Meeting Notes and Weekly Goals:

This week’s meeting entailed how the rig issues that previously prevented the animation stage from starting are now at a malleable and workable stage in which all beach scene animations are a priority and should be completed within the course of the week. As well as this good progression was made on the Ice cream mini-game, as well as the development of the unity/ Maya rig tests.

This week’s Aims:

.Complete all personally assigned beach scene files (Interaction 4)

.Resolve any last issues that may prevent gameplay function in Unity

Issue Resolution: Non-Functioning Rigs in Unity

an Issue Marianna encountered when animating one of the interactions was that the animation, when imported into unity, would not apply to the models. After meeting with the VR students, it became apparent the importance of finalising the rig and model before importing and animating sequences intended for unity.

Anim File Export

Using a process I discovered the last term when I had a similar issue regarding a non-functioning rig containing well-developed animation information, I exported Marianna’s animation for interaction 3 as an Anim file and imported this onto the most updated version of the rig in which I had fixed some issues regarding the rigging and skin weighting of the beak. While most of this importation was safely transferred, I had to reanimate the beak speaking in time with the relevant audio as the information could not be transferred onto a rig that did not previously exist. Despite this, the resolution was reached and functioned just fine in unity after engaging in a Microsoft teams meeting with our teammate Cal so we could test directly how well the process worked. Due to the nature of animated games, in which a finalised rig is required to apply animation information in the games engine, animated films can use a referenced rig that updates with the animated information. Proceeding with this information in future I will assist to create a cleaner pipeline in which myself and the other animations are definitely working with the same rig. This will further be prevented by the fact that my and my future collaborators’ rigging knowledge will be more in-depth, especially regarding functions in engines such as Unity, that will prevent issues needing to be consistently fixed.

Anim File Imported on Updated Rig

Animation

As mentioned by Fothergill and Flick on the ethics of human-chicken relationships in video games, the idea of chickens being symbolic of cowardice and domesticity is strongly associated with its comedic effect (2015). While in the context of our project, seagulls in themselves are viewed widely as irritating and inconvenient to humans on a daily basis, which gives a further implication and suggestion for the use of violence in-game. A key aspect of the gameplay for our project is the ability to pick up objects in the environment and use them to attack the seagulls, especially in regards to this interaction here, and bring to attention an important aspect of gameplay. Due to society’s embedded behaviours, it seems obvious to a player without specific instruction that the intention of the game is to inflict violence on the bird who is causing you the most personal irritation. This is expanded by Bucher, who suggests that a key element to the success of a player to advance in a non-linear narrative is the “continued reliance on linear logic” which in its own way implies that the player themselves is the key driver to the narrative rather than a camera (2018,p.84) I feel societal association works to our benefit for this particular project, however, leaves questions for more ethical considerations in future. Violence in video games is something taken as a given, even in its comedic form, and especially for a game that in its aesthetic appearance seems harmless in presentation. It insinuates potential underlying factors that create the product to have a problematic ethical response. Especially in the first person, fully immersive experience of the Oculus rift, where it seems far less distant than simply watching a narrative play out through the confines of a screen. As discussed by Matthew Cotton, Immersive VR systems are considered “empathy-arousal” tools that can be utilised to ‘stimulate empathetic engagement towards marginalised or vulnerable peoples’ ( 2021: 113). In this instant, by using violence as a comedic effect it brings to attention a line perhaps that should be drawn through this incorporation that will be beneficial to consider, especially by what it represents in anthropomorphic characters than embody some kind of human experience. This being said, seagulls in our current climate are in some danger brought to attention by environmentalists, which may spread a negative environmental image to players.

File 1: Bird Dialogue Interaction

The nature of this interaction is to engage with the viewer and their place as a ‘seagull’ who is part of the conversation at hand. The use of the ‘stick’ or piece of washed up driftwood for the player to take first-hand was something interesting to consider when approaching this piece, as the use of constraints in Maya was something that would not necessarily function the same in unity. Taking this into account, I broke the animation down into 5 separate FBX files that the VR students could then trigger and engage in based on player interactivity. File 1: The triggered initial dialogue between seagull A and B. File 2: The Looping animation, engaging the viewer to take the stick. FIle 3: The interactions between seagull A and B after the stick has been taken. File 4: a looping animation that awaits the player to hit seagull A with the stick. File 5: The interactions between Seagull A and B once this action has been triggered.

File 2: Looping Animation

Breaking these down into the different files proved very useful for the VR students when coding into the unity file the different interactions. However, I feel this is something we should have regarded earlier on in the project, as the structural ‘shot’ planning could have been produced more effectively.

File 3: Interaction and Dialogue

File 4: Looping Animation Awaiting Player Engagement

File 5: Last Diaolouge sequence of the Interaction

Speech Bubble Design for Sky Scene

For the sky environmental scene, the only ‘animation’ required was the speech bubbles that will be interacted with the skyspace. For this, I used my previous designs to work in cooperation with Marianna to create bubbles that appear rudimentary in concept but paired with movement appear to have a charm to them. Reflecting on our overall use of speech bubbles in regard to this project, I feel we perhaps could have considered character and emotional portrayal more with the linework involved. For example for the stupid seagull, creating the lines is a specific way that encapsulates his ‘simplicity’ such as lots of easily curved lines, whereas we could have used sharper, more dynamic lines for seagull A to accentuate his frustrations and more depth of thought and feeling.

I think that by incorporating a more ‘foul’ linguistic approach, it tries to separate itself from a younger audience despite its inherent childlike aesthetics that perhaps will assist in pushing forward the use of animated media in the adult world associations.

Critical Reflection on Animation Process

During this week we were productive in our endeavours to complete the first three scenes, the Beach, the Pier and the Sky, which was our minim aim for the 10-week project. This being said, we still needed to develop enough to at least begin animation in the city environment. Going forward with game student collaboration, I feel I have a much deeper understanding of how Maya works in relationship to game engine requirements, and will allow a much longer and more developed contingency time for aspects such as rigging, and will produce animated FBX file tests much earlier on to identify issues sooner.

References

.M. Cotton. 2021. Virtual Reality, empathy and ethics. Basingstoke: Palgrave Macmillan.

.B. Tyr Fothergill and C Flick. 2016. The Ethics of Human-Chicken Relationships in Video Games: The Origins of the Digital Chicken. SIGCAS Comput. Soc. 45, 3 (September 2015), 100-108. DOI: hht://doi.org./10.1145/2874239.2874254

Week 7: Title Card Development

Inspiration

Taking inspiration from cartoons from the 1930s title cards, I wanted to adapt both a opening and a credit sequence to my animation that adds a nostalgic and old worldly theme. I feel this will be a succinctly relevant aesthetic tool for aiding in the contextual associations for my project.

See the source image
https://youtu.be/LPW70q4w5pw

Test 1

There are heavy nostalgic associations with the early Warner Brothers cartoons and the famous circular title card, and my rendition almost makes a direct reference to the ‘Merrie melodies’ cartoon openings. These historically are very similar to the Silly Symphonies in the way they explored music and sound in regards to animation. The use of the ‘tunnel’ within 3D space will also assist in pronouncing its status as a 3D animation without being inherently obvious due to the angle it is rendered.

Initial Render Tests

Finalised Look Tests

Test 2

An interesting point stated by Goldmark is the 1930’s musical cartoon genre being so heavily associated with Disney that “synchronisation of music and action” was referred too as “mickey-mousing” (2005, pp.6) . Taking this into consideration I looked at the heavy branding involved with mickey mouse and noticed at the start of a few of the early 1930’s Disney cartons the title card feature stills of mickey mouse’s face. To mimic and replicate this in a way that encourages nostalgic association I attempted to use my own characters face in a way that both celebrates and mocks the characterised branding of this era. I wanted to add the further inclusion of movement to mimic how his head moves in rotations similar to that of the globe, but also in a way that accentuates the 3D nature of the film early on.

Example of Mickey Mouse Branding Title Card
Recreated Scene
Playblast

Editing

In order to fully utilised the contextual setting, I have taken these clips, with a slow cross fading transition, and colour graded them black and white. This editing style mimics that of the clips above such as Wild Waves (1929) and places it within the genre while also making use of the smooth and clean nature that computer generated animation can provide, as well as post editing effects such as motion blur, to add a more visually pleasing look to genre with outdated technology.

Sound

https://www.dailymotion.com/video/xjri6w

A major consideration when regarding music with this is the context regarding the year. As stated in ‘Tunes for ‘Toons’, “Carl Stallings extraordinary influence on cartoon music as a whole suggests a host of possible avenues to explore” (Goldmark, 2005,pp.7). Looking into Stallings work, he used predominantly traditional orchestral instruments’ such as flutes, violins and brass bands. Comparing this to my main source of inspiration, Disney’s Skeleton dance, there are several things in the sheet music I considered going forward.

A Key aspect here is the Allegretto tempo, which implies it should be played quickly and briskly. The 4/4 time signature also allows for the pacing of the animation to be timed more simply due to the even number of frames it allocates space for musically. There is also a heavy use of percussion involved within this project, and the creative use of an Xylophone to replicate the sound of bones.

One of my initial opening composition ideas was to make direction audience associations with black and white cinema which is the famous tune if ‘The entertainer” which I have adapted in my own rendition below and applied to the same key as the music of my film. However, I still did not think this fit as efficiently to my opening as it was very slow, and seemed to contrast with the overall soundtrack I had initially planned making the film seem more relaxed in nature.

When composing the opening, I wanted to stress more on the key instrumental and Allegretto tempo, to make a music ‘clash’ (Symbolic of the skeletons ‘battle’) that arrives at a harmonious conclusion. I felt this was more audibly symbolic of my piece and also felt more original as a piece of work.

References

.Goldmark, D. (2005). Tunes For ‘Toons. Los Angels, California: University of California press.

Workshop Week 8: Case Study Recreation

This week’s workshop detailed looking at and trying to remake ideas and techniques from the work of Lucca Zanotto, who creates simplistic looping animations that are effective in conveyance.

https://www.visualatelier8.com/art/2020/5/lucas-zanotto-yatatoy-kinetic-characters

I started this idea by creating a mash node to create cylinders that were of equal distance to each other.

Week 8: Collaborative Project

Meeting and Notes:

Interactivity such as teleportation and grabbing were completed. The rig has also been predominately fixed, and animation has begun in full swing. Music for minigames is complete and scripts on the VR side are going well.

Personal Goals:

.Create Shot List

. Animate Interaction 2

Considering the animation required for this project and also considering time efficiency, we collectively agreed that the best method of animation would be minimal, which gives the overall emotional impression without extensive detail. Despite my best efforts to adjust and fix the rig, time constraints no longer allow us time to keep adjusting the rig, so facial animation will also be kept to a minimum. This, however, appears to suit the process, as conversion from MB into FBX proves to compress and readjust the model until it displays in lower poly/ rather different to Maya in the aspects of mesh presentation.

Animation Development

Developing from last week’s ideas, we began to structure the animation files and rigs into different layers to help us animate in a more organised and structured manner. This will also assist us if we need to make any changes fast without directly affecting the rest of the animation. Reflecting further on best industry practices in this respect, the use of layers is a fantastic way to explore alternate versions of your animation without heavy commitment. Roy expands on how the “weight ” of an animation layer can be adjusted in a similar way to opacity and layer groups found in Adobe Photoshop (2014, P.284)

Animation Layers and Organisation

Shot Lists and Planning

I thought an imperative first step to the animation process was creating a shot list in order to evenly distribute the work into clear, distinguishable sequences based on dialogue. This in itself proved a challenge due to my prior experiences with scenes and shots =, it was difficult to pin down how to separate the different aspects of animation due to its non-linear narrative and structural form. In this sense, I broke it down into the key different environments and key interactions between characters in the order that they were storyboarded. Following on from this I assigned the shots between myself and Marianna, prioritising those on the beach scene due to that being our main goal for the project.

Shot List for Gameplay Interactions

I felt it important to also add the idle animations such as walking and ‘breathing’ due to the constant nature of real-time rendering, and these animations will be used in cycles to help the flow of gameplay.

Animation

Thinking about the stylistic interpretation of the speech bubbles, the integration of how they move in relation to the bird models and also considering player POV when adjusting motion will be something that will have to be relatively controlled within unity’s camera. Due to this, the way in which the interactions will potentially be structured in unity is by ‘ teleporting’ to different squares in which visibility is relatively controlled to a confined space within the games. Beginning the animation process for this VR experience, I had to adapt a different mindset to the one I had harboured for past projects as the animation itself depends on player interactivity, as different sequences are ‘triggered’ by gameplay elements. Animation is also created with the intention of being viewed from multiple angles with lesser regard to cinematography than an animated film.

Cycled Animation at the End of Narrative Sequence

Baring in mind player interactivity, in theory, the interaction should ‘Cycle’ until the player has clicked or progressed through the narrative in order to trigger the next. But in the event of that not happening the seagulls need to participate in continuous movement to allow for a level of realism and believability of the character. If the character remained static after every interaction it would ruin world-building believability and bring to attention flaws in our immersive experience.

Reflecting on my finished shot, I feel I was able to produce a character-driven performance that is suitable for the lower poly models imported within unity, however, there are several aspects I would improve on for future considerations if given the opportunity to fine-tune our project at a later date. The first would be the lack of ‘finger’ and wing movement detail, as it adds a flatness to the characters that makes them appear more like inanimate figurines than anthropomorphized seagulls. I also feel more follow-through, offset and secondary action could have been included during this animated interaction, especially in Seagull B as he sways to the music, where weight could be distributed in the offset of his head to his body. This would also add to the accentuation of ‘birdlike’ flexibility between the head and neck, as the bird contains more vertebrae which allow for sudden neck and head movements.

See the source image
Gull Skeletal Structure

The facial expressions are also lacking in blinks and well-developed eye darts, however, the limitations of the detail within these expressions are predominately due to rigging issues experienced with the eyelids, which continue to be offset from the rig despite revisions to the mesh hierarchies. With these limitations and lack of contingency time to fix it, I think the monotonous, lack of expression in its own way suits the characters in their blunt emotional depth and only adds exemplification to Seagull B’s ‘Stupidity’. In association with facial animation studies, the lack of upper facial animation (brows, eyes, eyelids) can create an uncanny effect that strips the naturalistic element of movement and creates viewer discomfort (Tinwell, 2015). However, due to the simplistic design of the seagulls, which do not strive to achieve photorealism, a lack of expression is passable in expression portrayal, as less animation is required to sell the character’s performance inherently and avoids falling into the realms of the Uncanny Valley on aesthetic design.

Animation Exported into Unity

With the previous issues of last week’s animation import issues resolved, the sections myself and Marianna animated over the past week seem to function well in unity. The major issue at hand that still needs to be tackled is the mesh exporting issues. While there are ways to improve the polygonal interception, the overall smoothing seems to be lost, specifically on seagull A, and the wings of seagull B. In spite of this, I feel the ruggedness of the birds adds a rough element that is replicant of childlike drawings that suggest a more ‘undeveloped conscious’ contrasting visually with the conversations at hand. In spite of this, I intend to look more into solutions surrounding unity’s polygonal processing in the following week.

Potential Video Guidance for Next Week

(51) How to Actually optimize your game in Unity – Complete Game Optimization Guide – YouTube

(51) Unity Probuilder : Smoothing Tool – YouTube

Animation Interaction (with vice over audio included)

References

.Roy, K., 2014. How to cheat in Maya 2014. Abingdon: Focal Press.

.Tinwell, A. 2015. The Uncanny Valley In Games and Animation. Florida: CRC Press.

.B. Tyr Fothergill and C Flick. 2016. The Ethics of Human-Chicken Relationships in Video Games: The Origins of the Digital Chicken. SIGCAS Comput. Soc. 45, 3 (September 2015), 100-108. DOI: hht://doi.org./10.1145/2874239.2874254

Week 6: Sound Development

Due to the heavy inspiration my project takes from early 1920s animated features such as the ‘Silly Symphonies’ there is already a direct reference to be made with instrumentals generally used in the classical music genre. As stated by Goldmark, many people in America “attribute their first conscious memory of the classical repertoire to cartoons” (2005, pp. 107). This in itself could benefit my animated project overall aesthetic simply by contextual association and theme. An interesting point that Goldmark mentions is the intergration of popular concert music and opera often featured in the pieces from the 20s cartoons, and commercialise them which subvertly created a timeless piece that integrates several eras and sustained through popular animation (2005,pp. 108).

Musical References

Starting to develop a musical motif I made note of the traditional instruments heard in the ‘Skeleton’ dance as well as making using a similar key to replicate the tonality. The Key instruments I wanted to explore were the Basson, one and Flute, as they add a realistic and non digitally enhanced clarity that is often heard in the early Disney Silly Symphonies.

Due to time limitations and a general lack of musical knowledge, I have taken these segments of experimentation and contacted a friend who has more experience in musical composition.

All the above practised and considered, Goldmark makes a compelling point that the historical ties Disney linked with Disney’s overuse of the same classical cannons do not expand or test the limits of the relationship between moving image and score (2005). It may also be relevant to consider historically MGM’s use of classical instruments to convey sounds and motions of violence and chases, and how this can later be applied to my own work.

References-

. Goldmark, Daniel. 2005. Tunes for ‘Toons. University of California Press: Berkeley, Los Angeles, California.

Week 7: Collaborative project

Meeting and Notes:

We discussed major setbacks and progress each of us had made. one of the scenes in Unity was entirely created and ready to be played and interacted with in VR space. I and Marianna still encountered issues with our rig and import/ Export issues in unity that we wish to develop for next week.

Goals:

.Finalise the rig

.Resolve any further issues that prevent animation

Speech Bubble Explortation

Exploring the ways in which the comic bubbles could be interpreted and translated into 3D space, I wanted to think about player interactivity as well as aesthetic considerations that empathises gameplay. Looking at design elements in balance with gameplay elements, the VR team intend to create interactable speech bubbles, that a player can grab and use as potential weaponry against the seagulls. Thinking about this as a graphic element, my initial ideas were to create one singular mesh that contains both 3D text and a larger almost spherical bubble that the player can grab. Looking at the original false knees comic as a reference, the bubbles are very basic in design and did not inspire much creativity from me inherently apart from the differing line size that added a hand-drawn aesthetic, that I wanted to replicate in 3D.

False Knees Comic Bubble Reference

Below is the initial test I produced to explore speech bubbles’ aesthetics and movement. The test proved mildly unsuccessful despite the 3d environment as I felt the bubbles’ appearance was still flat, going forward from this I looked at references of different artistic impressions of speech bubbles in old animated films and games to see different design aspects I could interpret into 3D space.

Initial Speech Bubble Test

One particular design that stood out to me is the credit sequence of One Hundred and One Dalmations (Geronimi, C, Luske, H and W Reitherman, 1961) as I found the abstract outlines and shapes were an interesting format that replicated the spots on the dogs. Taking this animalistic concept into account, I thought that this may be an interesting concept to explore with the seagulls. The key element I took from this however was the thick abstract lines and shapes with more human-made, rough edges that do not appear as perfect as my initial bubble.

See the source image
Speech Bubble Inspiration

Developing from this I created a speech bubble that has several different layers that can be interacted with the intention of creating a more 2D effect in 3D space, I added a heavy line replicating that of traditional comic book line art. With the use of the pencil curve tool, I was able to create a ‘drawn’ and more imperfect aspect to the line control which seemed, in my opinion, to fit with the comedic and comic derived goal of the production.

Pencil and Curve Toolsets
Developed Speech Bubble with Layers

Below I recorded and shared this process with my team members so that the style and process of the speech bubble developed were easily accessible and could be recreated in the same style to keep consistency with our animations. This process also allowed me to explore different methods of NURBS to polygonal conversion within Maya.

Process of Creating Stylised Speech Bubble Text

Critically thinking about the process of integrating these bubbles into an interactive and three-dimensional space, there may be some impracticalities in the design aspect as the layered spaced elements of the design could not work well in a 360-degree environment in which all angles of our objects can be explored by players, so this in itself presents challenges on camera and object tracking within the virtual camera. Taking accessibility into account also, we wanted to pick a font that would be easily readable for viewers that may not be able to hear or easily understand the audio and voice acting of the birds, so we explored a style that was simple to read and well-spaced out, but also fitted the ‘hand drawn’ visual to match that of the created 3D lines.

Video References

(49) How to convert/change curve into a polygon in Maya 2020 for beginner – YouTube

Mesh Collision Issues

Expanding on last week’s issues involving missing mesh faces upon unity imports, in an attempt to explore the issues surrounding the deleted faces, I looked at the original Maya file model in an unsmoothed mode to replicate that of its appearance in Unity. I found that the vertex seemed to unnaturally collide with one another and adjusted them so that they were more in line with one another, in the hopes that upon import they would not collide.

Wing Mesh Interception issues on animation export

Mesh interception
Vertex Manipulation

The method eventually proved effective however caused a few issues in which the wing looked abnormally big and messed up the overall volume and scale of the model proportionally. This, naturally, takes away from any sense of ‘realism’ we may have wanted to achieve within character believability, however, temporarily seems to fix any issues regarding mesh collision elements.

Resolved Mesh issues

Scene File setups and professionalism

Now that the rigs are in a functioning state and are ready to be animated I thought it imperative for production to set up a scene in both me and Mariana can use as a base for our animations so they are scaled exactly the same and are both in the exact same spacial position. This will improve the production pipeline as when coding the different interactions into the Unity project folder, the character will not drastically jump apart from one another which will prevent any jarring cuts between character and gameplay interactability.

Project Folders
Beach Interaction Set Up
Restaurant Scene Set UP