Experiential Design Final Task

Melvin Yung Khun Yew | 0357241 | Bachelor of Design (Hons) in Creative Media

MMD 60204 | Experiential Design
Week 12 — Week 15


Task 4: Final App Compilation

Carry on with the MVP progress on the BodyBuddy app with my teammate, Lin Si Yan, from Task 3, we proceed with our original plan to include the full app experience by adding the quizzes features in our app. It aims to provide challenges to the fellow organ learners to test their knowledge about the organs they’ve learned during the Interactive AR organ learning experience with the physical model.

App mockup



These are the jump links to each part of this portfolio



Instructions

Mr Razif Mohammed, my lecturer for the Experiential Design module for this April 2025 semester, gives us a heads-up on the upcoming tasks and exercises.



MIB - Module Information Booklet


    Work Process

    In our final task, Lin Si Yan and I joined hands together, further refining our product as well as compiling our work together to make a complete working mobile app. This is where we also build our Unity app to our own mobile platform, iOS and Android, in order to make our educational app universal and accessible to all.

    In my blog, I'm going to share my contributions to our final task that I've remembered. To finalize our app into a full intended experience, I'm taking on the final features of our app, the Quizzes mode, as well as improving the other aspects of the app, such as screen scale responsiveness and more user-centric design.

    What are the changes/improvements/additions?

    Before we begin, I discussed with my groupmate, and I’ve decided to tackle on the final app feature, the quizzes mode, whereby Si Yan will help to compile our progress together into one single file, with her organs learning features and my organ assembly and quizzes features, as well as ensuring our features will work coherently together.

    By creating various Unity scenes, I separate it into 3 scenes for the quizzes features: the tutorial pages, AR image target quiz mode, and the quiz result page.

    The three scenes for Quizzes


    Tutorial Page

    Same as my previous assembly feature before, there are on-screen tutorials and guides shown on the screen when the user presses the quiz feature, where the users will be guided on what will be conducted in this session. Taking the same scripts and components I’ve done during my MVP process, I altered the tutorial guide and the animations presented on the screen as a visual cue.

    Animation process using keyframes to change alpha, position, and rotation

    AR Quizzes

    Taking reference from my assembly feature, I set up the image target function, which is required for answering the quizzes. 

    Image targets used

    Before I begin, I noted down the new mechanism needed in this scene for it to work as I intended, which are:

    • Responsive progression bar

    • Adjustable question list carousel

    • Option to retry the question if incorrect

    • Detect questions that are answered correctly first try for the result page

    For the question list, I’ve set it up so I can add and remove the questions for the quiz as I like. In addition, together with the question, I also include a field to provide the different sprites for the progression bar. For example, a progression bar indicates to the user the questions left in the quiz mode at 0%, 50% and 100% progression. 

    Sprites for the progress bar

    To reduce the future workload to include more questions in the list, I scripted out a C# script that includes a public list for me to add more questions, update the correct scanned answer (image target name), and the progress bar sprites.

    Question lists

    The question text input in the list will be updated to the textMeshPro gameObject at the top panel of the screen.
    Question text placement

    Apart from letting the users retry the question, as in the organ model assembly question, I also included a button where the users can just continue to the other questions even if they answer the questions incorrectly. However, only the first try attempts will be counted towards the final result page, while the users can retry the questions again to redeem themselves. 

    Feedback panel for wrong answers

    Feedback panel for correct answers

    Result Page

    This is where the users can review their performance for the quizzes on how many questions they can answer correctly without retrying the questions. The different mascots will be shown according to the number of questions answered correctly. The encouragement text will be shown differently, with each boosting the morale/confidence of the users, apparently, when our target audience is mainly young children.

    Different encouragement texts and mascots for different results

    Mr. Razif's Feedback & Further User-Centric Design

    When I consult with Mr Razif on week 14 class and let him to take a look at our outcomes, he notices the lack of sufficient feedback to the users on the assembly and quizzes mode, where there is visual feedback to the users on right and wrong answers.

    This is where Mr Razif reminded me of the Fitts Law, where, in our case, on users' attention needs to constantly move up and down when checking answers after their sight focuses on the centre screen for scanning the image targets, and force them to move their eyes down the screen to see the feedback panels.

    Razif's constructive feedback gave me inspiration on how to fix this issue by incorporating our app's mascot in the centre of the screen, as well as adding particle effects behind the mascot, indicating the right and wrong state to the users. 

    To make this possible, I made an adjustment to the script that controls the active state of the UI gameObject in order to accommodate the new visual improvements and add public fields for the respective mascot UI image and the VFX particles gameObject.

    Public fields for mascots and VFX particles

    This is where I spent hours of my time fixing the particle effect issues, where I conducted trials and errors to make the particles work as intended. Issues such as particles gameObject destroyed after playing, and unable to show on the overlay screen space.

    The fix for the particle effect is destroyed after being played is setting the clear behaviour below the particle system on the gameObject in inspector, from destroy to none. This is how I solve this issue.

    Set clear behaviour to none

    For the particle system not showing on the screen, I found out that the particle effects will have issues displaying on other overlays (Unity side issue), where the particle systems for my desired effects are not optimised for the UI overlay. This is where I found the alternative workaround to change the canvas's screen space from overlay to camera. This is when the particle effect will start showing on the screen when I play test the app. 

    Scree space from overlay to camera for particle effect to show

    Here is the final look of the newly added visual feedback on correct and wrong answers/image target given.

    Correct visual feedback

    Wrong visual feedback

    Final look rendering


    With this improvement, users will have an immediate visual feedback at the centre of the screen to communicate with them, reducing the brief confusion for the users with more subtle cues.


    Final Updates: Issues found during build & run

    The final Unity file compiled and provided by Si Yan to me, as I build and run the app to my personal Android phone. I realized that some of the scenes are not responsive to my screen size, plus the interaction handler on her organ learning features, as I’m unable to pass the tutorial pages where the input handling isn’t working as intended. I believe this is due to the issue that the input handler used by the IOS and Android may differ, causing this experience-breaking bug on my Android phone. Regardless, my app features are first built to Android compatibility, so the features are still working as before.

    Responsive Design Issue on teammate's part

    As I build the Unity project on my phone, I notice the app screen doesn't fit my phone's screen scale. I went back to the compiled Unity file only to see the homepage as well as the other pages of the organ mode have a constant pixel-sized canvas. Thus, I changed the canvas scaler to the scale with screen size mode, and placed each game object in place using the Rect Transform component inside each game object under the Canvas parent gameObject.

    Screen Scale Issue

    Scale with the screen size canvas


    Tutorial Page Tap to Navigate Issue

    At first glance, I noticed this would have something to do with the interaction script, so I went on to check the script by Si Yan, and modified the script to include the mobile touch input system. This proves to be a quick fix, and the touch is working again as intended.

    Input system script

    The particle effect is not showing up in assembly and quiz mode

    After confirming with my teammates, both of us faced issues on the missing particle effect from the assembly mode and quiz mode, where normally the particle effects should be playing as intended according to my previous Unity build before our project compilations.

    At first, I thought it was the incompatibility of the script that handles the UI active state, so I dug through the script for hours looking for errors, and searched the internet for solutions. Countless fixes were tried, but all came to a fail in the end. However, I continued pushing on to seek the answers to fix this bug, that's when I noticed a forum about something to do with render pipelines. Thus, I delved more into this section, discussing with ChatGPT to determine the final resolve.

    And that's it! After hours of attempted fixes on the script, it was rather a simple solution as the fix to the missing particle system (after I checked that the newly created particle system is working on the build). We need to check on the texture depth options; this way, Unity will render the particle system as intended on our phones.

    Tick the Depth Texture option in the inspector of the Render Pipeline Asset (URP)


    Submission

    Google Drive
    Unity File, APK + iOS app file, and project compilation included

    Presentation Video

    Task 4 BodyBuddy Full App Presentation

    Walkthrough Video
    BodyBuddy App Walkthrough


    My reflections

    Completing the Experiential Design module has been an insightful journey into creating interactive learning experiences with Unity and Vuforia AR. This project taught me how to go beyond building a simple app by integrating augmented reality to elevate the way users interact with our educational content. I gained hands-on experience with Unity’s app functions, AR image targets, and the workflow needed to compile a functional app on both iOS and Android platforms.

    One of the most significant learning moments came from solving bugs and optimizing our app for mobile responsiveness. Challenges like particle system issues and screen scaling required deliberate problem-solving and repeated testing to ensure the app worked as intended across different devices. Through this, I became more confident in debugging Unity apps and handling platform-specific challenges.

    Working in a two-person team also shaped how I approached the project. Collaborating with Lin Si Yan allowed for the exchange of ideas and constructive feedback, which improved both the functionality and the user experience of the app. While combining separate Unity files came with compatibility challenges, the process strengthened my collaboration and troubleshooting skills.

    Ultimately, this module has deepened my understanding of experiential and interactive design. I’ve learned how crucial visual feedback and user-centric interactions are in keeping users engaged—especially in AR-based educational apps. This experience has not only improved my technical skills but also reinforced the value of thoughtful, user-focused design in creating meaningful digital experiences.




    Comments

    Popular posts from this blog

    Information Design | Exercises

    Information Design | Project 1

    Application Design II Task 1