Experiential Design Task 1

Melvin Yung Khun Yew | 0357241 | Bachelor of Design (Hons) in Creative Media

MMD 60204 | Experiential Design
Week 1 — Week 4


Task 1: Trending Experience

As I delved into the world of augmented reality (AR), it really amazed me how far this technologies go and achieves wonderful experiences that I believe it should be explored more to incorporate this technology well into real life.


Image by Blippar



These are the jump links to each part of this portfolio



Instructions

Mr Razif Mohammed, my lecturer for the Experiential Design module for this April 2025 semester, gives us a heads-up on the upcoming tasks and exercises.



MIB - Module Information Booklet

To guide us step-by-step into the AR technology, Mr Razif gave us a series of in-class exercises that assist us in exploring the current, popular trend to help us better understand this technology and how we can utilize it in our upcoming project.



Lecture

In week 1, Mr Razif gave us a short lecture on Minimum Viable Product (MVP) is a single flow of user interaction
Vertical Slice

Week 1
In the first week, Mr Razif showed us what the differences are between the Augmented Reality (AR), Mixed Reality (MR), and Virtual Reality (VR). He emphasizes that Augmented Reality (AR) is a technology that overlays digital content onto the real world using smartphones and AR glasses. One example of AR technology implemented in an app is Pokémon GO, where the game overlays the Pokémon creatures onto the screen with the camera view for the players to interact with.

Mixed Reality (MR) is a more advanced technology that blends both the physical and the virtual content together to be interacted with in real time. One key example of technology utilizing the MR is Microsoft HoloLens, that have since provided benefits to various expert fields like healthcare or manufacturing. However, Mr Razif mentioned that AR and MR are very closely related, with both of them having the same function to overlay digital content to the real world. The only difference is the way to interact with the contents with AR using physical buttons or controllers, while MR can be more intuitive to use hand gestures to control the contents.

Virtual Reality (VR) is essentially making users immersed into the full digital environment, snapping relations with the physical world.


Mr Razif also extends further to guide us on the project deliverables in this module by showing past student works on YouTube about the AR (Augmented reality) experience, giving us ideas on what to expect and the product outcome for this semester.

For the first time in my university life, it is surprising to me that Mr Razif is the first lecturer to incorporate and effectively utilize the AI technology to provide us a real-time chatbot to aid us in the assignment down the line.

Experiential Design AI Assistant by Mr Razif



Mr Razif also spends time explaining to us the current trend of AR:
  • Marker-based
    Image by Krit Salah-ddine

  • Markerless
    Image from 3D Cloud


  • Location-based 
    Image from WebAR Studio


Week 2
What is User Mapping?



Image by Nielsen Norman Group, User Map Cheat Sheet 2017

Lecturer's Recommendation
Referring to the Nielsen Norman Group for the latest information about these things, which can include the current AI and Experience trend.


Empathy Map is a tool used to articulate what we know about a particular type of user. It externalizes user knowledge to create a shared understanding and aid in decision making.




    Work Process

    This work process section is divided into 3 parts, consisting of:

    • In-class exercises
    • Self research
    • Idea proposal


    In-class exercises

    Week 2

    To have us understand and utilize what we've learn in lecture class, Mr Razif started a group activity to have us create a user journey map of a person's journey on specific locations like a cafe or even a theme park. Thus, I work closely with my team to select Disneyland Tokyo Park as our location focus, and then we further brainstorm what the possibilities are when someone travels to Disneyland. To set up a standard user journey map, we make sure to include gainpoints, painpoints, and solutions for the painpoints according to each journey progression.

    A collaborative work with my group during the tutorial class

    Week 3

    For the first half of the exercise, Mr Razif initiates a session to let us imagine a scenario where the AR experience can be applied and what extended visualizations can be useful for this. To conduct this activity, I teamed up with other students to come up with a scenario in a hospital.

    This exercise enables me to visualize and ideate what the possible ways are to implement an AR experience into hospital settings, by account different audiences, like the patients who are more visually-oriented rather than the conversations with doctors, and doctors who can utilize AR technology to help the patients reach a consensus with doctors, and deliver the messages across to patients more effectively.


    Later, 
    Mr Razif introduced us to Unity, one of the key applications we will use to create an AR experience to equip our students with the necessary knowledge about creating AR experiences, starting with the marker-based AR. By teaching us the required plugins like Vuforia, which enables us to import the AR functionality into Unity to create a working and responsive AR.

    During Mr. Razif's experiential design tutorial, he guided us step by step through the process of setting up augmented reality using Vuforia in Unity. After we completed the registration, our first task was to download the Vuforia Engine package and bring it into our Unity project. From there, we accessed the Vuforia Developer Portal, where we had to create a brand new database. In that database, we uploaded the images we wanted to use as markers—these would act as the visual triggers that the AR camera detects to bring digital content to life.

    One key takeaway Mr. Razif emhasized was the importance of choosing high-quality images. He pointed out that Vuforia gives each image a star rating based on how well it can be tracked—factors like contrast, detail, and overall sharpness affect this. We were told to make sure our images had at least a 3-star rating because anything lower might cause recognition issues during the AR experience, which would affect how smooth or immersive the interaction feels for users.

    Image rating on Vuforia

    While I imported the same image I uploaded on Vuforia as a recognition target for AR, I also added a 3D cube model on top of the image and under the child of ImageTarget (added from the VuforiaEngine options), preparing for the camera scanning later to trigger the visibility of the cube when the image is in camera view.

    Import the image target from Vuforia

    Image Target features from Vuforia Engine

    Adding a 3D cube

    Group and nest the cube on the image target

    To add the camera scanning AR features to Unity, Mr Razif guided us to add the Vuforia license key in the Vuforia Engine configuration in Unity. 

    App license key requirement under the Vuforia Engine configuration tab


    By creating a license profile, copying and pasting the license key from Vuforia into the Unity, it enables the AR features to be used in Unity.

    License Key on Vuforia


    But as my webcam on my laptop is broken, I tried to use the Camo app to substitute the webcam by using my Android phone as camera. During the process, I've met with many problems arising from that option such as the camera not registering in Unity. I have to go all the way into the Project files to look for the webcamprofiles.xml to register Camo with a default webcam profile for it to work properly in Unity.

    File Path and Webcam Profile coding

    And then, voila!



    Week 4
    To continue on the exercise from week 3, Mr Razif helped us to explore how to include the UI into the AR experience in Unity. In short, it is essential to create a new canvas layer, and set the reference resolution to a desired settings, which for this exercise 1920x1080 resolution is used.

    Canvas features in the UI panel

    In a UI design, a button is the simplest element for a working UI, enabling users to interact with the screen. Thus, I added a button feature nested under the canvas while adding commands to each button to hide and show the cube.

    Adding a button on the canvas

    Adding commands for the buttons to set the active state of the cube model

    For the users to have visual feedback when hovering over the button, the button colour state can be edited at the button configuration panel.

    Button state

    Here is the outcome of hiding and showing the model using the button:


    Later, Mr Razif guided us to create keyframe animation on the cube and using the Animation.Enabled(bool) on the button to start and stop the animation.

    Key frame animation


    Here is the outcome of starting and stopping the animation using the button:


    Ideation Proposal

    The idea to include an AR experience in the app is:
    • Maps (Navigation-based app)
    • Store (Shopping-based app, when in store, scan an item to show its 3D model/price/any related description)
    • Education-based app

    After reviewing my past experiences on any problems that I've faced, I came to the conclusion that AR technology can be incorporated into one of my interests as well, that is, the PopMart blind box figurines.

    Pop Mart, Image by Shuango Zhang

    The problem I noticed in this franchise is the limited options for users to preview and inspect the 3D figurines and search for figurines to their liking. The Pop Mart's current options for customers' viewing are to display only a certain popular blind box series, not only that to display part of the models of a blind box series (for example: 6 out of 12 are shown).

    The issue lies within the fact that each branches of Pop Mart have a different store area, rendering the store unable to display all the models at once to save space.

    The next thing that came to my mind is the PC builder tool for the starters, while utilizing the AR technology to aid in visual education for the first-time PC builder. I came up with this idea as I noticed the video skits I saw on social media talking about how PC builders make rookie mistakes when building a PC themselves from the ground up. This makes me realize that there are few official manuals or tutorials to aid builders in checking compatibility and assembling a complete and functional PC, considering that the are many manufacturing parties from various electronic parts in a desktop PC.

    PC components

    Here are the 3 idea proposals:



    My reflections

    Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.




    Comments

    Popular posts from this blog

    Information Design | Exercises

    Advanced Typography | Final Compilation

    Information Design | Project 1