Dahye Kim’s personal interest filed in interaction design is immersive digital art and augmented reality.
During her Honours project, she presents communicative relationship between the city and people and visualizing the sensory experience that people can earn from communication with the city as a evocative feedback. She used images and moving images to create the immersive digital art and used projection mapping and augmented reality as exploring material of subject matter.
Communicate with the city, visualizing the sensory experience.
Communicate with the city, Visualizing the sensory experience collected from the city.
Collaborative project of Projection Mapping and Augmented Reality.
Every ordinary day, people have continuous communication with the city through evocative components composing the city such as texture, form, light and sound. While people having the communicative relationship with the city, people affecting the city and the city reflecting and delivering the result of influence through sensory experience to people. Within this communicative process, people can collect their own sensory experience and the city can be more narratively evocative.
This work explores the communicative relationship of people with the city and results of communications which is sensory experience.
This work focused on visualizing the way people have communication with the city and collecting diverse sensory experiences like people do their ordinary day in the real world.
To visualize the communicative activity and collective sensory experience that reflect people’s actions, this project created with screen-based 3D environment using trained images by Runway ML as resources and augmented reality. Through this project, people will have visualized sensory experiences and be able to collect each sensory experience within one sight, and this work will evoke narrative sensory experience that people received from the city.
Communicative City _ Projection Mapping
Runway ML , Autodesk MAYA
Projection Mapping work that captured the features of the city that trigger the sensory experience to people and by converting the trained images to animated video, creating the immersive feelings.
This work created with texture highlighted images trained by Runway ML. Runway ML assembled over 2000 city and landscape images and create new city aspect in texture-wise. To converting the images to evocative 3D city to triggering the sensory experience, used Autodesk MAYA to rebuild the 3D city scene and texturing with trained images by Runway ML. The camera movement of city scene animation and the sky
evoking the feeling of being surrounded by the city and passage of time to expressing a day. The city on digital space created with textured images and 3D figures allow people to evoking their own experience being in the city and evoking their sensory memories before trigger the visualized sensory experience from projection mapping
Sensory Experience _ Augmented Reality
Autodesk MAYA , Unity
Triggered sensory experience from communicative city projection mapping.
This is the Augmented Reality work used to exploring the visualized sensory experience from the city. This series of AR works representing each component which are texture, form and light that composing the city and be evocative to people in the city. By triggering the communicative city projection mapping work, people allow to see the each sensory objects and able to detach from the image target which is contained in projection mapping to collect the AR and create their own visualized sensory experience through this work. Each AR objects has animated movement like floating around the city and these movements can be paused through button input of unity to be collected by audience. Audience can pausing the AR object whenever they want so the result of communication with city can be vary by each audience who having the communication with the projected city.
Texture – transparent floating texture / Light – RGB and light value of the city / Form – multi perspective to the city
The 3D objects of AR created with Autodesk MAYA to build and convert image resources to 3D objects to have depth of floating area of AR and with Unity, the AR objects been animated and add buttons to controlled the AR objects by audience.