Practice Log 8: Dalton Mill Project Version 1

Date: 23/07-14/10/2024

Project Showcase

Project file

Please download to a PC from here.

Downloading instruction

  1. You will need a VR headset that can be connected to the PC that you will use to view the project.
  2. Download the original file.
  3. Download the Epic Games Launcher. Launch the software and download Unreal Engine version 5.3.
  4. Open Unreal Engine, go to Browse, find the project file and click Open.
  5. Click the Play button on the toolbar (usually in the middle of the top of the screen, looks like a small triangle ▶️).
  6. Navigate the game with physical movement, and interact with the model with hand gestures (hand gestures are shown below)

Hand gestures demonstration for game interaction

Left Hand

Pinch your index finger with your thumb to select which room to go to from the menu.

Right Hand

Pinch your middle finger with your thumb to go back to the menu.
Pinch your index finger with your thumb to teleport to the location you would like to be.

If you cannot open it or do not have a VR headset, please view the video demonstration below.

Video demonstration (first half of the video)

If you would like to try it out yourself but do not have suitable equipment, please contact the researcher, Yuan Gao, from the “Contact Me” section in the navigation bar at the top of the screen. Yuan will arrange an in-person showcase with you at the University of Leeds (only available before 30th October 2025).

Reflective Diary

This practice-based project, funded by the Science Museum Group’s Congruence Engine, sought to explore how multisensory techniques, particularly the combination of sound and 3D modelling, might help reconstruct and reimagine Dalton Mills, a now-destroyed industrial site. My key responsibilities focused on the recording and editing of sound and the creation and integration of 3D-scanned elements, collaborating with Alexander Neish (an expert in Unreal Engine) and Gyannateet Dutta (an expert in AI).

Due to the loss of physical access to Dalton Mills (destroyed by fire twice and now abandoned), the project adopted an exploratory approach, using AI-based reconstruction, ambient sound editing, and visual 3D scanning to create a VR experience with four virtual rooms (including indoor and outdoor areas of the Mills). Each room aimed to represent Dalton Mills’ visual and auditory textures, reflecting its industrial function and historical atmosphere.

In the Entrance Room, I used site-matching and environmental replication to embed a realistic street ambience. Based on Google Maps’ analysis, I recorded ambient sound in Brighouse to simulate the roadside location of the Dalton Mills’ gate. The audio was set to an ambisonic effect in Unreal Engine, where dynamic spatial cues were assigned to simulate movement and orientation shifts (sound fades when going inside the gate).

In the Textile Loom Room, I recreated the sound of multiple moquette looms based on recordings collected at the Calderdale Industrial Museum. To reflect asynchronous operation patterns, each loom’s sound was edited to play on a staggered loop, producing a polyphonic industrial soundscape. The ambisonic configuration allowed participants to experience shifting sound density depending on their movement. However, auditory masking still occurred, which also happens in real life (see practice 1 & 3), where high volume in certain looms drowned out others unless the listener paused and reoriented.

The visual side employed a hybrid model: AI-enhanced photogrammetry for low-detail structural elements (done by Gyanateet and Alex), and direct 3D scanning for machinery. While some artefacts lacked surface accuracy, they provided enough spatial anchoring to enhance the audiovisual alignment needed for immersion.

Reflective Methodological Note

In this version of the Dalton Mill VR experience, I worked not just as a content creator but as a reconstructor of sensorial memory. I wasn’t trying to digitally resurrect Dalton Mill as it once was, but to explore how partial media (e.g. recordings, approximations, fragments) could collectively produce affective coherence.

This approach was shaped in close collaboration with my team members, Alexander Neish and Gyannateet Dutta, during a series of experimental planning sessions. One of the early concepts I proposed was the idea of Sonic Wanderability: designing sound not as background but as a mobile, navigable structure that users could move through and assemble into meaning (explored further in practice 10). Unlike traditional audio-visual alignment strategies, where visuals take precedence, here sound came first in both production order and experiential logic. It set the emotional tone, defined temporal flow, and, in some cases, obscured or compensated for low-fidelity visuals (see participants’ feedback paragraph below).

My hypothesis in this project was not that we could simulate Dalton Mills “accurately”, but that we might evoke its presence through a sensorial patchwork. This was less about digital archaeology and more about Affective Synthesis, which is layering approximated fragments to suggest a historical atmosphere. In this sense, what we constructed was not a replica, but what I would call a plausible fiction generated by Affective Synthesis: a narrative space held together by resonance rather than precision.

This practice reminded me that sound does not need to be accurate to feel true. What matters is not fidelity, but situated resonance. In this project, sound filled in what visuals could not. It triggered associations, evoked embodied memory, and offered temporal depth to an otherwise spatially incomplete environment. Some participants mentioned that they felt sad, regretted losing Dalton Mills, and some participants felt amazed that Dalton Mills could be recreated in a digital way. They also mentioned that this reminded them of their experiences of visiting Dalton Mills in the past.

Participant feedback affirmed this. At the Science Museum Group’s London showcase, I conducted a mini interview and observation. Most users spent the longest time in the Loom Room, reporting feelings of “tension”, “movement”, and “presence”, even though the visuals were relatively low resolution. Many audiences reacted in shock, even though I had told them the sound would come up immediately after entering the room, and a few participants instinctively raised their voices in response to loom sounds, which mirrored real-world factory behaviour and hinted at behavioural immersion induced by auditory design.

This challenged my assumption that visual realism drives immersion. In fact, the sound’s role as a narrative texture proved more powerful than surface detail. As Daniela Angelina Jelinčić, Marta Šveb, and Alan E. Stewart (2022) argue, sound “is in fact a powerful elicitor of emotions and can markedly enhance the emotional experience” in audiovisual media (Jelinčić, Šveb & Stewart, 2022, p.525).

Furthermore, this experience raised ethical questions: what are we “authenticating” when we reconstruct lost environments using simulated data and AI inference? Are we preserving history as an excuse to test how accurate the newly generated narratives can be, or delivering a personalised understanding of the history?

In sum, Dalton Mill Version 1 demonstrated that immersive heritage is not just reconstruction but reconfiguration. And in this process, sound is not supplementary. It is foundational.


Reference

Chion, M. 2019. Audio-vision: Sound on screen. New York: Columbia University Press.

Jelinčić, D.A., Šveb, M. and Stewart, A.E., 2022. Designing sensory museum experiences for visitors’ emotional responses. Museum management and curatorship. [Online]. 37(5), pp.513-530. [Accessed 25 September 2025]. Available from: https://doi.org/10.1080/09647775.2021.1954985

Leave a Reply

Your email address will not be published. Required fields are marked *

Read More

Practice Log 6: The King’s Book Leeds

Date: 01/04–01/08/2023 Project Showcase Link to the original project: https://immersivenetworks.co.uk/thekingsbook/ My role in this project was observing the project design process, interview facilitation and post-experimental

Read More »

Get in touch