Computation in Design 4
BA Design Communication
Live Project
2021

DJ Hear That!?

Lau Jun Hui Matthew,
Goh Sing Hong,
Siti Nur Azri Binte Abdul Razak,
Aditi Venkat,
2021.

Exploring ambient type in our surroundings, DJ Hear That!? explores the relationship between ambient sound and vernacular typography through probing the environment to create an object of play.

Summary

This project is the manifestation of a cool idea I had while in the shower made possible with the support of my group mates and the help and feedback from Andreas and Joanne. This microsite contains our ideation, processes, artefact and testing made with our blood, sweat and tears (mainly a lot of tears).

Our ideation consists of our mindmaps and sketches that lead up to the term “Ambient Type” probes. Through the ideation process, my contributions were mainly integrating Sing Hong and my interests (as our other groupmates weren’t around during the first session) of sound and typography to form the term Ambient Type as well as my random shower thought, which pushed us towards the direction of searching for and recording vernacular typography in the environment to give the project some background and context instead of just using general environments to record sounds—and deciding to use a MIDI Controller (which luckily Sing Hong had at home) to manipulate and play with the sounds and elements.

The process includes some of our recordings of sounds as well as attempts at code. Through our process, I mainly focused on trying to get the code to work, but I’ve also contributed a couple of letters to our soundbank. Trying to get the code to work was just day after day of trial and error. If you looked on my computer, the links on the first five pages of Google when you search anything related to p5.js and midi would be purple. Even then, it was still primarily broken without Andreas’s help.

Artefact features our project in its entirety as of now, as I hope to be able to develop this project further in the future. Sing Hong, and I did documentation for the artefact with some help from the seniors who had a photo studio set up. We just took advantage of the opportunity to shoot a nice feature video for the project.

Testing features some of our friends and classmates who came by to help test our project and the feedback they gave us. I contributed by managing and setting up the space and carrying the wooden pedestal over all the way from d301, haha! We were lucky to have had that amount of people show up, and that provided us with a whole bunch of exciting results and feedback.

Anything Probes

Ambient Type probes aim to record the ambient sounds that are "heard" by found vernacular typography that we find out in the world. It is also to explore the relationship between scenes and soundscapes, looking and hearing through collecting, recording, and sampling sound from our environment — both urban and nature.

What does a typeface hear from the location?
What would a typeface sound like?
What would it look like in specific areas of our city?

This will then inform and influence the form of the type as we experiment with our findings to create a typographic system that is ever-changing with the pace of the city.

Project Description

DJ Hear That!? utilizes recordings from “Ambient Type” Probes to create an engaging artefact in the form of a MIDI controller that controls a p5.js sketch, engaging users to create visual typographical compositions and ambient soundscapes through the manipulation of knobs and keys.

Process

I initially focused more on developing the idea and pushing it as far as possible, thinking of ways to create more depth and background to the concept and make things interesting. As we shifted to the making phase of things, I turned my focus to trying to get the project to work. I worked primarily on coding the programme and integrating the hardware and features while my groupmates focused on collating our soundbank. At the end of the project, I focused on guiding our users and watching them interacting with the project taking down pointers and observations. As well as tidying up the write-ups of the project to ensure that we had an accurate depiction of the project we had created and focused on the video recording and editing of the outcome and user testing to control the art direction of the documentation.

The ideation process started with Sing Hong and me flipping through the reference books brought by Andreas. We realised that we could attempt something that interested both of us, combining elements of sound and typography. We started developing the Ambient Type probe idea before sharing it with our other group members as they were not in class that day. Luckily they seemed to be mostly on-board with the idea. The idea was pushed further when one day while I was in the shower, I noticed that the showerhead holder formed the letter “T”, and that got me wondering if we could incorporate found vernacular typography into the idea of the ambient type and what if letterforms could hear and are shaped by the environment around them.

The outcome would be a series of letters shaped by the ambient sound waves forming a unique letter with each ambient sound, creating the sounds of our city. However, we realised that we lost the idea of play here. We didn’t really have an interactive element to the project, besides the act of searching for and collecting vernacular typography and ambient sounds. After some brainstorming, I remembered coming across some generative graphics controlled by unconventional tools such as console controllers and even MIDI controllers and that it was possible to control p5.js generate graphics using them. So after discussing it with the group, we realised that Sing Hong had a MIDI controller. So we decided to use that to introduce an element of interactivity and play to the project.

Trying to get the MIDI controller to work with p5.js was quite tricky. At one point, we also attempted to build a set of controls using an Arduino board and some knobs, which didn’t work as well cause I couldn’t figure out how to set up the serial communication program to work with p5.js, especially since we had no prior experience with it. Despite the many, many attempts, we are trying to get things to work. It seemed like we were moving backwards in our attempts to fix things instead of progressing. We were reminded that we have to balance our ambition and skill level. That reminder came in time as I feel like we were too ambitious in our execution, but we really did not have the skills to execute it. We decided to simplify our outcome, mainly to feature our sound bank of collected sounds and streamline how the letters were being manipulated.

However, we were lucky enough to have Andreas help us out with the code to get things working. I attempted to play around with the functions after watching a bunch of tutorials online, but I ended up breaking things and causing more errors. Despite the many many failures, it was a good learning experience, and through that, I think I’ve progressed further in my computation journey and have come to realise my weaknesses and what I have to work on further in the future.

By the end of the project, we had many different iterations of our project’s abstracts and ideas. I focused on tidying up the write-ups, taking pieces of each version and making sure that the write-ups accurately described the project. We also got a chance to document our outcome with a proper setup, thanks to the seniors who had studio lights set up for their documentation. I seized the opportunity to borrow the setup and got Sing Hong to help record two extra videos featuring the different features and our soundbank. The other was of me creating the DJ Hear THAT !? logotype using the tool we had created.



Artefact

DJ Hear That!? incorporates computation in design and is composed of a p5.js sketch that is controlled through a MIDI controller. Using the concept of DJ’ing, users are able to “play” with sound and type simultaneously using the various controllers on the MIDI controller. Each letter has been allocated a certain abstract ambient sound that has been collected through vernacular typography. The controller allows users to type in a letters to form words or graphical compositions.

Users are then able to manipulate these sounds by playing with the knobs, adjusting the sound volume, frequency and pitch. These controls simultaneously adjust the size of the letters and shift them around the screen. The pads on the controller allow users to toggle between letters as well as clear the screen.

Visual typographical compositions and ambient soundscapes are created through permutation and combinations allowed by the artefact. The project encourages the users to play with the composition of letters and sounds, which would influence different individuals differently, allowing for a variety of unique compositions.

The Ambient Type probes draws elements from the surrounding environment and pairs them with typography creating a multimedia tool providing an unique experience of being able to create visually stunning and sonically intriguing compositions creating design and soundscapes in relation with each other.





Mic test 1, 2.
Oh wait I'm not a MC, I'm a DJ!

Testing

We got a bunch of really interesting results through our user testing session. DJ Hear That!? can be considered a slightly immersive experience since it completely engages the user, requiring them to focus in order to create something. The project remains open-ended allowing the users to explore numerous sound and visual compositions.

While some experimented with typographic compositions with complete disregard for the sounds, others chose to focus on soundscaping. The influence of sound or visual or both, an abstract juxtaposition of the two, can be derived from the various outcomes.

Using a large television screen for display and audio, a Macbook to run the programme and a MIDI Controller set up on a wooden pedestal, we converted a section of DW103 for our user testing.

Initially, we decided to conceal the purpose of each button and allowed the users to play around with the whole MIDI controller but after the first test, we realised that in doing that, the users would mess up some of the MIDI settings, causing the programme not to work. So we proceeded to provide parameters for the users and inform them about the buttons that they can control, without telling them what they do. This approach would mean that the users have a little less clarity. However, it made our users want to return and give it another go after getting a better idea of it, which serves as an example of learning from experience.

We thought there would be more people doing button mashing, but surprisingly only one person, Toby, did so. Our thoughts were that the loud noises might have scared some people and made them become more cautious about the keys they were pressing, discouraging them from button mashing.

One of the more interesting outcomes was by Jodi who created this massive white composition through the layering of type creating an almost white screen, in the same time creating this loud white noise sound, correlating the type to the sound where when both layers add up they create this corresponding white screen white noise, sound-visual.

Johnson managed to figure out that the vowel “A, E, I, O, U” had special sounds and spelt his name and Yuqi spelt out “YEET”, while others like Joanne and Vanessa tried to draw faces using the letters and Annabelle seemed to focus more on the sound.

Most users found the project interesting and they had always wanted to try out being a DJ, although not in this sense. Most of them managed to figure out that the sounds were city ambient sounds. It was a 50-50 split between users who realised the correlation between the sound and the typography. While most users said that they were influenced by both the type and sound or just by type, only a small percentage were influenced more by sound in their final outcomes.

Users found it interesting that there was a narrative going on while you were typing and sort of forms a story, and they enjoyed being able to create a musical soundscape from daily sounds. They also said that the project gave enhanced typographic meaning through the adjustments of sound and type and it also created noise music. The project could have also been thought of as a multimedia approach for a tool for creating design and sparked curiosity in most of our users.

Some feedback on improvements that we received was that users would like more variations in type/form such as colour and rotation. One of the most requested feedback was to include a backspace button and to improve on the interface for easier interaction. They also wanted a more standardized volume and consistent rhythm and beats to help with creating a soundscape.

Some in-class feedback includes that we should invite other people, non-designers, to join in the user testing as they might have different takeaways, maybe some musicians might be more focused on sound. Feedback from Andreas was that that the interface could be further improved as some people might get overwhelmed by the MIDI Controller, we could also explore using an Arduino to create our own simplified MIDI Controller. Sounds could be better controlled and we could introduce more silent or quieter sounds as sometimes things could get quite overwhelming and that would turn people to focus less on sound but more on the visuals.

Feedback from Joanne was that we could improve on the relationship between sound and type and that she was actually more interested in the sound than type cause she likes noise music. She also mentioned that we could explore overlaps on-screen between the white and black forms that emerge and that sometimes overlaps might cause some weird distortions. Also, we should put this on a website!



Reflection & Conclusion

This project made me realise that computation wasn’t just about being able to code or work with technology. It was also about finding ways to compound and develop ideas and, if possible, use machines, use code, find ways to make unconventional design, change the way we think about design, and approach it. It is more about understanding what’s possible and then finding ways to use them. The computer doesn’t drive the idea, but instead, we should let the idea develop and show us ways to integrate computation. I feel like that’s also kind of the role of a designer who has an interest in computation. It’s not just about being able to do the computation. It was more about being able to envision ideas that integrate computation and not being afraid to ask for help and find assistance when working with projects that are out of your depth.

There’s still a lot I need to learn about computation. Still, I’m glad that I also had some knowledge of computation possibilities and that they came in handy throughout the project when thinking about ideas and possible executions. The project opened my eyes to see what was possible to execute within such a short time frame and that it’s not about the outcome at this point, it’s learning that with each project, there’s always the possibility of doing more, but it’s about being able to match our ambition to our skill level. That we were able to create something exciting and that if I were able to further my skills, I would push the project further and improve on it with time. With the feedback we got, there are so many possible ways to improve the project from here. Should we add more features, should we bring the interface online or switch to a more straightforward interface through physical computing? I’m pretty excited to try out different improvements and implementations to this project over the break and see where the project can go!