Music is a popular form of entertainment for human beings. Instruments are often played by experts in studios or at live performances. Digital or digitised (computer) music is also popular amongst today’s listeners. However, there are subtle differences between the sound of a real instrument and digital music. Moreover, the physical movement of real instruments creates a different atmosphere. Hence, this project addresses the development of a robotic pan flute that enables a real instrument to be played by non-expert musicians. Various subsystems of the low-cost prototype solution have been developed and its performance has been evaluated. Experimental results indicate that the system can produce the desired (theoretical) music notes and is robust to variations in compressed air pressure over the tested range.
An overview of the robotic pan flute system in shown in Figure 1. A laptop running LabVIEW and a connected camera are used primarily for user input and automatic calibration of the pan flute system. An Arduino Mega 2560 microcontroller receives commands from the laptop to control the nozzle position and air pressure thus producing the desired sound note. During the automatic calibration process, the camera detects the centre location of each pipe and the microphone provides sound frequency feedback for air pressure adjustment.
Utilising a single nozzle and voltage-to-pressure transducer combination that moves linearly reduces component costs and allows different sized pan flutes to be played. Figure 2 illustrates the nozzle mounted on a linear rail and connected to the E/P transducer. Figure 3 illustrates the location of the web camera for detecting pipes and centre locations. Figure 4 shows an overview of the pipe location process. The sound calibration process is illustrated in Figure 5.
Video of the robotic pan flute in action.
Figure 1. Overview of robotic pan flute system.
Figure 2. Nozzle mounted on linear rail.
Figure 3. Camera location.
Figure 4. Image processing for pipe location.
Figure 5. Sound calibration process.
For more details on this project refer to the following:
Journal Paper:
Chand, P., Kumar, K. and Kumar, K. Development of a Low-Cost Robotic Pan Flute. International Journal of Intelligent Machines and Robotics, Inderscience, 2018, 1(2), 153-170.
Conference Proceedings Paper:
Kumar, K., Kumar, K., Chand, P. and Carnegie, D. A. Towards an Automated Pan Flute Player. The 6th International Conference on Automation, Robotics and Applications, 2015, pp. 192-197, Queenstown, New Zealand.
Kumar, K., Kumar, K., Chand, P. and Carnegie, D. A. Pan Piper 1.0: An Overview of a Robotic Pan Flute for Pacific Music. In Proceedings of IEEE Asia-Pacific World Congress on Computer Science and Engineering, 2014, Nadi, Fiji.
News Article (Phys.org - Science X Network):
How to dance to a synthetic band (2018, November 16) retrieved 1 October 2022 from https://phys.org/news/2018-11-synthetic-band.html