In recent years, cooperative robots have been attracting attention for their ability to work with humans. The ability to share tools is an important factor in the development of collaborative robots. In this study, we investigated the methods necessary to realize a robot that can manipulate tools. To understand the characteristics of the tools, we modeled the target tools and performed finite element analysis. Based on the analysis results, we determined the initial command values to be given to the robot. Under the constraint of the robot’s range of motion, we created a hand jig and arranged the tools to expand the control sound range. The robot uses acoustic information to judge a situation and meets different demands with its two arms. The left hand controls the generated frequency by applying a forced displacement to the instrument. With the right hand, the robot focuses on the difference in sound pressure between the main and secondary frequencies and controls the position of the blow in the plane of the instrument to increase this difference. The introduction of this control based on acoustic recognition makes it possible to teach the musical saw performance posture in real time. Using this method, we were able to demonstrate the possibility that a tool manipulating robot can work in an environment where humans coexist.