While there are several best practice standards available for minimizing the energy requirement for compressed air use in an industrial context, moving to best practice often requires investment and operational change. In production facilities, there is often a reluctance to commit to this type of change without a clear view of the benefit. Furthermore, there is very little detailed information available in the open literature that allows even a qualitative assessment of priorities. In order to address this shortcoming, analyses of two industrial compressed air systems which are already installed in manufacturing plants have been conducted in the context of energy usage. The installations are quite different in compressed air needs: one is focused on actuation and drying; while the other uses compressed air primarily for material handling. In both sites, the energy of the compressed air is evaluated at each key element of the system and the typical end use application profile is assessed. Simple models of the consumption rates are used to relate duty cycle and device count with actual total consumption. A new way of assessing the leak rate from the entire system has been developed, based on the pressure decay time, and has been implemented at one site. In this way, the energy balance of the system entire has been analyzed quantitatively, with the effect of distribution leaks accounted for directly. It is found that in both sites, open blowing operations (e.g. drying) are the largest, consumers which are amenable to optimization. It is also found that the measured leak rate at one site represented 23% of the compressed air generated, with an energy input of 455kWh per day. It is concluded that this approach can help to identify priorities for optimizing CA use at an industrial site.

This content is only available via PDF.
You do not currently have access to this content.