End-user interface
Q: How is the speech signalling carried out and what is needed for its work?
Re: The speech signalling, as well as other methods of the signalling, is an element of the subsystem "User interfaces" and organised in the modules of the visual control area (VCA), and exactly in the VCA engine UI.VCAEngine and visualisers UI.Vision, UI.WebVision. Usually, for synthesis the language you need a synthesizer with the support of the desired language, like to RHVoice, festival, espeak. For playing the synthesised sound you need no specific tools in the typical environments. But for specific environments you need the play program that is typically contained in the sox package, and for precise definition of the dependencies see the relevant external notification methods.