Work that's currently being done in voice-based automation comes in two flavours: recognition technology and interface design. Machine voice systems are moving on from the traditional menu hierarchy ("For Customer Support, say 'Yes' now") to a more intuitive (familiar) design. A lot of the lessons being learnt there are equally applicable to Conversational User Interfaces [CUI] (and, I imagine, vice-versa). See AT&T's How May I Help You? voice research, and especially the research paper "Designing User Interfaces for Spoken Dialog Systems" [PDF, at the bottom of the page].
It's at this point you should also go away and read The Jack Principles -- in short, how to direct the conversation so a user will never be in a position to ask a question the machine can't answer.
Work that's currently being done in voice-based automation comes in two flavours: recognition technology and interface design. Machine voice systems are moving on from the traditional menu hierarchy ("For Customer Support, say 'Yes' now") to a more intuitive (familiar) design. A lot of the lessons being learnt there are equally applicable to Conversational User Interfaces [CUI] (and, I imagine, vice-versa). See AT&T's How May I Help You? voice research, and especially the research paper "Designing User Interfaces for Spoken Dialog Systems" [PDF, at the bottom of the page].
It's at this point you should also go away and read The Jack Principles -- in short, how to direct the conversation so a user will never be in a position to ask a question the machine can't answer.