PhD Defense: Periodicity, Surprisal, Attention: Skip Conditions for Recurrent Neural Networks
24 March 2021
Abstract: Recurrent neural networks are powerful models used for sequence learning. This thesis addresses the typical limitation of recurrent neural networks to fully process and update for every timestep of an input sequence, even if the presented information is redundant or not relevant to the given task. Based on the conditional computation framework, we introduce and investigate periodicity, surprisal, and attention as constraints to model effective skipping. We evaluate the presented methodologies in the context of natural language processing and explore how we can balance potential trade-offs between model performance and skipping.
To participate in the PhD defense, please contact Tayfun via email. The Zoom link will then be mailed to you shortly before the meeting.
Wednesday, 24 March 2021, 14:00, Online Meeting (zoom)
Speaker: Tayfun Alpay, Informatics Dept., Univ. Hamburg