The Emergence of Chunking Structures with Hierarchical HRNN

Authors

  • Zijun Wu Dept. Computing Science & Alberta Machine Intelligence Institute (Amii), University of Alberta
  • Anup Anand Deshmukh University of Waterloo
  • Yongkang Wu Huawei Poisson Lab
  • Jimmy Lin University of Waterloo
  • Lili Mou Dept.~Computing Science & Alberta Machine Intelligence Institute (Amii), University of Alberta

Abstract

In Natural Language Processing (NLP), predicting linguistic structures, such as parsing and chunking, has mostly relied on manual annotations of syntactic structures. This paper introduces an unsupervised approach to chunking, a syntactic task that involves grouping words in a non-hierarchical manner. We present a Hierarchical Recurrent Neural Network (HRNN) designed to model word-to-chunk and chunk-to-sentence compositions. Our approach involves a two-stage training process: pretraining with an unsupervised parser and finetuning on downstream NLP tasks. Experiments on multiple datasets reveal a notable improvement of unsupervised chunking performance in both pretraining and finetuning stages. Interestingly, we observe that the emergence of the chunking structure is transient during the neural model's downstream-task training. This study contributes to the advancement of unsupervised syntactic structure discovery and opens avenues for further research in linguistic theory.

Published

2025-09-09