Research

My work focuses on sentence structure and it is processed by humans and machines. I am especially interested in sentence patterns where words are missing (ellipsis) or appear far from where they belong (extraction). My research combines various techniques:

  • Statistical modelling
  • Large corpora (collections of text)
  • Computer models (both grammar-based models and neural networks)
  • Experiments (controlled studies with human informants or language models)
My theoretical work is based on frameworks that focus on the surface form of sentences, such as HPSG/SBCG, and I have built medium-sized computer models of these grammars for research, including a NLTK feature-based structure grammar for Triqui, in collaboration with Christian DiCanio.

PhD Supervision

I'm also currently serving as the Director of Graduate and Undergraduate Studies for the Computational Linguistics programs.

Representative research

Book out on Oxford University Press

Chaves, Rui P. and Michael T. Putnam. (2021) Unbounded Dependency Constructions: theoretical and experimental perspectives, Oxford Surveys in Syntax and Morphology 10, Oxford University Press.

This volume offers a comprehensive overview of unbounded dependency constructions and their constraints. It provides a detailed empirical and theoretical comparison of movement-based and non-movement-based accounts, and reports new data and experimental findings that challenge long-standing theoretical assumptions. This work argues for an exemplar-based construction-based conception of extraction and of grammatical theory that is consistent with the behavioural facts of incremental sentence processing, and it showcases how linguistic phenomena can be shaped by the interplay of syntactic, semantic, pragmatic, phonologic, and cognitive factors.

A review of the book in the Journal of Linguistics can be found here.
There is also an errata.