The goal of this talk is to provide empirical arguments in favor of a derivational view of the grammar in which structure building occurs incrementally, top-down (Chesi 2004, 2007) and from left to right (Phillips 1996, 2003). Following the spirit of Minimalist research (Chomsky 1995, 2008), I will show that the bottom-to-top orientation of phrase structure building is not a “virtual conceptual necessity” and that we can gain in descriptive adequacy if we move away from the idea that phrases are created by the recursive application of transformations like Merge. In a nutshell, I propose reversing the tree-structure building procedure. If phrases are expanded rather than being created by Merge, we can interpret Chomsky’s notion of Phase as the minimal domain in which a given set of syntactic features must be linearized and processed (either lexicalized or further expanded). This way, we better characterize the distinction between nested expansion domains (syntactic strong islands), which are highly computationally demanding, and cost-free recursive expansions such as the last selected complement of a verb. I argue that Movement (creating a long-distance filler-gap dependency) is triggered by the fact that unexpected features are inserted in the computation. This is the case for wh-elements like “who” that are inserted in the leftperiphery of the matrix phase to express an interrogative feature; but “who” also has argument features, which are not licensed in that peripheral wh-position. In order to reconstruct any non-local dependency in Top-Down, LeftRight terms, I propose using a Memory Buffer where unexpected feature bundles are stored until the relevant (thematic) licensing position is found. Regulating the access and inheritance mechanisms of the memory buffer in terms of phases, we succeed in capturing island-hood (Chesi 2004) and parasitic gap constructions (Bianchi & Chesi 2006). We also account for the intermediate status, in terms of transparency, of certain adjuncts (Pollard & Sag 1994), and for the reconstruction effects of some subjects of unaccusatives and passives (Bianchi & Chesi 2012). I want to stress that this grammatical perspective does not provide a processing account of these phenomena — the grammar is not the parser from this perspective — but shows how a formal grammatical model that includes these directionality constraints is empirically more adequate in unifying a set of facts otherwise mysteriously related. However, I would also try to show that this grammatical model is explanatorily more adequate than the standard Minimalist one, and that memory load and feature confusion can account for locality constraints such as Relativized Minimality (Rizzi 1990) and for many asymmetries in the processing of minimal pairs of sentences, e.g., subject vs. object relative clauses (Traxler et al. 2002; Belletti & Chesi 2011).

On directionality of phrase structure building

CHESI C
2012

Abstract

The goal of this talk is to provide empirical arguments in favor of a derivational view of the grammar in which structure building occurs incrementally, top-down (Chesi 2004, 2007) and from left to right (Phillips 1996, 2003). Following the spirit of Minimalist research (Chomsky 1995, 2008), I will show that the bottom-to-top orientation of phrase structure building is not a “virtual conceptual necessity” and that we can gain in descriptive adequacy if we move away from the idea that phrases are created by the recursive application of transformations like Merge. In a nutshell, I propose reversing the tree-structure building procedure. If phrases are expanded rather than being created by Merge, we can interpret Chomsky’s notion of Phase as the minimal domain in which a given set of syntactic features must be linearized and processed (either lexicalized or further expanded). This way, we better characterize the distinction between nested expansion domains (syntactic strong islands), which are highly computationally demanding, and cost-free recursive expansions such as the last selected complement of a verb. I argue that Movement (creating a long-distance filler-gap dependency) is triggered by the fact that unexpected features are inserted in the computation. This is the case for wh-elements like “who” that are inserted in the leftperiphery of the matrix phase to express an interrogative feature; but “who” also has argument features, which are not licensed in that peripheral wh-position. In order to reconstruct any non-local dependency in Top-Down, LeftRight terms, I propose using a Memory Buffer where unexpected feature bundles are stored until the relevant (thematic) licensing position is found. Regulating the access and inheritance mechanisms of the memory buffer in terms of phases, we succeed in capturing island-hood (Chesi 2004) and parasitic gap constructions (Bianchi & Chesi 2006). We also account for the intermediate status, in terms of transparency, of certain adjuncts (Pollard & Sag 1994), and for the reconstruction effects of some subjects of unaccusatives and passives (Bianchi & Chesi 2012). I want to stress that this grammatical perspective does not provide a processing account of these phenomena — the grammar is not the parser from this perspective — but shows how a formal grammatical model that includes these directionality constraints is empirically more adequate in unifying a set of facts otherwise mysteriously related. However, I would also try to show that this grammatical model is explanatorily more adequate than the standard Minimalist one, and that memory load and feature confusion can account for locality constraints such as Relativized Minimality (Rizzi 1990) and for many asymmetries in the processing of minimal pairs of sentences, e.g., subject vs. object relative clauses (Traxler et al. 2002; Belletti & Chesi 2011).
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12076/1791
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact