### 15.03.15 - 20.03.15, Seminar 15122

# Formal Models of Graph Transformation in Natural Language Processing

### Diese Seminarbeschreibung wurde vor dem Seminar auf unseren Webseiten veröffentlicht und bei der Einladung zum Seminar verwendet.

## Motivation

In natural language processing (NLP) there is an increasing interest in formal models for processing graphs rather than more restricted structures such as strings or trees. Classic feature structures can be seen as graphs. Recent work in dependency parsing produces graphs rather than trees. New work in deep semantic annotation organizes logical meanings into directed graph structures, and several efforts are now being made that in the near future will yield large amounts of linguistic data annotated with these representations. **Graph transformation** (graph-to-graph, graph-to-tree, graph-to-string, and vice versa) is thus becoming central to our thinking about natural language understanding, natural language generation, and machine translation.

Formal models of graph transformation have previously been studied and applied in various other areas of computer science, including formal language theory, term rewriting, theory and implementation of programming languages, concurrent processes, and software engineering. Generally speaking, the theory of graph transformation systems studies rule-based mechanisms for the manipulation of graphs. These mechanisms can be used both to generate graphs (by means of graph grammars), to analyze them (by means of graph reduction systems), and to compute input–output relations (by means of general graph transformation systems). A particularly well-studied subject within the area of graph transformation, and one that has received quite some attention recently within the NLP community, are context-free graph grammars.

Few researchers from NLP are familiar with the theory of graph transformation, and at the same time, few researchers from the theory of graph transformation are aware of the specific desiderata, possibilities and challenges that one faces when applying this theory to NLP problems. This Dagstuhl Seminar will therefore bring researchers from the two areas together. The goal is to assess the state of the art, identify areas of common interest, and pave the way for future collaborations. We expect the seminar to create a new interdisciplinary research community, and its proceedings to become a standard reference for future work on graph transformation in NLP.

The specific research questions to be addressed include:

- What linguistic data is modeled using graphs, and what computations on this data are required for practical NLP applications?
- What formal models of graph transformation are available, and what are their properties with respect to expressiveness and algorithmic complexity?
- How well do the models meet other desiderata such as determinization, compositionality, incrementality, and learnability?
- How can formal models of graph transformation be combined with the standard machine learning techniques that are being used in NLP?
- What implementations of graph transformation frameworks do exist, and how can they be used for applications within natural language processing?