Advancements in Natural Language Processing

The field of natural language processing is moving towards more accurate and efficient processing of legal and formal texts. This is evident in the development of new libraries and frameworks that can handle complex sentence structures and specialized citations. Furthermore, there is a growing focus on coreference resolution, with advancements in end-to-end neural coreference resolution systems and the use of pretrained language models to improve accuracy. Additionally, researchers are exploring new methods for cross-document contextual coreference resolution and enhancing coreference resolution with syntax and semantics integration. Noteworthy papers include:

  • NUPunkt and CharBoundary, which provide high-precision sentence boundary detection for legal texts,
  • End-to-End Dialog Neural Coreference Resolution, which achieves improved accuracy and efficiency in large-scale systems,
  • Enhancing Coreference Resolution with Pretrained Language Models, which bridges the gap between syntax and semantics for more accurate referential relationships.

Sources

Precise Legal Sentence Boundary Detection for Retrieval at Scale: NUPunkt and CharBoundary

Intrinsic Verification of Parsers and Formal Grammar Theory in Dependent Lambek Calculus (Extended Version)

Proposing TAGbank as a Corpus of Tree-Adjoining Grammar Derivations

End-to-End Dialog Neural Coreference Resolution: Balancing Efficiency and Accuracy in Large-Scale Systems

Cross-Document Contextual Coreference Resolution in Knowledge Graphs

Enhancing Coreference Resolution with Pretrained Language Models: Bridging the Gap Between Syntax and Semantics

Reducing Formal Context Extraction: A Newly Proposed Framework from Big Corpora

Built with on top of