Why did we open-source our inference engine? Read the post

EmergentMethods/gliner_large_news-v2.1

This model is a fine-tune of GLiNER aimed at improving accuracy across a broad range of topics, especially with respect to long-context news entity extraction.

Architecture
DeBERTa
Parameters
435M
Tasks
Extract
Outputs
Entities
License
apache-2.0
Languages
en

Benchmarks

CoNLL-2003

news ner en

Named entity recognition on Reuters newswire text

Corpus: 3,453 Queries: 3,453
Quality
f1 0.5527
precision 0.5704
recall 0.5361
Reference →

Self-hosted inference for search & document processing

Cut API costs by 50x, boost quality with 85+ SOTA models, and keep your data in your own cloud.