Skip to main content

PREQUEL-2025-0095

Elasticsearch field limit exceeded causing Logstash indexing failuresHigh
Impact: 7/10
Mitigation: 5/10

PREQUEL-2025-0095View on GitHub

Description

Logstash is failing to index events to Elasticsearch due to the total fields limit of 1000 being exceeded. This occurs when the Elasticsearch index has reached its maximum field limit, preventing new fields from being added during document indexing.


Cause

  • Elasticsearch index has reached the default limit of 1000 total fields per index.
  • New documents being indexed contain fields that don't exist in the current mapping.
  • Dynamic mapping is enabled, causing Elasticsearch to create new field mappings automatically.
  • The index has accumulated too many unique field names over time.

Mitigation

  • Increase the index.mapping.total_fields.limit setting in Elasticsearch.
  • Review and clean up unused fields in the index mapping.
  • Consider using field aliases or nested objects to reduce field count.
  • Implement field mapping templates to control field creation.
  • Monitor field count and set up alerts for approaching limits.

References