PREQUEL-2025-0095
Elasticsearch field limit exceeded causing Logstash indexing failuresHighImpact: 7/10Mitigation: 5/10
PREQUEL-2025-0095View on GitHub
Description
Logstash is failing to index events to Elasticsearch due to the total fields limit of 1000 being exceeded. This occurs when the Elasticsearch index has reached its maximum field limit, preventing new fields from being added during document indexing.\n
Mitigation
- Increase the index.mapping.total_fields.limit setting in Elasticsearch.\n- Review and clean up unused fields in the index mapping.\n- Consider using field aliases or nested objects to reduce field count.\n- Implement field mapping templates to control field creation.\n- Monitor field count and set up alerts for approaching limits.\n
References
- https://stackoverflow.com/questions/55372330/what-does-limit-of-total-fields-1000-in-index-has-been-exceeded-means-in
- https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping.html#mapping-limit-settings
- https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-put-mapping.html