Logstash output error when use SSL certificates

Hi,

I’m trying to use logstash with odfe cluster and I get the error described below (#error)

error resume

Got response code '500' contacting Elasticsearch at URL 'https://odfe-master-01:9200/_xpack'

NOTE: more detail see error section described down in this ticket error(#error)

I tried with different certificates, ODFE cluster is working fine and kibana too.

I created the SSL certificates following the method described below SSL certificates generation

versions

config

Elasticsearch SSL part

opendistro_security.ssl.transport.pemcert_filepath: node.pem
opendistro_security.ssl.transport.pemkey_filepath: node.key
opendistro_security.ssl.transport.pemtrustedcas_filepath: root-ca.pem
opendistro_security.ssl.transport.enforce_hostname_verification: false
opendistro_security.ssl.transport.resolve_hostname: false

opendistro_security.ssl.http.enabled: true
opendistro_security.ssl.http.pemcert_filepath: node.pem
opendistro_security.ssl.http.pemkey_filepath: node.key
opendistro_security.ssl.http.pemtrustedcas_filepath: root-ca.pem

kibana

# {{ ansible_managed }}
server.name: {{ odfe_node_name }}
server.host: {{ odfe_kibana_server_host }}

elasticsearch.hosts:
{% for master_node in groups['odfe_master_nodes'] %}
  - "https://{{ master_node }}:{{ odfe_master_port }}"
{% endfor %}

elasticsearch.requestTimeout: 360000
elasticsearch.username: kibanaserver
elasticsearch.password: kibanaserver
elasticsearch.requestHeadersWhitelist: ["securitytenant","Authorization"]

opendistro_security.multitenancy.enabled: true
opendistro_security.multitenancy.tenants.preferred: ["Private", "Global"]
opendistro_security.readonly_mode.roles: ["kibana_read_only"]

elasticsearch.ssl.verificationMode: certificate
elasticsearch.ssl.certificateAuthorities: ["/usr/share/kibana/config/root-ca.pem"]

server.ssl.enabled: true
server.ssl.key: /usr/share/kibana/config/kibana.key
server.ssl.certificate: /usr/share/kibana/config/kibana.pem

newsfeed.enabled: false
telemetry.optIn: false
telemetry.enabled: false

logging.rotate:
  enabled: true
  everyBytes: 10485760
  keepFiles: 10

logstash output part

          output {
            elasticsearch {
              hosts => ["https://odfe-master-01:9200"]
              index => "my_index_name-%{+YYYY.MM.dd}"
              user => "{{ logstash_odfe_username }}" # ansible vault var
              password => "{{ logstash_odfe_password }}"  # ansible vault var
              ssl => true
              ssl_certificate_verification => false
              cacert => "/usr/share/logstash/config/root-ca.pem"
              #cacert => "/usr/share/logstash/config/admin.pem"
            }
          }

SSL certificates generation


mkdir -p ~/tmp/odfe

cd ~/tmp/odfe/

# --> Generate a private key

openssl genrsa -out root-ca.key 2048

# --> Generate a root certificate

openssl req -new -x509 -sha256 -key root-ca.key -out root-ca.pem -days 3650 -subj "/C=SP/ST=Catalunya/L=Barcelona/O=My Company S.L./OU=Engineering/CN=root"

# --> Generate an admin certificate

openssl genrsa -out admin-temp.key 2048

openssl pkcs8 -v1 PBE-SHA1-3DES -in admin-temp.key -topk8 -nocrypt -out admin.key

openssl req -new -key admin.key -out admin.csr -days 3650 -subj "/C=SP/ST=Catalunya/L=Barcelona/O=My Company S.L./OU=Engineering/CN=admin"

openssl x509 -req -in admin.csr -CA root-ca.pem -CAkey root-ca.key -CAcreateserial -sha256 -out admin.pem

# --> Generate node certificates

openssl genrsa -out node-temp.key 2048

openssl pkcs8 -v1 PBE-SHA1-3DES -in node-temp.key -topk8 -nocrypt -out node.key

openssl req -new -key node.key -out node.csr -days 3650 -subj "/C=SP/ST=Catalunya/L=Barcelona/O=My Company S.L./OU=Engineering/CN=*"

openssl x509 -req -in node.csr -CA root-ca.pem -CAkey root-ca.key -CAcreateserial -sha256 -out node.pem

# --> Generate kibana certificates

openssl genrsa -out kibana-temp.key 2048

openssl pkcs8 -v1 PBE-SHA1-3DES -in kibana-temp.key -topk8 -nocrypt -out kibana.key

openssl req -new -key kibana.key -out kibana.csr -days 3650 -subj "/C=SP/ST=Catalunya/L=Barcelona/O=My Company S.L./OU=Engineering/CN=kibana"

openssl x509 -req -in kibana.csr -CA root-ca.pem -CAkey root-ca.key -CAcreateserial -sha256 -out kibana.pem

error

2020-07-17T15:24:10,099][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://admin:xxxxxx@odfe-master-01:9200/]}}
[2020-07-17T15:24:10,110][DEBUG][logstash.outputs.elasticsearch][main] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://admin:xxxxxx@odfe-master-01:9200/, :path=>"/"}
[2020-07-17T15:24:10,803][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://admin:xxxxxx@odfe-master-01:9200/"}
[2020-07-17T15:24:10,941][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-07-17T15:24:10,995][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-07-17T15:24:11,012][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://odfe-master-01:9200"]}
[2020-07-17T15:24:11,194][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-07-17T15:24:11,216][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x58a61414 run>"}
[2020-07-17T15:24:11,298][ERROR][logstash.outputs.elasticsearch][main] Failed to install template. {:message=>"Got response code '500' contacting Elasticsearch at URL 'https://odfe-master-01:9200/_xpack'", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:332:in `perform_request_to_url'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:319:in `block in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:414:in `with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:318:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:326:in `block in Pool'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:162:in `get'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:378:in `get_xpack_info'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/ilm.rb:57:in `ilm_ready?'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/ilm.rb:28:in `ilm_in_use?'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/template_manager.rb:14:in `install_template'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/common.rb:205:in `install_template'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/common.rb:49:in `block in setup_after_successful_connection'"]}
warning: thread "Ruby-0-Thread-6: :1" terminated with exception (report_on_exception is true):
LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError: Got response code '500' contacting Elasticsearch at URL 'https://odfe-master-01:9200/_xpack'
                    perform_request at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80
             perform_request_to_url at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:332
                    perform_request at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:319
                    with_connection at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:414
                    perform_request at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:318
                               Pool at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:326
                                get at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:162
                     get_xpack_info at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:378
                         ilm_ready? at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/ilm.rb:57
                        ilm_in_use? at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/ilm.rb:28
  setup_after_successful_connection at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/common.rb:50
[2020-07-17T15:24:11,394][FATAL][logstash.runner          ] An unexpected error occurred! {:error=>#<LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError: LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:332:in `perform_request_to_url'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:319:in `block in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:414:in `with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:318:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:326:in `block in Pool'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:162:in `get'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:378:in `get_xpack_info'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/ilm.rb:57:in `ilm_ready?'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/ilm.rb:28:in `ilm_in_use?'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.5.1-java/lib/logstash/outputs/elasticsearch/common.rb:50:in `block in setup_after_successful_connection'"]}
[2020-07-17T15:24:11,402][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

Fixed adding ilm_enabled => falseto the output conf

          output {
            elasticsearch {
              ilm_enabled => false
              hosts => ["https://odfe-master-01:9200"]
              index => "my_index_name-%{+YYYY.MM.dd}"
              user => "{{ logstash_odfe_username }}" # ansible vault var
              password => "{{ logstash_odfe_password }}"  # ansible vault var
              ssl => true
              ssl_certificate_verification => false
              cacert => "/usr/share/logstash/config/root-ca.pem"
              #cacert => "/usr/share/logstash/config/admin.pem"
            }
          }
1 Like

in

/etc/filebeat/filebeat.yml

add

setup.ilm.enabled: false