logstash setting

I have a question.
Can I find an example of the elasticsearch output setting in logstash?

Hey - I tried this and It worked with the demo certificates, basic installation of Open Distro for Elasticsearch :

output{
	elasticsearch {
		hosts => ["https://localhost:9200"]
		index => "myindex"
		cacert => "c:/Users/Thi/Downloads/logstash/config/root-ca.pem"
		user => "logstash"
		password => "logstash"
		ssl => true
		ssl_certificate_verification => false
  }
}

Thi

1 Like

thank you!
I have solved the problem.

Hi Guys,

did that solved your issue? I have this pipeline:

output {
elasticsearch {
hosts => “https://myelasticscluster:9200
index => “logstash-%{+YYYY.MM.dd}”
document_type => “log”
user => admin
password => admin
}
}

And still getting error: [2019-05-14T14:28:08,984][WARN ][logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>“https://admin:xxxxxx@myelasticserver:9200/”, :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>“Elasticsearch Unreachable: [https://admin:xxxxxx@myelasticserver:9200/][Manticore::ClientProtocolException] PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target”}

try to ping myelasticscluster hostname from machine were your instance is running

I have tried to do it from master and ping was successful. Any idea? I have also tried add this to output:

ssl => false
ssl_certificate_verification => false

it can be also firewall.

Im using docker, running on the same machine. My logstash configuration:

cat logstash.conf

input { 
  beats {
    port => 5044
  }
}

output {
        elasticsearch {
                hosts => ["odfe-node1:9200"]
                index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
                document_type => "%{[@metadata][type]}"
                user => "logstash"
                password => "logstash"
                ssl => false
                ssl_certificate_verification => false
  }
}

Just wonder why you put ssl as false if you use https ?
Have you tried to curl the elasticsearch from the logstash machine?
Might be firewall issues.
Me I used the default certificates so I added them.

Thi

Hi Guys thanks for reply, really appreciate yes I have tried.

Insecure
[root@serverxxx ~]# curl --insecure https://clusterserverxxx:9200
Unauthorized[root@serverxx ~]#
secure
curl https://clusterserverxxx:9200
curl performs SSL certificate verification by default, using a “bundle”
of Certificate Authority (CA) public keys (CA certs). If the default
bundle file isn’t adequate, you can specify an alternate file
using the --cacert option.
If this HTTPS server uses a certificate signed by a CA represented in
the bundle, the certificate verification probably failed due to a
problem with the certificate (it might be expired, or the name might
not match the domain name in the URL).
If you’d like to turn off curl’s verification of the certificate, use
the -k (or --insecure) option.

Ogulman: Do you really use OPENDISTRO Elastic? I did not set anything specific. Only thing that I have changed was plugin for security so i bind it to our domain so we can authorize with our AD accounts.

But when I setup the output as I did that I got error that I posted. I think that whole problem has a root in SSL certification. Or should I disable the SSL?

ThiBaudF:

I changed that SSL to true and this is what I get now

[2019-05-14T22:38:36,377][WARN ][logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>“https://admin:xxxxxx@elkcluster:9200/”, :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>“Elasticsearch Unreachable: [https://admin:xxxxxx@elkcluster:9200/][Manticore::ClientProtocolException] PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target”}

To make sure that Elasticsearch is responding, run:
curl -XGET http://localhost:9200 -u admin:admin --insecure

Elasticsearch should respond with something like:

{ 
  "name" : "MIU0_tc",
  "cluster_name" : "odfe-cluster",
  "cluster_uuid" : "2ffW1Y_fQTCRtYvgInNRKQ",
  "version" : {
    "number" : "6.6.2",
    "build_flavor" : "oss",
    "build_type" : "tar",
    "build_hash" : "3bd3e59",
    "build_date" : "2019-03-06T15:16:26.864148Z",
    "build_snapshot" : false,
    "lucene_version" : "7.6.0",
    "minimum_wire_compatibility_version" : "5.6.0",
    "minimum_index_compatibility_version" : "5.0.0"
  },
  "tagline" : "You Know, for Search"
}

Hi,

problem was with the SSL certificate. I have modified output. I have used our public SSL key.

output {
elasticsearch {
hosts => “https://ourescluster:9200/
index => “logstash-%{+YYYY.MM.dd}”
document_type => “log”
user => “admin”
password => “admin”
ssl => true
cacert => “/pem/logstash.pem”
}
}

Thanks for the help !! :slight_smile:

The latest compatible version logstash with elasticsearch 6.6.2 or 6.7.1 is logstash-oss:6.8.0.
https://www.elastic.co/support/matrix#matrix_compatibility

logstash:
image: docker.elastic.co/logstash/logstash-oss:6.8.0
container_name: logstash
volumes:
- ./logstash.conf:/usr/share/logstash/pipeline/logstash.conf
ports:
- "5044:5044"
networks:
- odfe-net