Data Prepper not starting

Hi

I have been following the instructions on opensearch site to start the data prepper.
I have pulled the data prepper container. But whn I try to run the data prepper I have the below error

command run

docker run  --expose 21890  opensearchproject/data-prepper:latest

Error:-

cli-plugins-data-prepper-1  | Exception in thread "main" java.lang.IllegalArgumentException: Invalid DataPrepper configuration file.
cli-plugins-data-prepper-1  |   at com.amazon.dataprepper.parser.model.DataPrepperConfiguration.fromFile(DataPrepperConfiguration.java:44)
cli-plugins-data-prepper-1  |   at com.amazon.dataprepper.DataPrepper.configure(DataPrepper.java:64)
cli-plugins-data-prepper-1  |   at com.amazon.dataprepper.DataPrepperExecute.main(DataPrepperExecute.java:27)```



Do we need to have a dataprepper file locally? can the command use the one already available in container?

Hi @ammujgd,

Data Prepper must take in 2 configuration files as shown below, the pipelines.yaml and the data-prepper-config.yaml.

docker run --name data-prepper --expose 21890 -v /full/path/to/pipelines.yaml:/usr/share/data-prepper/pipelines.yaml -v /full/path/to/data-prepper-config.yaml:/usr/share/data-prepper/data-prepper-config.yaml opensearchproject/opensearch-data-prepper:latest

While there are some example configs in the data-prepper/examples/config folder, it is best to use these as guidelines to create your own and pass them in the docker run command. Let me know if you have any more questions about setting up the configs and Data Prepper. Thanks!

Thanks

So when we pass our own yml files , I am getting the below error .Could you pls help

Command run :
PS C:\Program Files\Docker\cli-plugins> docker run --expose 21890 -v /c:/program files/docker/cli-plufins/pipelines.yaml:/usr/share/data-prepper/pipelines.yaml -v /c:/program files/docker/cli-plugins/data-prepper-config.yaml:/usr/share/data-prepper/data-prepper-config.yaml opensearchproject/data-prepper:latest

Error: docker: invalid reference format.

@ammujgd you are in a Windows environment. You should use ’ ’ when you have a space in the folder or file name.

try this

PS C:\Program Files\Docker\cli-plugins> docker run --expose 21890 -v '/c:/program files/docker/cli-plufins/pipelines.yaml:/usr/share/data-prepper/pipelines.yaml' -v '/c:/program files/docker/cli-plugins/data-prepper-config.yaml:/usr/share/data-prepper/data-prepper-config.yaml' opensearchproject/data-prepper:latest

Thanks

Now I get this error

docker: Error response from daemon: invalid mode: /usr/share/data-prepper/pipelines.yaml.

@ammujgd

As the last one, this is Windows environment. PS is very tricky.
The first issue is the typo “plufins”.

/c:/program files/docker/cli-plufins/pipelines.yaml

The second, are the colon symbols. You have to remove them. The below command should work for you.

docker run --expose 21890 -v ‘/c/program files/docker/cli-plugins/pipelines.yaml:/usr/share/data-prepper/pipelines.yaml’ -v ‘/c/program files/docker/cli-plugins/data-prepper-config.yaml:/usr/share/data-prepper/data-prepper-config.yaml’ opensearchproject/data-prepper:latest

I am sorry this is the error now

Failed to find the plugin with name elasticsearch. Please ensure that plugin is annotated with appropriate values
2021-10-22T05:45:22,858 [main] ERROR com.amazon.dataprepper.parser.PipelineParser - Construction of pipeline components failed, skipping building of pipeline [service-map-pipeline] and its connected pipelines
com.amazon.dataprepper.plugins.PluginException: Failed to find the plugin with name [elasticsearch]. Please ensure that plugin is annotated with appropriate values

pipeline.yaml is as below

entry-pipeline:
  delay: "100"
  source:
    otel_trace_source:
      health_check_service: true
      ssl: false
  sink:
    - pipeline:
        name: "raw-pipeline"
    - pipeline:
        name: "service-map-pipeline"
raw-pipeline:
  source:
    pipeline:
      name: "entry-pipeline"
  prepper:
    - otel_trace_raw_prepper:
  sink:
    - elasticsearch:
        hosts: 
          - "http://localhost:9200"
        insecure: true
        aws_region: "us-east-1"
        trace_analytics_raw: true
service-map-pipeline:
  delay: "100"
  source:
    pipeline:
      name: "entry-pipeline"
  prepper:
    - service_map_stateful:
  sink:
    - elasticsearch:
        hosts: 
          - "http://localhost:9200"
        username: "admin"
        password: "admin"
        insecure: true
        aws_region: "us-east-1"
        trace_analytics_service_map: true

@ammujgd

The sink plugin is no longer named elasticsearch. It should work if you change it to opensearch as shown in the following config.

entry-pipeline:
  delay: "100"
  source:
    otel_trace_source:
      health_check_service: true
      ssl: false
  sink:
    - pipeline:
        name: "raw-pipeline"
    - pipeline:
        name: "service-map-pipeline"
raw-pipeline:
  source:
    pipeline:
      name: "entry-pipeline"
  prepper:
    - otel_trace_raw_prepper:
  sink:
    - opensearch:
        hosts: 
          - "http://localhost:9200"
        insecure: true
        aws_region: "us-east-1"
        trace_analytics_raw: true
service-map-pipeline:
  delay: "100"
  source:
    pipeline:
      name: "entry-pipeline"
  prepper:
    - service_map_stateful:
  sink:
    - opensearch:
        hosts: 
          - "http://localhost:9200"
        username: "admin"
        password: "admin"
        insecure: true
        aws_region: "us-east-1"
        trace_analytics_service_map: true

Let me know if you have any more problems after making this change.

Thanks I have made the change and I have this error now

2021-10-25T06:21:33,826 [main] INFO com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink - Starting OpenSearch sink
2021-10-25T06:21:34,125 [main] INFO com.amazon.dataprepper.plugins.sink.opensearch.ConnectionConfiguration - Using the trust all strategy
2021-10-25T06:21:34,351 [main] ERROR com.amazon.dataprepper.plugins.PluginFactory - Encountered exception while instantiating the plugin OpenSearchSink
java.lang.reflect.InvocationTargetException: null
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:64) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
at java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Constructor.java:481) ~[?:?]
at com.amazon.dataprepper.plugins.PluginFactory.newPlugin(PluginFactory.java:35) ~[data-prepper.jar:1.0.0.0-rc1] at com.amazon.dataprepper.plugins.sink.SinkFactory.newSink(SinkFactory.java:23) ~[data-prepper.jar:1.0.0.0-rc1]
at com.amazon.dataprepper.parser.PipelineParser.buildSinkOrConnector(PipelineParser.java:160) ~[data-prepper.jar:1.0.0.0-rc1]
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) [?:?]
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) [?:?]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) [?:?]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) [?:?]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) [?:?]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) [?:?]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) [?:?]
at com.amazon.dataprepper.parser.PipelineParser.buildPipelineFromConfiguration(PipelineParser.java:109) [data-prepper.jar:1.0.0.0-rc1]
at com.amazon.dataprepper.parser.PipelineParser.parseConfiguration(PipelineParser.java:75) [data-prepper.jar:1.0.0.0-rc1]
at com.amazon.dataprepper.DataPrepper.execute(DataPrepper.java:108) [data-prepper.jar:1.0.0.0-rc1]
at com.amazon.dataprepper.DataPrepperExecute.main(DataPrepperExecute.java:31) [data-prepper.jar:1.0.0.0-rc1]
Caused by: java.lang.RuntimeException: Connection refused
at com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink.(OpenSearchSink.java:85) ~[data-prepper.jar:1.0.0.0-rc1]
… 19 more
Caused by: java.net.ConnectException: Connection refused
at org.opensearch.client.RestClient.extractAndWrapCause(RestClient.java:892) ~[data-prepper.jar:1.0.0.0-rc1]
at org.opensearch.client.RestClient.performRequest(RestClient.java:296) ~[data-prepper.jar:1.0.0.0-rc1]
at org.opensearch.client.RestClient.performRequest(RestClient.java:283) ~[data-prepper.jar:1.0.0.0-rc1]
at org.opensearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1394) ~[data-prepper.jar:1.0.0.0-rc1]
at org.opensearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1364) ~[data-prepper.jar:1.0.0.0-rc1]
at org.opensearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:1334) ~[data-prepper.jar:1.0.0.0-rc1]
at org.opensearch.client.ClusterClient.getSettings(ClusterClient.java:106) ~[data-prepper.jar:1.0.0.0-rc1]
at com.amazon.dataprepper.plugins.sink.opensearch.IndexStateManagement.checkISMEnabled(IndexStateManagement.java:36) ~[data-prepper.jar:1.0.0.0-rc1]
at com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink.start(OpenSearchSink.java:92) ~[data-prepper.jar:1.0.0.0-rc1]
at com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink.(OpenSearchSink.java:83) ~[data-prepper.jar:1.0.0.0-rc1]
… 19 more
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.Net.pollConnect(Native Method) ~[?:?]
at sun.nio.ch.Net.pollConnectNow(Net.java:660) ~[?:?]
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:875) ~[?:?]
at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvent(DefaultConnectingIOReactor.java:174) ~[data-prepper.jar:1.0.0.0-rc1]
at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvents(DefaultConnectingIOReactor.java:148) ~[data-prepper.jar:1.0.0.0-rc1]
at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor.execute(AbstractMultiworkerIOReactor.java:351) ~[data-prepper.jar:1.0.0.0-rc1]
at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.execute(PoolingNHttpClientConnectionManager.java:221) ~[data-prepper.jar:1.0.0.0-rc1]
at org.apache.http.impl.nio.client.CloseableHttpAsyncClientBase$1.run(CloseableHttpAsyncClientBase.java:64) ~[data-prepper.jar:1.0.0.0-rc1]
at java.lang.Thread.run(Thread.java:832) ~[?:?]
2021-10-25T06:21:34,357 [main] ERROR com.amazon.dataprepper.parser.PipelineParser - Construction of pipeline components failed, skipping building of pipeline [raw-pipeline] and its connected pipelines
com.amazon.dataprepper.plugins.PluginException: Encountered exception while instantiating the plugin OpenSearchSink
at com.amazon.dataprepper.plugins.PluginFactory.newPlugin(PluginFactory.java:45) ~[data-prepper.jar:1.0.0.0-rc1] at com.amazon.dataprepper.plugins.sink.SinkFactory.newSink(SinkFactory.java:23) ~[data-prepper.jar:1.0.0.0-rc1]
at com.amazon.dataprepper.parser.PipelineParser.buildSinkOrConnector(PipelineParser.java:160) ~[data-prepper.jar:1.0.0.0-rc1]
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) ~[?:?]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?]
at com.amazon.dataprepper.parser.PipelineParser.buildPipelineFromConfiguration(PipelineParser.java:109) [data-prepper.jar:1.0.0.0-rc1]
at com.amazon.dataprepper.parser.PipelineParser.parseConfiguration(PipelineParser.java:75) [data-prepper.jar:1.0.0.0-rc1]
at com.amazon.dataprepper.DataPrepper.execute(DataPrepper.java:108) [data-prepper.jar:1.0.0.0-rc1]
at com.amazon.dataprepper.DataPrepperExecute.main(DataPrepperExecute.java:31) [data-prepper.jar:1.0.0.0-rc1]
Caused by: java.lang.reflect.InvocationTargetException
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:64) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
at java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Constructor.java:481) ~[?:?]
at com.amazon.dataprepper.plugins.PluginFactory.newPlugin(PluginFactory.java:35) ~[data-prepper.jar:1.0.0.0-rc1] … 13 more
Caused by: java.lang.RuntimeException: Connection refused
at com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink.(OpenSearchSink.java:85) ~[data-prepper.jar:1.0.0.0-rc1]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:64) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
at java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Constructor.java:481) ~[?:?]
at com.amazon.dataprepper.plugins.PluginFactory.newPlugin(PluginFactory.java:35) ~[data-prepper.jar:1.0.0.0-rc1] … 13 more
Caused by: java.net.ConnectException: Connection refused
at org.opensearch.client.RestClient.extractAndWrapCause(RestClient.java:892) ~[data-prepper.jar:1.0.0.0-rc1]
at org.opensearch.client.RestClient.performRequest(RestClient.java:296) ~[data-prepper.jar:1.0.0.0-rc1]
at org.opensearch.client.RestClient.performRequest(RestClient.java:283) ~[data-prepper.jar:1.0.0.0-rc1]
at org.opensearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1394) ~[data-prepper.jar:1.0.0.0-rc1]
at org.opensearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1364) ~[data-prepper.jar:1.0.0.0-rc1]
at org.opensearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:1334) ~[data-prepper.jar:1.0.0.0-rc1]
at org.opensearch.client.ClusterClient.getSettings(ClusterClient.java:106) ~[data-prepper.jar:1.0.0.0-rc1]
at com.amazon.dataprepper.plugins.sink.opensearch.IndexStateManagement.checkISMEnabled(IndexStateManagement.java:36) ~[data-prepper.jar:1.0.0.0-rc1]
at com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink.start(OpenSearchSink.java:92) ~[data-prepper.jar:1.0.0.0-rc1]
at com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink.(OpenSearchSink.java:83) ~[data-prepper.jar:1.0.0.0-rc1]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:64) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
at java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Constructor.java:481) ~[?:?]
at com.amazon.dataprepper.plugins.PluginFactory.newPlugin(PluginFactory.java:35) ~[data-prepper.jar:1.0.0.0-rc1] … 13 more
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.Net.pollConnect(Native Method) ~[?:?]
at sun.nio.ch.Net.pollConnectNow(Net.java:660) ~[?:?]
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:875) ~[?:?]
at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvent(DefaultConnectingIOReactor.java:174) ~[data-prepper.jar:1.0.0.0-rc1]
at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvents(DefaultConnectingIOReactor.java:148) ~[data-prepper.jar:1.0.0.0-rc1]
at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor.execute(AbstractMultiworkerIOReactor.java:351) ~[data-prepper.jar:1.0.0.0-rc1]
at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.execute(PoolingNHttpClientConnectionManager.java:221) ~[data-prepper.jar:1.0.0.0-rc1]
at org.apache.http.impl.nio.client.CloseableHttpAsyncClientBase$1.run(CloseableHttpAsyncClientBase.java:64) ~[data-prepper.jar:1.0.0.0-rc1]
at java.lang.Thread.run(Thread.java:832) ~[?:?]
2021-10-25T06:21:34,361 [main] ERROR com.amazon.dataprepper.DataPrepper - No valid pipeline is available for execution, exiting

@ammujgd

It looks to me like you don’t have an opensearch sink currently running on localhost:9200 when you begin running data prepper, which is causing this exception: java.net.ConnectException: Connection refused.

If you are just trying to get data prepper started and working without errors you could start with changing the opensearch sink to an stdout sink for testing, but if you want to send to opensearch you will need to have a sink to send data to.

Thanks… How do we add sink to opensearch?

@ammujgd ,
Do you have OpenSearch running?

I’m unsure if OpenSearch runs on Windows or not, but you may wish to check this Quick Start guide for running OpenSearch. If you have it running on your machine, Data Prepper should be able to connect to it with the default pipeline configuration.

Regarding @graytaylor0 's suggestion, you can change both sinks in the configuration:

sink:
    - elasticsearch:
        hosts: 
          - "http://localhost:9200"
        username: "admin"
        password: "admin"
        insecure: true
        aws_region: "us-east-1"
        trace_analytics_service_map: true

to

sink:
  - stdout:

Then you can run Data Prepper and the output will write to stdout instead of OpenSearch.

1 Like

Thanks ,Data Prepper is starting now with stdout sink. But I don’t get any data from collector to data prepper.

The problem is I have my own opensearch, exporter , collector , data prepper running locally. So I need an opensearch sink itself so tat the data flows through properly.

Now my question is how can I point my data prepper properly yo my open search sink as I was having error asper my previous post.

Quoting what @graytaylor0 has mentioned “but if you want to send to opensearch you will need to have a sink to send data to.” :- How can I achieve this ?

I already have opensearch running on - “https://localhost:9200” . so that Sink should have worked right? Why is the data prepper not ablw to connect to opensearch sink

https://localhost:9200/ is up and running

entry-pipeline:
  delay: "100"
  source:
    otel_trace_source:
      ssl: false
  sink:
    - pipeline:
        name: "raw-pipeline"
    - pipeline:
        name: "service-map-pipeline"
raw-pipeline:
  source:
    pipeline:
      name: "entry-pipeline"
  prepper:
    - otel_trace_raw_prepper:
  sink:
    - opensearch:
        hosts: 
          - "https://localhost:9200"
        username: "admin"
        password: "admin"
        trace_analytics_raw: true
service-map-pipeline:
  delay: "100"
  source:
    pipeline:
      name: "entry-pipeline"
  prepper:
    - service_map_stateful:
  sink:
    - opensearch:
        hosts: 
          - "https://localhost:9200"
        username: "admin"
        password: "admin"
        trace_analytics_service_map: true

@ammujgd Thanks for your interest in data-prepper. This is George. Looking at your last post. We probably need to confirm:

how is your opensearch backend launched? Is it through (1) docker run? (2) docker-compose? (3) running the binary (jar) on your local machine?

Notice that if the opensearch backend was listening on your local machine network instead of in the same docker network as your data-prepper, then you probably need to change the data-prepper container network mode to be host:

docker run ... opensearchproject/data-prepper:latest --network="host"

Ref: Docker run reference | Docker Documentation

Others might reply further

I am launching open search using the command

`docker run -p 9200:9200 -p 9600:9600 -e “discovery.type=single-node” opensearchproject/opensearch

Also as per your @qchea suggestion I tried running the below command


docker run --expose 21890 -v `/c/program files/docker/cli-plugins/pipeline.yaml:/usr/share/data-prepper/pipelines.yaml' -v `/c/program files/docker/cli-plugins/data-prepper-config.yaml:/usr/share/data-prepper/data-prepper-config.yaml' opensearchproject/data-prepper:latest --network="host"

Getting below error now

docker: Error response from daemon: OCI runtime create failed: container_linux.go:380: starting container process caused: exec: "--network=host": executable file not found in $PATH: unknown.

`

You seem to be mixing single quotes and back ticks in your command.

Thanks I changed my command to docker run --expose 21890 -v '/c/program files/docker/cli-plugins/pipeline.yaml:/usr/share/data-prepper/pipelines.yaml' -v '/c/program files/docker/cli-plugins/data-prepper-config.yaml:/usr/share/data-prepper/data-prepper-config.yaml' opensearchproject/data-prepper:latest --network="host"

still i see same error

ocker: Error response from daemon: OCI runtime create failed: container_linux.go:380: starting container process caused: exec: "--network=host": executable file not found in $PATH: unknown.
type or paste code here

Try;
docker run --expose 21890 -v ‘/c/program files/docker/cli-plugins/pipeline.yaml:/usr/share/data-prepper/pipelines.yaml’ -v ‘/c/program files/docker/cli-plugins/data-prepper-config.yaml:/usr/share/data-prepper/data-prepper-config.yaml’ --network “host” opensearchproject/data-prepper:latest

As the --network=“host” was being used as a command in the image.

Thanks

2021-11-08T12:19:13,112 [main] INFO  com.amazon.dataprepper.plugins.source.oteltrace.OTelTraceSource - Started otel_trace_source...
2021-11-08T12:19:13,112 [main] INFO  com.amazon.dataprepper.pipeline.Pipeline - Pipeline [entry-pipeline] - Submitting request to initiate the pipeline processing
2021-11-08T12:19:13,116 [main] INFO  com.amazon.dataprepper.pipeline.Pipeline - Pipeline [service-map-pipeline] - Initiating pipeline execution
2021-11-08T12:19:13,116 [main] INFO  com.amazon.dataprepper.pipeline.Pipeline - Pipeline [service-map-pipeline] - Submitting request to initiate the pipeline processing
2021-11-08T12:19:13,117 [main] INFO  com.amazon.dataprepper.pipeline.Pipeline - Pipeline [raw-pipeline] - Initiating pipeline execution
2021-11-08T12:19:13,117 [main] INFO  com.amazon.dataprepper.pipeline.Pipeline - Pipeline [raw-pipeline] - Submitting request to initiate the pipeline processing
2021-11-08T12:19:13,119 [main] INFO  com.amazon.dataprepper.pipeline.server.DataPrepperServer - Data Prepper server running at :4900
2021-11-08T12:19:13,226 [entry-pipeline-prepper-worker-1-thread-1] INFO  com.amazon.dataprepper.pipeline.ProcessWorker -  entry-pipeline Worker: No records received from buffer
2021-11-08T12:19:13,226 [service-map-pipeline-prepper-worker-3-thread-1] INFO  com.amazon.dataprepper.pipeline.ProcessWorker -  service-map-pipeline Worker: No records received from buffer
2021-11-08T12:19:16,120 [raw-pipeline-prepper-worker-5-thread-1] INFO  com.amazon.dataprepper.pipeline.ProcessWorker -  raw-pipeline Worker: No records received from buffer

Looks like the data prepper is started…

But its not receivng anything from my collector… Hre is my collector logs… Collector is getting my information from exporter as below

021-11-08T12:53:27.158Z        INFO    loggingexporter/logging_exporter.go:56  MetricsExporter {"#metrics": 5}
2021-11-08T12:53:27.158Z        DEBUG   loggingexporter/logging_exporter.go:66  ResourceMetrics #0
Resource labels:
     -> host.arch: STRING(amd64)
     -> host.name: STRING(DESKTOP-GK9AKCD)
     -> os.description: STRING(Windows 10 10.0)
     -> os.type: STRING(windows)
     -> process.command_line: STRING(C:\Program Files\Java\jre1.8.0_271;bin;java.exe -Xms128m -Xmx600m -javaagent:C:\Users\AnjanaAsok\Downloads\opentelemetry-javaagent-all.jar -Dotel.traces.exporter=otlp -Dotel.metrics.exporter=otlp -Dotel.exporter.otlp.endpoint=http://localhost:4317 -Dotel.javaagent.debug=true -Dfile.encoding=Cp1252)
     -> process.executable.path: STRING(C:\Program Files\Java\jre1.8.0_271;bin;java.exe)
     -> process.pid: INT(9564)
     -> process.runtime.description: STRING(Oracle Corporation Java HotSpot(TM) 64-Bit Server VM 25.271-b09)
     -> process.runtime.name: STRING(Java(TM) SE Runtime Environment)
     -> process.runtime.version: STRING(1.8.0_271-b09)
     -> service.name: STRING(unknown_service:java)
     -> telemetry.auto.version: STRING(1.6.2)
     -> telemetry.sdk.language: STRING(java)
     -> telemetry.sdk.name: STRING(opentelemetry)
     -> telemetry.sdk.version: STRING(1.6.0)
InstrumentationLibraryMetrics #0
InstrumentationLibrary io.opentelemetry.javaagent.shaded.instrumentation.runtimemetrics.MemoryPools
Metric #0
Descriptor:
     -> Name: runtime.jvm.memory.pool
     -> Description: Bytes of a given JVM memory pool.
     -> Unit: By
     -> DataType: Gauge
NumberDataPoints #0
Data point attributes:
     -> pool: STRING(Code Cache)
     -> type: STRING(max)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2021-11-08 12:53:24.681 +0000 UTC
Value: 251658240
NumberDataPoints #1
Data point attributes:
     -> pool: STRING(Metaspace)
     -> type: STRING(used)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2021-11-08 12:53:24.681 +0000 UTC
Value: 20388912
NumberDataPoints #2
Data point attributes:
     -> pool: STRING(PS Old Gen)
     -> type: STRING(max)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2021-11-08 12:53:24.681 +0000 UTC
Value: 419430400
NumberDataPoints #3
Data point attributes:
     -> pool: STRING(PS Survivor Space)
     -> type: STRING(used)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2021-11-08 12:53:24.681 +0000 UTC
Value: 5226512
NumberDataPoints #4
Data point attributes:
     -> pool: STRING(Code Cache)
     -> type: STRING(committed)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2021-11-08 12:53:24.681 +0000 UTC
Value: 2686976
NumberDataPoints #5
Data point attributes:
     -> pool: STRING(Metaspace)
     -> type: STRING(committed)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2021-11-08 12:53:24.681 +0000 UTC
Value: 21233664
NumberDataPoints #6
Data point attributes:
     -> pool: STRING(Compressed Class Space)
     -> type: STRING(committed)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2021-11-08 12:53:24.681 +0000 UTC
Value: 3145728
NumberDataPoints #7
Data point attributes:
     -> pool: STRING(Code Cache)
     -> type: STRING(used)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2021-11-08 12:53:24.681 +0000 UTC
Value: 2614016
NumberDataPoints #8

Any idea why the data prepper and collector are not communciating?

I am starting collector as below

docker run --rm -p 13133:13133 -p 14250:14250 -p 14268:14268 -p 55678-55679:55678-55679 -p 4317:4317 -p 8888:8888 -p 9411:9411 --name otelcol otel/opentelemetry-collector 


Never used data-prepper. The port you are exposing (21890) doesn’t seem to be the one the logs are showing data-prepper to be using (4900).