Recommand · October 22, 2021 0

How to configure Filebeat to read log files, using ELK stack?

I am new to nowadays ELK stack.

I need to have an ability to read logs from path, using ElasticSearch, Kibana and Filebeat.
I’ve tried to configure them step by step with ELK guides. But I still cannot see my logs in Kibana.

Now I work only with localhost.

Here is how my .yml files are configured:

elasticsearch.yml:

xpack.security.enabled: true

kibana.yml:

elasticsearch.username: "elastic"
elasticsearch.password: "elastic1"

filebeat.yml:

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - C:\\logs\\*.log


- type: filestream
  enabled: false
  paths:
    - C:\logs\*

filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false

setup.template.settings:
  index.number_of_shards: 1


setup.kibana:
  host: "localhost:5601"
  username: "kibana_system"
  password: "kibana_system1"


output.elasticsearch:
  hosts: ["localhost:9200"]
  username: "elastic"
  password: "elastic1"
setup.kibana:
  host: "localhost:5601"

processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~

So I execute ElasticSearch and Kibana. It’s OK. I set up Filebeat, using PowerShell like in guide. Many dashboards are being loaded. But I can’t see anything, related to my logs in Discovery tab…

Tell me, please, if I do anything wrong, or may be I need to configure the files more deeply?