Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Load consumer properties from file #5088

Closed
wants to merge 1 commit into from

Conversation

jsargiot
Copy link

@jsargiot jsargiot commented Sep 7, 2018

Load custom Kafka Consumer properties from a file

Description

This change allows to define a custom properties file that will be loaded and overwrites the default configuration of the KafkaTransport while also allowing to set other Consumer properties (like TLS configuration certs and passwords).

The order of the properties also changes to give default values the lowest precedence, then the properties from the consumer properties file are loaded to override default values, and later input configuration to override consumer properties. Consumer client.id is also moved to the latest stage to avoid being overrided.

Empty values for the consumer properties file are ignored.

Failing to read the consumer properties file stops the KafkaTransport in the assumption that missing properties could affect the connection in unexpected ways.

Note: I'm creating the PR against branch 2.5, please let me know if it shouldn't, I was initially going for 2.4 but I guess that ship has sailed (happy to move the PR to 2.4 if you say so).

Motivation and Context

The provided fields in the Kafka Input configuration do not cover TLS configuration, we could add those, but then there are other configurations that people need to tweak depending on their environment needs, this is a never ending story of adding fields everytime there's someone with a specific need, I on the other hand, propose to load Kafka Consumer configuration from a config file, just like many other tools (for example kafka-console-consumer.sh with --consumer.config argument), this way we allow for an configuration of the Kafka Consumer (https://kafka.apache.org/documentation/#newconsumerconfigs) without modifying Graylog.

There are a number of issues that could be solved by this approach:

#5001: By setting fetch.message.max.bytes in the consumer.properties the Consumer would be able to fetch larger messages.

#4481: Same as the previous.

#3960: Setting ssl.* family for configuration would automatically add SSL support to the client (once new client is merged, old client does not support SSL).

#5073: Because of the order in which properties are loaded, we're allowing to modify the group.id from the consumer.properties file. So PR #5073 would be a nice addition but we can achieve the same without touching the UI.

#4770: The use of a consumer properties file would serve as base for this PR. The addition of SSL would have to be removed (in my opinion), because there are several security.protocols not just SSL and having options for each is just mad.

How Has This Been Tested?

Created Travis build to ensure the build is ok:
https://travis-ci.org/jsargiot/graylog2-server/builds/427290269

Tested locally-built package against:
Kafka 0.10
Kafka 1.1.1

Screenshots (if appropriate):

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)

Checklist:

  • My code follows the code style of this project.
  • My change requires a change to the documentation.
  • I have updated the documentation accordingly.
  • I have read the CONTRIBUTING document.
  • I have added tests to cover my changes.
  • All new and existing tests passed.

This change allows to define a custom properties file that will be loaded and overwrites the default configuration of the KafkaTransport while also allowing to set other Consumer properties (like TLS configuration certs and passwords).

The order of the properties also changes to give default values the lowest precedence, then the properties from the consumer properties file are loaded to override default values, and later input configuration to override consumer properties. Consumer `client.id` is also moved to the latest stage to avoid being overriden

Empty values for the consumer properties file are ignored.

Failing to read the consumer properties file stops the KafkaTransport in the assumption that missing properties could affect the connection in unexpected ways.

Signed-off-by: Joaquin Sargiotto <joaquinsargiotto@gmail.com>
@mpfz0r
Copy link
Contributor

mpfz0r commented Jun 8, 2020

this got implemented in #7504

@mpfz0r mpfz0r closed this Jun 8, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants