How to add file_fingerprint_lines option as a --log-opt option in docker run command for Docker AWS log driver
"But I didn't find any resources exaplaining how to set the file_fingerprint_lines command with docker run command."
I think that you have to set it in the CloudWatch Logs agent configuration file:
From the Amazon CloudWatch docs:
file_fingerprint_lines
Specifies the range of lines for identifying a file. The valid values are one number or two dash delimited numbers, such as '1', '2-5'. The default value is '1' so the first line is used to calculate fingerprint. Fingerprint lines are not sent to CloudWatch Logs unless all the specified lines are available
But, I think that the interesting point comes here:
What kinds of file rotations are supported?
The following file rotation mechanisms are supported:
Renaming existing log files with a numerical suffix, then re-creating the original empty log file. For example, /var/log/syslog.log is renamed /var/log/syslog.log.1. If /var/log/syslog.log.1 already exists from a previous rotation, it is renamed /var/log/syslog.log.2.
Truncating the original log file in place after creating a copy. For example, /var/log/syslog.log is copied to /var/log/syslog.log.1 and /var/log/syslog.log is truncated. There might be data loss for this case, so be careful about using this file rotation mechanism.
Creating a new file with a common pattern as the old one. For example, /var/log/syslog.log.2014-01-01 remains and /var/log/syslog.log.2014-01-02 is created.
The fingerprint (source ID) of the file is calculated by hashing the log stream key and the first line of file content. To override this behavior, the file_fingerprint_lines option can be used. When file rotation happens, the new file is supposed to have new content and the old file is not supposed to have content appended; the agent pushes the new file after it finishes reading the old file.
And, how to override it:
You can have more than one [logstream] section, but each must have a unique name within the configuration file, e.g., [logstream1], [logstream2], and so on. The [logstream] value along with the first line of data in the log file, define the log file's identity.
[general]state_file = valuelogging_config_file = valueuse_gzip_http_content_encoding = [true | false][logstream1]log_group_name = valuelog_stream_name = valuedatetime_format = valuetime_zone = [LOCAL|UTC]file = valuefile_fingerprint_lines = integer | integer-integermulti_line_start_pattern = regex | {datetime_format}initial_position = [start_of_file | end_of_file]encoding = [ascii|utf_8|..]buffer_duration = integerbatch_count = integerbatch_size = integer[logstream2]...