Setting ICAgent Structuring Parsing Rules
You can use ICAgent to collect logs to LTS. When creating a log ingestion configuration, you can customize settings such as parsing, whitelist, and blacklist rules, and raw log uploading. An ICAgent collection configuration defines the process of collecting logs of the same type from a server, parsing them, and sending them to a specified log stream.
ICAgent supports only RE2 regular expressions. For details, see Syntax.
Advantages
- Collecting log data in non-intrusive mode as log files: You do not need to modify application code, and log collection does not affect application running.
- Handling various exceptions during log collection: Security measures such as proactive retry and local cache are taken when a network or server exception occurs.
- Centralized management: After installing ICAgent, you only need to set configurations such as host groups and ICAgent collection on the LTS console.
- Comprehensive self-protection mechanisms: ICAgent is designed with strict limits and protective measures regarding CPU, memory, and network usage, to minimize its impact on other services sharing the same server.
LTS enables combined parsing, allowing you to configure different structuring parsing rules for a log stream. Before configuring log ingestion, get to know the structuring parsing rules of ICAgent collection to facilitate your operations.
LTS supports the following log structuring parsing rules:
- Single-Line - Full-Text Log: Each log line is displayed as a single log event.
- Multi-Line - Full-Text Log: Multiple lines of an exception log, such as a Java exception, are merged into a single log event, while regular single-line logs remain unchanged.
- JSON: parses JSON logs into key-value pairs.
- Delimiter: applicable to logs separated by fixed symbols (such as spaces, commas, and colons).
- Single-Line - Completely Regular: uses a regular expression to extract fields from single-line logs in any format. After entering a regular expression, click Verify to verify it.
- Multi-Line - Completely Regular: uses a regular expression to extract fields from multi-line logs in any format. The regular expression of the first line can be automatically generated or manually entered. After entering a regular expression, click Verify to verify it.
- Combined Parsing: applicable to logs in multiple nested formats, for example, JSON logs with delimiters. For logs with complex structures and cannot be parsed using a single parsing mode (for example, completely regular or JSON), you can use this mode. It allows you to input code of JSON format on the console to define the pipeline logic for log parsing. You can add one or more plug-in processing configurations. ICAgent executes the configurations one by one based on the specified sequence.
- From now: queries log data generated in a time range that ends with the current time, such as the previous 1, 5, or 15 minutes. For example, if the current time is 19:20:31 and 1 hour is selected as the relative time from now, the charts on the dashboard display the log data that is generated from 18:20:31 to 19:20:31.
- From last: queries log data generated in a time range that ends with the current time, such as the previous 1 or 15 minutes. For example, if the current time is 19:20:31 and 1 hour is selected as the relative time from last, the charts on the dashboard display the log data that is generated from 18:00:00 to 19:00:00.
- Specified: queries log data that is generated in a specified time range.
Single-Line - Full-Text Log
If you want to display each line of log data as a single log on the LTS page, select Single-Line - Full-Text Log.
- Select Single-Line - Full-Text Log.
- Enable Log Filtering (disabled by default) as required and add a maximum of 20 whitelist or blacklist rules.
Figure 1 Log filtering rules
- Add a whitelist rule (available only when Log Filtering is enabled).
You can add filtering rules to retain valuable log data by applying regular expressions to the values of specified keys. A filtering rule acts as a matching criterion, collecting and reporting only logs that match the specified regular expression. When adding multiple whitelist filtering rules, you can select the And or Or relationship. This means a log will be collected when it satisfies all or any of the rules.
- Click Add and enter a key and filtering rule (regular expression). In single-line and multi-line full-text modes, content is used as the key name {key} of the full text by default. For example, to collect log events containing hello from the log source file, set the filtering rule to .*hello.*.
- Click
in the Operation column. In the displayed dialog box, enter a field value, and click Verify to verify the rule.
Figure 2 Verification - After the verification is successful, click OK or Close to exit the dialog box.
- Add a blacklist rule (available only when Log Filtering is enabled).
You can add filtering rules to retain valuable log data by applying regular expressions to the values of specified keys. A filtering rule acts as a discarding criterion, discarding logs that match the specified regular expression. When adding multiple blacklist filtering rules, you can select the And or Or relationship. This means a log will be collected when it satisfies all or any of the rules.
- Click Add and enter a key and filtering rule (regular expression). In single-line and multi-line full-text modes, content is used as the key name {key} of the full text by default. For example, if you do not want to collect log events containing test from the log source file, set the filtering rule to .*test.*.
- Click
in the Operation column. In the displayed dialog box, enter a field value, and click Verify to verify the rule.
- After the verification is successful, click OK or Close to exit the dialog box.
- Add a whitelist rule (available only when Log Filtering is enabled).
Multi-Line - Full-Text Log
If the collected logs include multi-line exception logs, such as Java exceptions or memory faults, you can select Multi-Line - Full-Text Log. This mode merges multiple lines of an exception log into a single log event, while keeping each line of regular logs displayed as an individual log event, facilitating log storage and search.
- Select Multi-Line - Full-Text Log.
- Select a log example from existing logs or paste it from the clipboard.
- Click Select from Existing Logs, filter logs by time range, select a log event, and click OK.
- Click Paste from Clipboard to paste the copied log content to the Log Example box.
- A regular expression can be automatically generated or manually entered under Regular Expression of the First Line. The regular expression of the first line must match the entire first line, not just the beginning of the first line.
Figure 3 Regular expression of the first line
- Enable Log Filtering (disabled by default) as required and add a maximum of 20 whitelist or blacklist rules.
Figure 4 Log filtering rules
- Add a whitelist rule (available only when Log Filtering is enabled).
You can add filtering rules to retain valuable log data by applying regular expressions to the values of specified keys. A filtering rule acts as a matching criterion, collecting and reporting only logs that match the specified regular expression. When adding multiple whitelist filtering rules, you can select the And or Or relationship. This means a log will be collected when it satisfies all or any of the rules.
- Click Add and enter a key and filtering rule (regular expression). In single-line and multi-line full-text modes, content is used as the key name {key} of the full text by default. For example, to collect log events containing hello from the log source file, set the filtering rule to .*hello.*.
- Click
in the Operation column. In the displayed dialog box, enter a field value, and click Verify to verify the rule.
Figure 5 Verification - After the verification is successful, click OK or Close to exit the dialog box.
- Add a blacklist rule (available only when Log Filtering is enabled).
You can add filtering rules to retain valuable log data by applying regular expressions to the values of specified keys. A filtering rule acts as a discarding criterion, discarding logs that match the specified regular expression. When adding multiple blacklist filtering rules, you can select the And or Or relationship. This means a log will be collected when it satisfies all or any of the rules.
- Click Add and enter a key and filtering rule (regular expression). In single-line and multi-line full-text modes, content is used as the key name {key} of the full text by default. For example, if you do not want to collect log events containing test from the log source file, set the filtering rule to .*test.*.
- Click
in the Operation column. In the displayed dialog box, enter a field value, and click Verify to verify the rule.
- After the verification is successful, click OK or Close to exit the dialog box.
- Add a whitelist rule (available only when Log Filtering is enabled).
JSON
This option is applicable to JSON logs and splits them into key-value pairs.
- Choose JSON.
- Enable Log Filtering (disabled by default) as required and add a maximum of 20 whitelist or blacklist rules.
Figure 6 Log filtering rules
- Add a whitelist rule (available only when Log Filtering is enabled).
You can add filtering rules to retain valuable log data by applying regular expressions to the values of specified keys. A filtering rule acts as a matching criterion, collecting and reporting only logs that match the specified regular expression. When adding multiple whitelist filtering rules, you can select the And or Or relationship. This means a log will be collected when it satisfies all or any of the rules.
- Click Add and enter a key value and a filtering rule (regular expression). A key value is a log field name. For example, to collect log events containing hello, enter hello as the key value, and set the filtering rule to .*hello.*.
- Click
in the Operation column. In the displayed dialog box, enter a field value, and click Verify to verify the rule.
Figure 7 Verification - After the verification is successful, click OK or Close to exit the dialog box.
- Add a blacklist rule (available only when Log Filtering is enabled).
You can add filtering rules to retain valuable log data by applying regular expressions to the values of specified keys. A filtering rule acts as a discarding criterion, discarding logs that match the specified regular expression. When adding multiple blacklist filtering rules, you can select the And or Or relationship. This means a log will be excluded when it satisfies all or any of the rules.
- Click Add and enter a key value and a filtering rule (regular expression). A key value is a log field name. For example, if you do not want to collect log events containing test, enter hello as the key value, and set the filtering rule to .*test.*.
- Click
in the Operation column. In the displayed dialog box, enter a field value, and click Verify to verify the rule.
- After the verification is successful, click OK or Close to exit the dialog box.
- Add a whitelist rule (available only when Log Filtering is enabled).
- Raw Log Upload:
After this function is enabled, raw logs are uploaded to LTS as the value of the content field.
- Upload Parsing Failure Log:
After this function is enabled, raw logs are uploaded to LTS as the value of the _content_parse_fail_ field.
- The following describes how logs are reported when Raw Log Upload and Upload Parsing Failure Log are enabled or disabled.
This function is available only in region AP-Singapore.
Figure 8 Structuring parsingTable 1 Log reporting description Parameter
Description
- Raw Log Upload enabled
- Upload Parsing Failure Log enabled
- Parsing succeeded: The parsed log and the raw log's content field are reported.
- Parsing failed: To avoid redundancy, only the raw log's content field is reported. The _content_parse_fail_ field is not reported.
- Raw Log Upload enabled
- Upload Parsing Failure Log disabled
- Parsing succeeded: The parsed log and the raw log's content field are reported.
- Parsing failed: The raw log's content field is reported.
- Raw Log Upload disabled
- Upload Parsing Failure Log enabled
- Parsing succeeded: The parsed log is reported.
- Parsing failed: The _content_parse_fail_ field is reported.
- Raw Log Upload disabled
- Upload Parsing Failure Log disabled
- Parsing succeeded: The parsed log is reported.
- Parsing failed: Only the system built-in and label fields are reported.
- Custom Time:
Enabling this lets you specify a field as the log time. Otherwise, the time set during ingestion configuration is used.
- JSON Parsing Layers: Configure the JSON parsing layers. The value must be an integer ranging from 1 (default) to 4.
This function expands the fields of a JSON log. For example, for raw log {"key1":{"key2":"value"}}, if you choose to parse it into 1 layer, the log will become {"key1":{"key2":"value"}}; if you choose to parse it into 2 layers, the log will become {"key1.key2":"value"}.
- JSON String Parsing: disabled by default. After this function is enabled, escaped JSON strings can be parsed into JSON objects. For example, {"key1":"{\"key2\":\"value\"}"} can be parsed into key1.key2:value.
Delimiter
Logs can be parsed by delimiters, such as commas (,), spaces, or other special characters.
- Select a delimiter.
- Select or customize a delimiter.
- Select a log example from existing logs or paste it from the clipboard, click Verify, and view the results under Extraction Results.
- Click Select from Existing Logs, select a log event, and click OK. You can select different time ranges to filter logs.
- Click Paste from Clipboard to paste the copied log content to the Log Example box.
Figure 9 Delimiter - Enable Log Filtering (disabled by default) as required and add a maximum of 20 whitelist or blacklist rules.
Figure 10 Log filtering rules
- Add a whitelist rule (available only when Log Filtering is enabled).
You can add filtering rules to retain valuable log data by applying regular expressions to the values of specified keys. A filtering rule acts as a matching criterion, collecting and reporting only logs that match the specified regular expression. When adding multiple whitelist filtering rules, you can select the And or Or relationship. This means a log will be collected when it satisfies all or any of the rules.
- Click Add and enter a key value and a filtering rule (regular expression). A key value is a log field name. For example, to collect log events containing hello, enter hello as the key value, and set the filtering rule to .*hello.*.
- Click
in the Operation column. In the displayed dialog box, enter a field value, and click Verify to verify the rule.
Figure 11 Verification - After the verification is successful, click OK or Close to exit the dialog box.
- Add a blacklist rule (available only when Log Filtering is enabled).
You can add filtering rules to retain valuable log data by applying regular expressions to the values of specified keys. A filtering rule acts as a discarding criterion, discarding logs that match the specified regular expression. When adding multiple blacklist filtering rules, you can select the And or Or relationship. This means a log will be excluded when it satisfies all or any of the rules.
- Click Add and enter a key value and a filtering rule (regular expression). A key value is a log field name. For example, if you do not want to collect log events containing test, enter hello as the key value, and set the filtering rule to .*test.*.
- Click
in the Operation column. In the displayed dialog box, enter a field value, and click Verify to verify the rule.
- After the verification is successful, click OK or Close to exit the dialog box.
- Add a whitelist rule (available only when Log Filtering is enabled).
- Raw Log Upload:
After this function is enabled, raw logs are uploaded to LTS as the value of the content field.
- Upload Parsing Failure Log:
After this function is enabled, raw logs are uploaded to LTS as the value of the _content_parse_fail_ field.
- The following describes how logs are reported when Raw Log Upload and Upload Parsing Failure Log are enabled or disabled.
This function is available only in region AP-Singapore.
Figure 12 Structuring parsingTable 2 Log reporting description Parameter
Description
- Raw Log Upload enabled
- Upload Parsing Failure Log enabled
- Parsing succeeded: The parsed log and the raw log's content field are reported.
- Parsing failed: To avoid redundancy, only the raw log's content field is reported. The _content_parse_fail_ field is not reported.
- Raw Log Upload enabled
- Upload Parsing Failure Log disabled
- Parsing succeeded: The parsed log and the raw log's content field are reported.
- Parsing failed: The raw log's content field is reported.
- Raw Log Upload disabled
- Upload Parsing Failure Log enabled
- Parsing succeeded: The parsed log is reported.
- Parsing failed: The _content_parse_fail_ field is reported.
- Raw Log Upload disabled
- Upload Parsing Failure Log disabled
- Parsing succeeded: The parsed log is reported.
- Parsing failed: Only the system built-in and label fields are reported.
- Custom Time:
Enabling this lets you specify a field as the log time. Otherwise, the time set during ingestion configuration is used.
Single-Line - Completely Regular
This option is applicable to single-line logs in any format and uses a regular expression to extract fields.
- Select Single-Line - Completely Regular.
- Select a log example from existing logs or paste it from the clipboard.
- Click Select from Existing Logs, select a log event, and click OK. You can select different time ranges to filter logs.
- Click Paste from Clipboard to paste the copied log content to the Log Example box.
- Enter a regular expression for extracting the log under Extraction Regular Expression, click Verify, and view the results under Extraction Results.
Alternatively, click automatic generation of regular expressions. In the displayed dialog box, extract fields based on the log example, enter the key, and click OK to automatically generate a regular expression. Then, click OK.
Figure 13 Extracting a regular expression - Enable Log Filtering (disabled by default) as required and add a maximum of 20 whitelist or blacklist rules.
Figure 14 Log filtering rules
- Add a whitelist rule (available only when Log Filtering is enabled).
You can add filtering rules to retain valuable log data by applying regular expressions to the values of specified keys. A filtering rule acts as a matching criterion, collecting and reporting only logs that match the specified regular expression. When adding multiple whitelist filtering rules, you can select the And or Or relationship. This means a log will be collected when it satisfies all or any of the rules.
- Click Add and enter a key value and a filtering rule (regular expression). A key value is a log field name. For example, to collect log events containing hello, enter hello as the key value, and set the filtering rule to .*hello.*.
- Click
in the Operation column. In the displayed dialog box, enter a field value, and click Verify to verify the rule.
Figure 15 Verification - After the verification is successful, click OK or Close to exit the dialog box.
- Add a blacklist rule (available only when Log Filtering is enabled).
You can add filtering rules to retain valuable log data by applying regular expressions to the values of specified keys. A filtering rule acts as a discarding criterion, discarding logs that match the specified regular expression. When adding multiple blacklist filtering rules, you can select the And or Or relationship. This means a log will be excluded when it satisfies all or any of the rules.
- Click Add and enter a key value and a filtering rule (regular expression). A key value is a log field name. For example, if you do not want to collect log events containing test, enter hello as the key value, and set the filtering rule to .*test.*.
- Click
in the Operation column. In the displayed dialog box, enter a field value, and click Verify to verify the rule.
- After the verification is successful, click OK or Close to exit the dialog box.
- Add a whitelist rule (available only when Log Filtering is enabled).
- Raw Log Upload:
After this function is enabled, raw logs are uploaded to LTS as the value of the content field.
- Upload Parsing Failure Log:
After this function is enabled, raw logs are uploaded to LTS as the value of the _content_parse_fail_ field.
- The following describes how logs are reported when Raw Log Upload and Upload Parsing Failure Log are enabled or disabled.
This function is available only in region AP-Singapore.
Figure 16 Structuring parsingTable 3 Log reporting description Parameter
Description
- Raw Log Upload enabled
- Upload Parsing Failure Log enabled
- Parsing succeeded: The parsed log and the raw log's content field are reported.
- Parsing failed: To avoid redundancy, only the raw log's content field is reported. The _content_parse_fail_ field is not reported.
- Raw Log Upload enabled
- Upload Parsing Failure Log disabled
- Parsing succeeded: The parsed log and the raw log's content field are reported.
- Parsing failed: The raw log's content field is reported.
- Raw Log Upload disabled
- Upload Parsing Failure Log enabled
- Parsing succeeded: The parsed log is reported.
- Parsing failed: The _content_parse_fail_ field is reported.
- Raw Log Upload disabled
- Upload Parsing Failure Log disabled
- Parsing succeeded: The parsed log is reported.
- Parsing failed: Only the system built-in and label fields are reported.
- Custom Time:
Enabling this lets you specify a field as the log time. Otherwise, the time set during ingestion configuration is used.
Multi-Line - Completely Regular
This option is applicable to multi-line logs in any format and uses a regular expression to extract fields.
- Select Multi-Line - Completely Regular.
- Select a log example from existing logs or paste it from the clipboard.
- Click Select from Existing Logs, select a log event, and click OK. You can select different time ranges to filter logs.
- Click Paste from Clipboard to paste the copied log content to the Log Example box.
- A regular expression can be automatically generated or manually entered under Regular Expression of the First Line. The regular expression of the first line must match the entire first line, not just the beginning of the first line.
The regular expression of the first line is used to identify the beginning of a multi-line log. Example:
2024-10-11 10:59:07.000 a.log:1 level:warn no.1 log 2024-10-11 10:59:17.000 a.log:2 level:warn no.2 log
Complete lines:
2024-10-11 10:59:07.000 a.log:1 level:warn no.1 log
First line:
2024-10-11 10:59:07.000 a.log:1 level:warn
Example of the regular expression of the first line: ^(\d+-\d+-\d+\s+\d+:\d+:\d+\.\d+). The date in each first line is unique. Therefore, the regular expression in the first line can be generated based on the date.
- Enter a regular expression for extracting the log under Extraction Regular Expression, click Verify, and view the results under Extraction Results.
Alternatively, click automatic generation of regular expressions. In the displayed dialog box, extract fields based on the log example, enter the key, and click OK to automatically generate a regular expression. Then, click OK.
The extraction result is the execution result of the extraction regular expression instead of the first line regular expression. To check the execution result of the first line regular expression, go to the target log stream.
If you enter an incorrect regular expression for Regular Expression of the First Line, you cannot view the reported log stream data.
Figure 17 Setting a regular expression - Enable Log Filtering (disabled by default) as required and add a maximum of 20 whitelist or blacklist rules.
Figure 18 Log filtering rules
- Add a whitelist rule (available only when Log Filtering is enabled).
You can add filtering rules to retain valuable log data by applying regular expressions to the values of specified keys. A filtering rule acts as a matching criterion, collecting and reporting only logs that match the specified regular expression. When adding multiple whitelist filtering rules, you can select the And or Or relationship. This means a log will be collected when it satisfies all or any of the rules.
- Click Add and enter a key value and a filtering rule (regular expression). A key value is a log field name. For example, to collect log events containing hello, enter hello as the key value, and set the filtering rule to .*hello.*.
- Click
in the Operation column. In the displayed dialog box, enter a field value, and click Verify to verify the rule.
Figure 19 Verification - After the verification is successful, click OK or Close to exit the dialog box.
- Add a blacklist rule (available only when Log Filtering is enabled).
You can add filtering rules to retain valuable log data by applying regular expressions to the values of specified keys. A filtering rule acts as a discarding criterion, discarding logs that match the specified regular expression. When adding multiple blacklist filtering rules, you can select the And or Or relationship. This means a log will be excluded when it satisfies all or any of the rules.
- Click Add and enter a key value and a filtering rule (regular expression). A key value is a log field name. For example, if you do not want to collect log events containing test, enter hello as the key value, and set the filtering rule to .*test.*.
- Click
in the Operation column. In the displayed dialog box, enter a field value, and click Verify to verify the rule.
- After the verification is successful, click OK or Close to exit the dialog box.
- Add a whitelist rule (available only when Log Filtering is enabled).
- Raw Log Upload:
After this function is enabled, raw logs are uploaded to LTS as the value of the content field.
- Upload Parsing Failure Log:
After this function is enabled, raw logs are uploaded to LTS as the value of the _content_parse_fail_ field.
- The following describes how logs are reported when Raw Log Upload and Upload Parsing Failure Log are enabled or disabled.
This function is available only in region AP-Singapore.
Figure 20 Structuring parsingTable 4 Log reporting description Parameter
Description
- Raw Log Upload enabled
- Upload Parsing Failure Log enabled
- Parsing succeeded: The parsed log and the raw log's content field are reported.
- Parsing failed: To avoid redundancy, only the raw log's content field is reported. The _content_parse_fail_ field is not reported.
- Raw Log Upload enabled
- Upload Parsing Failure Log disabled
- Parsing succeeded: The parsed log and the raw log's content field are reported.
- Parsing failed: The raw log's content field is reported.
- Raw Log Upload disabled
- Upload Parsing Failure Log enabled
- Parsing succeeded: The parsed log is reported.
- Parsing failed: The _content_parse_fail_ field is reported.
- Raw Log Upload disabled
- Upload Parsing Failure Log disabled
- Parsing succeeded: The parsed log is reported.
- Parsing failed: Only the system built-in and label fields are reported.
- Custom Time:
Enabling this lets you specify a field as the log time. Otherwise, the time set during ingestion configuration is used.
Combined Parsing
This option is applicable to logs in multiple nested formats, for example, JSON logs with delimiters. You can customize parsing rules based on the syntax.
- Select Combined Parsing.
- Select a log example from existing logs or paste it from the clipboard and enter the configuration content under Plug-in Settings.
- Customize the settings based on the log content by referring to the following plug-in syntax. After entering the plug-in configuration, you can click Verify to view the parsing result under Extraction Results. You can learn about the usage of the plug-in configuration by referring to the log example provided in 4.
This verification function is available only in regions CN North-Beijing4, CN North-Ulanqab1, CN East-Shanghai1, CN-Hong Kong, and AP-Singapore.
- processor_regex
Table 5 Regular expression extraction Parameter
Type
Description
source_key
string
Original field name.
regex
string
() in a regular expression indicates the field to be extracted.
keys
string array
Field name for the extracted content.
keep_source
boolean
Whether to retain the original field.
keep_source_if_parse
boolean
Whether to retain the original field when a parsing error occurs.
- processor_split_string
Table 6 Parsing using delimiters Parameter
Type
Description
source_key
string
Original field name.
split_sep
string
Delimiter string.
keys
string array
Field name for the extracted content.
keep_source
boolean
Whether to retain the original field in the parsed log.
split_type
char/special_char/string
Delimiter type. The options are char (single character), special_char (invisible character), and string.
keep_source_if_parse_error
boolean
Whether to retain the original field when a parsing error occurs.
- processor_split_key_value
Table 7 Key-value pair segmentation Parameter
Type
Description
source_key
string
Original field name.
split_sep
string
Delimiter between key-value pairs. The default value is the tab character (\t).
expand_connector
string
Delimiter between the key and value in a key-value pair. The default value is a colon (:).
keep_source
boolean
Whether to retain the original field in the parsed log.
- processor_add_fields
Table 8 Adding a field Parameter
Type
Description
fields
json/object
Name and value of the field to be added. The field is in key-value pair format. Multiple key-value pairs can be added.
- processor_drop
Table 9 Discarded fields Parameter
Type
Description
drop_keys
string array
List of discarded fields.
- processor_rename
Table 10 Renaming a field Parameter
Type
Description
source_keys
string array
Original name.
dest_keys
string array
New name.
- processor_json
Table 11 JSON expansion and extraction Parameter
Type
Description
source_key
string
Original field name.
keep_source
boolean
Whether to retain the original field in the parsed log.
expand_depth
int
JSON expansion depth. The default value 0 indicates that the depth is not limited. Other numbers, such as 1, indicate the current level.
expand_connector
string
Connector for expanding JSON. The default value is a period (.).
prefix
string
Prefix added to a field name when JSON is expanded.
keep_source_if_parse_error
boolean
Whether to retain the original field when a parsing error occurs.
- processor_filter_regex
Table 12 Filters Parameter
Type
Description
Include
json/object
The key indicates the log field, and the value indicates the regular expression to be matched.
Exclude
json/object
The key indicates the log field, and the value indicates the regular expression to be matched.
- processor_gotime
Table 13 Extraction time Parameter
Type
Description
source_key
string
Original field name.
source_format
string
Original time format.
source_location
int
Original time zone. If the value is empty, it indicates the time zone of the host or container is located.
set_time
boolean
Whether to set the parsed time as the log time.
keep_source
boolean
Whether to retain the original field in the parsed log.
- processor_base64_decoding
Table 14 Base64 decoding Parameter
Type
Description
source_key
string
Original field name.
dest_key
string
Target field after parsing.
keep_source_if_parse_error
boolean
Whether to retain the original field when a parsing error occurs.
keep_source
boolean
Whether to retain the original field in the parsed log.
- processor_base64_encoding
Table 15 Base64 encoding Parameter
Type
Description
source_key
string
Original field name.
dest_key
string
Target field after parsing.
keep_source
boolean
Whether to retain the original field in the parsed log.
- processor_regex
- Parse the following log example by combining the plug-ins provided. This function is available only in region AP-Singapore.
Raw log example (for reference only):
2025-03-19:16:49:03 [INFO] [thread1] {"ref":"https://www.test.com/","curl":"https://www.test.com/so/search?spm=1000.1111.2222.3333&q=linux%20opt%testabcd&t=&u=","sign":"1234567890","pid":"so","0508":{"sign":"112233445566 English bb Error, INFO, error bb&&bb","float":15.25,"long":15},"float":15.25,"long":15}
You can parse fields by using combined plug-ins. The following is an example of a complete plug-in configuration. You can copy it to Plug-in Settings and click Verify to view the parsing result.
[{ "detail": { "keys": ["nowtime", "level", "thread", "jsonmsg"], "keep_source": true, "regex": "(\\d{4}-\\d{2}-\\d{2}:\\d{2}:\\d{2}:\\d{2})\\s+\\[(\\w+)\\]\\s+\\[(\\w+)\\]\\s+(.*)", "source_key": "content" }, "type": "processor_regex" }, { "detail": { "expand_connector": ".", "expand_depth": 4, "keep_source": true, "source_key": "jsonmsg" }, "type": "processor_json" }, { "detail": { "keep_source": true, "keep_source_if_parse_error": true, "keys": [ "key1", "key2", "key3" ], "source_key": "0508.sign", "split_sep": ",", "split_type": "char" }, "type": "processor_split_string" }, { "detail": { "keep_source": true, "keep_source_if_parse_error": true, "keys": [ "a1", "a2", "a3" ], "regex": "^(\\w+)(?:[^ ]* ){1}(\\w+)(?:[^ ]* ){2}([^\\$]+)", "source_key": "key1" }, "type": "processor_regex" } ]
The following describes the plug-in configuration and parsing results of the example.
- Copy the raw log example to the Log Example box under Combined Parsing.
Raw log example (for reference only):
2025-03-19:16:49:03 [INFO] [thread1] {"ref":"https://www.test.com/","curl":"https://www.test.com/so/search?spm=1000.1111.2222.3333&q=linux%20opt%testabcd&t=&u=","sign":"1234567890","pid":"so","0508":{"sign":"112233445566 English bb Error, INFO, error bb&&bb","float":15.25,"long":15},"float":15.25,"long":15}
Figure 21 Log example - In the analysis result of the raw log example, the log time is 2025-03-19:16:49:03, the log level is [INFO], the thread ID is [thread1], and the log content is a complete JSON object as follows:
{"ref":"https://www.test.com/","curl":"https://www.test.com/so/search?spm=1000.1111.2222.3333&q=linux%20opt%testabcd&t=&u=","sign":"1234567890","pid":"so","0508":{"sign":"112233445566 English bb Error, INFO, error bb&&bb","float":15.25,"long":15},"float":15.25,"long":15}
- Extract logs based on the log time, log level, thread ID, and log content. Each part is separated by spaces. However, the JSON text may also contain spaces. Therefore, the delimiter plug-in is not used for extraction. The regular expression extraction plug-in (processor_regex) is used for extraction.
Regular expression examples:
For extracting the log time as nowtime: (\d{4}-\d{2}-\d{2}:\d{2}:\d{2}:\d{2})
For extracting the log level as level: \[(\w+)\]
For extracting the thread ID as thread: \[(\w+)\]
For extracting the log content as jsonmsg: (.*)
These regular expressions are separated by spaces to form a complete extraction regular expression: (\d{4}-\d{2}-\d{2}:\d{2}:\d{2}:\d{2})\s+\[(\w+)\]\s+\[(\w+)\]\s+(.*)
The processor_regex plug-in is configured as follows. Replace the value of regex with your regular expression, and replace \ in the regular expression with \\ for escape.[{ "detail": { "keys": ["nowtime", "level", "thread", "jsonmsg"], "regex": "(\\d{4}-\\d{2}-\\d{2}:\\d{2}:\\d{2}:\\d{2})\\s+\\[(\\w+)\\]\\s+\\[(\\w+)\\]\\s+(.*)", "source_key": "content" }, "type": "processor_regex" }]
- Copy the configuration content to the Plug-in Settings box and click Verify. The extraction result will be displayed under Extraction Results, showing that the keys and values are correctly extracted.
- You can use the JSON plug-in (processor_json) to expand the log content (jsonmsg) in JSON format. (Separate each pair of braces ({}) with commas (,).)
, { "detail": { "expand_connector": ".", "expand_depth": 4, "keep_source": true, "source_key": "jsonmsg", "source_location": 8 }, "type": "processor_json" }
Add the JSON plug-in (processor_json) to the end of the regular expression extraction plug-in (processor_regex) configuration, and click Verify. The extraction result will be displayed.
- To split the 0508.sign field extracted from JSON, you can use the delimiter parsing plug-in (processor_split_string) to split the field using commas (,) and extract the key1, key2, and key3 fields.
,{ "detail": { "keys": [ "key1", "key2", "key3" ], "source_key": "0508.sign", "split_sep": ",", "split_type": "char" }, "type": "processor_split_string" }
Add the delimiter parsing plug-in (processor_split_string) to the end of the JSON plug-in (processor_json) configuration, and click Verify. The key1, key2, and key3 fields will be displayed under Extraction Results.
- You can also use the regular expression extraction plug-in processor_regex to extract fields such as a1, a2, and a3 from key1.
, { "detail": { "keep_source": true, "keep_source_if_parse_error": true, "keys": [ "a1", "a2", "a3" ], "regex": "^(\\w+)(?:[^ ]* ){1}(\\w+)(?:[^ ]* ){2}([^\\$]+)", "source_key": "key1" }, "type": "processor_regex" }
Add the regular expression extraction plug-in (processor_regex) configuration to the end of the delimiter parsing plug-in (processor_split_string) configuration, and click Verify. The a1, a2, and a3 fields will be displayed under Extraction Results.
- Copy the raw log example to the Log Example box under Combined Parsing.
Custom Time
Enable Custom Time and set parameters by referring to Table 16.
- If the time format is incorrect or the specified field does not exist, the log time is the time set during ingestion configuration.
- The time field needs to be verified again when operations such as field name modification, field deletion, and field type modification are performed on structuring parsing.
Parameter |
Description |
Example |
---|---|---|
Key Name of the Time Field |
Name of an extracted field. You can select an extracted field from the drop-down list. The field is of the string or long type. |
test |
Field Value |
For an extracted field, after you select a key, its value is automatically filled in. |
2023-07-19 12:12:00 |
Time Format |
For details, see Common Log Time Formats. |
yyyy-MM-dd HH:mm:ss |
Operation |
Click the verification icon ( |
- |
Common Log Time Formats
Table 17 lists common log time formats.
By default, log timestamps in LTS are accurate to seconds. You do not need to configure information such as milliseconds and microseconds.
Format |
Description |
Example |
---|---|---|
EEE |
Abbreviation for a day of the week. |
Fri |
EEEE |
Full name for a day of the week. |
Friday |
MMM |
Abbreviation for a month. |
Jan |
MMMM |
Full name for a month. |
January |
dd |
Day of the month, ranging from 01 to 31 (decimal). |
07, 31 |
HH |
Hour, in 24-hour format. |
22 |
h |
Hour, in 12-hour format. |
11 |
MM |
Month, ranging from 01 to 12 (decimal). |
08 |
mm |
Minute, ranging from 00 to 59 (decimal). |
59 |
a |
a.m. or p.m. |
am or pm |
h:mm:ss a |
Time, in 12-hour format. |
11:59:59 am |
HH:mm |
Hour and minute. |
23:59 |
ss |
Number of the second, ranging from 00 to 59 (decimal). |
59 |
yy |
Year without century, ranging from 00 to 99 (decimal). |
04 or 98 |
yyyy |
Year (decimal). |
2004 or 1998 |
d |
Day of the month, ranging from 1 to 31 (decimal). |
7 or 31 |
%s |
Unix timestamp. |
147618725 |
Examples
Table 18 lists common time standards, examples, and expressions.
Example |
Time Expression |
Time Standard |
---|---|---|
2022-07-14T19:57:36+08:00 |
yyyy-MM-dd'T'HH:mm:ssXXX |
Custom |
1548752136 |
%s |
Custom |
27/Jan/2022:15:56:44 |
dd/MMM/yyyy:HH:mm:ss |
Custom |
2022-07-24T10:06:41.000 |
yyyy-MM-dd'T'HH:mm:ss.SSS |
Custom |
Monday, 02-Jan-06 15:04:05 MST |
EEEE, dd-MMM-yy HH:mm:ss Z |
RFC850 |
Mon, 02 Jan 2006 15:04:05 MST |
EEE, dd MMM yyyy HH:mm:ss Z |
RFC1123 |
02 Jan 06 15:04 MST |
dd MMM yy HH:mm Z |
RFC822 |
02 Jan 06 15:04 -0700 |
dd MMM yy HH:mm Z |
RFC822Z |
2023-01-02T15:04:05Z07:00 |
yyyy-MM-dd'T'HH:mm:ss Z |
RFC3339 |
2022-12-11 15:05:07 |
yyyy-MM-dd HH:mm:ss |
Custom |
2025-02-24T05:24:07.085Z |
yyyy-MM-dd'T'HH:mm:ss.SSSZ |
Custom |
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot