Keyword Log Listener
This is a short introduction on how to install a Keyword Log Listener. This Log Listener listens to specified keywords in the error text of error messages in and sends an email with a notification where applicable. The text of this email only contains the job number, to make sure the forwarding to an SMS gateway works (length limitation of the text).
Setting up a configuration file
Create a file ./conf/log_events.properties.
The file has to contain at least the following entries (Comments begin with "#").
# Sets the sender of the email
sender.address=john.doe@example.com
# Time in hours (Integer); Errors after that time are ignored
time.start=8
# Time in hours (Integer); Errors before that time are ignored
time.end=18
# If "false", the time settings are used, i.e. no emails are sent from Monday to Friday between 8am and 6pm
# If "true", the time settings are ignored - an email is sent, no matter what time it is (e.g. because Friday is a holiday)
ignoreTimeSettings=false
# To ignore errors for 60 minutes, starting on Thursdays at 2.30am, use the following setting
exclude.thursday.time=2:30;60
# Specified key words for the receiver; Key words are separated by commas. Various key words can be assigned to
# different receivers
john.doe@example.com=mandatory,invalidNote: If the configuration file is modified during runtime, the new settings will be read.
Activating the Keyword Log Listener
Please include the following line in section "Datawizard" in file ./etc/startup.xml.
<Call name="addLogListener"><Arg><New class="com.ebd.hub.datawizard.plugin.LogListener"/></Arg></Call> This activates the Keyword Log Listener at the next start of the Integration Server.
Note: The Log Listener accumulates notifications in a timeframe of 5 minutes to avoid sending an email for every error.
Unknown Segment Log Listener
If input files contain data segments that cannot be parsed into the existing source structure, this data will be lost (without triggering a profile job error).
However, it is possible to log such data, evaluate it and, if necessary, post-process it.
Activating the Unknown Segment Log Listener
Please add the following entry in section "Datawizard" in configuration file ./etc/startup.xml.
<Call name="addLogListener">
<Arg>
<New class="com.ebd.hub.datawizard.plugin.UnknownSegmentLogListener"/>
</Arg>
</Call>This will activate the Unknown Segment Log Listener after the next start of the Integration Server.
Adjusting log level
Now, you have to set option "Unknown segments" in the "Profile Logging" of the respective profile (or generally in the "System Logging") for phase 2, so that in a first step messages are written into the job log.
Created log file
The Unknown Segment Log Listener reads the relevant messages out of all job logs and creates a file with name "UnknownSegmentData.work". A line of this file has the following structure.
<job_number>,<date>,"<job_log_message>" |
Example: 12636,25.09.19 09:07:38.184,"Skipping data 'somematchcode,somedata', no matching node found."
The Unknown Segment Log Listener internally remembers the age of the file. If it is older than 2 hours, the next time the file is touched, it will be renamed to "UnknownSegmentData.log". Note: If there already is a file "UnknownSegmentData.log" in the ./logs directory, the file "UnknownSegmentData.work" will not be renamed until the existing "UnknownSegmentData.log" file is fetched/removed.
You can then create another profile to process this log file (time-driven Input Agent of type "File", possibly with option "React to file events"). So you have the option to either process the unknown data segments somehow or simply send a notification email.
Document type
Please note that the Unknown Segment Log Listener does not work for document types "XML" and "JSON". The reason is that in these formats data in the input file is searched for based on the source structure itself. Unknown elements are thus not discovered at all since they (due to their unknown nature) do not exist in the source structure.
But it works for all other types of documents. In these cases, the parser tries to place a data segment in the source structure. If a data segment cannot be placed, a log entry for this segment will be created (if the corresponding log level option is set, see above).