Compare commits

...

18 Commits

Author SHA1 Message Date
Zane C. B-H ed247fdbbd add repo to Makefile.PL 2021-11-08 20:32:55 -06:00
Zane C. B-H eb0d2f5e28 fix srcx and hostx for syslog and bump for release 2021-11-08 20:23:42 -06:00
Zane C. B-H feb144fdfe ready to release 0.4.3 2021-11-04 04:01:24 -05:00
Zane C. B-H 48e360c825 add back in postfix geoip processing 2021-11-04 03:57:32 -05:00
Zane C. B-H 5c596e54cc remove the geoip mutate from fail2ban... don't need mapping for geopoint now 2021-11-04 03:44:27 -05:00
Zane C. B-H cb1810e24f learned about make dist 2021-10-28 06:32:44 -05:00
Zane C. B-H 8cf6c6ee63 ready to release 0.4.2 2021-10-21 23:11:20 -05:00
Zane C. B-H 57f34f8b96 apparently github does not display/handle symbolic links in a handy manner for showing going to a file in a different dir 2021-10-21 22:57:45 -05:00
Zane C. B-H c0e408121b symlink changes 2021-10-21 22:55:07 -05:00
Zane C. B-H 373535dcf2 add a readme for the logstash examples 2021-10-21 22:51:33 -05:00
Zane C. B-H b328e1891a update to the newest postfix stuff, aggregate default to off as that appears to be buggy, resulting in lots of lines being ignored 2021-10-21 22:30:03 -05:00
Zane C. B-H 9936df5321 add config examples for injesting 2021-10-21 22:08:34 -05:00
Zane C. B-H bc6e2b2594 update postfix to reflect how it looks upon install 2021-10-21 22:03:08 -05:00
Zane C. B-H ff581a589b remove a extra ` 2021-10-21 11:55:35 -05:00
Zane C. B-H 7664e5f352 note the expected results if Elasticsearch has recache/index/etc. 2021-10-21 09:11:10 -05:00
Zane C. B-H ad756a4fe8 add a short description of what the nagius style check does 2021-10-20 21:41:44 -05:00
Zane C. B-H 4521e88071 add a how to for nagius style checks 2021-10-20 21:40:05 -05:00
Zane C. B-H bb4e1beb66 make README.md largely lint happy and expand on configuring it 2021-10-20 20:33:11 -05:00
30 changed files with 390 additions and 196 deletions

46
Changes Normal file
View File

@ -0,0 +1,46 @@
Revision history for Search-ESsearcher
0.4.4 2021-11-04/20:30
-Fix srcx and hostx for syslog.
0.4.3 2021-11-04/04:00
-Remove mutate from geoip on fail2ban.
This removes the need for mappings in Elasticsearch.
-Add back in GeoIP for Postfix.
0.4.2 2021-10-21/23:15
- Include logstash examples.
- Update Postfix logstash bits.
0.4.1 2019-12-08/04:05
- Remove accidentally included emacs save.
- Correct datestamp on previous change log entry.
0.4.0 2019-12-08/04:00
- Make host searching work better. Thanks, Kevin Greene.
- Add the aonHost.
0.3.1 2019-06-05/05:0
- Add missing options to postfix pod.
0.3.0 2019-06-05/01:30
- Add postfix support.
- Add repo info.
0.2.0 2019-06-03/04:30
- The bf2b template now properly processes --ip
- Add the httpAccess template.
- Add a missing flag to the help for bf2b.
- Added the option for pretty printing -S via -p
0.1.0 2019-06-02/09:00
- Add bf2b, beats fail2ban support.
- Actually set the output template now.
- name validation no longer chokes on numbers.
- Now prints the proper help info instead of the
one for the default, syslog.
0.0.0 2019-06-02/04:40
- Initial release.

View File

@ -17,6 +17,10 @@ t/manifest.t
t/pod-coverage.t
t/pod.t
bin/essearcher
logstash/postfix/50-filter-postfix.conf
logstash/postfix/README.md
logstash/postfix/postfix.grok
logstash/patterns.d/postfix.grok
logstash/conf.d/50-filter-postfix.conf
logstash/conf.d/syslog.conf
logstash/conf.d/rsyslog.conf
logstash/conf.d/beats.conf
logstash/conf.d/51-filter-postfix-aggregate.conf.off
logstash/README.md

62
Makefile.PL Normal file
View File

@ -0,0 +1,62 @@
use 5.006;
use strict;
use warnings;
use ExtUtils::MakeMaker;
my %WriteMakefileArgs = (
NAME => 'Search::ESsearcher',
AUTHOR => q{Zane C. Bowers-Hadley <vvelox@vvelox.net>},
VERSION_FROM => 'lib/Search/ESsearcher.pm',
ABSTRACT_FROM => 'lib/Search/ESsearcher.pm',
LICENSE => 'artistic_2',
MIN_PERL_VERSION => '5.006',
INST_SCRIPT => 'bin',
CONFIGURE_REQUIRES => {
'ExtUtils::MakeMaker' => '0',
},
TEST_REQUIRES => {
'Test::More' => '0',
},
PREREQ_PM => {
'JSON' => '4.02',
'Error::Helper' => '1.0.0',
'Search::Elasticsearch' => '6.00',
'Template' => '2.29',
'Template::Plugin::JSON' => '0.08',
'Time::ParseDate' => '2015.103',
'Term::ANSIColor' => '4.06',
'Data::Dumper' => '2.173',
},
dist => { COMPRESS => 'gzip -9f', SUFFIX => 'gz', },
clean => { FILES => 'Search-ESsearcher-*' },
META_MERGE => {
"meta-spec" => { version => 2 },
resources => {
repository => {
type => 'git',
url => 'git@github.com:VVelox/Search-ESsearcher.git',
web => 'https://github.com/VVelox/Search-ESsearcher',
},
},
},
);
# Compatibility with old versions of ExtUtils::MakeMaker
unless ( eval { ExtUtils::MakeMaker->VERSION('6.64'); 1 } ) {
my $test_requires = delete $WriteMakefileArgs{TEST_REQUIRES} || {};
@{ $WriteMakefileArgs{PREREQ_PM} }{ keys %$test_requires } = values %$test_requires;
}
unless ( eval { ExtUtils::MakeMaker->VERSION('6.55_03'); 1 } ) {
my $build_requires = delete $WriteMakefileArgs{BUILD_REQUIRES} || {};
@{ $WriteMakefileArgs{PREREQ_PM} }{ keys %$build_requires } = values %$build_requires;
}
delete $WriteMakefileArgs{CONFIGURE_REQUIRES}
unless eval { ExtUtils::MakeMaker->VERSION('6.52'); 1 };
delete $WriteMakefileArgs{MIN_PERL_VERSION}
unless eval { ExtUtils::MakeMaker->VERSION('6.48'); 1 };
delete $WriteMakefileArgs{LICENSE}
unless eval { ExtUtils::MakeMaker->VERSION('6.31'); 1 };
WriteMakefile(%WriteMakefileArgs);

View File

@ -9,6 +9,10 @@ template.
Search::ESsearcher largely exists for the purpose of that script.
Example logstash configs to help get you started using this are
included under the directory logstash. Just set the host and port
variables as desired and you should be good to got.
INSTALLATION
To install this module, run the following commands:
@ -44,7 +48,7 @@ You can also look for information at:
LICENSE AND COPYRIGHT
This software is Copyright (c) 2019 by Zane C. Bowers-Hadley.
This software is Copyright (c) 2021 by Zane C. Bowers-Hadley.
This is free software, licensed under:

103
README.md
View File

@ -2,42 +2,117 @@
![essearcher](essearcher.png)
It provides a dynamic system for searching logs stored in Elasticsearch. Currently it has out of the box support for the items below.
It provides a dynamic system for searching logs stored in
Elasticsearch. Currently it has out of the box support for the items below.
* [syslog](https://metacpan.org/pod/Search::ESsearcher::Templates::syslog)
* [postfix](https://metacpan.org/pod/Search::ESsearcher::Templates::postfix)
* [fail2ban via filebeat](https://metacpan.org/pod/Search::ESsearcher::Templates::bf2b)
* [HTTP access via filebeat](https://metacpan.org/pod/Search::ESsearcher::Templates::httpAccess)
# Configuring
If elasticsearch is not running on the same machine, then you will
need to setup the elastic file. By default this is
~/.config/essearcher/elastic/default . If not configured, the default
is as below.
```
{ "nodes": [ "127.0.0.1:9200" ] }
```
So if you want to set it to use ES on the server foo.bar, it would be
as below.
```
{ "nodes": [ "foo.bar:9200" ] }
```
The elastic file is JSON that will be passed to hash and passed to
[Search::Elasticsearch](https://metacpan.org/pod/Search::Elasticsearch)->new.
# As A Nagios Style Check
This requires three options, -n, -w, -c.
```
-n <check>
-w <warn>
-c <critical>
Check is the equality to use when comparing the number of hits found
for the search.
gt >
gte >=
lt <
lte <=
Critical and warn are the thresholds to use.
```
So for example for httpAccess if we want to alert for number of times
robots.txt is requested, we would do it like below.
```
essearcher -m httpAccess --dgte -5m --req robots.txt -n gt -w 2 -c 5
```
This will search for requests with 'robots.txt' in it within the last
5 minutes and will warn if the number of hits are great than 2 and go
critical if greater than 5.
# Extending
It has 5 parts that are listed below.
* options : [Getopt::Long](https://perldoc.perl.org/Getopt/Long.html) options that are parsed after the initial basic options. These are stored and used with the search and output template.
* elastic : This is a JSON that contains the options that will be used to initialize [Search::Elasticsearch](https://metacpan.org/pod/Search::Elasticsearch).
* search : This is a [Template](https://metacpan.org/pod/Template) template that will be fed to [Search::Elasticsearch](https://metacpan.org/pod/Search::Elasticsearch)->search.
* output : This is a [Template](https://metacpan.org/pod/Template) template that will be be used on each found item.
* options : [Getopt::Long](https://perldoc.perl.org/Getopt/Long.html)
options that are parsed after the initial basic options. These are
stored and used with the search and output template.
* elastic : This is a JSON that contains the options that will be used
to initialize [Search::Elasticsearch](https://metacpan.org/pod/Search::Elasticsearch).
* search : This is a [Template](https://metacpan.org/pod/Template)
template that will be fed to [Search::Elasticsearch](https://metacpan.org/pod/Search::Elasticsearch)->search.
* output : This is a [Template](https://metacpan.org/pod/Template)
template that will be be used on each found item.
It will search for those specified in the following order.
1. $ENV{'HOME'}.'/.config/essearcher/'.$part.'/'.$name
1. $base.'/etc/essearcher/'.help.'/'.$name
1. $base.'/etc/essearcher/'.$part.'/'.$name
1. Search::ESsearcher::Templates::$name->$part (except for elastic)
# INSTALLING
# FreeBSD
## FreeBSD
pkg install perl5 p5-JSON p5-Error-Helper p5-Template p5-Template-Plugin-JSON p5-Time-ParseDate p5-Term-ANSIColor p5-Data-Dumper
cpanm Search::ESsearcher
```
pkg install perl5 p5-JSON p5-Error-Helper p5-Template p5-Template-Plugin-JSON p5-Time-ParseDate p5-Term-ANSIColor p5-Data-Dumper
cpanm Search::ESsearcher
```
## Linux
### CentOS
yum install cpanm
cpanm Search::ESsearcher
```
yum install cpanm
cpanm Search::ESsearcher
```
### Debian
apt install perl perl-base perl-modules make cpanminus
cpanm Search::ESsearcher
```
apt install perl perl-base perl-modules make cpanminus
cpanm Search::ESsearcher
```
# Caveat
Please be aware that if a similar search has not been ran for awhile,
Elasticsearch will likely return a buggy/empty result that can't be
used. The usual return when this happens is empty JSON just containing
the key 'hits', which can be viewed via switch -R. When this happens,
just wait a few minutes or so to try again and Elasticsearch should
have reindex/cached/etc.

View File

@ -1,34 +0,0 @@
Revision history for Search-ESsearcher
0.4.1 2019-12-08/04:05
- Remove accidentally included emacs save.
- Correct datestamp on previous change log entry.
0.4.0 2019-12-08/04:00
- Make host searching work better. Thanks, Kevin Greene.
- Add the aonHost.
0.3.1 2019-06-05/05:0
- Add missing options to postfix pod.
0.3.0 2019-06-05/01:30
- Add postfix support.
- Add repo info.
0.2.0 2019-06-03/04:30
- The bf2b template now properly processes --ip
- Add the httpAccess template.
- Add a missing flag to the help for bf2b.
- Added the option for pretty printing -S via -p
0.1.0 2019-06-02/09:00
- Add bf2b, beats fail2ban support.
- Actually set the output template now.
- name validation no longer chokes on numbers.
- Now prints the proper help info instead of the
one for the default, syslog.
0.0.0 2019-06-02/04:40
- Initial release.

View File

@ -1,52 +0,0 @@
use 5.006;
use strict;
use warnings;
use ExtUtils::MakeMaker;
my %WriteMakefileArgs = (
NAME => 'Search::ESsearcher',
AUTHOR => q{Zane C. Bowers-Hadley <vvelox@vvelox.net>},
VERSION_FROM => 'lib/Search/ESsearcher.pm',
ABSTRACT_FROM => 'lib/Search/ESsearcher.pm',
LICENSE => 'artistic_2',
MIN_PERL_VERSION => '5.006',
INST_SCRIPT => 'bin',
CONFIGURE_REQUIRES => {
'ExtUtils::MakeMaker' => '0',
},
TEST_REQUIRES => {
'Test::More' => '0',
},
PREREQ_PM => {
'JSON' => '4.02',
'Error::Helper' => '1.0.0',
'Search::Elasticsearch' => '6.00',
'Template' => '2.29',
'Template::Plugin::JSON' => '0.08',
'Time::ParseDate' => '2015.103',
'Term::ANSIColor' => '4.06',
'Data::Dumper' => '2.173',
},
dist => { COMPRESS => 'gzip -9f', SUFFIX => 'gz', },
clean => { FILES => 'Search-ESsearcher-*' },
);
# Compatibility with old versions of ExtUtils::MakeMaker
unless (eval { ExtUtils::MakeMaker->VERSION('6.64'); 1 }) {
my $test_requires = delete $WriteMakefileArgs{TEST_REQUIRES} || {};
@{$WriteMakefileArgs{PREREQ_PM}}{keys %$test_requires} = values %$test_requires;
}
unless (eval { ExtUtils::MakeMaker->VERSION('6.55_03'); 1 }) {
my $build_requires = delete $WriteMakefileArgs{BUILD_REQUIRES} || {};
@{$WriteMakefileArgs{PREREQ_PM}}{keys %$build_requires} = values %$build_requires;
}
delete $WriteMakefileArgs{CONFIGURE_REQUIRES}
unless eval { ExtUtils::MakeMaker->VERSION('6.52'); 1 };
delete $WriteMakefileArgs{MIN_PERL_VERSION}
unless eval { ExtUtils::MakeMaker->VERSION('6.48'); 1 };
delete $WriteMakefileArgs{LICENSE}
unless eval { ExtUtils::MakeMaker->VERSION('6.31'); 1 };
WriteMakefile(%WriteMakefileArgs);

View File

@ -1,18 +0,0 @@
Makefile
Makefile.old
Build
Build.bat
META.*
MYMETA.*
.build/
_build/
cover_db/
blib/
inc/
.lwpcookies
.last_cover_stats
nytprof.out
pod2htm*.tmp
pm_to_blib
Search-ESsearcher-*
Search-ESsearcher-*.tar.gz

View File

@ -1,47 +0,0 @@
Logstash grok patterns for postfix logging
==========================================
A set of grok patterns for parsing postfix logging using grok. Also included is a sample Logstash config file for applying the grok patterns as a filter.
Usage
-----
- Install logstash
- Add `50-filter-postfix.conf` to `/etc/logstash/conf.d`
- Add `postfix.grok` to `/etc/logstash/patterns.d`
- Restart logstash
The included Logstash config file requires two input fields to exist in input events:
- `program`: the name of the program that generated the log line, f.i. `postfix/smtpd` (named `tag` in syslog lingo)
- `message`: the log message payload without additional fields (program, pid, etc), f.i. `connect from 1234.static.ctinets.com[45.238.241.123]`
This event format is supported by the Logstash `syslog` input plugin out of the box, but several other plugins produce input that can be adapted fairly easy to produce these fields too. See [ALTERNATIVE INPUTS](ALTERNATIVE-INPUTS.md) for details.
Tests
-----
[![Build Status](https://travis-ci.org/whyscream/postfix-grok-patterns.svg?branch=master)](https://travis-ci.org/whyscream/postfix-grok-patterns)
In the `test/` directory, there is a test suite that tries to make sure that no previously supported log line will break because of changing common patterns and such. It also returns results a lot faster than doing `sudo service logstash restart` :-).
The test suite needs the patterns provided by Logstash, you can easily pull these from github by running `git submodule update --init`. To run the test suite, you also need `ruby 2.2` or higher, and the `jls-grok` gem. Then simply execute `ruby test/test.rb`.
Adding new test cases can easily be done by creating new yaml files in the test directory. Each file specifies a grok pattern to validate, a sample log line, and a list of expected results.
Also, the example Logstash config file adds some informative tags that aid in finding grok failures and unparsed lines. If you're not interested in those, you can remove all occurrences of `add_tag` and `tag_on_failure` from the config file.
Contributing
------------
I only have access to my own log samples, and my setup does not support or use every feature in postfix. If you miss anything, please open a pull request on github. If you're not very well versed in regular expressions, it's also fine to only submit sample unsupported log lines.
License
-------
Everything in this repository is available under the New (3-clause) BSD license.
Acknowledgement
---------------
I use postfix, logstash, elasticsearch and kibana in order to get everything working.
For writing the grok patterns I depend heavily on [grokdebug](https://grokdebug.herokuapp.com/), and I looked a lot at [antispin's useful logstash grok patterns](http://antisp.in/2014/04/useful-logstash-grok-patterns/).

View File

@ -17,11 +17,11 @@ Search::ESsearcher - Provides a handy system for doing templated elasticsearch s
=head1 VERSION
Version 0.4.1
Version 0.4.4
=cut
our $VERSION = '0.4.1';
our $VERSION = '0.4.4';
=head1 SYNOPSIS

View File

@ -10,11 +10,11 @@ Search::ESsearcher::Templates::sfail2ban - Provicdes support for fail2ban logs s
=head1 VERSION
Version 0.0.1
Version 0.0.2
=cut
our $VERSION = '0.0.1';
our $VERSION = '0.0.2';
=head1 LOGSTASH
@ -38,9 +38,6 @@ This uses a logstash configuration like below.
geoip {
source => "clientip"
}
mutate {
convert => [ "[geoip][coordinates]", "float" ]
}
}
}

View File

@ -10,11 +10,11 @@ Search::ESsearcher::Templates::syslog - Provides postfix support for essearcher.
=head1 VERSION
Version 0.1.0
Version 0.1.1
=cut
our $VERSION = '0.1.0';
our $VERSION = '0.1.1';
=head1 LOGSTASH
@ -42,7 +42,8 @@ The important bit is "type" being set to "syslog". If that is not used,
use the command line options field and fieldv.
Install L<https://github.com/whyscream/postfix-grok-patterns> for pulling apart
the postfix messages. These files are included with this as well.
the postfix messages. These files are included with this as well. You will likely
not want to use 51-filter-postfix-aggregate.conf as that is a bit buggy.
=head1 Options

View File

@ -10,11 +10,11 @@ Search::ESsearcher::Templates::syslog - Provides syslog support for essearcher.
=head1 VERSION
Version 1.1.0
Version 1.1.1
=cut
our $VERSION = '1.1.0';
our $VERSION = '1.1.1';
=head1 LOGSTASH
@ -189,14 +189,14 @@ return '
[% IF o.hostx %]
{"query_string": {
"default_field": "host.keyword",
"query": [% o.host.json %]
"query": [% o.hostx.json %]
}
},
[% END %]
[% IF o.srcx %]
{"query_string": {
"default_field": "logsource.keyword",
"query": [% o.src.json %]
"query": [% o.srcx.json %]
}
},
[% END %]

17
logstash/README.md Normal file
View File

@ -0,0 +1,17 @@
# Installing
Just dump the stuff in your logstash dir and update the host setting
for the IP to listen on as well as set the ports as desired.
# Notes
## Postfix
These come from
[whyscream/postfix-grok-patterns](https://github.com/whyscream/postfix-grok-patterns).
51-filter-postfix-aggregate.conf is set to off by default as in
testing I found it to be buggy. It will often times result in lines
being skipped.
This one does have GeoIP processing though.

View File

@ -167,10 +167,6 @@ filter {
}
}
# process key-value data if it exists
if [postfix_keyvalue_data] {
kv {
@ -269,12 +265,9 @@ filter {
# add geoip for postfix
if [program] =~ /.*postfix.*/ {
geoip {
source => "postfix_client_ip"
}
mutate {
convert => [ "[geoip][coordinates]", "float" ]
}
geoip {
source => "postfix_client_ip"
}
}
}

View File

@ -0,0 +1,38 @@
filter {
if ![postfix_queueid] {
drop {}
} else if [program] == "postfix/qmgr" and [postfix_from]{
aggregate {
task_id => "%{postfix_queueid}"
code => "
map['postfix_from'] = event.get('postfix_from')
map['postfix_size'] = event.get('postfix_size')
map['postfix_nrcpt'] = event.get('postfix_nrcpt')
"
}
} else if [program] == "postfix/smtpd" {
aggregate {
task_id => "%{postfix_queueid}"
code => "
map['postfix_client_hostname'] = event.get('postfix_client_hostname')
map['postfix_client_ip'] = event.get('postfix_client_ip')
"
}
} else if [program] == "postfix/cleanup" {
aggregate {
task_id => "%{postfix_queueid}"
code => "
map['postfix_message-id'] = event.get('postfix_message-id')
"
}
} else if [program] == "postfix/smtp" {
aggregate {
task_id => "%{postfix_queueid}"
code => "
map.each do |key, value|
event.set(key, value)
end
"
}
}
}

View File

@ -0,0 +1,67 @@
input {
beats {
host => "192.168.14.3"
port => 5044
type => "beats"
}
}
filter {
if [type] == "beats" {
mutate {
remove_field => [ "[host]" ]
}
mutate {
add_field => {
"host" => "%{[beat][hostname]}"
}
}
}
if [fields][log] == "fail2ban" {
grok {
match => {
"message" => "%{TIMESTAMP_ISO8601:timestamp} %{WORD:log_src}.%{WORD:src_action} *\[%{INT:fail2ban_digit}\]: %{LOGLEVEL:loglevel} *\[%{NOTSPACE:service}\] %{WORD:b
an_status} %{IP:clientip}"
}
}
geoip {
source => "clientip"
}
}
if [fields][log] == "apache-access" {
grok {
match => {
"message" => "%{HTTPD_COMBINEDLOG}+%{GREEDYDATA:extra_fields}"
}
overwrite => [ "message" ]
}
mutate {
convert => ["response", "integer"]
convert => ["bytes", "integer"]
convert => ["responsetime", "float"]
}
geoip {
source => "clientip"
target => "geoip"
add_tag => [ "apache-geoip" ]
}
date {
match => [ "timestamp" , "dd/MMM/YYYY:HH:mm:ss Z" ]
remove_field => [ "timestamp" ]
}
useragent {
source => "agent"
}
}
}
output {
if [type] == "beats" {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
}
}
}

View File

@ -0,0 +1,18 @@
input {
udp {
host => "192.168.14.3"
port => 10514
codec => "json"
type => "rsyslog"
}
}
filter { }
output {
if [type] == "rsyslog" {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
}
}
}

View File

@ -0,0 +1,17 @@
input {
syslog {
host => "192.168.14.3"
port => 11514
type => "syslog"
}
}
filter { }
output {
if [type] == "syslog" {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
}
}
}

View File

@ -1,5 +1,7 @@
# Version: 1.0.0
# common postfix patterns
POSTFIX_QUEUEID ([0-9A-F]{6,}|[0-9a-zA-Z]{12,})
POSTFIX_QUEUEID ([0-9A-F]{6,}|[0-9a-zA-Z]{12,}|NOQUEUE)
POSTFIX_CLIENT_INFO %{HOSTNAME:postfix_client_hostname}?\[%{IP:postfix_client_ip}\](:%{INT:postfix_client_port})?
POSTFIX_RELAY_INFO %{HOSTNAME:postfix_relay_hostname}?\[(%{IP:postfix_relay_ip}|%{DATA:postfix_relay_service})\](:%{INT:postfix_relay_port})?|%{WORD:postfix_relay_service}
POSTFIX_SMTP_STAGE (CONNECT|HELO|EHLO|STARTTLS|AUTH|MAIL( FROM)?|RCPT( TO)?|(end of )?DATA|RSET|UNKNOWN|END-OF-MESSAGE|VRFY|\.)
@ -26,6 +28,7 @@ POSTFIX_COMMAND_COUNTER_DATA (helo=(%{INT:postfix_cmd_helo_accepted}/)?%{INT:pos
# helper patterns
GREEDYDATA_NO_COLON [^:]*
GREEDYDATA_NO_SEMICOLON [^;]*
GREEDYDATA_NO_BRACKET [^<>]*
STATUS_WORD [\w-]*
# warning patterns
@ -37,12 +40,15 @@ POSTFIX_WARNING %{POSTFIX_WARNING_WITH_KV}|%{POSTFIX_WARNING_WITHOUT_KV}
POSTFIX_SMTPD_CONNECT connect from %{POSTFIX_CLIENT_INFO}
POSTFIX_SMTPD_DISCONNECT disconnect from %{POSTFIX_CLIENT_INFO}( %{GREEDYDATA:postfix_command_counter_data})?
POSTFIX_SMTPD_LOSTCONN %{POSTFIX_LOSTCONN:postfix_smtpd_lostconn_data}( after %{POSTFIX_SMTP_STAGE:postfix_smtp_stage}( \(%{INT} bytes\))?)? from %{POSTFIX_CLIENT_INFO}(: %{GREEDYDATA:postfix_smtpd_lostconn_reason})?
POSTFIX_SMTPD_NOQUEUE NOQUEUE: %{POSTFIX_ACTION:postfix_action}: %{POSTFIX_SMTP_STAGE:postfix_smtp_stage} from %{POSTFIX_CLIENT_INFO}:( %{POSTFIX_STATUS_CODE:postfix_status_code} %{POSTFIX_STATUS_CODE_ENHANCED:postfix_status_code_enhanced})?( <%{DATA:postfix_status_data}>:)? (%{POSTFIX_DNSBL_MESSAGE}|%{GREEDYDATA:postfix_status_message};) %{POSTFIX_KEYVALUE_DATA:postfix_keyvalue_data}
POSTFIX_SMTPD_NOQUEUE %{POSTFIX_QUEUEID:postfix_queueid}: %{POSTFIX_ACTION:postfix_action}: %{POSTFIX_SMTP_STAGE:postfix_smtp_stage} from %{POSTFIX_CLIENT_INFO}:( %{POSTFIX_STATUS_CODE:postfix_status_code} %{POSTFIX_STATUS_CODE_ENHANCED:postfix_status_code_enhanced})?( <%{DATA:postfix_status_data}>:)? (%{POSTFIX_DNSBL_MESSAGE}|%{GREEDYDATA:postfix_status_message};) %{POSTFIX_KEYVALUE_DATA:postfix_keyvalue_data}
POSTFIX_SMTPD_PIPELINING improper command pipelining after %{POSTFIX_SMTP_STAGE:postfix_smtp_stage} from %{POSTFIX_CLIENT_INFO}: %{GREEDYDATA:postfix_improper_pipelining_data}
POSTFIX_SMTPD_PROXY proxy-%{POSTFIX_ACTION:postfix_proxy_result}: (%{POSTFIX_SMTP_STAGE:postfix_proxy_smtp_stage}): %{POSTFIX_PROXY_MESSAGE:postfix_proxy_message}; %{POSTFIX_KEYVALUE_DATA:postfix_keyvalue_data}
# cleanup patterns
POSTFIX_CLEANUP_MILTER %{POSTFIX_QUEUEID:postfix_queueid}: milter-%{POSTFIX_ACTION:postfix_milter_result}: %{GREEDYDATA:postfix_milter_message}; %{GREEDYDATA_NO_COLON:postfix_keyvalue_data}(: %{GREEDYDATA:postfix_milter_data})?
POSTFIX_CLEANUP_PREPEND_TYPE (header|body)
POSTFIX_CLEANUP_PREPEND %{POSTFIX_QUEUEID:postfix_queueid}: prepend: %{POSTFIX_CLEANUP_PREPEND_TYPE:postfix_prepend_type} %{GREEDYDATA:postfix_prepend_trigger} from %{POSTFIX_CLIENT_INFO}; %{GREEDYDATA_NO_COLON:postfix_keyvalue_data}: %{GREEDYDATA:postfix_prepend_value}
POSTFIX_CLEANUP_MESSAGEID %{POSTFIX_QUEUEID:postfix_queueid}: message-id=<?%{GREEDYDATA_NO_BRACKET:postfix_message-id}>?
# qmgr patterns
POSTFIX_QMGR_REMOVED %{POSTFIX_QUEUEID:postfix_queueid}: removed
@ -107,7 +113,7 @@ POSTFIX_SCACHE_TIMESTAMP statistics: start interval %{SYSLOGTIMESTAMP:postfix_sc
# aggregate all patterns
POSTFIX_SMTPD %{POSTFIX_SMTPD_CONNECT}|%{POSTFIX_SMTPD_DISCONNECT}|%{POSTFIX_SMTPD_LOSTCONN}|%{POSTFIX_SMTPD_NOQUEUE}|%{POSTFIX_SMTPD_PIPELINING}|%{POSTFIX_TLSCONN}|%{POSTFIX_WARNING}|%{POSTFIX_SMTPD_PROXY}|%{POSTFIX_KEYVALUE}
POSTFIX_CLEANUP %{POSTFIX_CLEANUP_MILTER}|%{POSTFIX_WARNING}|%{POSTFIX_KEYVALUE}
POSTFIX_CLEANUP %{POSTFIX_CLEANUP_MESSAGEID}|%{POSTFIX_CLEANUP_MILTER}|%{POSTFIX_CLEANUP_PREPEND}|%{POSTFIX_WARNING}|%{POSTFIX_KEYVALUE}
POSTFIX_QMGR %{POSTFIX_QMGR_REMOVED}|%{POSTFIX_QMGR_ACTIVE}|%{POSTFIX_QMGR_EXPIRED}|%{POSTFIX_WARNING}
POSTFIX_PIPE %{POSTFIX_PIPE_ANY}
POSTFIX_POSTSCREEN %{POSTFIX_PS_CONNECT}|%{POSTFIX_PS_ACCESS}|%{POSTFIX_PS_NOQUEUE}|%{POSTFIX_PS_TOOBUSY}|%{POSTFIX_PS_CACHE}|%{POSTFIX_PS_DNSBL}|%{POSTFIX_PS_VIOLATIONS}|%{POSTFIX_WARNING}