Compare commits

...

36 Commits

Author SHA1 Message Date
Zane C. B-H ed247fdbbd add repo to Makefile.PL 2021-11-08 20:32:55 -06:00
Zane C. B-H eb0d2f5e28 fix srcx and hostx for syslog and bump for release 2021-11-08 20:23:42 -06:00
Zane C. B-H feb144fdfe ready to release 0.4.3 2021-11-04 04:01:24 -05:00
Zane C. B-H 48e360c825 add back in postfix geoip processing 2021-11-04 03:57:32 -05:00
Zane C. B-H 5c596e54cc remove the geoip mutate from fail2ban... don't need mapping for geopoint now 2021-11-04 03:44:27 -05:00
Zane C. B-H cb1810e24f learned about make dist 2021-10-28 06:32:44 -05:00
Zane C. B-H 8cf6c6ee63 ready to release 0.4.2 2021-10-21 23:11:20 -05:00
Zane C. B-H 57f34f8b96 apparently github does not display/handle symbolic links in a handy manner for showing going to a file in a different dir 2021-10-21 22:57:45 -05:00
Zane C. B-H c0e408121b symlink changes 2021-10-21 22:55:07 -05:00
Zane C. B-H 373535dcf2 add a readme for the logstash examples 2021-10-21 22:51:33 -05:00
Zane C. B-H b328e1891a update to the newest postfix stuff, aggregate default to off as that appears to be buggy, resulting in lots of lines being ignored 2021-10-21 22:30:03 -05:00
Zane C. B-H 9936df5321 add config examples for injesting 2021-10-21 22:08:34 -05:00
Zane C. B-H bc6e2b2594 update postfix to reflect how it looks upon install 2021-10-21 22:03:08 -05:00
Zane C. B-H ff581a589b remove a extra ` 2021-10-21 11:55:35 -05:00
Zane C. B-H 7664e5f352 note the expected results if Elasticsearch has recache/index/etc. 2021-10-21 09:11:10 -05:00
Zane C. B-H ad756a4fe8 add a short description of what the nagius style check does 2021-10-20 21:41:44 -05:00
Zane C. B-H 4521e88071 add a how to for nagius style checks 2021-10-20 21:40:05 -05:00
Zane C. B-H bb4e1beb66 make README.md largely lint happy and expand on configuring it 2021-10-20 20:33:11 -05:00
Zane C. B-H 318c4ae3c1 note the changes 2019-12-08 04:11:03 -06:00
Zane C. B-H da2e4bd230 finish off the last bits 2019-12-08 03:32:02 -06:00
Zane C. B-H 756fa71841 meh 2019-12-06 12:15:13 -06:00
Zane C. B-H 267b38c41f add queue ID and a few other bits 2019-12-06 12:14:06 -06:00
Zane C. B-H 022632657f rework host and src and add aonHost 2019-11-13 17:30:54 -06:00
Zane C. B-H 875fc8478c template cleanup, add some highlighting, and bump version number... add build stuff to .gitignore 2019-11-12 04:40:39 -06:00
Zane C. B-H c3853f739c
Update README.md 2019-08-05 23:39:43 -05:00
Zane C. B-H 9153e61b86 Merge branch 'master' of github.com:VVelox/Search-ESsearcher 2019-06-05 04:53:50 -05:00
Zane C. B-H 0974a661bc add missing options to postfix pod 2019-06-05 04:52:10 -05:00
VVelox d22658c383
begin updating the readme 2019-06-05 03:59:16 -05:00
VVelox 8a89781f08
Add files via upload 2019-06-05 01:51:36 -05:00
VVelox b9e267e55b
Delete essearcher.png 2019-06-05 01:51:10 -05:00
VVelox 8ce08c1275
resize 2019-06-05 01:50:01 -05:00
VVelox 57283a66dd
screen shot showing the httpAccess module 2019-06-05 01:43:56 -05:00
Zane C. B-H c59217039b ready for 0.3.0 release 2019-06-05 01:24:14 -05:00
Zane C. B-H 932fd35b5a add postfix support 2019-06-05 01:09:17 -05:00
Zane C. B-H f380df158c everything now done for 0.3.0 2019-06-03 04:31:36 -05:00
Zane C. B-H 63e8d3b3fe most of the work for 0.3.0 2019-06-03 03:34:21 -05:00
32 changed files with 2417 additions and 144 deletions

11
.gitignore vendored
View File

@ -7,6 +7,13 @@
*.o
*.pm.tdy
*.bs
Search-ESsearcher/MYMETA.json
Search-ESsearcher/MYMETA.yml
Search-ESsearcher/Makefile
Search-ESsearcher/bin/.exists
Search-ESsearcher/blib/
Search-ESsearcher/pm_to_blib
Search-ESsearcher/Makefile.old
# Devel::Cover
cover_db/
@ -35,3 +42,7 @@ inc/
/pm_to_blib
/*.zip
# emacs
*/\#*\#
Search-ESsearcher/bin/#essearcher#

46
Changes Normal file
View File

@ -0,0 +1,46 @@
Revision history for Search-ESsearcher
0.4.4 2021-11-04/20:30
-Fix srcx and hostx for syslog.
0.4.3 2021-11-04/04:00
-Remove mutate from geoip on fail2ban.
This removes the need for mappings in Elasticsearch.
-Add back in GeoIP for Postfix.
0.4.2 2021-10-21/23:15
- Include logstash examples.
- Update Postfix logstash bits.
0.4.1 2019-12-08/04:05
- Remove accidentally included emacs save.
- Correct datestamp on previous change log entry.
0.4.0 2019-12-08/04:00
- Make host searching work better. Thanks, Kevin Greene.
- Add the aonHost.
0.3.1 2019-06-05/05:0
- Add missing options to postfix pod.
0.3.0 2019-06-05/01:30
- Add postfix support.
- Add repo info.
0.2.0 2019-06-03/04:30
- The bf2b template now properly processes --ip
- Add the httpAccess template.
- Add a missing flag to the help for bf2b.
- Added the option for pretty printing -S via -p
0.1.0 2019-06-02/09:00
- Add bf2b, beats fail2ban support.
- Actually set the output template now.
- name validation no longer chokes on numbers.
- Now prints the proper help info instead of the
one for the default, syslog.
0.0.0 2019-06-02/04:40
- Initial release.

26
MANIFEST Normal file
View File

@ -0,0 +1,26 @@
Changes
lib/Search/ESsearcher.pm
lib/Search/ESsearcher/Templates/httpAccess.pm
lib/Search/ESsearcher/Templates/syslog.pm
lib/Search/ESsearcher/Templates/bf2b.pm
lib/Search/ESsearcher/Templates/postfix.pm
Makefile.PL
MANIFEST This list of files
README
bin/essearcher
t/00-load.t
t/01-load.t
t/02-load.t
t/03-load.t
t/04-load.t
t/manifest.t
t/pod-coverage.t
t/pod.t
bin/essearcher
logstash/patterns.d/postfix.grok
logstash/conf.d/50-filter-postfix.conf
logstash/conf.d/syslog.conf
logstash/conf.d/rsyslog.conf
logstash/conf.d/beats.conf
logstash/conf.d/51-filter-postfix-aggregate.conf.off
logstash/README.md

62
Makefile.PL Normal file
View File

@ -0,0 +1,62 @@
use 5.006;
use strict;
use warnings;
use ExtUtils::MakeMaker;
my %WriteMakefileArgs = (
NAME => 'Search::ESsearcher',
AUTHOR => q{Zane C. Bowers-Hadley <vvelox@vvelox.net>},
VERSION_FROM => 'lib/Search/ESsearcher.pm',
ABSTRACT_FROM => 'lib/Search/ESsearcher.pm',
LICENSE => 'artistic_2',
MIN_PERL_VERSION => '5.006',
INST_SCRIPT => 'bin',
CONFIGURE_REQUIRES => {
'ExtUtils::MakeMaker' => '0',
},
TEST_REQUIRES => {
'Test::More' => '0',
},
PREREQ_PM => {
'JSON' => '4.02',
'Error::Helper' => '1.0.0',
'Search::Elasticsearch' => '6.00',
'Template' => '2.29',
'Template::Plugin::JSON' => '0.08',
'Time::ParseDate' => '2015.103',
'Term::ANSIColor' => '4.06',
'Data::Dumper' => '2.173',
},
dist => { COMPRESS => 'gzip -9f', SUFFIX => 'gz', },
clean => { FILES => 'Search-ESsearcher-*' },
META_MERGE => {
"meta-spec" => { version => 2 },
resources => {
repository => {
type => 'git',
url => 'git@github.com:VVelox/Search-ESsearcher.git',
web => 'https://github.com/VVelox/Search-ESsearcher',
},
},
},
);
# Compatibility with old versions of ExtUtils::MakeMaker
unless ( eval { ExtUtils::MakeMaker->VERSION('6.64'); 1 } ) {
my $test_requires = delete $WriteMakefileArgs{TEST_REQUIRES} || {};
@{ $WriteMakefileArgs{PREREQ_PM} }{ keys %$test_requires } = values %$test_requires;
}
unless ( eval { ExtUtils::MakeMaker->VERSION('6.55_03'); 1 } ) {
my $build_requires = delete $WriteMakefileArgs{BUILD_REQUIRES} || {};
@{ $WriteMakefileArgs{PREREQ_PM} }{ keys %$build_requires } = values %$build_requires;
}
delete $WriteMakefileArgs{CONFIGURE_REQUIRES}
unless eval { ExtUtils::MakeMaker->VERSION('6.52'); 1 };
delete $WriteMakefileArgs{MIN_PERL_VERSION}
unless eval { ExtUtils::MakeMaker->VERSION('6.48'); 1 };
delete $WriteMakefileArgs{LICENSE}
unless eval { ExtUtils::MakeMaker->VERSION('6.31'); 1 };
WriteMakefile(%WriteMakefileArgs);

View File

@ -9,6 +9,10 @@ template.
Search::ESsearcher largely exists for the purpose of that script.
Example logstash configs to help get you started using this are
included under the directory logstash. Just set the host and port
variables as desired and you should be good to got.
INSTALLATION
To install this module, run the following commands:
@ -39,10 +43,12 @@ You can also look for information at:
Search CPAN
https://metacpan.org/release/Search-ESsearcher
Repository
https://github.com/VVelox/Search-ESsearcher
LICENSE AND COPYRIGHT
This software is Copyright (c) 2019 by Zane C. Bowers-Hadley.
This software is Copyright (c) 2021 by Zane C. Bowers-Hadley.
This is free software, licensed under:

114
README.md
View File

@ -1,16 +1,118 @@
# About
![essearcher](essearcher.png)
It provides a dynamic system for searching logs stored in
Elasticsearch. Currently it has out of the box support for the items below.
* [syslog](https://metacpan.org/pod/Search::ESsearcher::Templates::syslog)
* [postfix](https://metacpan.org/pod/Search::ESsearcher::Templates::postfix)
* [fail2ban via filebeat](https://metacpan.org/pod/Search::ESsearcher::Templates::bf2b)
* [HTTP access via filebeat](https://metacpan.org/pod/Search::ESsearcher::Templates::httpAccess)
# Configuring
If elasticsearch is not running on the same machine, then you will
need to setup the elastic file. By default this is
~/.config/essearcher/elastic/default . If not configured, the default
is as below.
```
{ "nodes": [ "127.0.0.1:9200" ] }
```
So if you want to set it to use ES on the server foo.bar, it would be
as below.
```
{ "nodes": [ "foo.bar:9200" ] }
```
The elastic file is JSON that will be passed to hash and passed to
[Search::Elasticsearch](https://metacpan.org/pod/Search::Elasticsearch)->new.
# As A Nagios Style Check
This requires three options, -n, -w, -c.
```
-n <check>
-w <warn>
-c <critical>
Check is the equality to use when comparing the number of hits found
for the search.
gt >
gte >=
lt <
lte <=
Critical and warn are the thresholds to use.
```
So for example for httpAccess if we want to alert for number of times
robots.txt is requested, we would do it like below.
```
essearcher -m httpAccess --dgte -5m --req robots.txt -n gt -w 2 -c 5
```
This will search for requests with 'robots.txt' in it within the last
5 minutes and will warn if the number of hits are great than 2 and go
critical if greater than 5.
# Extending
It has 5 parts that are listed below.
* options : [Getopt::Long](https://perldoc.perl.org/Getopt/Long.html)
options that are parsed after the initial basic options. These are
stored and used with the search and output template.
* elastic : This is a JSON that contains the options that will be used
to initialize [Search::Elasticsearch](https://metacpan.org/pod/Search::Elasticsearch).
* search : This is a [Template](https://metacpan.org/pod/Template)
template that will be fed to [Search::Elasticsearch](https://metacpan.org/pod/Search::Elasticsearch)->search.
* output : This is a [Template](https://metacpan.org/pod/Template)
template that will be be used on each found item.
It will search for those specified in the following order.
1. $ENV{'HOME'}.'/.config/essearcher/'.$part.'/'.$name
1. $base.'/etc/essearcher/'.$part.'/'.$name
1. Search::ESsearcher::Templates::$name->$part (except for elastic)
# INSTALLING
# FreeBSD
## FreeBSD
```
pkg install perl5 p5-JSON p5-Error-Helper p5-Template p5-Template-Plugin-JSON p5-Time-ParseDate p5-Term-ANSIColor p5-Data-Dumper
cpanm Search::ESsearcher
```
## Linux
### CentOS
yum install cpan
cpan Search::ESsearcher
```
yum install cpanm
cpanm Search::ESsearcher
```
### Debian
apt install perl perl-base perl-modules make
cpan Search::ESsearcher
```
apt install perl perl-base perl-modules make cpanminus
cpanm Search::ESsearcher
```
# Caveat
Please be aware that if a similar search has not been ran for awhile,
Elasticsearch will likely return a buggy/empty result that can't be
used. The usual return when this happens is empty JSON just containing
the key 'hits', which can be viewed via switch -R. When this happens,
just wait a few minutes or so to try again and Elasticsearch should
have reindex/cached/etc.

View File

@ -1,12 +0,0 @@
Revision history for Search-ESsearcher
0.1.0 2019-06-02/09:00
- Add bf2b, beats fail2ban support.
- Actually set the output template now.
- name validation no longer chokes on numbers.
- Now prints the proper help info instead of the
one for the default, syslog.
0.0.0 2019-06-02/04:40
- Initial release.

View File

@ -1,15 +0,0 @@
Changes
lib/Search/ESsearcher.pm
lib/Search/ESsearcher/Templates/syslog.pm
lib/Search/ESsearcher/Templates/bf2b.pm
Makefile.PL
MANIFEST This list of files
README
bin/essearcher
t/00-load.t
t/01-load.t
t/02-load.t
t/manifest.t
t/pod-coverage.t
t/pod.t
bin/essearcher

View File

@ -1,52 +0,0 @@
use 5.006;
use strict;
use warnings;
use ExtUtils::MakeMaker;
my %WriteMakefileArgs = (
NAME => 'Search::ESsearcher',
AUTHOR => q{Zane C. Bowers-Hadley <vvelox@vvelox.net>},
VERSION_FROM => 'lib/Search/ESsearcher.pm',
ABSTRACT_FROM => 'lib/Search/ESsearcher.pm',
LICENSE => 'artistic_2',
MIN_PERL_VERSION => '5.006',
INST_SCRIPT => 'bin',
CONFIGURE_REQUIRES => {
'ExtUtils::MakeMaker' => '0',
},
TEST_REQUIRES => {
'Test::More' => '0',
},
PREREQ_PM => {
'JSON' => '4.02',
'Error::Helper' => '1.0.0',
'Search::Elasticsearch' => '6.00',
'Template' => '2.29',
'Template::Plugin::JSON' => '0.08',
'Time::ParseDate' => '2015.103',
'Term::ANSIColor' => '4.06',
'Data::Dumper' => '2.173',
},
dist => { COMPRESS => 'gzip -9f', SUFFIX => 'gz', },
clean => { FILES => 'Search-ESsearcher-*' },
);
# Compatibility with old versions of ExtUtils::MakeMaker
unless (eval { ExtUtils::MakeMaker->VERSION('6.64'); 1 }) {
my $test_requires = delete $WriteMakefileArgs{TEST_REQUIRES} || {};
@{$WriteMakefileArgs{PREREQ_PM}}{keys %$test_requires} = values %$test_requires;
}
unless (eval { ExtUtils::MakeMaker->VERSION('6.55_03'); 1 }) {
my $build_requires = delete $WriteMakefileArgs{BUILD_REQUIRES} || {};
@{$WriteMakefileArgs{PREREQ_PM}}{keys %$build_requires} = values %$build_requires;
}
delete $WriteMakefileArgs{CONFIGURE_REQUIRES}
unless eval { ExtUtils::MakeMaker->VERSION('6.52'); 1 };
delete $WriteMakefileArgs{MIN_PERL_VERSION}
unless eval { ExtUtils::MakeMaker->VERSION('6.48'); 1 };
delete $WriteMakefileArgs{LICENSE}
unless eval { ExtUtils::MakeMaker->VERSION('6.31'); 1 };
WriteMakefile(%WriteMakefileArgs);

View File

@ -1,18 +0,0 @@
Makefile
Makefile.old
Build
Build.bat
META.*
MYMETA.*
.build/
_build/
cover_db/
blib/
inc/
.lwpcookies
.last_cover_stats
nytprof.out
pod2htm*.tmp
pm_to_blib
Search-ESsearcher-*
Search-ESsearcher-*.tar.gz

View File

@ -12,9 +12,10 @@ use warnings;
use Search::ESsearcher;
use Getopt::Long qw(:config pass_through);
use Data::Dumper;
use JSON;
sub version{
print "essearch: 0.0.1\n";
print "essearch: 0.1.0\n";
};
# disable color if asked
@ -22,6 +23,8 @@ if ( defined( $ENV{NO_COLOR} ) ){
$ENV{ANSI_COLORS_DISABLED}=1;
}
# set all the templates the servers use to to fault
my $search;
my $options;
@ -35,6 +38,7 @@ my $help;
my $warn;
my $critical;
my $check;
my $pretty;
GetOptions(
's=s' => \$search,
'g=s' => \$options,
@ -49,6 +53,7 @@ GetOptions(
'n=s' => \$check,
'w=s' => \$warn,
'c=s' => \$critical,
'p' => \$pretty,
);
# if -n is set, make sure we have -w and -c
@ -68,7 +73,7 @@ if ( defined( $check ) &&
( $check ne 'gte' ) &&
( $check ne 'lt' ) &&
( $check ne 'lte' )
){
) {
warn('-n is set, but is not gt, gte, lt, or lte');
exit 255;
}
@ -100,6 +105,7 @@ if ( $help ){
Any of the above being set will override this.
-e <elastic> The elasticsearch config to use.
-S Print the search out after filling it in and exit.
-p Print the search JSON prettily.
-R Run the search and print it via Data::Dumper.
-i Invert the results.
-n <check> Operate as a nagios style check.
@ -138,6 +144,17 @@ $ess->search_set( $search );
$ess->load_search;
my $filled_in=$ess->search_fill_in;
if ( $print_search ){
#clean it up if requested
if ( $pretty ){
# eval{
my $json=JSON->new;
$json->pretty(1);
$json->relaxed(1);
$json->canonical(1);
my $decoded=$json->decode( $filled_in );
$filled_in=$json->encode($decoded);
# }
}
print $filled_in;
exit 255;
}
@ -256,6 +273,14 @@ Any of the above being set will override this.
The elasticsearch config to use.
=head2 -p
If -S is given, then it will attempt to pretty print the JSON.
This will involve parsing it and then turning it back into JSON.
So it will spit out ugly JSON is malformed JSON is fed into it.
=head2 -S
Print the search out after filling it in and exit.
@ -307,4 +332,10 @@ Print the help.
If this is set, it disables color output.
=head1 NOTE
This script has dynamic command line options. The ones shown above are just the defaults.
For others see the help documentation via --help or check the pod of the module you are using.
=cut

BIN
essearcher.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 449 KiB

View File

@ -17,16 +17,15 @@ Search::ESsearcher - Provides a handy system for doing templated elasticsearch s
=head1 VERSION
Version 0.1.0
Version 0.4.4
=cut
our $VERSION = '0.1.0';
our $VERSION = '0.4.4';
=head1 SYNOPSIS
use Search::ESsearcher;
my $ess = Search::ESsearcher->new();
@ -830,6 +829,13 @@ sub search_fill_in{
$_[0]=~s/\!/\ NOT\ /;
return $_[0];
},
aonHost=>sub{
$_[0]=~s/^([A-Za-z0-9\.]+)/\/$1*\//;
$_[0]=~s/\+([A-Za-z0-9\.]+)/\ AND\ \/$1*\//;
$_[0]=~s/\,([A-Za-z0-9\.]+)/\ OR\ \/$1*\//;
$_[0]=~s/\!([A-Za-z0-9\.]+)/\ NOT\ \/$1*\//;
return $_[0];
},
pd=>sub{
if( $_[0] =~ /^u\:/ ){
$_[0] =~ s/^u\://;
@ -897,6 +903,19 @@ sub search_run{
$results=$self->{es}->search( $self->{search_hash} );
};
# @timestamp can't be handled via
if (
( ref( $results ) eq 'HASH' ) ||
( defined( $results->{hits} ) )||
( defined( $results->{hits}{hits} ) )
){
foreach my $item ( @{ $results->{hits}{hits} } ){
if (!defined( $item->{'_source'}{'timestamp'}) ) {
$item->{'_source'}{'timestamp'}=$item->{'_source'}{'@timestamp'}
}
}
}
return $results;
}
@ -1045,13 +1064,43 @@ So the string "postfix,spamd" would become
Can be used like below.
[% USE JSON ( pretty => 1 ) %]
[% DEFAULT o.program = "*" %]
[% IF o.program %]
{"query_string": {
"default_field": "program",
"query": [% aon( o.program ).json %]
}
},
[% END %]
This function is only available for the search template.
=head2 aonHost
This is AND, OR, or NOT sub that handles
the following in a string, transforming them
from the punctuation to the logic.
, OR
+ AND
! NOT
So the string "foo.,mail.bar." would become
"/foo./ OR /mail.bar./".
This is best used with $field.keyword.
Can be used like below.
[% USE JSON ( pretty => 1 ) %]
[% IF o.host %]
{"query_string": {
"default_field": "host.keyword",
"query": [% aonHost( o.host ).json %]
}
},
[% END %]
This function is only available for the search template.
@ -1174,6 +1223,10 @@ L<https://cpanratings.perl.org/d/Search-ESsearcher>
L<https://metacpan.org/release/Search-ESsearcher>
=item * Repository
L<https://github.com/VVelox/Search-ESsearcher>
=back

View File

@ -6,15 +6,15 @@ use warnings;
=head1 NAME
Search::ESsearcher::Templates::syslog - Provicdes support for fail2ban logs sucked down via beats.
Search::ESsearcher::Templates::sfail2ban - Provicdes support for fail2ban logs sucked down via beats.
=head1 VERSION
Version 0.0.0
Version 0.0.2
=cut
our $VERSION = '0.0.0';
our $VERSION = '0.0.2';
=head1 LOGSTASH
@ -38,9 +38,6 @@ This uses a logstash configuration like below.
geoip {
source => "clientip"
}
mutate {
convert => [ "[geoip][coordinates]", "float" ]
}
}
}
@ -225,7 +222,7 @@ return '
}
},
[% END %]
[% IF o.clientip %]
[% IF o.ip %]
{"query_string": {
"default_field": "clientip",
"query": [% aon( o.ip ).json %]
@ -323,22 +320,27 @@ sub help{
--status <status> The status value of the message.
--host <log host> The system beats in question is running on.
--country <country> The 2 letter country code.
--jail <jail> The fail2ban jail in question.
--ip <ip> The IP to search for.
--country <country> The 2 letter country code.
--region <state> The state/province/etc to search for.
--postal <zipcode> The postal code to search for.
--city <cide> The city to search for.
--ip <ip> The IP to search for.
--dgt <date> Date greater than.
--dgte <date> Date greater than or equal to.
--dlt <date> Date less than.
--dlte <date> Date less than or equal to.
--msg <message> Messages to match.
--field <field> The term field to use for matching them all.
--field2 <field2> The term field to use for what beats is setting.
--fieldv <fieldv> The value of the term field to matching them all.
--field2v <field2v> The value to look for in the field beats is setting.
--dgt <date> Date greater than.
--dgte <date> Date greater than or equal to.
--dlt <date> Date less than.
--dlte <date> Date less than or equal to.
--msg <message> Messages to match.
--field <field> The term field to use for matching them all.
--field2 <field2> The term field to use for what beats is setting.
--fieldv <fieldv> The value of the term field to matching them all.
--field2v <field2v> The value to look for in the field beats is setting.
AND, OR, or NOT shortcut

View File

@ -0,0 +1,650 @@
package Search::ESsearcher::Templates::httpAccess;
use 5.006;
use strict;
use warnings;
=head1 NAME
Search::ESsearcher::Templates::httpAccess - Provicdes support for HTTP access logs sucked down via beats.
=head1 VERSION
Version 0.0.0
=cut
our $VERSION = '0.0.0';
=head1 LOGSTASH / FILEBEAT
This uses a logstath beasts input akin to below.
The important bit below is setting the "type" to "beats" and "fields.log" to "apache-access".
If you are using something different than "type" and "beats" you can specify that via "--field" and
"--fieldv" respectively.
If you are using something different than "fields.log" and "apache-access" you can specify that via "--field2" and
"--field2v" respectively.
input {
beats {
host => "192.168.14.3"
port => 5044
type => "beats"
}
}
filter {
if [fields][log] == "apache-access" {
grok {
match => {
"message" => "%{HTTPD_COMBINEDLOG}+%{GREEDYDATA:extra_fields}"
}
overwrite => [ "message" ]
}
mutate {
convert => ["response", "integer"]
convert => ["bytes", "integer"]
convert => ["responsetime", "float"]
}
geoip {
source => "clientip"
target => "geoip"
add_tag => [ "apache-geoip" ]
}
date {
match => [ "timestamp" , "dd/MMM/YYYY:HH:mm:ss Z" ]
remove_field => [ "timestamp" ]
}
useragent {
source => "agent"
}
}
}
output {
if [type] == "beats" {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
}
}
}
Then for file beats, something akin to below. The really important bits here the various
values for "fields".
For "fields.vhost" and "fields.vhost_port" if you are using somethind different, you can
specify that via "--field3" and "--field4" respectively.
- type: log
enabled: true
paths:
- /var/log/apache/foo.bar:80-access.log
fields:
log: apache-access
vhost: foo.bar
vhost_port: 80
=head1 Options
=head2 --host <host>
The machine beasts is running on feeding info to logstash/ES.
=head2 --response <code>
The response code from the HTTP server.
=head2 --verb <verb>
The verb used with the request.
=head2 --vhost <vhost>
The domain served up.
=head2 --port <port>
The port for the vhost.
=head2 --ip <ip>
The client IP that made the request.
=head2 --os <os>
The supplied OS value that made the request.
=head2 --showos
Shows the OS value.
=head2 --req <req>
The HTTP request.
=head2 --ref <ref>
The supplied referrer for the request.
=head2 --agent <agent>
The supplied agent value that made the request.
=head2 --noagent
Do not show the agent field.
=head2 --auth <auth>
The authed user for the request.
=head2 --bgt <bytes>
Response bytes greater than.
=head2 --bgte <bytes>
Response bytes greater than or equal to.
=head2 --blt <bytes>
Response bytes less than.
=head2 --blte <bytes>
Response bytes less than or equal to.
=head2 --geoip
Require GEO IP to have worked.
=head2 --country <country>
The 2 letter country code.
=head2 --showcountry
Show country code.
=head2 --region <state>
The state/province/etc to search for.
=head2 --showregion
Show region code.
=head2 --postal <zipcode>
The postal code to search for.
=head2 --showpostal
Show postal code.
=head2 --city <cide>
The city to search for.
=head2 --showcity
Show city name.
=head2 --size <count>
The number of items to return.
=head2 --dgt <date>
Date greater than.
=head2 --dgte <date>
Date greater than or equal to.
=head2 --dlt <date>
Date less than.
=head2 --dlte <date>
Date less than or equal to.
=head2 --msg <message>
Messages to match.
=head1 AND, OR, or NOT shortcut
, OR
+ AND
! NOT
A list seperated by any of those will be transformed
These may be used with program, facility, pid, or host.
example: --program postfix,spamd
results: postfix OR spamd
=head1 date
date
/^-/ appends "now" to it. So "-5m" becomes "now-5m".
/^u\:/ takes what is after ":" and uses Time::ParseDate to convert it to a
unix time value.
Any thing not matching maching any of the above will just be passed on.
=cut
sub search{
return '
[% USE JSON ( pretty => 1 ) %]
[% DEFAULT o.size = "50" %]
[% DEFAULT o.field = "type" %]
[% DEFAULT o.fieldv = "beats" %]
[% DEFAULT o.field2 = "fields.log" %]
[% DEFAULT o.field2v = "apache-access" %]
[% DEFAULT o.field3 = "fields.vhost" %]
[% DEFAULT o.field4 = "fields.vhost_port" %]
{
"index": "logstash-*",
"body": {
"size": [% o.size.json %],
"query": {
"bool": {
"must": [
{
"term": { [% o.field.json %]: [% o.fieldv.json %] }
},
{"query_string": {
"default_field": [% o.field2.json %],
"query": [% o.field2v.json %]
}
},
[% IF o.country %]
{"query_string": {
"default_field": "geoip.country_code2",
"query": [% aon( o.country ).json %]
}
},
[% END %]
[% IF o.region %]
{"query_string": {
"default_field": "geoip.region_code",
"query": [% aon( o.region ).json %]
}
},
[% END %]
[% IF o.city %]
{"query_string": {
"default_field": "geoip.city_name",
"query": [% aon( o.city ).json %]
}
},
[% END %]
[% IF o.postal %]
{"query_string": {
"default_field": "geoip.postal_code",
"query": [% aon( o.postal ).json %]
}
},
[% END %]
[% IF o.host %]
{"query_string": {
"default_field": "host",
"query": [% aon( o.host ).json %]
}
},
[% END %]
[% IF o.msg %]
{"query_string": {
"default_field": "message",
"query": [% o.msg.json %]
}
},
[% END %]
[% IF o.response %]
{"query_string": {
"default_field": "response",
"query": [% aon( o.response ).json %]
}
},
[% END %]
[% IF o.geoip %]
{"query_string": {
"default_field": "geoip.country_code2",
"query": "*"
}
},
[% END %]
[% IF o.verb %]
{"query_string": {
"default_field": "verb",
"query": [% aon( o.verb ).json %]
}
},
[% END %]
[% IF o.vhost %]
{"query_string": {
"default_field": "fields.vhost",
"query": [% aon( o.vhost ).json %]
}
},
[% END %]
[% IF o.port %]
{"query_string": {
"default_field": "fields.vhost_port",
"query": [% aon( o.port ).json %]
}
},
[% END %]
[% IF o.os %]
{"query_string": {
"default_field": "os",
"query": [% aon( o.os ).json %]
}
},
[% END %]
[% IF o.agent %]
{"query_string": {
"default_field": "agent",
"query": [% aon( o.agent ).json %]
}
},
[% END %]
[% IF o.ip %]
{"query_string": {
"default_field": "clientip",
"query": [% aon( o.ip ).json %]
}
},
[% END %]
[% IF o.auth %]
{"query_string": {
"default_field": "auth",
"query": [% aon( o.auth ).json %]
}
},
[% END %]
[% IF o.req %]
{"query_string": {
"default_field": "request",
"query": [% aon( o.req ).json %]
}
},
[% END %]
[% IF o.ref %]
{"query_string": {
"default_field": "referrer",
"query": [% aon( o.ref ).json %]
}
},
[% END %]
[% IF o.bgt %]
{"range": {
"bytes": {
"gt": [% pd( o.bgt ).json %]
}
}
},
[% END %]
[% IF o.bgte %]
{"range": {
"bytes": {
"gte": [% pd( o.bgte ).json %]
}
}
},
[% END %]
[% IF o.blt %]
{"range": {
"bytes": {
"lt": [% pd( o.blt ).json %]
}
}
},
[% END %]
[% IF o.blte %]
{"range": {
"bytes": {
"lte": [% pd( o.blte ).json %]
}
}
},
[% END %]
[% IF o.dgt %]
{"range": {
"@timestamp": {
"gt": [% pd( o.dgt ).json %]
}
}
},
[% END %]
[% IF o.dgte %]
{"range": {
"@timestamp": {
"gte": [% pd( o.dgte ).json %]
}
}
},
[% END %]
[% IF o.dlt %]
{"range": {
"@timestamp": {
"lt": [% pd( o.dlt ).json %]
}
}
},
[% END %]
[% IF o.dlte %]
{"range": {
"@timestamp": {
"lte": [% pd( o.dlte ).json %]
}
}
},
[% END %]
]
}
},
"sort": [
{
"@timestamp": {"order" : "desc"}}
]
}
}
';
}
sub options{
return '
host=s
response=s
verb=s
vhost=s
port=s
ip=s
os=s
agent=s
auth=s
req=s
showos
geoip
country=s
ref=s
bgt=s
bgte=s
blt=s
blte=s
showcountry
showregion
showpostal
showcity
region=s
postal=s
city=s
msg=s
size=s
field=s
fieldv=s
field2=s
field2v=s
field3=s
field4=s
noagent
dgt=s
dgte=s
dlt=s
dlte=s
';
}
sub output{
return '[% c("cyan") %][% f.timestamp %] '.
'[% c("bright_blue") %][% f.fields.vhost %][% c("bright_yellow") %]:[% c("bright_magenta") %][% f.fields.vhost_port %] '.
'[% c("bright_cyan") %][% f.clientip %]'.
'[% IF o.showcountry %]'.
'[% IF f.geoip.country_code2 %]'.
'[% c("yellow") %]('.
'[% c("bright_green") %][% f.geoip.country_code2 %]'.
'[% c("yellow") %])'.
'[% END %]'.
'[% END %]'.
'[% IF o.showregion %]'.
'[% IF f.geoip.region_code %]'.
'[% c("yellow") %]('.
'[% c("bright_green") %][% f.geoip.region_code %]'.
'[% c("yellow") %])'.
'[% END %]'.
'[% END %]'.
'[% IF o.showcity %]'.
'[% IF f.geoip.city_name %]'.
'[% c("yellow") %]('.
'[% c("bright_green") %][% f.geoip.city_name %]'.
'[% c("yellow") %])'.
'[% END %]'.
'[% END %]'.
'[% IF o.showpostal %]'.
'[% IF f.geoip.postal_code %]'.
'[% c("yellow") %]('.
'[% c("bright_green") %][% f.geoip.postal_code %]'.
'[% c("yellow") %])'.
'[% END %]'.
'[% END %]'.
' [% c("bright_red") %][% f.auth %] '.
'[% c("bright_yellow") %][% f.verb %] '.
'[% c("bright_magenta") %][% f.request %] '.
'[% c("bright_blue") %][% f.response %] '.
'[% c("bright_green") %][% f.bytes %] '.
'[% c("cyan") %][% f.referrer %] '.
'[% IF o.showos %]'.
'[% c("green") %][% f.os %] '.
'[% END %]'.
'[% IF ! o.noagent %]'.
'[% c("magenta") %][% f.agent %]'.
'[% END %]'.
''
;
}
sub help{
return '
--host <log host> The system beats in question is running on.
--response <code> The response code from the HTTP server.
--verb <verb> The verb used with the request.
--vhost <vhost> The domain served up.
--port <port> The port for the vhost.
--ip <ip> The client IP that made the request.
--os <os> The supplied OS value that made the request.
--showos Shows the OS value.
--req <req> The HTTP request.
--ref <ref> The supplied referrer for the request.
--agent <agent> The supplied agent value that made the request.
--noagent Do not show the agent field.
--auth <auth> The authed user for the request.
--bgt <bytes> Response bytes greater than.
--bgte <bytes> Response bytes greater than or equal to.
--blt <bytes> Response bytes less than.
--blte <bytes> Response bytes less than or equal to.
--geoip Require GEO IP to have worked.
--country <country> The 2 letter country code.
--showcountry Show country code.
--region <state> The state/province/etc to search for.
--showregion Show region code.
--postal <zipcode> The postal code to search for.
--showpostal Show postal code.
--city <cide> The city to search for.
--showcity Show city name.
--dgt <date> Date greater than.
--dgte <date> Date greater than or equal to.
--dlt <date> Date less than.
--dlte <date> Date less than or equal to.
--msg <message> Messages to match.
--size <size> The max number of matches to return.
--field <field> The term field to use for matching them all.
--field2 <field2> The term field to use for what beats is setting.
--fieldv <fieldv> The value of the term field to matching them all.
--field2v <field2v> The value to look for in the field beats is setting.
AND, OR, or NOT shortcut
, OR
+ AND
! NOT
A list seperated by any of those will be transformed
These may be used with host, country, jail, region, postal, city, and ip.
example: --country CN,RU
field and fieldv
The search template is written with the expectation that logstash is setting
"type" with a value of "syslog". If you are using like "tag" instead of "type"
or the like, this allows you to change the field and value.
date
/^-/ appends "now" to it. So "-5m" becomes "now-5m".
/^u\:/ takes what is after ":" and uses Time::ParseDate to convert it to a
unix time value.
Any thing not matching maching any of the above will just be passed on.
';
}

View File

@ -0,0 +1,675 @@
package Search::ESsearcher::Templates::postfix;
use 5.006;
use strict;
use warnings;
=head1 NAME
Search::ESsearcher::Templates::syslog - Provides postfix support for essearcher.
=head1 VERSION
Version 0.1.1
=cut
our $VERSION = '0.1.1';
=head1 LOGSTASH
This uses a logstash configuration below.
input {
syslog {
host => "10.10.10.10"
port => 11514
type => "syslog"
}
}
filter { }
output {
if [type] == "syslog" {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
}
}
}
The important bit is "type" being set to "syslog". If that is not used,
use the command line options field and fieldv.
Install L<https://github.com/whyscream/postfix-grok-patterns> for pulling apart
the postfix messages. These files are included with this as well. You will likely
not want to use 51-filter-postfix-aggregate.conf as that is a bit buggy.
=head1 Options
=head2 --host <log host>
The syslog server.
The search is done with .keyword appended to the field name.
=head2 --hostx <log host>
The syslog server.
Does not run the it through aonHost.
The search is done with .keyword appended to the field name.
=head2 --src <src server>
The source server sending to the syslog server.
The search is done with .keyword appended to the field name.
=head2 --srcx <src server>
The source server sending to the syslog server.
Does not run the it through aonHost.
The search is done with .keyword appended to the field name.
=head2 --size <count>
The number of items to return.
=head2 --pid <pid>
The PID that sent the message.
=head2 --dgt <date>
Date greater than.
=head2 --dgte <date>
Date greater than or equal to.
=head2 --dlt <date>
Date less than.
=head2 --dlte <date>
Date less than or equal to.
=head2 --msg <message>
Messages to match.
=head2 --field <field>
The term field to use for matching them all.
=head2 --fieldv <fieldv>
The value of the term field to matching them all.
=head2 --mid <msg id>
Search based on the message ID.
=head2 --from <address>
The from address to search for.
=head2 --to <address>
The to address to search for.
=head2 --oto <address>
The original to address to search for.
=head2 --noq
Search for rejected messages, NOQUEUE.
=head2 --ip <ip>
The client IP to search for.
=head2 --chost <host>
The client hostname to search for.
=head2 --status <status>
Search using SMTP status codes.
=head2 --nocountry
Do not display the country code for the client IP.
=head2 --noregion
Do not display the region code for the client IP.
=head2 --nocity
Do not display the city name for the client IP.
=head2 --nopostal
Do not display the postal code for the client IP.
=head2 --aliaswarn
Show alias warnings.
=head2 --showkeys
Show the parsed out /postfix\_.*/ keys.
=head2 --nomsg
Do not show the message.
=head2 --showprogram
Show the syslog program name as well.
=head2 --showpid
Show the syslog PID as well.
=head1 AND, OR, or NOT shortcut
, OR
+ AND
! NOT
A list seperated by any of those will be transformed
These may be used with program, facility, pid, or host.
example: --program postfix,spamd
results: postfix OR spamd
=head1 HOST AND, OR, or NOT shortcut
, OR
+ AND
! NOT
A list of hosts seperated by any of those will be transformed.
A host name should always end in a period unless it is a FQDN.
These may be used with host and src.
example: --src foo.,mail.bar.
results: /foo./ OR /mail.bar./
=head1 date
date
/^-/ appends "now" to it. So "-5m" becomes "now-5m".
/^u\:/ takes what is after ":" and uses Time::ParseDate to convert it to a
unix time value.
Any thing not matching maching any of the above will just be passed on.
=cut
sub search{
return '
[% USE JSON ( pretty => 1 ) %]
[% DEFAULT o.program = "postfix" %]
[% DEFAULT o.facility = "mail" %]
[% DEFAULT o.size = "50" %]
[% DEFAULT o.field = "type" %]
[% DEFAULT o.fieldv = "syslog" %]
{
"index": "logstash-*",
"body": {
"size": [% o.size.json %],
"query": {
"bool": {
"must": [
{
"term": { [% o.field.json %]: [% o.fieldv.json %] }
},
[% IF o.host %]
{"query_string": {
"default_field": "host.keyword",
"query": [% aonHost( o.host ).json %]
}
},
[% END %]
[% IF o.hostx %]
{"query_string": {
"default_field": "host.keyword",
"query": [% o.hostx.json %]
}
},
[% END %]
[% IF o.src %]
{"query_string": {
"default_field": "logsource.keyword",
"query": [% aonHost( o.src ).json %]
}
},
[% END %]
[% IF o.srcx %]
{"query_string": {
"default_field": "logsource.keyword",
"query": [% o.srcx.json %]
}
},
[% END %]
{"query_string": {
"default_field": "program",
"query": [% aon( o.program ).json %]
}
},
{"query_string": {
"default_field": "facility_label",
"query": [% aon( o.facility ).json %]
}
},
[% IF o.pid %]
{"query_string": {
"default_field": "pid",
"query": [% aon( o.pid ).json %]
}
},
[% END %]
[% IF o.msg %]
{"query_string": {
"default_field": "message",
"query": [% o.msg.json %]
}
},
[% END %]
[% IF o.from %]
{"query_string": {
"default_field": "postfix_from",
"query": [% aon( o.from ).json %]
}
},
[% END %]
[% IF o.to %]
{"query_string": {
"default_field": "postfix_to",
"query": [% aon( o.to ).json %]
}
},
[% END %]
[% IF o.oto %]
{"query_string": {
"default_field": "postfix_orig_to",
"query": [% aon( o.oto ).json %]
}
},
[% END %]
[% IF o.mid %]
{"query_string": {
"default_field": "postfix_message-id",
"query": [% aon( o.mid ).json %]
}
},
[% END %]
[% IF o.qid %]
{"query_string": {
"default_field": "postfix_queueid",
"query": [% aon( o.qid ).json %]
}
},
[% END %]
[% IF o.ip %]
{"query_string": {
"default_field": "postfix_client_ip",
"query": [% aon( o.ip ).json %]
}
},
[% END %]
[% IF o.chost %]
{"query_string": {
"default_field": "postfix_client_hostname",
"query": [% aon( o.chost ).json %]
}
},
[% END %]
[% IF o.status %]
{"query_string": {
"default_field": "postfix_status_code",
"query": [% aon( o.status ).json %]
}
},
[% END %]
[% IF ! o.aliaswarn %]
{"query_string": {
"default_field": "message",
"query": "NOT \"is older than source file\""
}
},
[% END %]
[% IF o.noq %]
{"query_string": {
"default_field": "message",
"query": "NOQUEUE"
}
},
[% END %]
[% IF o.dgt %]
{"range": {
"@timestamp": {
"gt": [% pd( o.dgt ).json %]
}
}
},
[% END %]
[% IF o.dgte %]
{"range": {
"@timestamp": {
"gte": [% pd( o.dgte ).json %]
}
}
},
[% END %]
[% IF o.dlt %]
{"range": {
"@timestamp": {
"lt": [% pd( o.dlt ).json %]
}
}
},
[% END %]
[% IF o.dlte %]
{"range": {
"@timestamp": {
"lte": [% pd( o.dlte ).json %]
}
}
},
[% END %]
]
}
},
"sort": [
{
"@timestamp": {"order" : "desc"}}
]
}
}
';
}
sub options{
return '
host=s
src=s
hostx=s
srcx=s
size=s
showpid
mid=s
showprogram
showpid
nocountry
noregion
nocity
nopostal
aliaswarn
from=s
to=s
oto=s
ip=s
status=s
chost=s
pid=s
dgt=s
dgte=s
dlt=s
dlte=s
msg=s
pid=s
field=s
fieldv=s
showkeys
nomsg
noq
qid=s
';
}
sub output{
return '[% c("cyan") %][% f.timestamp %] [% c("bright_blue") %][% f.logsource %]'.
'[% IF o.showprogram %]'.
' [% c("bright_green") %][% f.program %]'.
'[% END %]'.
'[% IF o.showpid %]'.
' [% c("bright_magenta") %][[% c("bright_yellow") %][% f.pid %][% c("bright_magenta") %]]'.
'[% END %]'.
'[% IF ! o.nocountry %]'.
'[% IF f.geoip.country_code2 %]'.
' [% c("yellow") %]('.
'[% c("bright_green") %][% f.geoip.country_code2 %]'.
'[% c("yellow") %])'.
'[% END %]'.
'[% END %]'.
'[% IF ! o.region %]'.
'[% IF f.geoip.region_code %]'.
' [% c("yellow") %]('.
'[% c("bright_green") %][% f.geoip.region_code %]'.
'[% c("yellow") %])'.
'[% END %]'.
'[% END %]'.
'[% IF ! o.nocity %]'.
'[% IF f.geoip.city_name %]'.
' [% c("yellow") %]('.
'[% c("bright_green") %][% f.geoip.city_name %]'.
'[% c("yellow") %])'.
'[% END %]'.
'[% END %]'.
'[% IF ! o.nopostal %]'.
'[% IF f.geoip.postal_code %]'.
' [% c("yellow") %]('.
'[% c("bright_green") %][% f.geoip.postal_code %]'.
'[% c("yellow") %])'.
'[% END %]'.
'[% END %]'.
' '.
'[% PERL %]'.
'use Term::ANSIColor;'.
'my $f=$stash->get("f");'.
'if (defined( $f->{postfix_queueid} ) ){'.
' print color("bright_magenta").$f->{postfix_queueid};'.
' my $qid=$f->{postfix_queueid};'.
' delete($f->{postfix_queueid});'.
' $f->{message}=~s/^$qid\://;'.
' $stash->set("f", $f);'.
'}'.
'[% END %]'.
'[% IF o.showkeys %]'.
'[% PERL %]'.
'use Term::ANSIColor;'.
'my $f=$stash->get("f");'.
'my @pkeys=grep(/^postfix/, keys( %{$f} ) );'.
'if (defined( $f->{postfix_queueid} ) ){'.
' delete($f->{postfix_queueid})'.
'}'.
'foreach my $pkey (@pkeys){'.
' my $name=$pkey;'.
' $name=~s/^postfix\_//;'.
' if (defined( $f->{$pkey} ) ){'.
' print " ".color("bright_cyan").$name.color("bright_yellow")."=".color("bright_green").$f->{$pkey};'.
' }'.
'}'.
'print " "'.
'[% END %]'.
'[% END %]'.
'[% IF ! o.nomsg %]'.
'[% PERL %]'.
'use Term::ANSIColor;'.
'my $f=$stash->get("f");'.
'my $msg=color("white").$f->{message};'.
'my $replace=color("cyan")."<".color("bright_magenta");'.
'$msg=~s/\</$replace/g;'.
'$replace=color("cyan").">".color("white");'.
'$msg=~s/\>/$replace/g;'.
'$replace=color("bright_green")."(".color("cyan");'.
'$msg=~s/\(/$replace/g;'.
'$replace=color("bright_green").")".color("white");'.
'$msg=~s/\)/$replace/g;'.
'my $green=color("bright_green");'.
'my $white=color("white");'.
'my $yellow=color("bright_yellow");'.
'$msg=~s/([A-Za-z\_\-]+)\=/$green$1$yellow=$white/g;'.
'my $blue=color("bright_blue");'.
'$msg=~s/([0-9]+\.[0-9]+\.[0-9]+\.[0-9]+)/$blue$1$white/g;'.
'$replace=color("bright_red")."NOQUEUE".color("white");'.
'$msg=~s/NOQUEUE/$replace/g;'.
'$replace=color("bright_red")."failed".color("white");'.
'$msg=~s/failed/$replace/g;'.
'$replace=color("bright_red")."warning".color("white");'.
'$msg=~s/warning/$replace/g;'.
'$replace=color("bright_red")."disconnect from".color("white");'.
'$msg=~s/disconnect\ from/$replace/g;'.
'$replace=color("bright_red")."connect from".color("white");'.
'$msg=~s/connect\ from/$replace/g;'.
'$replace=color("bright_red")."SASL LOGIN".color("white");'.
'$msg=~s/SASL LOGIN/$replace/g;'.
'$replace=color("bright_red")."authentication".color("white");'.
'$msg=~s/authentication/$replace/g;'.
'$replace=color("bright_red")."blocked using".color("white");'.
'$msg=~s/blocked using/$replace/g;'.
'$replace=color("bright_red")."Service unavailable".color("white");'.
'$msg=~s/Service unavailable/$replace/g;'.
'print $msg;'.
'[% END %]'.
'[% END %]'
;
}
sub help{
return '
--host <log host> The syslog server.
--hostx <log host> The syslog server. This is passed raw.
--src <src server> The source server sending to the syslog server.
--srcx <src server> The source server sending to the syslog server. This is passed raw.
--size <count> The number of items to return.
--pid <pid> The PID that sent the message.
--mid <msg id> Search based on the message ID.
--qid <queue id> Search based on the queue ID.
--from <address> The from address to search for.
--to <address> The to address to search for.
--oto <address> The original to address to search for.
--noq Search for rejected messages, NOQUEUE.
--ip <ip> The client IP to search for.
--chost <host> The client hostname to search for.
--status <status> Search using SMTP status codes.
--nocountry Do not display the country code for the client IP.
--noregion Do not display the region code for the client IP.
--nocity Do not display the city name for the client IP.
--nopostal Do not display the postal code for the client IP.
--aliaswarn Show alias warnings.
--showkeys Show the parsed out /postfix\_.*/ keys.
--nomsg Do not show the message.
--showprogram Show the syslog program name as well.
--showpid Show the syslog PID as well.
--dgt <date> Date greater than.
--dgte <date> Date greater than or equal to.
--dlt <date> Date less than.
--dlte <date> Date less than or equal to.
--msg <message> Messages to match.
--field <field> The term field to use for matching them all.
--fieldv <fieldv> The value of the term field to matching them all.
AND, OR, or NOT shortcut
, OR
+ AND
! NOT
A list seperated by any of those will be transformed
These may be used with program, facility, pid, or host.
example: --program postfix,spamd
HOST AND, OR, or NOT shortcut
, OR
+ AND
! NOT
A list of hosts seperated by any of those will be transformed.
A host name should always end in a period unless it is a FQDN.
These may be used with host and src.
example: --src foo.,mail.bar.
results: /foo./ OR /mail.bar./
field and fieldv
The search template is written with the expectation that logstash is setting
"type" with a value of "syslog". If you are using like "tag" instead of "type"
or the like, this allows you to change the field and value.
date
/^-/ appends "now" to it. So "-5m" becomes "now-5m".
/^u\:/ takes what is after ":" and uses Time::ParseDate to convert it to a
unix time value.
Any thing not matching maching any of the above will just be passed on.
';
}

View File

@ -10,11 +10,11 @@ Search::ESsearcher::Templates::syslog - Provides syslog support for essearcher.
=head1 VERSION
Version 0.0.0
Version 1.1.1
=cut
our $VERSION = '0.0.0';
our $VERSION = '1.1.1';
=head1 LOGSTASH
@ -47,10 +47,30 @@ use the command line options field and fieldv.
The syslog server.
The search is done with .keyword appended to the field name.
=head2 --hostx <log host>
The syslog server.
Does not run the it through aonHost.
The search is done with .keyword appended to the field name.
=head2 --src <src server>
The source server sending to the syslog server.
The search is done with .keyword appended to the field name.
=head2 --srcx <src server>
The source server sending to the syslog server.
Does not run the it through aonHost.
The search is done with .keyword appended to the field name.
=head2 --program <program>
The name of the daemon/program in question.
@ -113,6 +133,22 @@ These may be used with program, facility, pid, or host.
results: postfix OR spamd
=head1 HOST AND, OR, or NOT shortcut
, OR
+ AND
! NOT
A list of hosts seperated by any of those will be transformed.
A host name should always end in a period unless it is a FQDN.
These may be used with host and src.
example: --src foo.,mail.bar.
results: /foo./ OR /mail.bar./
=head1 date
date
@ -130,13 +166,6 @@ Any thing not matching maching any of the above will just be passed on.
sub search{
return '
[% USE JSON ( pretty => 1 ) %]
[% DEFAULT o.host = "*" %]
[% DEFAULT o.src = "*" %]
[% DEFAULT o.program = "*" %]
[% DEFAULT o.facility = "*" %]
[% DEFAULT o.severity = "*" %]
[% DEFAULT o.pid = "*" %]
[% DEFAULT o.msg = "*" %]
[% DEFAULT o.size = "50" %]
[% DEFAULT o.field = "type" %]
[% DEFAULT o.fieldv = "syslog" %]
@ -150,41 +179,69 @@ return '
{
"term": { [% o.field.json %]: [% o.fieldv.json %] }
},
[% IF o.host %]
{"query_string": {
"default_field": "host",
"query": [% aon( o.host ).json %]
"default_field": "host.keyword",
"query": [% aonHost( o.host ).json %]
}
},
[% END %]
[% IF o.hostx %]
{"query_string": {
"default_field": "logsource",
"query": [% o.src.json %]
"default_field": "host.keyword",
"query": [% o.hostx.json %]
}
},
[% END %]
[% IF o.srcx %]
{"query_string": {
"default_field": "logsource.keyword",
"query": [% o.srcx.json %]
}
},
[% END %]
[% IF o.src %]
{"query_string": {
"default_field": "logsource.keyword",
"query": [% aonHost( o.src ).json %]
}
},
[% END %]
[% IF o.program %]
{"query_string": {
"default_field": "program",
"query": [% aon( o.program ).json %]
}
},
[% END %]
[% IF o.facility %]
{"query_string": {
"default_field": "facility_label",
"query": [% aon( o.facility ).json %]
}
},
[% END %]
[% IF o.severity %]
{"query_string": {
"default_field": "severity_label",
"query": [% aon( o.severity ).json %]
}
},
[% END %]
[% IF o.pid %]
{"query_string": {
"default_field": "pid",
"query": [% aon( o.pid ).json %]
}
},
[% END %]
[% IF o.msg %]
{"query_string": {
"default_field": "message",
"query": [% o.msg.json %]
}
},
[% END %]
[% IF o.dgt %]
{"range": {
"@timestamp": {
@ -232,6 +289,7 @@ return '
sub options{
return '
host=s
hostx=s
src=s
program=s
size=s
@ -245,30 +303,69 @@ dlte=s
msg=s
field=s
fieldv=s
srcx=s
';
}
sub output{
return '[% c("cyan") %][% f.timestamp %] [% c("bright_blue") %][% f.logsource %] '.
'[% c("bright_green") %][% f.program %][% c("bright_magenta") %][[% c("bright_yellow") %]'.
'[% f.pid %][% c("bright_magenta") %]] [% c("white") %][% f.message %]';
'[% f.pid %][% c("bright_magenta") %]] [% c("white") %]'.
'[% PERL %]'.
'use Term::ANSIColor;'.
'my $f=$stash->get("f");'.
'my $msg=color("white").$f->{message};'.
'my $replace=color("cyan")."<".color("bright_magenta");'.
'$msg=~s/\</$replace/g;'.
'$replace=color("cyan").">".color("white");'.
'$msg=~s/\>/$replace/g;'.
'$replace=color("bright_green")."(".color("cyan");'.
'$msg=~s/\(/$replace/g;'.
'$replace=color("bright_green").")".color("white");'.
'$msg=~s/\)/$replace/g;'.
'my $green=color("bright_green");'.
'my $white=color("white");'.
'my $yellow=color("bright_yellow");'.
'my $blue=color("bright_blue");'.
'$replace=color("bright_yellow")."\'".color("cyan");'.
'$msg=~s/\\\'([A-Za-z0-9\\.\\#\\:\\-\\/]*)\\\'/$replace$1$yellow\'$white/g;'.
'$msg=~s/([A-Za-z\_\-]+)\=/$green$1$yellow=$white/g;'.
'$msg=~s/([0-9]+\.[0-9]+\.[0-9]+\.[0-9]+)/$blue$1$white/g;'.
'$msg=~s/(([A-f0-9:]+:+)+[A-f0-9]+)/$blue$1$white/g;'.
'print $msg;'.
'[% END %]';
;
}
sub help{
return '
--host <log host> The syslog server.
--hostx <log host> The syslog server, raw.
--src <src server> The source server sending to the syslog server.
--srcx <src server> The source server sending to the syslog server, raw.
--program <program> The name of the daemon/program in question.
--size <count> The number of items to return.
--facility <facility> The syslog facility.
--severity <severity> The severity level of the message.
--pid <pid> The PID that sent the message.
--dgt <date> Date greater than.
--dgte <date> Date greater than or equal to.
--dlt <date> Date less than.
--dlte <date> Date less than or equal to.
--msg <message> Messages to match.
--field <field> The term field to use for matching them all.
--fieldv <fieldv> The value of the term field to matching them all.
@ -279,14 +376,30 @@ AND, OR, or NOT shortcut
+ AND
! NOT
A list seperated by any of those will be transformed
A list seperated by any of those will be transformed.
These may be used with program, facility, pid, or host.
These may be used with program, facility, and pid.
example: --program postfix,spamd
HOST AND, OR, or NOT shortcut
, OR
+ AND
! NOT
A list of hosts seperated by any of those will be transformed.
A host name should always end in a period unless it is a FQDN.
These may be used with host and src.
example: --src foo.,mail.bar.
results: /foo./ OR /mail.bar./
field and fieldv
The search template is written with the expectation that logstash is setting

17
logstash/README.md Normal file
View File

@ -0,0 +1,17 @@
# Installing
Just dump the stuff in your logstash dir and update the host setting
for the IP to listen on as well as set the ports as desired.
# Notes
## Postfix
These come from
[whyscream/postfix-grok-patterns](https://github.com/whyscream/postfix-grok-patterns).
51-filter-postfix-aggregate.conf is set to off by default as in
testing I found it to be buggy. It will often times result in lines
being skipped.
This one does have GeoIP processing though.

View File

@ -0,0 +1,273 @@
filter {
# grok log lines by program name (listed alpabetically)
if [program] =~ /^postfix.*\/anvil$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_ANVIL}$" ]
tag_on_failure => [ "_grok_postfix_anvil_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/bounce$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_BOUNCE}$" ]
tag_on_failure => [ "_grok_postfix_bounce_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/cleanup$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_CLEANUP}$" ]
tag_on_failure => [ "_grok_postfix_cleanup_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/dnsblog$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_DNSBLOG}$" ]
tag_on_failure => [ "_grok_postfix_dnsblog_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/error$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_ERROR}$" ]
tag_on_failure => [ "_grok_postfix_error_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/local$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_LOCAL}$" ]
tag_on_failure => [ "_grok_postfix_local_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/master$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_MASTER}$" ]
tag_on_failure => [ "_grok_postfix_master_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/pickup$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_PICKUP}$" ]
tag_on_failure => [ "_grok_postfix_pickup_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/pipe$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_PIPE}$" ]
tag_on_failure => [ "_grok_postfix_pipe_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/postdrop$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_POSTDROP}$" ]
tag_on_failure => [ "_grok_postfix_postdrop_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/postscreen$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_POSTSCREEN}$" ]
tag_on_failure => [ "_grok_postfix_postscreen_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/qmgr$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_QMGR}$" ]
tag_on_failure => [ "_grok_postfix_qmgr_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/scache$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_SCACHE}$" ]
tag_on_failure => [ "_grok_postfix_scache_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/sendmail$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_SENDMAIL}$" ]
tag_on_failure => [ "_grok_postfix_sendmail_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/smtp$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_SMTP}$" ]
tag_on_failure => [ "_grok_postfix_smtp_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/lmtp$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_LMTP}$" ]
tag_on_failure => [ "_grok_postfix_lmtp_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/smtpd$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_SMTPD}$" ]
tag_on_failure => [ "_grok_postfix_smtpd_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/postsuper$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_POSTSUPER}$" ]
tag_on_failure => [ "_grok_postfix_postsuper_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/tlsmgr$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_TLSMGR}$" ]
tag_on_failure => [ "_grok_postfix_tlsmgr_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/tlsproxy$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_TLSPROXY}$" ]
tag_on_failure => [ "_grok_postfix_tlsproxy_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/trivial-rewrite$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_TRIVIAL_REWRITE}$" ]
tag_on_failure => [ "_grok_postfix_trivial_rewrite_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/discard$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_DISCARD}$" ]
tag_on_failure => [ "_grok_postfix_discard_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*\/virtual$/ {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => [ "message", "^%{POSTFIX_VIRTUAL}$" ]
tag_on_failure => [ "_grok_postfix_virtual_nomatch" ]
add_tag => [ "_grok_postfix_success" ]
}
} else if [program] =~ /^postfix.*/ {
mutate {
add_tag => [ "_grok_postfix_program_nomatch" ]
}
}
# process key-value data if it exists
if [postfix_keyvalue_data] {
kv {
source => "postfix_keyvalue_data"
trim_value => "<>,"
prefix => "postfix_"
remove_field => [ "postfix_keyvalue_data" ]
}
# some post processing of key-value data
if [postfix_client] {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => ["postfix_client", "^%{POSTFIX_CLIENT_INFO}$"]
tag_on_failure => [ "_grok_kv_postfix_client_nomatch" ]
remove_field => [ "postfix_client" ]
}
}
if [postfix_relay] {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => ["postfix_relay", "^%{POSTFIX_RELAY_INFO}$"]
tag_on_failure => [ "_grok_kv_postfix_relay_nomatch" ]
remove_field => [ "postfix_relay" ]
}
}
if [postfix_delays] {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => ["postfix_delays", "^%{POSTFIX_DELAYS}$"]
tag_on_failure => [ "_grok_kv_postfix_delays_nomatch" ]
remove_field => [ "postfix_delays" ]
}
}
}
# process command counter data if it exists
if [postfix_command_counter_data] {
grok {
patterns_dir => "/etc/logstash/patterns.d"
match => ["postfix_command_counter_data", "^%{POSTFIX_COMMAND_COUNTER_DATA}$"]
tag_on_failure => ["_grok_postfix_command_counter_data_nomatch"]
remove_field => ["postfix_command_counter_data"]
}
}
# Do some data type conversions
mutate {
convert => [
# list of integer fields
"postfix_anvil_cache_size", "integer",
"postfix_anvil_conn_count", "integer",
"postfix_anvil_conn_rate", "integer",
"postfix_client_port", "integer",
"postfix_cmd_auth", "integer",
"postfix_cmd_auth_accepted", "integer",
"postfix_cmd_count", "integer",
"postfix_cmd_count_accepted", "integer",
"postfix_cmd_data", "integer",
"postfix_cmd_data_accepted", "integer",
"postfix_cmd_ehlo", "integer",
"postfix_cmd_ehlo_accepted", "integer",
"postfix_cmd_helo", "integer",
"postfix_cmd_helo_accepted", "integer",
"postfix_cmd_mail", "integer",
"postfix_cmd_mail_accepted", "integer",
"postfix_cmd_quit", "integer",
"postfix_cmd_quit_accepted", "integer",
"postfix_cmd_rcpt", "integer",
"postfix_cmd_rcpt_accepted", "integer",
"postfix_cmd_rset", "integer",
"postfix_cmd_rset_accepted", "integer",
"postfix_cmd_starttls", "integer",
"postfix_cmd_starttls_accepted", "integer",
"postfix_cmd_unknown", "integer",
"postfix_cmd_unknown_accepted", "integer",
"postfix_nrcpt", "integer",
"postfix_postscreen_cache_dropped", "integer",
"postfix_postscreen_cache_retained", "integer",
"postfix_postscreen_dnsbl_rank", "integer",
"postfix_relay_port", "integer",
"postfix_server_port", "integer",
"postfix_size", "integer",
"postfix_status_code", "integer",
"postfix_termination_signal", "integer",
# list of float fields
"postfix_delay", "float",
"postfix_delay_before_qmgr", "float",
"postfix_delay_conn_setup", "float",
"postfix_delay_in_qmgr", "float",
"postfix_delay_transmission", "float",
"postfix_postscreen_violation_time", "float"
]
}
# add geoip for postfix
if [program] =~ /.*postfix.*/ {
geoip {
source => "postfix_client_ip"
}
}
}

View File

@ -0,0 +1,38 @@
filter {
if ![postfix_queueid] {
drop {}
} else if [program] == "postfix/qmgr" and [postfix_from]{
aggregate {
task_id => "%{postfix_queueid}"
code => "
map['postfix_from'] = event.get('postfix_from')
map['postfix_size'] = event.get('postfix_size')
map['postfix_nrcpt'] = event.get('postfix_nrcpt')
"
}
} else if [program] == "postfix/smtpd" {
aggregate {
task_id => "%{postfix_queueid}"
code => "
map['postfix_client_hostname'] = event.get('postfix_client_hostname')
map['postfix_client_ip'] = event.get('postfix_client_ip')
"
}
} else if [program] == "postfix/cleanup" {
aggregate {
task_id => "%{postfix_queueid}"
code => "
map['postfix_message-id'] = event.get('postfix_message-id')
"
}
} else if [program] == "postfix/smtp" {
aggregate {
task_id => "%{postfix_queueid}"
code => "
map.each do |key, value|
event.set(key, value)
end
"
}
}
}

View File

@ -0,0 +1,67 @@
input {
beats {
host => "192.168.14.3"
port => 5044
type => "beats"
}
}
filter {
if [type] == "beats" {
mutate {
remove_field => [ "[host]" ]
}
mutate {
add_field => {
"host" => "%{[beat][hostname]}"
}
}
}
if [fields][log] == "fail2ban" {
grok {
match => {
"message" => "%{TIMESTAMP_ISO8601:timestamp} %{WORD:log_src}.%{WORD:src_action} *\[%{INT:fail2ban_digit}\]: %{LOGLEVEL:loglevel} *\[%{NOTSPACE:service}\] %{WORD:b
an_status} %{IP:clientip}"
}
}
geoip {
source => "clientip"
}
}
if [fields][log] == "apache-access" {
grok {
match => {
"message" => "%{HTTPD_COMBINEDLOG}+%{GREEDYDATA:extra_fields}"
}
overwrite => [ "message" ]
}
mutate {
convert => ["response", "integer"]
convert => ["bytes", "integer"]
convert => ["responsetime", "float"]
}
geoip {
source => "clientip"
target => "geoip"
add_tag => [ "apache-geoip" ]
}
date {
match => [ "timestamp" , "dd/MMM/YYYY:HH:mm:ss Z" ]
remove_field => [ "timestamp" ]
}
useragent {
source => "agent"
}
}
}
output {
if [type] == "beats" {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
}
}
}

View File

@ -0,0 +1,18 @@
input {
udp {
host => "192.168.14.3"
port => 10514
codec => "json"
type => "rsyslog"
}
}
filter { }
output {
if [type] == "rsyslog" {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
}
}
}

View File

@ -0,0 +1,17 @@
input {
syslog {
host => "192.168.14.3"
port => 11514
type => "syslog"
}
}
filter { }
output {
if [type] == "syslog" {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
}
}
}

View File

@ -0,0 +1,137 @@
# Version: 1.0.0
# common postfix patterns
POSTFIX_QUEUEID ([0-9A-F]{6,}|[0-9a-zA-Z]{12,}|NOQUEUE)
POSTFIX_CLIENT_INFO %{HOSTNAME:postfix_client_hostname}?\[%{IP:postfix_client_ip}\](:%{INT:postfix_client_port})?
POSTFIX_RELAY_INFO %{HOSTNAME:postfix_relay_hostname}?\[(%{IP:postfix_relay_ip}|%{DATA:postfix_relay_service})\](:%{INT:postfix_relay_port})?|%{WORD:postfix_relay_service}
POSTFIX_SMTP_STAGE (CONNECT|HELO|EHLO|STARTTLS|AUTH|MAIL( FROM)?|RCPT( TO)?|(end of )?DATA|RSET|UNKNOWN|END-OF-MESSAGE|VRFY|\.)
POSTFIX_ACTION (accept|defer|discard|filter|header-redirect|reject)
POSTFIX_STATUS_CODE \d{3}
POSTFIX_STATUS_CODE_ENHANCED \d\.\d\.\d
POSTFIX_DNSBL_MESSAGE Service unavailable; .* \[%{GREEDYDATA:postfix_status_data}\] %{GREEDYDATA:postfix_status_message};
POSTFIX_PS_ACCESS_ACTION (DISCONNECT|BLACKLISTED|WHITELISTED|WHITELIST VETO|PASS NEW|PASS OLD)
POSTFIX_PS_VIOLATION (BARE NEWLINE|COMMAND (TIME|COUNT|LENGTH) LIMIT|COMMAND PIPELINING|DNSBL|HANGUP|NON-SMTP COMMAND|PREGREET)
POSTFIX_TIME_UNIT %{NUMBER}[smhd]
POSTFIX_KEYVALUE_DATA [\w-]+=[^;]*
POSTFIX_KEYVALUE %{POSTFIX_QUEUEID:postfix_queueid}: %{POSTFIX_KEYVALUE_DATA:postfix_keyvalue_data}
POSTFIX_WARNING_LEVEL (warning|fatal|info)
POSTFIX_TLSCONN (Anonymous|Trusted|Untrusted|Verified) TLS connection established (to %{POSTFIX_RELAY_INFO}|from %{POSTFIX_CLIENT_INFO}): %{DATA:postfix_tls_version} with cipher %{DATA:postfix_tls_cipher} \(%{DATA:postfix_tls_cipher_size} bits\)
POSTFIX_TLSVERIFICATION certificate verification failed for %{POSTFIX_RELAY_INFO}: %{GREEDYDATA:postfix_tls_error}
POSTFIX_DELAYS %{NUMBER:postfix_delay_before_qmgr}/%{NUMBER:postfix_delay_in_qmgr}/%{NUMBER:postfix_delay_conn_setup}/%{NUMBER:postfix_delay_transmission}
POSTFIX_LOSTCONN (Connection timed out|No route to host|Connection refused|Network is unreachable|lost connection|timeout|SSL_accept error|-1)
POSTFIX_LOSTCONN_REASONS (receiving the initial server greeting|sending message body|sending end of data -- message may be sent more than once)
POSTFIX_PROXY_MESSAGE (%{POSTFIX_STATUS_CODE:postfix_proxy_status_code} )?(%{POSTFIX_STATUS_CODE_ENHANCED:postfix_proxy_status_code_enhanced})?.*
POSTFIX_COMMAND_COUNTER_DATA (helo=(%{INT:postfix_cmd_helo_accepted}/)?%{INT:postfix_cmd_helo} )?(ehlo=(%{INT:postfix_cmd_ehlo_accepted}/)?%{INT:postfix_cmd_ehlo} )?(starttls=(%{INT:postfix_cmd_starttls_accepted}/)?%{INT:postfix_cmd_starttls} )?(auth=(%{INT:postfix_cmd_auth_accepted}/)?%{INT:postfix_cmd_auth} )?(mail=(%{INT:postfix_cmd_mail_accepted}/)?%{INT:postfix_cmd_mail} )?(rcpt=(%{INT:postfix_cmd_rcpt_accepted}/)?%{INT:postfix_cmd_rcpt} )?(data=(%{INT:postfix_cmd_data_accepted}/)?%{INT:postfix_cmd_data} )?(rset=(%{INT:postfix_cmd_rset_accepted}/)?%{INT:postfix_cmd_rset} )?(quit=(%{INT:postfix_cmd_quit_accepted}/)?%{INT:postfix_cmd_quit} )?(unknown=(%{INT:postfix_cmd_unknown_accepted}/)?%{INT:postfix_cmd_unknown} )?commands=(%{INT:postfix_cmd_count_accepted}/)?%{INT:postfix_cmd_count}
# helper patterns
GREEDYDATA_NO_COLON [^:]*
GREEDYDATA_NO_SEMICOLON [^;]*
GREEDYDATA_NO_BRACKET [^<>]*
STATUS_WORD [\w-]*
# warning patterns
POSTFIX_WARNING_WITH_KV (%{POSTFIX_QUEUEID:postfix_queueid}: )?%{POSTFIX_WARNING_LEVEL:postfix_message_level}: (%{POSTFIX_CLIENT_INFO}: )?%{GREEDYDATA:postfix_message}; %{POSTFIX_KEYVALUE_DATA:postfix_keyvalue_data}
POSTFIX_WARNING_WITHOUT_KV (%{POSTFIX_QUEUEID:postfix_queueid}: )?%{POSTFIX_WARNING_LEVEL:postfix_message_level}: (%{POSTFIX_CLIENT_INFO}: )?%{GREEDYDATA:postfix_message}
POSTFIX_WARNING %{POSTFIX_WARNING_WITH_KV}|%{POSTFIX_WARNING_WITHOUT_KV}
# smtpd patterns
POSTFIX_SMTPD_CONNECT connect from %{POSTFIX_CLIENT_INFO}
POSTFIX_SMTPD_DISCONNECT disconnect from %{POSTFIX_CLIENT_INFO}( %{GREEDYDATA:postfix_command_counter_data})?
POSTFIX_SMTPD_LOSTCONN %{POSTFIX_LOSTCONN:postfix_smtpd_lostconn_data}( after %{POSTFIX_SMTP_STAGE:postfix_smtp_stage}( \(%{INT} bytes\))?)? from %{POSTFIX_CLIENT_INFO}(: %{GREEDYDATA:postfix_smtpd_lostconn_reason})?
POSTFIX_SMTPD_NOQUEUE %{POSTFIX_QUEUEID:postfix_queueid}: %{POSTFIX_ACTION:postfix_action}: %{POSTFIX_SMTP_STAGE:postfix_smtp_stage} from %{POSTFIX_CLIENT_INFO}:( %{POSTFIX_STATUS_CODE:postfix_status_code} %{POSTFIX_STATUS_CODE_ENHANCED:postfix_status_code_enhanced})?( <%{DATA:postfix_status_data}>:)? (%{POSTFIX_DNSBL_MESSAGE}|%{GREEDYDATA:postfix_status_message};) %{POSTFIX_KEYVALUE_DATA:postfix_keyvalue_data}
POSTFIX_SMTPD_PIPELINING improper command pipelining after %{POSTFIX_SMTP_STAGE:postfix_smtp_stage} from %{POSTFIX_CLIENT_INFO}: %{GREEDYDATA:postfix_improper_pipelining_data}
POSTFIX_SMTPD_PROXY proxy-%{POSTFIX_ACTION:postfix_proxy_result}: (%{POSTFIX_SMTP_STAGE:postfix_proxy_smtp_stage}): %{POSTFIX_PROXY_MESSAGE:postfix_proxy_message}; %{POSTFIX_KEYVALUE_DATA:postfix_keyvalue_data}
# cleanup patterns
POSTFIX_CLEANUP_MILTER %{POSTFIX_QUEUEID:postfix_queueid}: milter-%{POSTFIX_ACTION:postfix_milter_result}: %{GREEDYDATA:postfix_milter_message}; %{GREEDYDATA_NO_COLON:postfix_keyvalue_data}(: %{GREEDYDATA:postfix_milter_data})?
POSTFIX_CLEANUP_PREPEND_TYPE (header|body)
POSTFIX_CLEANUP_PREPEND %{POSTFIX_QUEUEID:postfix_queueid}: prepend: %{POSTFIX_CLEANUP_PREPEND_TYPE:postfix_prepend_type} %{GREEDYDATA:postfix_prepend_trigger} from %{POSTFIX_CLIENT_INFO}; %{GREEDYDATA_NO_COLON:postfix_keyvalue_data}: %{GREEDYDATA:postfix_prepend_value}
POSTFIX_CLEANUP_MESSAGEID %{POSTFIX_QUEUEID:postfix_queueid}: message-id=<?%{GREEDYDATA_NO_BRACKET:postfix_message-id}>?
# qmgr patterns
POSTFIX_QMGR_REMOVED %{POSTFIX_QUEUEID:postfix_queueid}: removed
POSTFIX_QMGR_ACTIVE %{POSTFIX_QUEUEID:postfix_queueid}: %{POSTFIX_KEYVALUE_DATA:postfix_keyvalue_data} \(queue active\)
POSTFIX_QMGR_EXPIRED %{POSTFIX_QUEUEID:postfix_queueid}: from=<%{DATA:postfix_from}>, status=%{STATUS_WORD:postfix_status}, returned to sender
# pipe patterns
POSTFIX_PIPE_ANY %{POSTFIX_QUEUEID:postfix_queueid}: %{POSTFIX_KEYVALUE_DATA:postfix_keyvalue_data}, status=%{STATUS_WORD:postfix_status} \(%{GREEDYDATA:postfix_pipe_response}\)
# error patterns
POSTFIX_ERROR_ANY %{POSTFIX_QUEUEID:postfix_queueid}: %{POSTFIX_KEYVALUE_DATA:postfix_keyvalue_data}, status=%{STATUS_WORD:postfix_status} \(%{GREEDYDATA:postfix_error_response}\)
# discard patterns
POSTFIX_DISCARD_ANY %{POSTFIX_QUEUEID:postfix_queueid}: %{POSTFIX_KEYVALUE_DATA:postfix_keyvalue_data} status=%{STATUS_WORD:postfix_status} %{GREEDYDATA}
# postsuper patterns
POSTFIX_POSTSUPER_ACTIONS (removed|requeued|placed on hold|released from hold)
POSTFIX_POSTSUPER_ACTION %{POSTFIX_QUEUEID:postfix_queueid}: %{POSTFIX_POSTSUPER_ACTIONS:postfix_postsuper_action}
POSTFIX_POSTSUPER_SUMMARY_ACTIONS (Deleted|Requeued|Placed on hold|Released from hold)
POSTFIX_POSTSUPER_SUMMARY %{POSTFIX_POSTSUPER_SUMMARY_ACTIONS:postfix_postsuper_summary_action}: %{NUMBER:postfix_postsuper_summary_count} messages?
# postscreen patterns
POSTFIX_PS_CONNECT CONNECT from %{POSTFIX_CLIENT_INFO} to \[%{IP:postfix_server_ip}\]:%{INT:postfix_server_port}
POSTFIX_PS_ACCESS %{POSTFIX_PS_ACCESS_ACTION:postfix_postscreen_access} %{POSTFIX_CLIENT_INFO}
POSTFIX_PS_NOQUEUE %{POSTFIX_SMTPD_NOQUEUE}
POSTFIX_PS_TOOBUSY NOQUEUE: reject: CONNECT from %{POSTFIX_CLIENT_INFO}: %{GREEDYDATA:postfix_postscreen_toobusy_data}
POSTFIX_PS_DNSBL %{POSTFIX_PS_VIOLATION:postfix_postscreen_violation} rank %{INT:postfix_postscreen_dnsbl_rank} for %{POSTFIX_CLIENT_INFO}
POSTFIX_PS_CACHE cache %{DATA} full cleanup: retained=%{NUMBER:postfix_postscreen_cache_retained} dropped=%{NUMBER:postfix_postscreen_cache_dropped} entries
POSTFIX_PS_VIOLATIONS %{POSTFIX_PS_VIOLATION:postfix_postscreen_violation}( %{INT})?( after %{NUMBER:postfix_postscreen_violation_time})? from %{POSTFIX_CLIENT_INFO}(( after %{POSTFIX_SMTP_STAGE:postfix_smtp_stage})?(: %{GREEDYDATA:postfix_postscreen_data})?| in tests (after|before) SMTP handshake)
# dnsblog patterns
POSTFIX_DNSBLOG_LISTING addr %{IP:postfix_client_ip} listed by domain %{HOSTNAME:postfix_dnsbl_domain} as %{IP:postfix_dnsbl_result}
# tlsproxy patterns
POSTFIX_TLSPROXY_CONN (DIS)?CONNECT( from)? %{POSTFIX_CLIENT_INFO}
# anvil patterns
POSTFIX_ANVIL_CONN_RATE statistics: max connection rate %{NUMBER:postfix_anvil_conn_rate}/%{POSTFIX_TIME_UNIT:postfix_anvil_conn_period} for \(%{DATA:postfix_service}:%{IP:postfix_client_ip}\) at %{SYSLOGTIMESTAMP:postfix_anvil_timestamp}
POSTFIX_ANVIL_CONN_CACHE statistics: max cache size %{NUMBER:postfix_anvil_cache_size} at %{SYSLOGTIMESTAMP:postfix_anvil_timestamp}
POSTFIX_ANVIL_CONN_COUNT statistics: max connection count %{NUMBER:postfix_anvil_conn_count} for \(%{DATA:postfix_service}:%{IP:postfix_client_ip}\) at %{SYSLOGTIMESTAMP:postfix_anvil_timestamp}
# smtp patterns
POSTFIX_SMTP_DELIVERY %{POSTFIX_KEYVALUE} status=%{STATUS_WORD:postfix_status}( \(%{GREEDYDATA:postfix_smtp_response}\))?
POSTFIX_SMTP_CONNERR connect to %{POSTFIX_RELAY_INFO}: %{POSTFIX_LOSTCONN:postfix_smtp_lostconn_data}
POSTFIX_SMTP_SSLCONNERR SSL_connect error to %{POSTFIX_RELAY_INFO}: %{POSTFIX_LOSTCONN:postfix_smtp_lostconn_data}
POSTFIX_SMTP_LOSTCONN %{POSTFIX_QUEUEID:postfix_queueid}: %{POSTFIX_LOSTCONN:postfix_smtp_lostconn_data} with %{POSTFIX_RELAY_INFO}( while %{POSTFIX_LOSTCONN_REASONS:postfix_smtp_lostconn_reason})?
POSTFIX_SMTP_TIMEOUT %{POSTFIX_QUEUEID:postfix_queueid}: conversation with %{POSTFIX_RELAY_INFO} timed out( while %{POSTFIX_LOSTCONN_REASONS:postfix_smtp_lostconn_reason})?
POSTFIX_SMTP_RELAYERR %{POSTFIX_QUEUEID:postfix_queueid}: host %{POSTFIX_RELAY_INFO} said: %{GREEDYDATA:postfix_smtp_response} \(in reply to %{POSTFIX_SMTP_STAGE:postfix_smtp_stage} command\)
POSTFIX_SMTP_UTF8 host %{POSTFIX_RELAY_INFO} offers SMTPUTF8 support, but not 8BITMIME
# master patterns
POSTFIX_MASTER_START (daemon started|reload) -- version %{DATA:postfix_version}, configuration %{PATH:postfix_config_path}
POSTFIX_MASTER_EXIT terminating on signal %{INT:postfix_termination_signal}
# bounce patterns
POSTFIX_BOUNCE_NOTIFICATION %{POSTFIX_QUEUEID:postfix_queueid}: sender (non-delivery|delivery status|delay) notification: %{POSTFIX_QUEUEID:postfix_bounce_queueid}
# scache patterns
POSTFIX_SCACHE_LOOKUPS statistics: (address|domain) lookup hits=%{INT:postfix_scache_hits} miss=%{INT:postfix_scache_miss} success=%{INT:postfix_scache_success}%
POSTFIX_SCACHE_SIMULTANEOUS statistics: max simultaneous domains=%{INT:postfix_scache_domains} addresses=%{INT:postfix_scache_addresses} connection=%{INT:postfix_scache_connection}
POSTFIX_SCACHE_TIMESTAMP statistics: start interval %{SYSLOGTIMESTAMP:postfix_scache_timestamp}
# aggregate all patterns
POSTFIX_SMTPD %{POSTFIX_SMTPD_CONNECT}|%{POSTFIX_SMTPD_DISCONNECT}|%{POSTFIX_SMTPD_LOSTCONN}|%{POSTFIX_SMTPD_NOQUEUE}|%{POSTFIX_SMTPD_PIPELINING}|%{POSTFIX_TLSCONN}|%{POSTFIX_WARNING}|%{POSTFIX_SMTPD_PROXY}|%{POSTFIX_KEYVALUE}
POSTFIX_CLEANUP %{POSTFIX_CLEANUP_MESSAGEID}|%{POSTFIX_CLEANUP_MILTER}|%{POSTFIX_CLEANUP_PREPEND}|%{POSTFIX_WARNING}|%{POSTFIX_KEYVALUE}
POSTFIX_QMGR %{POSTFIX_QMGR_REMOVED}|%{POSTFIX_QMGR_ACTIVE}|%{POSTFIX_QMGR_EXPIRED}|%{POSTFIX_WARNING}
POSTFIX_PIPE %{POSTFIX_PIPE_ANY}
POSTFIX_POSTSCREEN %{POSTFIX_PS_CONNECT}|%{POSTFIX_PS_ACCESS}|%{POSTFIX_PS_NOQUEUE}|%{POSTFIX_PS_TOOBUSY}|%{POSTFIX_PS_CACHE}|%{POSTFIX_PS_DNSBL}|%{POSTFIX_PS_VIOLATIONS}|%{POSTFIX_WARNING}
POSTFIX_DNSBLOG %{POSTFIX_DNSBLOG_LISTING}|%{POSTFIX_WARNING}
POSTFIX_ANVIL %{POSTFIX_ANVIL_CONN_RATE}|%{POSTFIX_ANVIL_CONN_CACHE}|%{POSTFIX_ANVIL_CONN_COUNT}
POSTFIX_SMTP %{POSTFIX_SMTP_DELIVERY}|%{POSTFIX_SMTP_CONNERR}|%{POSTFIX_SMTP_SSLCONNERR}|%{POSTFIX_SMTP_LOSTCONN}|%{POSTFIX_SMTP_TIMEOUT}|%{POSTFIX_SMTP_RELAYERR}|%{POSTFIX_TLSCONN}|%{POSTFIX_WARNING}|%{POSTFIX_SMTP_UTF8}|%{POSTFIX_TLSVERIFICATION}
POSTFIX_DISCARD %{POSTFIX_DISCARD_ANY}|%{POSTFIX_WARNING}
POSTFIX_LMTP %{POSTFIX_SMTP}
POSTFIX_PICKUP %{POSTFIX_KEYVALUE}
POSTFIX_TLSPROXY %{POSTFIX_TLSPROXY_CONN}|%{POSTFIX_WARNING}
POSTFIX_MASTER %{POSTFIX_MASTER_START}|%{POSTFIX_MASTER_EXIT}|%{POSTFIX_WARNING}
POSTFIX_BOUNCE %{POSTFIX_BOUNCE_NOTIFICATION}
POSTFIX_SENDMAIL %{POSTFIX_WARNING}
POSTFIX_POSTDROP %{POSTFIX_WARNING}
POSTFIX_SCACHE %{POSTFIX_SCACHE_LOOKUPS}|%{POSTFIX_SCACHE_SIMULTANEOUS}|%{POSTFIX_SCACHE_TIMESTAMP}
POSTFIX_TRIVIAL_REWRITE %{POSTFIX_WARNING}
POSTFIX_TLSMGR %{POSTFIX_WARNING}
POSTFIX_LOCAL %{POSTFIX_KEYVALUE}|%{POSTFIX_WARNING}
POSTFIX_VIRTUAL %{POSTFIX_SMTP_DELIVERY}
POSTFIX_ERROR %{POSTFIX_ERROR_ANY}
POSTFIX_POSTSUPER %{POSTFIX_POSTSUPER_ACTION}|%{POSTFIX_POSTSUPER_SUMMARY}

13
t/03-load.t Normal file
View File

@ -0,0 +1,13 @@
#!perl -T
use 5.006;
use strict;
use warnings;
use Test::More;
plan tests => 1;
BEGIN {
use_ok( 'Search::ESsearcher::Templates::httpAccess' ) || print "Bail out!\n";
}
diag( "Testing Search::ESsearcher::Templates::httpAccess $Search::ESsearcher::Templates::httpAccess::VERSION, Perl $], $^X" );

13
t/04-load.t Normal file
View File

@ -0,0 +1,13 @@
#!perl -T
use 5.006;
use strict;
use warnings;
use Test::More;
plan tests => 1;
BEGIN {
use_ok( 'Search::ESsearcher::Templates::postfix' ) || print "Bail out!\n";
}
diag( "Testing Search::ESsearcher::Templates::postfix $Search::ESsearcher::Templates::postfix::VERSION, Perl $], $^X" );