selectively NOT archiving
Miguel Fernandes
maf at eurotux.com
Thu Sep 13 11:20:45 CEST 2012
Hi!
I've been trying this to work too.
I don't want clean emails above some size to be archived.
Michael Scheidell's code is in 'before_send', but in that point the disk
archive is already written...
The only way I got this sort of working wast to create a new custom hook
just before the disk archive gets written to disk:
in method do_notify_and_quarantine (before line 15737):
(...)
my($hdr_edits) = prepare_header_edits_for_quarantine($msginfo);
#patch2 begin
my($conn) = $msginfo->conn_obj;
my($custom_object) = Amavis::Custom->new($conn,$msginfo);
if (ref $custom_object) {
my($which_section) = "custom-before_clean_quarantine";
eval {
@q_tuples=
$custom_object->before_clean_quarantine($conn,$msginfo,\@q_tuples);
do_log(0, "[SELECTIVE ARCHIVE]: '" . Dumper(@q_tuples) );
update_current_log_level(); 1;
} or do {
my $eval_stat = $@ ne '' ? $@ : "errno=$!"; chomp $eval_stat;
do_log(-1,"custom before_clean_quarantine error: %s", $eval_stat);
};
section_time($which_section);
}
#patch2 end
if (@q_tuples) {
(...)
What i want to change is the @q_tuples, so there is no reference there
to disk quarantine.
The plugin:
package Amavis::Custom;
#using amavisd-custom.conf;
use strict;
use Data::Dumper;
use DBI qw(:sql_types);
use DBD::mysql;
my $__archive_quarantine_in;
BEGIN {
import Amavis::Conf qw(:platform :confvars c cr ca $myhostname
@lookup_sql_dsn
$sa_mail_body_size_limit);
import Amavis::Util qw(do_log untaint safe_encode safe_decode);
}
sub new {
my($class,$conn,$msginfo,$q_tuples) = @_;
my($self) = bless {}, $class;
my($conn_h) = Amavis::Out::SQL::Connection->new(@lookup_sql_dsn);
$self->{'conn_h'} = $conn_h;
$self; # returning an object activates further callbacks,
}
sub before_clean_quarantine{
my($self,$conn,$msginfo,$q_tuples) = @_;
my($ll) = 0; # log level (0 is the most important level, 1, 2,...
5 less o)
my($too_large) = $msginfo->msg_size > $sa_mail_body_size_limit;
my($is_clean) = $msginfo->is_in_contents_category( CC_CLEAN );
#$msginfo->is_in_contents_category(
# {CC_SPAMMY,1, CC_SPAM,1, CC_BANNED,1,
CC_VIRUS,1} );
do_log($ll, "[SELECTIVE ARCHIVE]: '" . $msginfo->mail_id ."'" );
my($filename) = "/var/virusmails/".
substr($msginfo->mail_id,0,1)."/".$msginfo->mail_id.".gz";
if (-e $filename) {
do_log($ll, "[SELECTIVE ARCHIVE]: file: '" . $filename ."'
exists" );
}else{
do_log($ll, "[SELECTIVE ARCHIVE]: file: '" . $filename ."' does
not exist" );
}
#message test start here!
if($is_clean){
if ($too_large)
{
do_log($ll, "[SELECTIVE ARCHIVE]: UNWANTED Clean message
too big (" . $msginfo->msg_size . "k >
".($sa_mail_body_size_limit/1024)."k) ");
@$q_tuples=[]
}
else
{
do_log($ll, "[SELECTIVE ARCHIVE]: ". Dumper($msginfo));
}
do_log($ll, "[SELECTIVE ARCHIVE]: Small Clean message (" .
$msginfo->msg_size . "k > ".($sa_mail_body_size_limit/1024)."k) OK");
}
else
{
do_log($ll, "[SELECTIVE ARCHIVE]: Message is not clean");
}
return @$q_tuples;
}
I don't know if "@$q_tuples=[]" is the best way of doing this, but it works.
There is one problem: "my($is_clean) =
$msginfo->is_in_contents_category( CC_CLEAN );" Does not work, this
instruction is always true even form spam messages...
I suppose that in this spot in the code, that information is not
available yet?
What is the best way of doing this?
Thank you!
On 07/20/2011 09:46 PM, Michael Scheidell wrote:
> I have a need to selectively NOT archive clean emails under certain
> circumstances.
> we archive clean email on some servers, NOT because we want the
> emails, but because we want to feed VIRGIN emails back to SA for learning.
> (exchange mashes the emails and headers.. imap wasn't so bad, but ews
> really mucks them up)
> HOWEVER, I do NOT want to archive CLEAN emails > 400K.
> (I still want to archive large viruses, attachments, and spam)
>
> I have (almost) got this down, but just need last step. sql queries
> work, I can calculate size, read values, just want to DISABLE
> archiving for LARGE clean emails (note: maybe I am doing it in the
> wrong place, maybe I need a per-user loop.. since one users clean is
> another users spam.. but then again, maybe the flags are set on
> is_in_contents_category just fine)
>
>
> using amavisd-custom.conf
>
> use strict;
> use DBI qw(:sql_types);
> use DBD::mysql;
> my $__archive_quarantine_in;
> BEGIN {
> import Amavis::Conf qw(:platform :confvars c cr ca $myhostname
> $clean_quarantine_method @lookup_sql_dsn
> $sa_mail_body_size_limit);
> import Amavis::Util qw(do_log untaint safe_encode safe_decode);
> }
>
> sub new {
> my($class,$conn,$msginfo) = @_;
> my($self) = bless {}, $class;
> my($conn_h) = Amavis::Out::SQL::Connection->new(@lookup_sql_dsn);
> $self->{'conn_h'} = $conn_h;
> $self; # returning an object activates further callbacks,
> }
>
> sub before_send {
> my($self,$conn,$msginfo) = @_;
> my($ll) = 3; # log level (0 is the most important level, 1, 2,...
> 5 less so)
> my($too_large) = $msginfo->msg_size > $sa_mail_body_size_limit;
> my($already_quarantined) = $msginfo->is_in_contents_category(
> {CC_SPAMMY,1, CC_SPAM,1, CC_BANNED,1, CC_VIRUS,1} );
>
> if ($too_large) {
> if(! $already_quarantined && $clean_quarantine_method =~ /sql:/) {
> do_log(0, "CUSTOM: UNWANTED = $msg_size"."k >
> ".($sa_mail_body_size_limit/1024)."k");
> # I want to NOT archive if it hits here.
> }
> }
>
> --
> Michael Scheidell, CTO
> o: 561-999-5000
> d: 561-948-2259
> >*| *SECNAP Network Security Corporation
>
> * Best Mobile Solutions Product of 2011
> * Best Intrusion Prevention Product
> * Hot Company Finalist 2011
> * Best Email Security Product
> * Certified SNORT Integrator
>
>
> ------------------------------------------------------------------------
>
> This email has been scanned and certified safe by SpammerTrap®.
> For Information please see http://www.secnap.com/products/spammertrap/
>
> ------------------------------------------------------------------------
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.amavis.org/pipermail/amavis-users/attachments/20120913/03f95647/attachment.html>
More information about the amavis-users
mailing list