- Ignore unknown forensic report fields when generating CSVs (Closes issue #148)
- Fix crash on IMAP timeout (PR #164 - closes issue #163)
- Use SMTP port from the config file when sending emails (PR #151)
- Add support for Elasticsearch 7.0 (PR #161 - closes issue #149)
- Remove temporary workaround for DMARC aggregate report records missing a SPF domain fields
- Use system nameservers instead of Cloudflare by default
- Parse aggregate report records with missing SPF domains
- Require
mailsuite>=1.5.4
- Use
match_phrase
instead ofmatch
when looking for existing strings in Elasticsearch
- Display warning when
GeoLite2-Country.mmdb
is missing, instead of trying to download it - Add documentation for MaxMind
geoipupdate
changes on January 30th, 2019 (closes issues #137 and #139) - Require
mail-parser>=3.11.0
- Update dependencies
- Make
dkim_aligned
andspf_aligned
case insensitive (PR #132)
- Fix SPF results field in CSV output (closes issue #128)
- Parse forensic email samples with non-standard date headers
- Graceful handling of a failure to download the GeoIP database (issue #123)
- Fix typos (PR #119)
- Make CSV output match JSON output (Issue # 22)
- Graceful processing of invalid aggregate DMARC reports (PR #122)
- Remove Python 3.4 support
- Close files after reading them
- Set a configurable default IMAP timeout of 30 seconds
- Set a configurable maximum of 4 IMAP timeout retry attempts
- Add support for reading
MBOX
files - Set a configurable Elasticsearch timeout of 60 seconds
- Set set minimum
publicsuffix2
version
- Bump required
mailsuite
version to1.2.1
- Fix typos in the CLI documentation
- Bump required
mailsuite
version to1.1.1
- Merge PR #100 from michaeldavie
- Correct a bug introduced in 6.5.1 that caused only the last record's data to be used for each row in an aggregate report's CSV version.
- Use
mailsuite
1.1.0 to fix issues with some IMAP servers (closes issue 103)- Always use
/
as the folder hierarchy separator, and convert to the server's hierarchy separator in the background - Always remove folder name characters that conflict with the server's hierarchy separators
- Prepend the namespace to the folder path when required
- Always use
- Merge PR #98 from michaeldavie
- Add functions
parsed_aggregate_reports_to_csv_row(reports)
parsed_forensic_reports_to_csv_row(reports)
- Add functions
- Require
dnspython>=1.16.0
- Move mail processing functions to the
mailsuite
package - Add offline option (closes issue #90)
- Use UDP instead of TCP, and properly set the timeout when querying DNS (closes issue #79 and #92)
- Log the current file path being processed when
--debug
is used (closes issue #95)
- Do not attempt to convert
org_name
to a base domain iforg_name
contains a space (closes issue #94) - Always lowercase the
header_from
- Provide a more helpful warning message when
GeoLite2-Country.mmdb
is missing
- Raise
utils.DownloadError
exception when a GeoIP database or Public Suffix List (PSL) download fails (closes issue #73)
- Add
number_of_shards
andnumber_of_replicas
as possible options in theelasticsearch
configuration file section (closes issue #78)
- Work around some unexpected IMAP responses reported in issue #75
- Work around some unexpected IMAP responses reported in issue #70
- Show correct destination folder in debug logs when moving aggregate reports
- Normalize
Delivery-Result
value in forensic/failure reports (issue #76) Thanks Freddie Leeman of URIports for the troubleshooting assistance
- Fix Elasticsearch index creation (closes issue #74)
- Set
number_of_shards
andnumber_of_replicas
to1
when creating indexes - Fix dependency conflict
- Fix the
monthly_indexes
option in theelasticsearch
configuration section
- Fix
strip_attachment_payloads
option
- Fix IMAP IDLE response processing for some mail servers (#67)
- Exit with a critical error when required settings are missing (#68)
- XML parsing fixes (#69)
- Add IMAP responses to debug logging
- Add
smtp
optionskip_certificate_verification
- Add
kafka
optionskip_certificate_verification
- Suppress
mailparser
logging output - Suppress
msgconvert
warnings
- Fix crash when trying to save forensic reports with missing fields to Elasticsearch
- Add missing
tqdm
dependency tosetup.py
- Add support for multi-process parallelized processing via CLI (Thanks zscholl - PR #62)
- Save sha256 hashes of attachments in forensic samples to Elasticsearch
- Actually fix GeoIP lookups
- Fix GeoIP lookups
- Better GeoIP error handling
- Always use Cloudflare's nameservers by default instead of Google's
- Avoid re-downloading the Geolite2 database (and tripping their DDoS protection)
- Add
geoipupdate
to install instructions
- Actually package requirements
- Fix package requirements
- Use local Public Suffix List file instead of downloading it
- Fix argument name for
send_email()
(closes issue #60)
- Fix aggregate report processing
- Check for the existence of a configuration file if a path is supplied
- Replace
publicsuffix
withpublicsuffix2
- Add minimum versions to requirements
- Fix aggregate report email parsing regression introduced in 6.0.3 (closes issue #57)
- Fix Davmail support (closes issue #56)
- Don't assume the report is the last part of the email message (issue #55)
- IMAP connectivity improvements (issue #53)
- Use a temp directory for temp files (issue #54)
- Fix Elasticsearch output (PR #50 - andrewmcgilvray)
- Move options from CLI to a config file (see updated installation documentation)
- Refactoring to make argument names consistent
- Fix crash on invalid forensic report sample (Issue #47)
- Fix DavMail support (Issue #45)
- Remove unnecessary debugging code
- Add filename and line number to logging output
- Improved IMAP error handling
- Add CLI options
--elasticsearch-use-ssl Use SSL when connecting to Elasticsearch --elasticsearch-ssl-cert-path ELASTICSEARCH_SSL_CERT_PATH Path to the Elasticsearch SSL certificate --elasticsearch-monthly-indexes Use monthly Elasticsearch indexes instead of daily indexes --log-file LOG_FILE output logging to a file
- Remove
urllib3
version upper limit
- Workaround unexpected Office365/Exchange IMAP responses
- Bugfix: Crash when parsing invalid forensic report samples (#38)
- Bugfix: Crash when IMAP connection is lost
- Increase default Splunk HEC response timeout to 60 seconds
- Bugfix: Submit aggregate dates to Elasticsearch as lists, not tuples
- Support
elasticsearch-dsl<=6.3.0
- Add support for TLS/SSL and username/password auth to Kafka
- Revert to using
publicsuffix
instead ofpublicsuffix2
- Use
publixsuffix2
(closes issue #4) - Add Elasticsearch to automated testing
- Lock
elasticsearch-dsl
required version to6.2.1
(closes issue #25)
Note: Re-importing kibana_saved_objects.json
in Kibana is required when upgrading to this version!
- Bugfix: Reindex the aggregate report index field
published_policy.fo
astext
instead oflong
(Closes issue #31) - Bugfix: IDLE email processing in Gmail/G-Suite accounts (closes issue #33)
- Bugfix: Fix inaccurate DNS timeout in CLI documentation (closes issue #34)
- Bugfix: Forensic report processing via CLI
- Bugfix: Duplicate aggregate report Elasticsearch query broken
- Bugfix: Crash when
Arrival-Date
header is missing in a forensic/fialure/ruf report - IMAP reliability improvements
- Save data in separate indexes each day to make managing data retention easier
- Cache DNS queries in memory
- Don't crash if Elasticsearch returns an unexpected result (workaround for issue #31)
- Packaging fixes
- Kafka output improvements
- Moved some key values (
report_id
,org_email
,org_name
) higher in the JSON structure - Recreated the
date_range
values from the ES client for easier parsing. - Started sending individual record slices. Kafka default message size is 1 MB, some aggregate reports were exceeding this. Now it appends meta-data and sends record by record.
- Moved some key values (
- Fix decoding of attachments inside forensic samples
- Add CLI option
--imap-skip-certificate-verification
- Add optional
ssl_context
argument forget_dmarc_reports_from_inbox()
andwatch_inbox()
- Debug logging improvements
- When checking an inbox, always recheck for messages when processing is complete
- Be more forgiving for forensic reports with missing fields
- Fix base64 attachment decoding (#26)
- Fix crash on empty aggregate report comments (brakhane - #25)
- Add SHA256 hashes of attachments to output
- Add
strip_attachment_payloads
option to functions and--strip-attachment-payloads
option to the CLI (#23) - Set
urllib3
version requirements to matchrequests
- Fix forensic report email processing
- Fix normalization of the forensic sample from address
- Fix parsing of some emails
- Fix duplicate forensic report search for Elasticsearch
- Fix bug where
parsedmarc
would always try to save to Elastic search, even if only--hec
was used - Add options to save reports as a Kafka topic (mikesiegel - #21)
- Major refactoring of functions
- Support parsing forensic reports generated by Brightmail
- Make
sample_headers_only
flag more reliable - Functions that might be useful to other projects are now stored in
parsedmarc.utils
:get_base_domain(domain)
get_filename_safe_string(string)
get_ip_address_country(ip_address)
get_ip_address_info(ip_address, nameservers=None, timeout=2.0)
get_reverse_dns(ip_address, nameservers=None, timeout=2.0)
human_timestamp_to_datetime(human_timestamp)
human_timestamp_to_timestamp(human_timestamp)
parse_email(data)
- Save each aggregate report record as a separate Splunk event
- Fix IMAP delete action (#20)
- Suppress Splunk SSL validation warnings
- Change default logging level to
WARNING
- Workaround for forensic/ruf reports that are missing
Arrival-Date
and/orReported-Domain
- Be more forgiving of weird XML
- Remove any invalid XML schema tags before parsing the XML (#18)
- Fix typo in CLI parser
- Only move or delete IMAP emails after they all have been parsed
- Move/delete messages one at a time - do not exit on error
- Reconnect to IMAP if connection is broken during
get_dmarc_reports_from_inbox()
- Add
--imap-port
and--imap-no-ssl
CLI options
- Change default logging level to
ERROR
- Fix crash introduced in 4.1.0 when creating Elasticsearch indexes (Issue #15)
- Fix packaging bug
- Add splunk instructions
- Reconnect reset IMAP connections when watching a folder
- Add options for Elasticsearch prefixes and suffixes
- If an aggregate report has the invalid
disposition
valuepass
, change it tonone
- Use report timestamps for Splunk timestamps
- When saving aggregate reports in Elasticsearch store
domain
inpublished_policy
- Rename
policy_published
topublished_policy
when saving aggregate reports to Splunk
- Add support for sending DMARC reports to a Splunk HTTP Events Collector (HEC)
- Use a browser-like
User-Agent
when downloading the Public Suffix List and GeoIP DB to avoid being blocked by security proxies - Reduce default DNS timeout to 2.0 seconds
- Add alignment booleans to JSON output
- Fix
.msg
parsing CLI exception whenmsgconvert
is not found in the system path - Add
--outgoing-port
and--outgoing-ssl
options - Fall back to plain text SMTP if
--outgoing-ssl
is not used andSTARTTLS
is not supported by the server - Always use
\n
as the newline when generating CSVs - Workaround for random Exchange/Office365
Server Unavailable
IMAP errors
- Completely reset IMAP connection when a broken pipe is encountered
- Finish incomplete broken pipe fix
-
Refactor to use a shared IMAP connection for inbox watching and message downloads
-
Gracefully recover from broken pipes in IMAP
- Fix moving/deleting emails
- Fix crash when forensic reports are missing
Arrival-Date
- Fix PEP 8 spacing
- Update build script to fail when CI tests fail
- Use
COPY
and delete if an IMAP server does not supportMOVE
(closes issue #9)
- Reduce IMAP
IDLE
refresh rate to 5 minutes to avoid session timeouts in Gmail - Fix parsing of some forensic/failure/ruf reports
- Include email subject in all warning messages
- Fix example NGINX configuration in the installation documentation (closes issue #6)
- Fix
nameservers
option (mikesiegel) - Move or delete invalid report emails in an IMAP inbox (closes issue #7)
- Better handling of
.msg
files whenmsgconvert
is not installed
- Use
.
instead of/
as the IMAP folder hierarchy separator when/
does not work - fixes dovecot support (#5) - Fix parsing of base64-encoded forensic report data
- Fix saving attachment from forensic sample to Elasticsearch
- Change uses uses of the
DocType
class toDocument
, to properly supportelasticsearch-dsl
6.2.0
(this also fixes use in pypy) - Add documentation for installation under pypy
- Require
elasticsearch>=6.2.1,<7.0.0
andelasticsearch-dsl>=6.2.1,<7.0.0
- Update for class changes in
elasticsearch-dsl
6.2.0
- Fix bug where PSL would be called before it was downloaded if the PSL was older than 24 Hours
- Parse aggregate reports with missing SPF domain
- Much more robust error handling
- Fix dashboard message counts for source IP addresses visualizations
- Improve dashboard loading times
- Improve dashboard layout
- Add country rankings to the dashboards
- Fix crash when parsing report with empty <auth_results></auth_results>
- Use Cloudflare's public DNS resolvers by default instead of Google's
- Fix installation from virtualenv
- Fix documentation typos
- Documentation fixes
- Fix console output
- Maintain IMAP IDLE state when watching the inbox
- The
-i
/--idle
CLI option is now-w
/--watch
- Improved Exception handling and documentation
- Fix errors when saving to Elasticsearch
- Fix existing aggregate report error message
- Fix existing aggregate report query
- Add option to select the IMAP folder where reports are stored
- Add options to send data to Elasticsearch
- Use Google's public nameservers (
8.8.8.8
and4.4.4.4
) by default - Detect aggregate report email attachments by file content rather than file extension
- If an aggregate report's
org_name
is a FQDN, the base is used - Normalize aggregate report IDs
- Rename
parsed_dmarc_forensic_reports_to_csv()
toparsed_forensic_reports_to_csv()
to match other functions - Rename
parsed_aggregate_report_to_csv()
toparsed_aggregate_reports_to_csv()
to match other functions - Use local time when generating the default email subject
- Documentation fixes
- Add
get_report_zip()
andemail_results()
- Add support for sending report emails via the command line
- Fix documentation
- Remove Python 2 code
- Parse forensic reports
- Parse reports from IMAP inbox
- Drop support for Python 2
- Command line output is always a JSON object containing the lists
aggregate_reports
andforensic_reports
-o
/--output
option is now a path to an output directory, instead of an output file
- Add
extract_xml()
andhuman_timespamp_to_datetime
methods
- Prefix public suffix and GeoIP2 database filenames with
.
- Properly format errors list in CSV output
- Fix documentation formatting
- Fix more packaging flaws
- Fix packaging flaw
- Initial release