This document outlines the detailed implementation plan for integrating the existing Fluentd Aggregator with Kafka CDI bus to forward security and audit logs from the A4 EMS cluster.
The implementation covers the configuration of Fluentd Aggregator, the setup of Kafka topics, and the establishment of a secure connection between the A4 EMS cluster and Kafka CDI bus.
As per the PSA process, all network elements and servers in the DT network must have logging of security-relevant data enabled and sent to the Security Operations Center Technology (SOCT). This implementation aims to fulfill this requirement efficiently using existing infrastructure.
[Include a diagram of the proposed architecture here]
- A4 EMS Cluster
- Fluentd Aggregator
- Kafka CDI Bus
- Network Elements (OLT, Leaf & Spines, POD_SERVERs, LI-box, BOR)
Configure Fluentd to collect logs from various sources within the A4 EMS cluster:
<source>
@type tail
path /var/log/soct/*.log
pos_file /var/log/td-agent/soct.log.pos
tag soct.*
<parse>
@type json
</parse>
</source>
Add filters to process and standardize log formats:
<filter soct.**>
@type record_transformer
<record>
hostname ${hostname}
environment ${tag_parts[1]}
</record>
</filter>
Configure Fluentd to send logs to Kafka:
<match soct.**>
@type kafka2
brokers kafka-broker1:9092,kafka-broker2:9092,kafka-broker3:9092
default_topic soct_logs
<format>
@type json
</format>
compression_codec gzip
required_acks 1
<buffer>
@type file
path /var/log/td-agent/buffer/kafka
flush_interval 5s
</buffer>
</match>
Create the necessary Kafka topic:
kafka-topics.sh --create --topic soct_logs --bootstrap-server kafka-broker1:9092 --partitions 3 --replication-factor 2
Enable SSL/TLS for Kafka:
- Generate SSL certificates
- Configure Kafka server.properties
- Update Fluentd Kafka output plugin with SSL settings
Open necessary ports between A4 EMS Cluster and Kafka CDI:
- Kafka Broker Port: 9092 (or 9093 for SSL)
- Zookeeper Port: 2181 (if required)
Ensure proper DNS resolution for Kafka brokers from the A4 EMS Cluster.
Test individual components:
- Fluentd log collection
- Fluentd to Kafka communication
- Kafka topic accessibility
Test the entire flow from log generation to Kafka consumption.
Conduct load tests to ensure the system can handle expected log volumes.
- Fluentd buffer queue length
- Kafka producer success rate
- Kafka consumer lag
Set up alerts for:
- Fluentd errors
- Kafka connectivity issues
- Unusual spikes in log volume
- All configurations tested in staging environment
- Rollback plan prepared
- All team members briefed on deployment steps
- Update Fluentd configuration
- Create Kafka topics
- Configure network settings
- Start log forwarding
- Validate data flow
- Confirm logs are flowing to Kafka
- Verify log integrity and format
- Check for any errors or warnings
- Regular log rotation
- Kafka topic compaction
- SSL certificate renewal
Include common issues and their resolutions.
Ensure all data in transit is encrypted using SSL/TLS.
Implement proper authentication and authorization for Kafka access.
Maintain up-to-date documentation on:
- System architecture
- Configuration details
- Operational procedures
Outline training requirements for:
- Operations team
- Support team
- Security team
- Kafka cluster expansion
- Fluentd performance optimizations
Plan for incorporating logs from future network elements or services.
A. Configuration Files B. Command Reference C. Troubleshooting Flowcharts