- CheckMates
- :
- Products
- :
- Quantum
- :
- Management
- :
- Fastest way to export to CSV
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Are you a member of CheckMates?
×- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Fastest way to export to CSV
HI,
I need to export 2 months of logging into csv format. Following https://support.checkpoint.com/results/sk/sk118521 I added the -n option. But an export of a 2 GB log file takes about 3 to 4 hours. And we got about 5 GB per day worth of logging.
Are there any other option I need to add to speed up the process?
My current syntax is
fw log -n ${DOMAINLOGDIR}/${LOGFILE} > ${EXPORTDIR}/${LOGFILE}.txt
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Have you considered using the SmartView Web Application? it allows exporting up to one million logs.
Perhaps you can filter out certain data and reduce the size and time it takes.
(exists in prior versions as well)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
That is true, BUT, worth mentioning this does not work if using S1C instance, its limited to 1k logs.
Andy
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Randsomware investigation which means they want ALL of it over the 2 months. And it's about 5 GB per day. So any filtering is out of the question.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Tal's answer is your best path forward. I worked with a customer to do this in bulk once, and because CP stores the logs in an efficient binary format, when you decompress them to a CSV file they lose that efficiency. I was seeing up to a 5x expansion. It also takes a lot of time. We had to export 100GB of logs and ran a script to let it run, took awhile.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
-p should speed it up, -n is the big one and you already have that.
from docs:
-p
Specifies to not to perform resolution of the port numbers in the log file (this is the default behavior).
This significantly speeds up the log processing.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Did you find a way to speed up the process?
I am in the same situation, we have other vendors and it is extremely simple and fast to extract the raw logs in CSV.
Here I am doing the process with checkpoint, however it is too slow because the file it generates is super heavy.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can use "fwm logexport -i ${log_file} -p -n -z -s", optionally run the output through awk for some filtering if you want. Instead of redirecting output directly to a file, pipe it to an inline "gzip -9c > /var/log/${export_file}.gz"
for log_file in ${FWDIR}/log/*.log; do
export_file=${log_file}.exported.gz
fwm logexport -i ${log_file} -n -p -z -s |\
awk -F '\xFF' '/some_filter/ { print $0 }' |\
gzip -9c > /var/log/SOME_DIR/${export_file}
done
Keep in mind, tho, that each file is:
1) separated by 0xFF (ASCII 255), because of "-s" option. Remove this to make it semicolon-separated, depending on your need, but change your awk field separator, too (-F ";")
2) the column headers are included in the first line
3) NOT ALL FILES will have the same number of fields, nor will they always have the same fields
The importer will need to take this into account, but they have all the data because the column headers are included.
Then SCP/SFTP the files off to somewhere, or mount a CIFS volume directly (be sure you specify the domain name in UNC format with the username: mount //server/share /mnt/cifs -t cifs -o username=AD_DOMAIN/AD_User).
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
We had a requirement to export logs, daily export took >2 hours on average. We added -s & -e, specifying previous full 1 hour period and executing hourly. Depending on the time of the day and the amount of traffic hourly exports were seconds to 5 mins, which I could live with.
