Create a Post
cancel
Showing results for 
Search instead for 
Did you mean: 
Hugo_vd_Kooij
Advisor

Fastest way to export to CSV

HI,

I need to export 2 months of logging into csv format. Following https://support.checkpoint.com/results/sk/sk118521 I added the -n option. But an export of a 2 GB log file takes about 3 to 4 hours. And we got about 5 GB per day worth of logging.

Are there any other option I need to add to speed up the process?

My current syntax is 

fw log -n ${DOMAINLOGDIR}/${LOGFILE} > ${EXPORTDIR}/${LOGFILE}.txt

 

<< We make miracles happen while you wait. The impossible jobs take just a wee bit longer. >>
0 Kudos
8 Replies
Tal_Paz-Fridman
Employee
Employee

Have you considered using the SmartView Web Application? it allows exporting up to one million logs. 

Perhaps you can filter out certain data and reduce the size and time it takes.

https://sc1.checkpoint.com/documents/R81.20/WebAdminGuides/EN/CP_R81.20_LoggingAndMonitoring_AdminGu...

 (exists in prior versions as well)

the_rock
Legend
Legend

That is true, BUT, worth mentioning this does not work if using S1C instance, its limited to 1k logs.

Andy

0 Kudos
Hugo_vd_Kooij
Advisor

Randsomware investigation which means they want ALL of it over the 2 months. And it's about 5 GB per day. So any filtering is out of the question.

<< We make miracles happen while you wait. The impossible jobs take just a wee bit longer. >>
0 Kudos
Joseph_Audet
Ambassador
Ambassador

Tal's answer is your best path forward. I worked with a customer to do this in bulk once, and because CP stores the logs in an efficient binary format, when you decompress them to a CSV file they lose that efficiency. I was seeing up to a 5x expansion. It also takes a lot of time. We had to export 100GB of logs and ran a script to let it run, took awhile.

0 Kudos
Lloyd_Braun
Collaborator

-p should speed it up, -n is the big one and you already have that.

 

from docs:

-p

Specifies to not to perform resolution of the port numbers in the log file (this is the default behavior).

This significantly speeds up the log processing.

mrflow1
Explorer

Did you find a way to speed up the process?

I am in the same situation, we have other vendors and it is extremely simple and fast to extract the raw logs in CSV.

Here I am doing the process with checkpoint, however it is too slow because the file it generates is super heavy.

0 Kudos
Duane_Toler
Advisor

You can use "fwm logexport -i ${log_file} -p -n -z -s",  optionally run the output through awk for some filtering if you want.  Instead of redirecting output directly to a file, pipe it to an inline "gzip -9c > /var/log/${export_file}.gz"

for log_file in ${FWDIR}/log/*.log; do
  export_file=${log_file}.exported.gz
  fwm logexport -i ${log_file} -n -p -z -s |\
  awk -F '\xFF' '/some_filter/ { print $0 }' |\
  gzip -9c > /var/log/SOME_DIR/${export_file}
done

 

Keep in mind, tho, that each file is:

1)  separated by 0xFF (ASCII 255), because of "-s" option. Remove this to make it semicolon-separated, depending on your need, but change your awk field separator, too (-F ";")

2) the column headers are included in the first line

3) NOT ALL FILES will have the same number of fields, nor will they always have the same fields

The importer will need to take this into account, but they have all the data because the column headers are included.

Then SCP/SFTP the files off to somewhere, or mount a CIFS volume directly (be sure you specify the domain name in UNC format with the username:   mount //server/share /mnt/cifs -t cifs -o username=AD_DOMAIN/AD_User).

 

0 Kudos
oa_munich
Contributor

We had a requirement to export logs, daily export took >2 hours on average. We added -s & -e, specifying previous full 1 hour period and executing hourly. Depending on the time of the day and the amount of traffic hourly exports were seconds to 5 mins, which I could live with.

0 Kudos

Leaderboard

Epsum factorial non deposit quid pro quo hic escorol.

Upcoming Events

    CheckMates Events