Create a Post
cancel
Showing results for 
Search instead for 
Did you mean: 
Ruan_Kotze
Advisor
Jump to solution

Running reports against older log files

Hi All,

My research on this topic has left me conflicted on exactly how run reports against older log files.  Here's my scenario: SMS with all roles on one appliance (SmartEvent, Logging etc).  Appliance was configured to delete indexes after 14 days.  There was an audit request to provide security reports going back 100 days.  Running reports against this timeframe came up blank (as expected).

Based on my research, here is what I did:

  • Verified that there are logs for the time period in $FWDIR/log/
  • Verified if there are indexes for the time period in $RTDIR/log_indexes/ (there wasn't, as expected only went back 14 days)
  • Followed sk111766 to assure the days_to_index was set back to 100 days
  • Configured the retention settings in SmartConsole to only delete indexes older than 100 days
  • Backed up and Deleted $INDEXERDIR/data/FetchedFiles to force re-indexing

I did this a couple of days ago and I see the Java and log_indexer processes are still taking up significant CPU, so I am assuming indexing is still going on (according to documentation it can take several days - in my case it's a 525 appliance with spinning disks so not the fastest storage).

Is indexing as per above sufficient to populate and run reports against older logs (of course with the 100 day timeframe) or are there additional steps necessary?  Specifically - does one need to follow the steps in sk98894 (How to run SmartEvent Offline Jobs for multiple log files) also.

Thanks,
Ruan

0 Kudos
1 Solution

Accepted Solutions
Ruan_Kotze
Advisor

Hi Martin, the steps I listed will allow you to report against older log files.  There's an additional step required if you for some reason need to recreate the log pointers (fw repairlog), but that should not be necessary if you just want to re-index older logs.

View solution in original post

5 Replies
PhoneBoy
Admin
Admin

The indexes are required for reporting, so yes.

0 Kudos
MartinZ
Contributor

I have this same question. @Ruan_Kotze did you ever validate just the indexes are enough?

"Is indexing as per above sufficient to populate and run reports against older logs (of course with the 100 day timeframe) or are there additional steps necessary? Specifically - does one need to follow the steps in sk98894"

While we agree indexes are required for reports, will indexes alone allow reporting against offline logs or do they need to be copied off (per sk98894) or by importing offline logs via the admin guide: https://sc1.checkpoint.com/documents/R81/WebAdminGuides/EN/CP_R81_LoggingAndMonitoring_AdminGuide/To...

Or is there a better way we don't know about (other than 3rd party software) ?

 

0 Kudos
Ruan_Kotze
Advisor

Hi Martin, the steps I listed will allow you to report against older log files.  There's an additional step required if you for some reason need to recreate the log pointers (fw repairlog), but that should not be necessary if you just want to re-index older logs.

CheckPointerXL
Advisor

Hi all,

can anyone explain the content of /var/log/CPrt-R81/.... folder?

I have  100GB inside that folder and i'm not sure if i can remove it safely? The extensions of that files are very strange and i cannot understand in which case they would be useful.

 

Now i'm on R81.10.

Cattura.JPG

thank you 

0 Kudos
PhoneBoy
Admin
Admin

I would work with the TAC before removing any files from here as it's not unusual for the log indexes to be large (depending on the size of logs you have).

Leaderboard

Epsum factorial non deposit quid pro quo hic escorol.

Upcoming Events

    CheckMates Events