- Products
- Learn
- Local User Groups
- Partners
- More
Policy Insights and Policy Auditor in Action
19 November @ 5pm CET / 11am ET
Access Control and Threat Prevention Best Practices
Watch HereOverlap in Security Validation
Help us to understand your needs better
CheckMates Go:
Maestro Madness
Hello community,
I would like to know if it's possible to make changes to what can be sent to a SIEM server, similar to what is done in LogExporter, but using Smart-1 Cloud.
K, got it! Then @PhoneBoy was correct. Ask TAC once you open the case to check below file on backend and modify whats needed:
[Expert@CPHQVMFWMGT01:0]#
[Expert@CPHQVMFWMGT01:0]# cd /opt/CPrt-R81.20/log_exporter/targets/
[Expert@CPHQVMFWMGT01:0]# ls
CheckPointLogs
[Expert@CPHQVMFWMGT01:0]# cd CheckPointLogs/
[Expert@CPHQVMFWMGT01:0]# vi targetConfiguration.xml
As I recall, you have to modify some .xml and/or config files.
That would have to be handled by TAC.
You can set it up from S1C portal. Will verify tomorrow and update you.
Hi the_rock
Thank you for your support. I wanted to see if this is possible because I want to improve the volume of events that the SIEM is receiving. I've only seen this in an on-premise management system, but I've never done it in the smart-1 cloud.
Sorry mate, forgot to take a screenshot, give me 10-15 mins, will check it and update you.
the_rock, I really appreciate your help.
But what I really need is to configure what should be sent.
For example:
The SIEM is receiving these columns: origin, destination, port, blade, action, date and time.
I want to edit and send only the columns: origin, port, date and time.
In LogExporter I know we can edit a file (.xml), it has some limitations, but I know it's possible and I also wanted to know if this is possible in Smart-1 Cloud.
I apologize if I wasn't clear before.
I'll leave the link to the configuration I want to make, comparing it with LogExporter:
https://sc1.checkpoint.com/documents/Log_Exporter/EN/Content/Topics/Filter-Configuration.htm
K, got it! Then @PhoneBoy was correct. Ask TAC once you open the case to check below file on backend and modify whats needed:
[Expert@CPHQVMFWMGT01:0]#
[Expert@CPHQVMFWMGT01:0]# cd /opt/CPrt-R81.20/log_exporter/targets/
[Expert@CPHQVMFWMGT01:0]# ls
CheckPointLogs
[Expert@CPHQVMFWMGT01:0]# cd CheckPointLogs/
[Expert@CPHQVMFWMGT01:0]# vi targetConfiguration.xml
Example from my lab:
[Expert@CP-MANAGEMENT:0]# find / -name targetConfiguration.xml
/opt/CPrt-R82/log_exporter/targets/test-log/targetConfiguration.xml
/opt/CPrt-R82/log_exporter/targets/SentinelOne-XDR/targetConfiguration.xml
[Expert@CP-MANAGEMENT:0]#
Hey mate,
Let me know what fields you need changed and I can verify in my lab for you.
Hi the_rock,
Well, I'm still coordinating with the SIEM team on what they need from their side. But this is the information they've given me so far.
Header{
CEF:0| - log type - necessary
Check Point| - Vendor - necessary
VPN-1 & FireWall-1| - Product - necessary
Check Point| - manufacturer - necessary
Accept| - action - necessary
https| - protocol - necessary
Unknown| - accountname - não mencionado nos casos de uso, but it may be linked to cases of account manipulation and/or modification of rules or policies.
}
Body{
act=Accept - action - necessary
deviceDirection=0 - unknown - not mentioned in the use cases
rt=1762518075000 - timestamp - necessary
spt=30071 - necessary
dpt=443 - necessary
cs2Label=Rule Name - necessary
cs2=Implied Rule - necessary
layer_name=Policy_XX Network - necessary
layer_uuid=32e912aa-dd67-4ba5-ad4e-2f122c69d987 - necessary
match_id=0 - unknown - not mentioned in the use cases
parent_rule=0 - unknown - not mentioned in the use cases
rule_action=Accept - necessary
rule_uid=0E3B6801-8AB0-4b1e-A317-8BE33055FB43 - necessary
ifname=DMZ - necessary
logid=0 - unknown - not mentioned in the use cases
loguid={0x462342d,0x62ac842e,0xbc016d08,0xb5efa7c3}
origin=x.x.x.x - necessary
originsicname=CN\=FW-XX-1600,O\=Management_Service..cnupe4 - hostname and host domain - necessary
sequencenum=98 - unknown - not mentioned in the use cases
version=5 - unknown - not mentioned in the use cases
dst=x.x.x.x - necessary
inzone=External - necessary
outzone=Local - necessary
product=VPN-1 & FireWall-1 - necessary
proto=6 - unknown - not mentioned in the use cases
service_id=https - necessary
src=x.x.x.x - necessary
}
Here is what it looks like in my lab:
[Expert@CP-MANAGEMENT:0]# more /opt/CPrt-R82/log_exporter/targets/SentinelOne-XDR/targetConfiguration.xml
<?xml version="1.0" encoding="utf-8"?>
<export id="targetObjectUID"><!--object uuid!-->
<version>9</version> <!-- Version of this file-->
<is_enabled>true</is_enabled><!--Is the process allowed to run, and start on cpstart-->
<!-- Destination section defines the properties of the export target -->
<destination type="syslog"> <!-- Target output type -->
<ip>172.16.10.108</ip><!--the ip of the syslog server-->
<port>8002</port><!--the port on which the syslog is listening to-->
<protocol>udp</protocol><!--udp/tcp-->
<local_addr_ip></local_addr_ip><!--local address ip-->
<!--the configuration of tls-->
<transport>
<security></security><!--clear/tls-->
<!-- the following section is relevant only if <security> is tls -->
<pem_ca_file></pem_ca_file>
<p12_certificate_file></p12_certificate_file>
<client_certificate_challenge_phrase></client_certificate_challenge_phrase>
</transport>
<reconnect_interval></reconnect_interval><!-- Shedule reconnection to the destination server (empty to disable [defa
ult] | number of minutes) -->
</destination>
<!-- Enrichment configuration, exporting domain server name, orig_log_server uuid and orig_log_server ip -->
<data_enrichment>
<export_domain>false</export_domain>
<export_orig_log_server>false</export_orig_log_server>
</data_enrichment>
<!-- Filter Configuration -->
<dynamicFilter>conf/FilterConfiguration.xml</dynamicFilter>
<!-- Source section defines the properties of the input stream that will be exported -->
<source>
<log_files>1</log_files><!-- <Number> - read logs on-line | read logs from [number] days back (default 1) | specif
ic file name -->
<log_types></log_types><!--all[default]|log|audit/-->
<folder></folder><!--$FWDIR/log[default]|specific path-->
<read_mode>semi-unified</read_mode><!--raw|semi-unified[default]/-->
</source>
<export_log_position>false</export_log_position> <!-- True | False /-->
<export_log_link>false</export_log_link> <!-- True | False /-->
<export_attachment_link>false</export_attachment_link> <!-- True | False /-->
<export_link_ip></export_link_ip> <!-- empty [defaut] | external IP /-->
<export_attachment_ids>false</export_attachment_ids> <!-- True | False /-->
<!-- Format section determines the form (headers and mappings) of the exported logs -->
<format type="syslog"> <!--syslog | cef | rsa | leef | generic | splunk | this parameter may differ from the type o
f destination, for example, destination type = files/format type = CEF -->
<resolver>
<mappingConfiguration></mappingConfiguration><!--if empty the fields are sent as is without renaming-->
<exportAllFields>true</exportAllFields> <!--in case exportAllFields=true - exported element in fieldsMapping.xml
is ignored and fields not from fieldsMapping.xml are exported as notMappedField field-->
</resolver>
<!-- Format header configuration (actual to CEF see ./conf directory) -->
<formatHeaderFile></formatHeaderFile>
</format>
<!-- Time In Milli Seconds -->
<time_in_milli>false</time_in_milli>
<!-- Skip logs incase of failure in sending-->
<skip_failed_logs>false</skip_failed_logs>
<!-- The following section is for future use of log filtering, please do not modify these values -->
<filter filter_out_by_connection="false">
<field name="product">
<value>VPN-1 & FireWall-1</value>
<value>HTTPS Inspection</value>
<value>VPN-1</value>
<value>Security Gateway/Management</value>
<value>Firewall</value>
<value>FG</value>
</field>
<field name="fw_subproduct">
<value>VPN-1 & FireWall-1</value>
<value>HTTPS Inspection</value>
<value>VPN-1</value>
<value>Security Gateway/Management</value>
<value>Firewall</value>
<value>FG</value>
</field>
</filter>
</export>
[Expert@CP-MANAGEMENT:0]#
Is this the default result, or did you apply any filters to what should be sent? If not, can we modify this file without causing any errors in sending it to the SIEM?
I remember reading in LogExporter that it has some limitations for certain filters. I wanted to know if you applied any, and if so, did you follow any documentation?
It is something my colleague configured in the lab, but it was not changed afterwards. This is what it looks like from smart console.
Interesting.
I didn't know it was possible to access the Smart-1 Cloud through the Smart Console. This is my first time experiencing this, and it has raised some questions.
If this is possible, I believe the logic for using LogExporter is the same for cloud management. But I confess that without documentation or a direct confirmation from Check Point, I'm hesitant to perform these configurations on a Smart-1.
No worries my friend, I was new to S1C back in 2019, now I feel Im an expert lol. Here is the guide:
Please refer to below, mate. Its pretty straight forward and easy to set up...well, as they say, everything in life is easy when you know it : - )
Anyway, message me directly if you need help.
Thank you very much for your help, the_rock. I've been working with Check Point for 4 years, but I'd never accessed a Smart-1 before.
Since you posted both sides of the issue, it raised another question for me.
If I configure something through the Smart-1 Cloud via the web using the forward to SIEM option and apply it, will that same configuration be visible through the Smart Console?
Yea, 100%. Just remember, the ONLY thing you cant access when it comes to S1C is ssh, which is only available to TAC, but lets be honest, you literally never need that anyway. For what is worth, I always bring up same argument to people thinking about getting smart-1 cloud...say someone inadvertently makes a change on onprem mgmt they are not supposed to, well, if device is few hours away and no one on site, wont be fun day/night for anyone. With cloud, thats never a concern, as you can log in from any computer in the world with Internet access and revert any changes. Plus, CP maintains the software updates and backups are also always there, so it truly gives you piece of mind.
So all the tests you were running weren't directly from an SSH connection to the Smart-1?
Because I was hoping I could access the expert mode of that Smart-1 and edit the file to perform the filtering of the columns that my SIEM team wants to receive.
Thats right, examples I gave you were from my lab mgmt. Last screenshots were from S1C portal though...if you need that tergetConfiguration.xml file modified, just open TAC case, explain the situation and they will do it for you. I would also reference this post if I were you, I always do that in my cases and they appreciate it, makes their job easier as well.
Message me offline if you need any more testing done.
That's what I needed to understand. So, in fact, we can't make changes in expert mode through Smart-1.
Your help was invaluable, the_rock. I will definitely mention that post.
Thank you very much for your support.
Of course man, never an issue, we are always here to help others.
This is the closest to ssh you can get.
We do not permit SSH access to Smart-1 Cloud instances.
Any changes to configuration files normally editable in Expert Mode must be done by the TAC.
I could have sworn one day in late 2019 that I was able to ssh into it, but then next day I was not : - )
Same like 2 remote access communities, maybe worked for a day or few hours.
Leaderboard
Epsum factorial non deposit quid pro quo hic escorol.
| User | Count |
|---|---|
| 7 | |
| 2 | |
| 2 | |
| 1 | |
| 1 | |
| 1 |
Thu 20 Nov 2025 @ 05:00 PM (CET)
Hacking LLM Applications: latest research and insights from our LLM pen testing projects - AMERThu 20 Nov 2025 @ 10:00 AM (CST)
Hacking LLM Applications: latest research and insights from our LLM pen testing projects - EMEAWed 26 Nov 2025 @ 12:00 PM (COT)
Panama City: Risk Management a la Parrilla: ERM, TEM & Meat LunchThu 20 Nov 2025 @ 05:00 PM (CET)
Hacking LLM Applications: latest research and insights from our LLM pen testing projects - AMERThu 20 Nov 2025 @ 10:00 AM (CST)
Hacking LLM Applications: latest research and insights from our LLM pen testing projects - EMEAThu 04 Dec 2025 @ 12:30 PM (SGT)
End-of-Year Event: Securing AI Transformation in a Hyperconnected World - APACThu 04 Dec 2025 @ 03:00 PM (CET)
End-of-Year Event: Securing AI Transformation in a Hyperconnected World - EMEAThu 04 Dec 2025 @ 02:00 PM (EST)
End-of-Year Event: Securing AI Transformation in a Hyperconnected World - AmericasWed 26 Nov 2025 @ 12:00 PM (COT)
Panama City: Risk Management a la Parrilla: ERM, TEM & Meat LunchAbout CheckMates
Learn Check Point
Advanced Learning
YOU DESERVE THE BEST SECURITY