cancel
Showing results for 
Search instead for 
Did you mean: 
Create a Post
Raymondn
Raymondn inside API / CLI Discussion and Samples 4 hours ago
views 93 4

Adjust Threat-Protection Action

I am trying to use the "mgmt" commands to adjust IPS protection.For example, I want to set the protection "FTP Commands" action from "inaction to "detect" for Threat protection profile "DMZ_Protection".How can I do this? Reading this:https://sc1.checkpoint.com/documents/R80/APIs/index.html#gui-cli/set-threat-protectionI got an idea.  However, the part I don't understand is how to correctly use the "profiles name" in the command so I am only adjusting the action of the protection only on a specific Threat profile. The example from the doc show "overrides.1.profile", but I don't really understand the meaning of "1" here. Thanks in advance for any explanation about how to deal with those "List: Object" parameter. 

How run fw sam command through management web service api?

Hi, is possible to run fw sam command through management web service api?Thanks

Basic problem with adding a host in an MDS via mgmt_cli

Hi, I'm trying to do some batch imports from one domain to another, but I'm having a strange issue out of the gate. When I do a simple "mgmt_cli add", it adds an object to the "MDS" domain instead of the "Core" domain I'm trying to get.[Expert@mds-1:0]# mgmt_cli login user [me] domain CorePassword: uid: "b1f49fd8-daa1-4d0c-ad3e-38f57e75b1b1"sid: "zQtNUq5L8HG_QI5Zk5v4Vl684kcPezUlG9S2Ef37IaE"url: "https://127.0.0.1:443/web_api"session-timeout: 600last-login-was-at: posix: 1568061885668iso-8601: "2019-09-09T13:44-0700"api-server-version: "1.1"[Expert@mds-1:0]# mgmt_cli add host name blah ipv4-address 1.2.3.4Username: [me]Password:---------------------------------------------Time: [13:55:31] 19/9/2019---------------------------------------------"Publish operation" succeeded (100%) [Expert@mds-1:0]# mgmt_cli publishUsername: bcotterPassword:---------------------------------------------Time: [13:57:04] 19/9/2019---------------------------------------------"Publish operation" succeeded (100%) tasks: - task-id: "01234567-89ab-cdef-a3c4-f6cd65a7062f"task-name: "Publish operation"status: "succeeded"progress-percentage: 100suppressed: falsetask-details: - publishResponse: numberOfPublishedChanges: 0revision: "90f810f4-7130-4122-a704-f40810b20fb8"The object doesn't show up in the Core domain, but rather ends up getting created in "MDS," which I didn't even know had an object database. I'm attaching the audit log from SmartConsole.What did I do wrong? 
Nüüül
Nüüül inside API / CLI Discussion and Samples 15 hours ago
views 1413 9 11

IPS Update Monitoring

Hi,I wrote a small script, using the SDK from Checkpoint (GitHub - CheckPointSW/cp_mgmt_api_python_sdk: Check Point API Python Development Kit ) for checking IPS Updates with my Monitoring Server (Centreon, based on Nagios, more or less )For the login, the SDK is used (i changed one option in Login part of mgmt_api.py: (unsafe_auto_accept --> true) should work with the default - false - too, but was easier for me.After successful logging in, we are parsing the API output from show-ip-status and comparing it with i.e actual date or "update available".After some calculating and comparing the script gives output, understandable for Nagios based systems.UNKNOWN = -1 - OK = 0 - WARNING = 1 - CRITICAL = 2GoodBad:And there is a state WARNING for 1 - 3 Days Delta from IPS UpdateThe Thresholds are freely configurable (on daily base).What would be good, is a possibility to get the current IPS Database version from Checkpoint, so, one might want to check the version against checkpoint, not, what the managment server found.I started working on this with the question of Sven Glock (IPS Monitoring )  in mind - maybe that kind of helps... and for my own of course To use it on Nagios Server you need:python installed (script worked with 2.7 and 3.7in the plugin folder i created an own "checkpoint" folder, containing the SDK and my script.Feel free to have a look, I´m sure, there is space for improvements.... Regards,Daniel

R80.20 Mgmt API issue

We need help with this issue happening in R80.20 , we see the API service stopping randomly and at that point we need to manually restart the service but of course we are trying to find out what is causing it to stop running randomly , this is very annoying. Below is the output of the API status when the issue happens:API Settings:---------------------Accessibility: Require all grantedAutomatic Start: EnabledProcesses:Name State PID More Information-------------------------------------------------API StoppedCPM Started 4929 Check Point Security Management Server is running and readyFWM Started 2192APACHE Started 4180Port Details:-------------------JETTY Internal Port: 50276APACHE Gaia Port: 4434 (a non-default port)When running mgmt_cli commands add '--port 4434'When using web-services, add port 4434 to the URL--------------------------------------------Overall API Status: The API Server Is Not Running!--------------------------------------------Notes:------------To collect troubleshooting data, please run 'api status -s <comment>'[Expert@cglscc4a:0]# api statusAPI Settings:---------------------Accessibility: Require all grantedAutomatic Start: EnabledProcesses:Name State PID More Information-------------------------------------------------API StoppedCPM Started 4929 Check Point Security Management Server is running and readyFWM Started 2192APACHE Started 4180Port Details:-------------------JETTY Internal Port: 50276APACHE Gaia Port: 4434 (a non-default port)When running mgmt_cli commands add '--port 4434'When using web-services, add port 4434 to the URL--------------------------------------------Overall API Status: The API Server Is Not Running!--------------------------------------------Notes:------------To collect troubleshooting data, please run 'api status -s <comment>'

R80 Clear Old Sessions Script

This is for made for MDS but it should work on a regular management server too. Our MDS on R80.10 doesn't clear out old sessions when an admin uses SmartConsole. New sessions are created when admins log in and the old sessions just stick around, continually building up. That makes the sessions tab ugly to look at and makes it hard to find that session with locks that is giving you trouble that you want to discard. This script removes all sessions that are logged out and have no changes (so a lock with no changes will get cleared too). I would have gone by locks instead of changes but when sessions are queried, locks is always at least 1. That might be a bug. The usage is <scriptname> <days to save> <Y/N prompt for user & pw if needed>.   For the option to prompt user & pw: in my experience, sometimes root login to a domain fails but the user login works. So if a Y is specified, then the script will try root login first, and if one fails, the script will prompt the user for a username & password to use instead (leaving the option blank or using anything that isn't a Y will result in root login attempts only). In my environment, I used crontab to schedule the script as follows, and it works great: <script> 0 N Use this script at your own risk. It works great for me but I cannot guarantee it is safe to use in your environment.
M_Ruszkowski
M_Ruszkowski inside API / CLI Discussion and Samples Wednesday
views 1311 12 2

Install Database - MDS

R80.20 API v1.3  can't seem to find "install Database".   We have close to 90 Domains across multiple MDS / Provider1 environments.  With that said when I need to make a change that requires an "Install Database" - I need to be able to do this via the API.   To me this is crazy that CheckPoint has left this out.  Or should I say I can't seem to find it.   Take a tool like Firemon that may require us to make a change and do an "install Database".  Please tell me there is an easy way to do an "install Database" across 90+ domains without having to log into each one.Thank you,

"put-file" api call not supported on GAIA Embedded?

I cannot find a definitive answer in the documentation: is the "put-file" function in the API supported on GAIA Embedded devices (e.g. 1200Rs)?I'm developing an ansible playbook to set SNMP strings on our gateways using "put-file" and "run-script" API calls on the management server to create a CLISH config file on the gateway, and then execute it. It all works great on full GAIA devices, but I get the following error back from the API when I attempt to run the script against a 1200R:message: Running a script from SmartConsole is only supported on Gaia R77 and above gateways and clusters The 1200R I'm testing against is R77.20.81 - Build 631.   Thanks!  - G 

Adding Rules to a Rule Section Using Ansible

Dear all, we are using the Check Point Ansible module https://github.com/CheckPointSW/cpAnsible   Do you know how to group access rules below a section? The API tells me: https://sc1.checkpoint.com/documents/latest/APIs/#cli/add-access-rule~v1.5%20 But how to convert it to an ansible playbook? Only the "position: "top"" statement seems to be working. I commented the other attempts out. Any ideas?     - name: "Section" check_point_mgmt: command: add-access-section parameters: layer: "Network" name: "section 1" position: "top" session-data: "{{login_response}}" - name: "Rule" check_point_mgmt: command: add-access-rule parameters: layer: "Network" # position.below: "section 1" # position: "section 1" position: "top" name: "Access rule" source: "1.1.1.1" destination: "2.2.2.2" service: "ssh" action: "allow" session-data: "{{login_response}}"     Thank you & kind regards

R80.x Policy to CSV file?

Is there a management CLI tool or script to export the policy and objekts to a CSV file?   I have only found something to objects here in the form but not to policys.  

When will VSX-objects be able to be scripted through the API?

Are there plans to bring this into the API or will it only be through the Provisioning tool, that you will be able to create VSX objects?
Ash_Sidhu
Ash_Sidhu inside API / CLI Discussion and Samples a week ago
views 13678 1 7

Ansible for Gaia gateways

Ansible role to manage Checkpoint Gaia Gateways.The following Ansible role will enable you to execute commands on multiple nodes in parallel. Using this role you will be able to do the following : -Take clish config back-ups ( show configuration )Run essential show commands on all your gateways from single playbook. eg.Show routeShow asset allShow interfacesShow ospf , etc…Issue configure commands to all your gateways from single playbook. In my case, I can backup 100 odd gateways in under 5 minutes. So, here it is ... REQUIREMENTSAnsible Server Linux server. The role has been tested with RHEL 7+ and Ubuntu, but it should work with other flavours as well.Ansible 2.5 and above running on the Linux server.The server should have direct connection to the checkpoint firewalls (at the moment the role does not support jump box) .Connection to the firewallsRoute to the firewall - You should be able to ssh to all your gateways from the Linux host.SSH keys - It is also important that the public key of the gateways are already stored in your 'known_hosts' file.  This is automatically done when you ssh for the first time from the  server to the gateways.Username - the default username for login is 'admin', but this can be changed per firewall in the inventory file.Passwords- The role assumes that you have the same password for all the gateways. If you have different logins on gateways then you can set up key-based (passwordless) logins.login Shell - The role assumes that the login shell for the user (used for login) is bash. i.e when the user logs on he is logged in as expert.                                                                                                                                                    If you have clish set as your login shell, most of the playbooks will run, but the bash playbooks will fail as the playbook cannot change to expert mode.The Playbooks have been tested with Ansible 2.7 running on Ubuntu 18.04 LTS and on RHEL 7.0. I am pretty sure they will run on other flavors of Linux as long Ansible 2.7 is being used.Set upFollow the procedure below once you have installed Ansible on your server. Login to the Linux server as root and follow the steps.Ansible config file - /etc/ansible/ansible.cfg                                                                                                                    Edit your Ansible config file (/etc/ansible/ansible.cfg) and make sure the following configuration directives are set to the value shown.forks          = 15 gather_timeout = 60 roles_path    = /etc/ansible/roles timeout = 90 connect_timeout = 90 command_timeout = 80Install the role, by giving the following command.ansible-galaxy install ashwin_sid.gaia_fw1 --forcePrepare the Inventory file - /etc/ansible/hostsMake an entry for each of your gateways in the following format.[ckp] GW-perimeter01 ansible_host=192.168.50.2  GW-Perimeter04-A ansible_host=192.168.89.2 GW-Perimeter04-B ansible_host=192.168.89.3 ansible_port=44 ansible_user=ladminThe fields are explained below.'[ckp]' - this is the hostgroup. This is the name  used in the sample playbooks provided. You can create your own hostgroup, but make sure to use that in the playbooks.'GW-perimeter01' (2nd line)- This is the name of the gateway as referenced in Ansible. This doesn't have to be the actual hostname of the firewall.'ansible_host=192.168.50.2' - This is the IP that will be used for making ssh connections. This is the IP of the individual gateways and not the Cluster IPs or VIPs. Cluster IP don't need to go in here.'ansible_port=44'  (4th line)- this is the ssh port, if different from default, 22.'ansible_user=ladmin' (4th line) - if user used to login to the firewall is different than default 'admin'.4. Preparing the Playbooks - There are some sample playbooks included with the role. You can find them in folder /etc/ansible/roles/ashwin_sid.gaia_fw1/Sample-Playbooks.Do not work in the '/etc/ansible' directory. Create your own directory structure to store and run the playbooks. I have created /opt/fw-ansible/playbooks folder on my linux server to run the playbooks.#mkdir /opt/fw-ansible #mkdir /opt/fw-ansible/playbooks #cp /etc/ansible/roles/ashwin_sid.gaia_fw1/Sample-Playbooks/* /opt/fw-ansible/playbooks/ Running PlaybooksThere are following Playbooks included : - Backup.yml - This play book will backup your gaia config, i.e. output of 'show configuration'.show.yml - Will run a diagnostic clish command on your gateways & store the output on the ansible server. You can use this playbook as a template and create custom playbook that are more relevant to  your environment. eg. you could copy the file to 'Show-route.yml' and change the 'cmd' string to 'show route'.Show_HFA.yml - This playbook will get the installed HFA info from the gateways. show-clish.yml - This is similar to the above playbook, but can run more than 1 command. The commands (1 command on each line) are stored in file 'show-clish.cmd' in the same directory.show-bash.yml  - This file will run bash commands on the gateway and store the output on ansible server. configure-clish.yml - This playbook will issue clish commands (stored in file configure-clish.cmd) on the gateways.configure-bash.yml - This playbook will issue bash commands (stored in file configure-clish.cmd) to the gateways.The playbooks can be run by ordinary user, you DO NOT have to be root to run the playbooks.CustomizationThe following can be customized for all the playbooks. - hosts: ckp   serial: 10'hosts: ckp'  - change this value to whatever hostgroup you have set in the inventory file.'serial: 10' - this is the batch size, value of 10 means that the playbook will run all the commands for 10 firewall at a time and then the next 10 until all gateways are done. If your ansible is beeffy enough, you can set this to 20 or higher, which means that playbook will execute the commands simultaneously for 20 hosts.Backup.ymlThis will backup your gaia config, i.e. output of 'show configuration'.You do not need to touch any other line (apart from the customization above) . Now run the playbook.#cd /opt/fw-ansible/playbooks/ #ansible-playbook -k backup.ymlNext it will ask you to enter your password. The backup is stored in folder /opt/fw-ansible/BACKUP/<GW-Name>/ . The file is named with timestamp when the backup was run.show.ymlThis playbook will run a single command on your gateway and store the output in a text file on the server.Customizing the show.yml playbook. The only parts that need to be adjusted in the configure.yml playbook are@@@@@@@@   TEXT NOT TO BE EDITED @@@@@@@@         cmd: show asset all 'cmd: show asset all' - Here you put your custom show command (clish). You do not need to touch any other line. Now run the playbook.#ansible-playbook -k show.ymlNext it will ask you to enter your password. This will run your command and stored the output in folder /opt/fw-ansible/SHOW/<TIMESTAMP><GW-Name>.txt . The file is named with gateway name.configure-clish.ymlThis playbook will run and save clish configuration commands.The commands are stored in a text file named 'configure-clish.cmd'. Please edit the file and add the commands that you want to run. The sample file has a '#' in front of every line(so you don't run this commands by error). DO NOT put a '#' for commands that you want to run on your gateways. So, to add 3 static routes to all the gateways, config-clish.cmd file will look like this.lock database override set static-route 1.1.1.1/32 nexthop gateway address 192.168.1.2 on set static-route 1.1.1.2/32 nexthop gateway address 192.168.1.2 on set static-route 1.1.1.3/32 nexthop gateway address 192.168.1.2 on save configOnce you have written the commands, run the playbook.#ansible-playbook -k configure-clish.ymlNext it will ask you to enter your password. It might be useful to run the backup playbook before you run the configure playbook.You can similarly run the other playbooks ... You should start with the Backup playbook (& test firewall) and once that is running smoothly, you can experiment with other playbooks and create your own. Ansible Role Page Blog explaining how to use the role.

CP python SDK - api_query function

As i understand it in the python sdk, api_query should return all objects and handle all the require API paging. This seems to work for show-host\show objects"to": 45766,"total": 45766},"status_code": 200however for show-access-rulesclient.api_query(command="show-access-rulebase",payload={"name" : "MyPolicy Security"})This does not seem to work - it only returns objects up to the default page limit for that api call"to": 50, client.api_query(command="show-access-rulebase",payload={"name" : sl})"total": 52,Do i misunderstand something here?Also - include-container-key doesnt seem to have any affect - whether True or False it always returns a dict, never a list. Not a problem for me but the documentation is misleading? """The APIs that return a list of objects are limited by the number of objects that they return.To get the full list of objects, there's a need to make repeated API calls each time using a different offsetuntil all the objects are returned.This API makes such repeated API calls and return the full list objects.note: this function calls gen_api_query and iterates over the generator until it gets all the objects,then returns.:param command: name of API command. This command should be an API that returns an array ofobjects (for example: show-hosts, show networks, ...):param details_level: query APIs always take a details-level argument.possible values are "standard", "full", "uid":param container_key: name of the key that holds the objects in the JSON response (usually "objects").:param include_container_key: If set to False the 'data' field of the APIResponse objectwill be a list of the wanted objects.Otherwise, the date field of the APIResponse will be a dictionary in the followingformat: { container_key: [ List of the wanted objects], "total": size of the list}:param payload: a JSON object (or a string representing a JSON object) with the command arguments:return: if include-container-key is False:an APIResponse object whose .data member contains a list of the objects requested: [ , , , ...]if include-container-key is True:an APIResponse object whose .data member contains a dict: { container_key: [...], "total": n }"""
Eric_Beasley
inside API / CLI Discussion and Samples 2 weeks ago
views 14080 21 33
Employee+

CLI API Example for exporting, importing, and deleting different objects using CSV files (v 00.33.00 and later)

OverviewThe export, import, delete using CSV files scripts in this post, currently version 00.33.00 and later, dated 2019-01-19 and later, are intended to allow operations on an existing R80, R80.10, R80.20[.Mx], R80.30[.Mx] Check Point management server (SMS or MDM) from bash on the management server or a management server able authenticate and reach the target management server.These scripts show examples of:an export of objects and services with full and standard json outputan export of hosts, networks, groups, groups-with-exclusion, address-ranges, dns-domains, host interfaces, and group members to csv output [future services and other object dumps to csv is pending further research]an import of hosts, networks, groups, groups-with-exclusion, address-ranges, dns-domains, host interfaces, and group members from csv output generated by the export to csv operations above or custom csv files with valid objects.a set operation (change existing objects) for hosts, networks, groups, groups-with-exclusion, address-ranges, dns-domains, host interfaces, and group members from csv output generated by the export to csv operations above or custom csv files with valid objects.  (As of version 00.24.00).  Script added to specifically set values using CSV file for all objects for which a CSV file is found, specifically the:   "cli_api_set-update_objects_from_csv.sh"a script to delete groups-with-exclusion, groups, address-ranges, dns-domains, networks, hosts, using csv files created by an object name export to csv for the respective items deleted.  NOTE:  DANGER!, DANGER!, DANGER!  Use at own risk with extreme care!scripts provided to just count the objects the script could find in the environment (as of version 00.23.00), specifically the:   "cli_api_get_object_totals.sh" and "cli_api_get_object_totals_w_group_member_count.sh".script operations for export of more than 500 objects of a specific type.export complex object elements, like Group Members and Host Interfaces to CSV files for import via the --batch option.export of full JSON and CSV for only objects that are not created by "System", so exporting only custom added or administrator modified objects.MDM tool to export domains to domains_list.txt file for reference in other script callsNOTE: As of version 00.29.00 all cli_api*.sh scripts require "common" script folder to handle command line parameters!  Don't forget to copy this folder also.EXPORT SCRIPTS NOW HANDLE > 500 OBJECTS FOR ALL CSV  AND JSON EXPORTS (version 00.24.00)EXPORT SCRIPTS NOW DEFAULT TO EXPORTING ALL OBJECTS INSTEAD OF NON-SYSTEM OBJECTS FOR FULL JSON AND CSV OPERATION.  CLI COMMANDS ENABLE FULL DUMP OF ALL OBJECTS INSTEAD OF THOSE CREATED BY "System":  --SO and --NSO for show System Objects and No Show System Objects.  (version 00.29.02)AS OF VERSION 00.33.00, THE PORT FOR MANAGEMENT WEB SSL PORT IS IDENTIFIED AUTOMATICALLY IF EXECUTING AGAINST THE LOCAL MANAGEMENT HOST.Where to Find, Contact, Support, and Terms of ServiceScripts are provided as-is without express or implied warranty, guarantee, assumption of liability, or SLA for resolution as they are examples of how to use the API and may not always apply to the situation they are used in and are subject to limitations of the API engine or utilized Check Point management version.At no time is this post an implied assumption of duty to change the example scripts IAW requests.  If you need an API script developed, contact Check Point Professional Services or a qualified Check Point Channel Partner with DevOps capabilities.CLI API Script approach may change at any time, which may require accessing older GitHub branches or releases.CLI API Scripts are provided with stipulation that the implementor is capable of using CLI and understands API operations.Check out GITHUB for latest versions : GitHub - mybasementcloud/R8x-export-import-api-scripts: Check Point R8x Export, Import, [and more] API scripts for bash … (version 00.33.00 as of 2019-01-19)For direct questions, you can hit me up at ericb@checkpoint.com  Response time may vary based on schedule, availability, and issue presented.For questions about issues with scripts please provide the information identified in the FEEDBACK section at the bottom of this article. DescriptionThis post includes a set of scripts in two (2) packages, a Development Branch which may be an advanced version still under construction and an Operations Branch that should work as expected.  All script files end with .sh for shell and are intended for Check Point bash implementation on R80, R80.10, R80.20[.Mx], and R80.30[Mx] or later.  Scripts in the packages have specific purposes and scripts call sub-scripts for extensive repeated operations and basic actions (e.g. handling CLI parameters).  The packages also include specific expected default directory folders that are not created by the script action.General Information:<yyyy-mm-dd-hhmm-tz> is a date time group (DTG) generated at time of execution and used for the full operation of the respective script, providing consistent information for a specific script run.  Example:  2017-01-05-1346CST  for January 5, 2017, at 13:46 hrs CST.Output Generated by Scripts:Output from the scripts is directed to a sub-folder (default is a dump folder with DTG sub-folder, e.g. ./dump/<yyyy-mm-dd-hhmm-tz>) and further placed in sub-folders based on script:  csv, full, standard, import, delete. Example:Multi-Domain Management operations scripts that handle multiple domains in their operation will create a sub-folder for the domain and then create the specific output based on that domain in the domain specific folder.  Note that the scripts automatically handle the default "System Data" and "Global" domains.  When executing the scripts with specific domain selected, that domain is the folder name between the DTG folder and the output folders.Example:NOTE:  Current CSV output includes additional files used in the process that are raw data in a WIP folder, sorted raw data, csv header, and the original data with header.NOTE:  When operating on MDM, the stated domain in the -d parameter is used as a subfolder to collect the specific data for that dumped domain.ScriptsScripts a provided in an Operations and Development branch folder structure.  Operations branch implies that these are stable operational scripts for their stated purpose, and expected to work as such under the tested versions where applicable.  Development branch may include advanced, early availability version of the scripts where development is not yet complete for migration to Operations branch.Main script types provided are MDM scripts, Session Cleanup scripts, common scripts used by other scripts, Export Import scripts, and script Templates.Script Type(main folder)Script NameScript PurposeOutput formats_templatesTemplates for developing scripts with the basic capabilities of the provided version level of the template.api_mgmt_cli_shell_template_with_cmd_line_parameters.template.<version>.shTemplate for scripts using provided approach that includes built-in command line parameter handler operations.Text log fileOutput dependent on chosen template implementationapi_mgmt_cli_shell_template_with_cmd_line_parameters_script.template.<versions>.shTemplate for scripts using common scripts approach for handling command line parametersText log fileOutput dependent on chosen template implementationapi_mgmt_cli_shell_template_action_handler.template.<version>.shTemplate for action sub-script called by another script, with basic handling for version mismatchlog to originating script based on verbose level settingtest._templates.<version>.shRough testing script for validating if templates functionText log fileOutput dependent on chosen template implementationcommonCommon scripts called by the scripts and utilized by the templates in the _template folder.  This folder and expected scripts are replicated to the utilizing script folder for direct access, so will be found under _templates, export_import, and Session_Cleanupcmd_line_parameters_handler.action.common.<level>.<version>.shAction sub-script called to execute operations to handle command line parameters standard to all scripts in that versionlog to originating script based on verbose level settingidentify_gaia_and_installation.action.common.<level>.<version>.shAction sub-script called to execute operations to identify version of Gaia of host and Check Point installation typelog to originating script based on verbose level settingexport_importScripts for export, import, set, and delete operationscli_api_export_objects.shScript to export all supported objects to JSON (full and standard) and CSVJSON Full, JSON Standard, CSVText log filecli_api_export_objects_to_json_full.shScript to export all supported objects to JSON fullJSON Full, JSON Standard, CSVText log filecli_api_export_objects_to_json_standard.shScript to export all supported objects to JSON standardJSON Full, JSON Standard, CSVText log filecli_api_export_objects_actions.shAction sub-script to execute export of objects to JSON (full or standard) for calling script.log to originating script based on verbose level settingcli_api_export_objects_actions_to_csv.shAction sub-script to execute export of objects to CSV (full or standard) for calling script.log to originating script based on verbose level settingcli_api_export_objects_to_csv.shScript to export all supported objects to CSVCSVText log filecli_api_export_all_domains_objects.shScript to export all supported objects in all domains on an MDS to JSON (full and standard) and CSVJSON Full, JSON Standard, CSVText log filecli_api_export_all_domains_objects_to_csv.shScript to export all supported objects in all domains on an MDS to CSVCSVText log filecli_api_import_objects_from_csv.shScript to import all supported objects from supplied CSV files (only supplied files are processed)JSON resultsText log filecli_api_set-update_objects_from_csv.shScript to set (updated) all supported objects from supplied CSV files (only supplied files are processed)JSON resultsText log filecli_api_delete_objects_using_csv.shScript to delete all supported objects from supplied CSV files by name (only supplied files are processed)JSON resultsText log filecli_api_get_object_totals.shDump the number of each object from all supported objects.Text log filecli_api_get_object_totals_w_group_member_count.shDump the number of each object from all supported objects, including count of group members per group.Text log fileapi_add_csv_error_handling_to_csv_file.shFor versions (e.g. R80.10) where CSV import and set operations require that ignore error, ignore warning, and set-if-exists are in the actual CSV file.  Adds necessary columns to front of existing CSV file rows with headers.CSVText log fileapi_subpend_csv_error_handling_to_csv_files.shFor versions (e.g. R80.10) where CSV import and set operations require that ignore error, ignore warning, and set-if-exists are in the actual CSV file.  Adds necessary columns to end (back) of existing CSV file rows with headers.CSVText log fileMDMMulti-Domain Management general scriptsMDM_Get_Domains_List_<version>.shScript will generate a list of current domains on an MDS for manual re-use laterTextSession_CleanupSession Cleanup scripts to first list the current sessions open on a management server and also provide options to clean-up (delete) session that are not locked or pending publish operations (zero locks).remove_zerolocks_sessions.<version>.shIdentify and delete zero lock sessionsText result of operationremove_zerolocks_web_api_sessions.<version>.shIdentify and delete zero lock sessions with user web_apiText result of operationshow_zerolocks_sessions.<version>.shIdentify zero lock sessionsText result of operationshow_zerolocks_web_api_sessions.<version>.shIdentify zero lock sessions with user web_apiText result of operationInstructionsTo utilize the scripts, download the scripts from this repository post, extract the script files and directory folders [import and delete actions], then upload those files and directory folders to a working target folder location (e.g. /var/tmp/api-scripts) on the target management server where the scripts will execute from.  Once uploaded to a working folder the relevant scripts are executed like any other bash script.  If executing directly from the folder where the script is located use "./<script>.sh" for execution.  If script modifications are made outside of Check Point Linux, it is recommended to first run "dos2unix <script>.sh" to ensure compatibility with bash shell.Each script accepts command line parameters to control important inputs that have some defined defaults.If the "-p <password>" parameter is not used, the user is prompted for the console user/administrators password, as in this example:If the "-r"  or "--root" parameter is used then the above prompt should be skipped as in standard mgmt_cli execution.Command Line ParametersThe scripts all (except actions sub-scripts) can take Command Line Parameters (CLI parameters).  To get a dump of the active CLI parameters for a specific script run it with "--help" or "-?".  Example  (version 00.23.00):Command line parameters support multiple input formats as displayed, and can be mixed and matched as needed.This is the standard help output for cli_api_export_objects.sh script, which is the standard baseline for all scripts in this package release:[Expert@X:0]# ./cli_api_get_object_totals.sh --helpScript:  cli_api_get_object_totals  Script Version: v00x23x00API version = 1.1./cli_api_get_object_totals.sh [-?][-v]|[-r]|[-u <admin_name>] [-p <password>]]|[-P <web ssl port>] [-m <server_IP>] [-d <domain>] [-s <session_file_filepath>]|[-x <export_path>] [-i <import_path>] [-k <delete_path>] [-l <log_path>] Script Version:  00.23.00  Date:  2017-07-22 Standard Command Line Parameters:  Show Help                  -? | --help  Verbose mode               -v | --verbose  Authenticate as root       -r | --root  Set Console User Name      -u <admin_name> | --user <admin_name> |                             -u=<admin_name> | --user=<admin_name>  Set Console User password  -p <password> | --password <password> |                             -p=<password> | --password=<password>  Set [web ssl] Port         -P <web-ssl-port> | --port <web-ssl-port> |                             -P=<web-ssl-port> | --port=<web-ssl-port>  Set Management Server IP   -m <server_IP> | --management <server_IP> |                             -m=<server_IP> | --management=<server_IP>  Set Management Domain      -d <domain> | --domain <domain> |                             -d=<domain> | --domain=<domain>  Set session file path      -s <session_file_filepath> |                             --session-file <session_file_filepath> |                             -s=<session_file_filepath> |                             --session-file=<session_file_filepath>  Set log file path          -l <log_path> | --log-path <log_path> |                             -l=<log_path> | --log-path=<log_path>  Set export file path       -x <export_path> | --export <export_path> |                             -x=<export_path> | --export=<export_path>  session_file_filepath = fully qualified file path for session file  log_path = fully qualified folder path for log files  export_path = fully qualified folder path for export file NOTE:  Only use Management Server IP (-m) parameter if operating from a        different host than the management host itself. Example: General : ]# cli_api_get_object_totals -u fooAdmin -p voodoo -P 4434 -m 192.168.1.1 -d fooville -s "/var/tmp/id.txt" -l "/var/tmp/script_dump/" Example: Export: ]# cli_api_get_object_totals -u fooAdmin -p voodoo -P 4434 -m 192.168.1.1 -d fooville -s "/var/tmp/id.txt" -l "/var/tmp/script_dump/" -x "/var/tmp/script_dump/export/"Standard Command Line Parameters: Parameter PurposeParameter value and optionsDefault ValueDescriptionShow Help-? | --helpn/aShow help for scriptVerbose Mode-v | --verbosen/a (not set)Show details of operations and values during execution.  bash environment variable APISCRIPTVERBOSE can be set to TRUE to run in verbose mode from start without Command Line Parameter.Example:export APISCRIPTVERBOSE=TRUEAuthenticate as root-r | --rootn/aInstead of using administrator user name and password operate as rootSet Console User Name-u <admin_name>--user <admin_name>-u=<admin_name>--user=<admin_name>administratorSet the username of console user/administrator executing the script.<admin_name> username for console/administrator, e.g. adminSet Console User password-p <password>--password <password>-p=<password>--password=<password>n/aSet the password to be used for console user/administrator authentication.If not used the default operation will prompt for the console user/administrator password.<password> password to use for console user/administrator.NOTE:  Entry is visible when used.Set Management Server IP-m <server_IP>--management <server_IP>-m=<server_IP>--management=<server_IP>localhostSet the IP address of the management server to use for this operation.<server_IP> is the TCP/IP address of the target management server, e.g. 10.10.100.66NOTE:  DO NOT USE THIS PARAMETER IF OPERATING THE SCRIPT FROM THE HOSTING MDS OR SMS, SINCE AUTHENTICTATION WILL FAIL.Set Management Domain-d <domain>--domain <domain>-d=<domain>--domain=<domain>not setSet the management domain to use for this operation on a Multi-Domain Management Server<domain> is the domain to use for the operation, e.g. foovilleSet session file path-s <session_file_filepath>-session-file <session_file_filepath>-s=<session_file_filepath>-session-file=<session_file_filepath>./id.txtSet the full path and file name to the session ID file<session_file_filepath> full path to the session ID file, e.g. /var/tmp/id.txtWeb SSL PortNEW-P <web_ssl_port> | --port <web_ssl_port> | -P=<web_ssl_port> | --port=<web_ssl_port>443Web SSL Port of the Management server, default is 443, but can be set explicity to address changes to multiportal, thus change to API web port.Session Timeout--session-timeout <session_time_out>10 - 3600600 defaultsecondsConfigure session timeout value for login operation executedSystem Object Export--NSO | --no-system-objects--SO | --system-objects--NSOConfigure export of System Objects created by "System".  By Default this value is set to --NSO or --no-system-objects and objects created by "System" are ignored during the export of full JSON or CSV information.  Standard JSON export always will do all objects found since the search for "System" objects is not possible with the supplied JSON stream.  To enable export of "System" created objects, utilise the --SO or --system-objects parameter.  For JSON output --NSO will generate zero length files as the dump.Log File path-l <log_path>--log-path <log_path>-l=<log_path>--log-path =<log_path>folder pathSet the path for log files generated by the script.<log_path> path (no following "/"), e.g. "./var/tmp/script/logs"Output File path-o <output_path>--output <output_path>-o=<output_path>--output =<output_path>folder pathSet the path for output files generated by the script.<export_path> path (no following "/"), e.g. "./var/tmp/script"Set export file pathCHANGED-x <export_path>--export <export_path>-x=<export_path>--export =<export_path>./dump/<yyyy-mm-dd-hhmm-tz>Set the path for export files generated by the script.<export_path> path (no following "/"), e.g. "./var/tmp/script"Set import file path-i <import_path>--import-path <import_path>-i=<import_path>--import-path=<import_path>./import.csvSet the path for input files required by the script.<import_path> path (no following "/"), e.g. "./var/tmp/script/input"Set delete file path-k <delete_path>--delete-path <delete_path>-k=<delete_path>--delete-path=<delete_path>./delete.csvSet the path for input files required by the script to identify what to delete.<delete_path> path (no following "/"), e.g. "./var/tmp/script/input"--NOWAITSkip waiting for key input on some operations or when running in verbose mode--CLEANUPWIPRemove the WIP folder created under some operational output operations (e.g. CSV exports) - PENDING IMPLEMENTATION--NODOMAINFOLDERSDon't generate the domain specific folders, files are domain specific so all collected - PENDING IMPLEMENTATION--CSVEXPORTADDIGNOREERRAutomatically modify the CSV file to include the presumed ignore error, warning, or set-if-exist values - PENDING IMPLEMENTATIONConfiguration Parameters in the Script [this section needs more work]NOTE:  This predicates some scripting ability and capability to use a text editor.  I recommend using the dos2unix command on any updated scripts once uploaded to the target management server host to ensure compatibility.These script examples attempt to provide some detail tailoring and configuration via variables set for the specific script.  Some of these configuration values are influenced by the Command Line Parameters that can be passed to the script.  This version does not make the approach overly generic (e.g. name of exported CSV file is hardcode in the import), and future versions of this example set may clearly abstract and configure command line input variables.Key values to configure for:Export, Import, and Delete scriptsVariableDefinitionAPICLIadminSmartConsole administrator name to use for operationsAPICLIsessionfilefilename and path to mgmt_cli session ID file generated by login and used for all subsequent mgmt_cli operationsExport ScriptsVariableDefinitionAPICLIpathrootroot of path for output filesAPICLIpathbasebase path for output files, generally uses $APICLIpathroot and for operations time delineation can utilize the $DATE variableAPICLIfileoutputpreGeneral prefix for the output file, prefixes the filename in the full output file pathAPICLIfileoutputextFile extension for operational output file, default is .txtAPICLIfileoutputsufixFile suffix for the operational output file, default is $DATE.$APICLIfileoutputext so generally<date_time_group>.txtAPICLIJSONfileoutputextFile extension for mgmt_cli json output file, default is .jsonNOTE:  this is not used in this exampleAPICLIJSONfileoutputsufixFile suffix for the mgmt_cli json output file, default is $DATE.$APICLIJSONfileoutputext so generally<date_time_group>.jsonNOTE:  this is not used in this exampleAPICLICSVfileoutputextFile extension for generated CSV file, default is .csvAPICLICSVfileoutputsufixFile suffix for the operational output file, default is $DATE.$APICLICSVfileoutputext so generally<date_time_group>.csvNOTE:  this was purposely done for the work utilizing this example, which stipulates a defined state of CSV output to export based on the time of execution.  For those wanting a generic approach, the value can be set to be more static and not include the $DATE value element.APICLIObjectLimit(DO NOT MODIFY THIS VALUE)This is the maximum number of groups to export, providing the limit value for the mgmt_cli show groups command to populate the array of groups to export members from.  The API supports a "limit" value of 0 to 500, and the default is set to 500 to ensure the maximum number of objects is collected.APICLIoutputfull file path to operational output file for later review of actionsImport ScriptsHeader 1Header 2APICLIfileoutputpreGeneral prefix for the output file, prefixes the filename in the full output file pathAPICLIfileoutputextFile extension for mgmt_cli json output file, default is .jsonAPICLIfileoutputsufixFile suffix for the mgmt_cli json output file, default is $DATE.$APICLIfileoutputext so generally<date_time_group>.jsonOutputPathRootroot of path for output filesOutputPathBasebase path for output files, generally uses $OutputPathRoot and for operations time delineation can utilize the $DATE variableCSVImportTypemgmt_cli type for import operation, in this example it is groupCSVImportPathRootThis is the path root for the location of the CSV file to import, in the example it is a sub-directory relative to the location of the scriptCSVImportPathFileThis is the file name of the CSV file to import, in this case hard-coded based the CSV output generated by the export operation.NOTE:  this was purposely done for the work utilizing this example, which stipulates a defined state of CSV output to import.  For those wanting a generic approach, the value can be set to be more static and not include the $DATE value element.CSVImportPathThis is the path to the CSV file to import based on the $CSVImportPathRoot and $CSVImportPathFile variables.OutputPathfull file path to operational output file for later review of actionsDelete ScriptsHeader 1Header 2APICLIfileoutputpreGeneral prefix for the output file, prefixes the filename in the full output file pathAPICLIfileoutputextFile extension for mgmt_cli json output file, default is .jsonAPICLIfileoutputsufixFile suffix for the mgmt_cli json output file, default is $DATE.$APICLIfileoutputext so generally<date_time_group>.jsonOutputPathRootroot of path for output filesOutputPathBasebase path for output files, generally uses $OutputPathRoot and for operations time delineation can utilize the $DATE variableCSVImportTypemgmt_cli type for import operation, in this example it is groupCSVImportPathRootThis is the path root for the location of the CSV file to import, in the example it is a sub-directory relative to the location of the scriptCSVImportPathFileThis is the file name of the CSV file to import, in this case hard-coded based the CSV output generated by the export operation.NOTE:  this was purposely done for the work utilizing this example, which stipulates a defined state of CSV output to import.  For those wanting a generic approach, the value can be set to be more static and not include the $DATE value element.CSVImportPathThis is the path to the CSV file to import based on the $CSVImportPathRoot and $CSVImportPathFile variables.OutputPathfull file path to operational output file for later review of actionsModification of the script sections to suit personal preference and requirements is strongly encouraged via the copy-paste operation.I may be updating these later, with some harmonization of common variables required and some abstraction options via command line parameters.Why and What for...These scripts were developed to address a pressing need in my own basement cloud laboratory, after some issues cropped up with my migrated management server, which has an original data base starting from R70 and migrated, upgraded, imported to Multi-Domain Management, and now exported from Multi-Domain Management, which has left the system a bit wonky and questionable.  By creating scripts to handle the output of objects from my existing management server, I can then use the CSV data to import to a clean, new installation, where I can start fresh, with all my objects, but probably none of the baggage or garbage from almost 9 years of lab/home use operations.  It is an excellent learning opportunity and mentors like Uri Bialik help with this very much.However, these scripts can also help with some other operations that may be necessary, probably requiring some tweaks, but the example can help a bunch for starting out, operations like:Duplicating group members after group import for laboratory environments, when building a from-scratch test environment, but wanting to use familiar objectsDuplicating group members after group import to a different Domain in Multi-Domain Management, when not wanting to use global objects.  This will require some adjustments to handle the Domain specification variables, but should not be rocket magic to make happen.Operations in Professional Services for either recovering from an exported baseline, or assisting with pre-migration testing operations, and rebuild operationsHaving a backup of objects on hand to utilize for bare-metal rebuild where a import or restore is not plausible or advisable.Future Improvements and Extensions:PARTIALLY Implemented - Implementing and improving error handling, currently omitted due to time constraint and for actual mgmt_cli operations a concept for approachJSON Implemented - Export, Import, and [yes also] Delete of Services, but very much focused on those created by the user, not the native Services delivered in the installationConfiguration of exported fields to CSV for more control at importInteractive selection of what items to export, import, or delete depending on scriptPort to Windows Power Shell scripting, once I figure out how to handled date-time-group values and jq in Windows Power Shell.Configuration of system objects creators to exclude, and parameter control to select specific creator objects.PARTIALLY Implemented - Common bash script element framework to simplify script development, which may mean more detailed script installation requirements.  Later with Power Shell port also for Power Shell.Addition of a global [or any use] naming pre-fix to each object exported to CSV to allow alternative import, e.g. to move objects from local domain to global domain on Multi Domain Management implementation.Addition of specific information fields for CSV export to help identify objects for other operations (e.g. not direct import), like uid, creator, state, etc.  These dumps may require more work to prepare if import is desiredNow on GITHUBGitHub - mybasementcloud/R8x-export-import-api-scripts: Check Point R8x Export, Import, [and more] API scripts for bash …FEEDBACKIf you need help with a problem using the script, please provide the following in any communication:R80.x version being used and management type (SMS or MDM)CLI level text capture (copy/paste of terminal window for execution) with the script run in "-v" (verbose mode) to ensure we see all comments and information from the execution Do not use the "--password" option to ensure that you're password is not provided, I don't want to knowEnsure all command line parameters passed are correctany CSV or JSON output generated for the problem object(s), this can be compressed to save e-mail spaceif importing, setting, or deleting values the utilized CSV file(s)a short explanation of what you are trying to accomplish and what is not being met, since we may be operating on different assumptionsCode VersionCode version 0.29.02 and laterTested on versionR80, API version 1.0R80.10 EA, API version 1.? [2016-12 EA package]R80.10 GA, API version 1.1R80.10 New Kernel EA, API version 1.1R80.10 GA for Smart-1 525, 5050, 5150, API version 1.1R80.20 EA T354, EA T395 MDM and SMS, R80.20 GoGo EA 3.10 kernel gatewaysR80.20 GA T101R80.30 EA (Public)Known LimitationsCode versionLimitation IdentifiedAll up to currentRESOLVED with v00.29.02 releaseUsing the -r or --root command line operation fails to execute authentication due to a limitation in approach with parametrized command line parameters for the mgmt_cli command.  This issue is being escalated for technical clarification since the same values entered directly (instead of by evaluation parameter) works.AllUsing the -m or --management command line parameter is only supported when script is not executed on the actual management server host that is the target.  When executing the script on the actual management server host, DO NOT use the -m or --management command line parameter with management IP address.Change LogCode versionKey Changes0.17.25Updated scripts to include comprehensive Command Line parameter handling (CLI parameters)Added specific scripts for explicit object export to csv.Added delete objects package for clean-up opererationsRefined operation of scripts to leverage sub routines for repeated operations with extensive parameterization to simplify adding more objects and servicesSolved the way to pass variable to JQ element in export operationThink I've solved the MDS JQ location problem with older scripts.Providing packaged sets for export, export of specific objects, import, delete, and template shell version 0.5.00.21.00Updated to correct issues with assumptions around how command line parameters for management, domain, and port work.Now able to handle domain, management server0.22.00Corrected issue with changes to file naming that caused action scripts to fail0.23.00New scripts provided to just count the objects the script could find in the environment (as of version 00.23.00), specifically the "cli_api_get_object_totals.sh" and "cli_api_get_object_totals_w_group_member_count.sh".  Which is useful to determine if what the api sees given the input parameters and user rights is what the admin expects or should see, specifically to help with Multi-Domain operations, which may still need some more tweaking.-P | --port <web ssl port> parameter added to support other than port 443 for management hosts-o | --output CLI parameter renamed to -x | --export to conform with actual purposeCheck for zero size groups added to export operations for group membersExport scripts now handled > 500 objects (current maximum api limit value) and will iterate in 500 chunk steps to process object sets larger than 500 objects.  json files will be broken out in 500 object sets with the name adding an increment value in sets of 500.  CSV files are still single, since they are built differently.For import added version 1.1 api support for "set-if-exists" option, with api version checking.Added more dependency checking before start and overhauled some operations to enable easier changes.0.24.00New script added to set values of existing objects (as of version 00.24.00), specifically the "cli_api_set-update_objects_from_csv.sh" in the Import Objects set.  It will process all expected input files, and skip operations where it does not find a file.Added handling for export, import, and set for host's interface objects.  NOTE:  The .interfaces[].subnet-mask value is not utilized, instead explict .interfaces[].mask-length4 and .interfaces[].mask-length6 are utilized.Export scripts not handle > 500 objects on all exported object types and formats (both CSV and JSON), this now includes the handling of group members and host interfaces, which required more programming thought to identify the approach.Reduced the default wait time in "read -t <waittime>" from 600 to 15 and added the value $WAITTIME that can be set higher if desired.  Future addition of --waittime <wait-time> and --nowait options are under consideration.0.25.00Corrected semantic issue with detecting empty object types (i.e. no objects of that type), now the CSV export will not exit the whole script if no objects are foundExtracted the Command Line Parameter Handler to a dedicated subscript to simplify editing the other scripts.Addressed a semantic issue with web ssl-port not available before identifing the API version and not having the jq location available on MDM versus SMS installationsPrepared some main export files for future object handling expansionNOTE:  This is the last version that will have dedicated specific object handling scripts, the next script set will have an option to configure which objects to handle interactively0.25.01Correct spelling mistakes for output [an ongoing process--always]0.27.05Major overhaul of script operational approach in some functions with logical fixes to semantic approach.Updated command line parameter handler to varriant 003.  As of version 00.27.05 all cli_api*.sh scripts require "cmd_line_parameters_handler.action.common.003.sh" script to handle command line parameters!  Don't forget to copy this file also.Added ability to configure export of objects created by "System" to remove potential for import of system objects and reduce size of output.  By Default this value is set to --NSO or --no-system-objects and objects created by "System" are ignored during the export of full JSON or CSV information.  Standard JSON export always will do all objects found since the search for "System" objects is not possible with the supplied JSON stream.  To enable export of "System" created objects, utilise the --SO or --system-objects parameter.  For JSON output --NSO will generate zero length files as the dump.Templates were also updated to reflect changes in command line handler and approach.0.29.02Resolved issues blocking use of -r parameter for local administrator authentication.  Only usable on the actual host and will not work with -m option (will indicate if identified).Corrected approach to selection of Check Point Data objects, since System creator is set by upgrade for all objects when upgrading from R77.30 and prior versions of management.Implemented "common" folder for common operations, where future common code elements will live their own version lives, like the CLI parameter handler in the current release.Restructured much of the internal code to fix issues and simplify future updates.Release folder structure and pressed files shared now reflects actual approach used in operation and provides total package, not just specifics.0.29.05Moved to providing Operations and Development branches for scripts in GitHub.General corrections and reworking operations.  Last release to include development of individual scripts for import and export of explicit objects.0.31.00Removed explicit objects import and export scripts from development operations and updates, and now legacy (version 00.29.05) variants of those scripts are provided under the Development branch under ../export_import.wip/_Deprecated_Scripts.Continued corrections and refactoring.  First release of the cli_api_export_all_domains_objects.sh and cli_api_export_all_domains_objects_to_csv_files.sh scripts to handle all objects in all domains on an MDS the user has rights to.0.33.00Production release with updates and working identification of local management host web ssl port for API login configuration.  Operational improvements and template corrections in preparation to adding more objects supported for CSV export and future changes to allow selection of operations.

New native Ansible module in 2.10 devel branch - and how to get it working

Hello,since  a few days there are several new modules in the developement branch of ansible, extending the very basic modules available since ansible version 2.8. (See: https://docs.ansible.com/ansible/devel/modules/list_of_network_modules.html#checkpoint)Does anyone have implemented a working playbook with these modules? I'm not sure if I use them correctly with the httpapi plugin. But if I do so, I get the following error: ansible.module_utils.connection.ConnectionError: 'Connection' object has no attribute '_session_uid'The full traceback is:Traceback (most recent call last):File "/home/pi/.ansible/tmp/ansible-local-21374ItToPW/ansible-tmp-1567247472.1-86947877065269/AnsiballZ_cp_mgmt_host.py", line 102, in <module>_ansiballz_main()File "/home/pi/.ansible/tmp/ansible-local-21374ItToPW/ansible-tmp-1567247472.1-86947877065269/AnsiballZ_cp_mgmt_host.py", line 94, in _ansiballz_maininvoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)File "/home/pi/.ansible/tmp/ansible-local-21374ItToPW/ansible-tmp-1567247472.1-86947877065269/AnsiballZ_cp_mgmt_host.py", line 40, in invoke_modulerunpy.run_module(mod_name='ansible.modules.network.checkpoint.cp_mgmt_host', init_globals=None, run_name='__main__', alter_sys=False)File "/usr/lib/python2.7/runpy.py", line 192, in run_modulefname, loader, pkg_name)File "/usr/lib/python2.7/runpy.py", line 72, in _run_codeexec code in run_globalsFile "/tmp/ansible_cp_mgmt_host_payload_N_OLDa/ansible_cp_mgmt_host_payload.zip/ansible/modules/network/checkpoint/cp_mgmt_host.py", line 333, in <module>File "/tmp/ansible_cp_mgmt_host_payload_N_OLDa/ansible_cp_mgmt_host_payload.zip/ansible/modules/network/checkpoint/cp_mgmt_host.py", line 328, in mainFile "/tmp/ansible_cp_mgmt_host_payload_N_OLDa/ansible_cp_mgmt_host_payload.zip/ansible/module_utils/network/checkpoint/checkpoint.py", line 189, in api_callFile "/tmp/ansible_cp_mgmt_host_payload_N_OLDa/ansible_cp_mgmt_host_payload.zip/ansible/module_utils/connection.py", line 185, in __rpc__ansible.module_utils.connection.ConnectionError: 'Connection' object has no attribute '_session_uid' using this basic playbook:- name: My First Playbook hosts: checkpoint connection: httpapi gather_facts: no tasks: - name: add-host cp_mgmt_host: ip_address: 192.0.2.1 name: New Host 1 state: presentIn my inventory file I defined the host and the plugin to use:[checkpoint] 192.168.100.5 [checkpoint:vars] ansible_network_os=checkpoint ansible_user=admin ansible_password=adminpw Is this a bug in the new series of modules or do I use them in the wrong way? Can anyone post an example including necessary variable definitions to make the plugin working? Thanks in advance and have a nice weekend,Markus