Combination of Python code and Github Actions (CICD pipeline/workflow) creating and updating only when needed cdns.json file for GenericDataCenter (R81), as well as Network Object (R81.20).
Additionally, creation of Dynamic Object file, which can be executed on the GWs in question, that are using Dynamic Object of the same name (Cloudflare & Akamai) and have the object in use in a policy. Could be automated with your automation tool of choice like Ansible on schedule. Dynamic Objects would be useful for Check Point R77.x or R80 code.
The interesting part of the project is the Github Actions CICD pipeline use with a scheduler. That way we are offloading compute and networking to Github.
Both files, cdns.json and dynamic_objects.txt will be updated only when upstream input data has changed.
UUID will be autogenerated when cdns.json files does not exist, so on the first run.
Data input separation from processing code simplifies reading and understanding code. I call the input data language: CDNQL (Content Delivery Network Query Language).
Additional integration with Check Point Sourceguard for code security scan on each code change, as well as latest Check Point's acquisition of SpectralOps to scan for any potential Secrets leak like passwords or API keys. All running as Github Actions workflow.
Anybody can fork the project and run on either their own systems or the same way on Github or any other DevOps Platform like Gitlab, Azure DevOps, etc.
All information on integration as GDC, Network Feed, or Dynamic Objects, including object creation over CHKP MGMT API, is all on the Github repository.
The code is under MIT License:
https://github.com/Senas23/cp_gdc_cdns