- Products
- Learn
- Local User Groups
- Partners
- More
Welcome to Maestro Masters!
Talk to Masters, Engage with Masters, Be a Maestro Master!
Join our TechTalk: Malware 2021 to Present Day
Building a Preventative Cyber Program
Be a CloudMate!
Check out our cloud security exclusive space!
Check Point's Cyber Park is Now Open
Let the Games Begin!
As YOU DESERVE THE BEST SECURITY
Upgrade to our latest GA Jumbo
CheckFlix!
All Videos In One Space
URL Categorization for HTTPS Sites - without HTTPS Inspection active - i believe this is intended for blocking purposes. I have a use case where i would like to do the reverse and actually allow traffic just to a list of 200 URLs, most are HTTPS. Of course if this was static we could to a firewall rule and just allow the IPs of the webservers but it has to be done based on URL. As HTTPS Categorization works based on CN and i can confirm that the CN is correct (sometimes we have wildcard certificates also). The issue itself manifests when those sites have references to other sites ( google analytics, ads, and other dynamic content generating third party sites) where the browser can be seen trying to establish TLS connection.
After it times out the original allowed site is finally loaded but this may take up to "number of external references" multiplied by "TLS timeout". Some of the load times we could optimize by allowing google analytics in the group. But for the above example we cannot account for all the third party tracking sites out there.
Example rule:
"SiteuriPermise" is the Allowed Sites Group. The second rule is useless and there for troubleshoot to make sure no packets pass the negated Drop.
1. Does anyone have any good tips on how to optimize or approach this issue? We are preparing to start doing HTTPS Inspection but this rule would be meant for some sources that are not managed and not even windows. Therefore they would from the start NOT trust the MiTM certificate issued by our organization. So even if this would solve partially the issues with internal hosts. I understand that when 80.20 will be allowed on the gateway(we are using VSEC so this is 80.10 as per HCL at the time of writing this article) we would be able to use Categorization on the bypassed HTTPS Inspection traffic but we would still run into the same speed issue.
At this moment i am unsure if this is a bug, or actually expected behavior. Other URL filtering solutions used before (Squid?!) would allow the main site to be loaded along side the third party scripts.
2. Running the Gateway in Explicit Proxy mode would help in any way for this behaviour?
Topics relevant for this thread i've found:
https://community.checkpoint.com/message/29790-url-filtering-categorization
https://community.checkpoint.com/thread/6984-url-filtering-without-https-inspection
https://community.checkpoint.com/message/21870-https-categorizaion-a-drama
Similar thread (i assume with the same problem):
https://community.checkpoint.com/message/29970-re-unable-to-access-a-banks-site-even-though-its-allo...
Hi Cezar,
The problem you mentioned it seems to be because of the Drop at the Application Layer on R80.10 and up:
After it times out the original allowed site is finnaly loaded but this may take up to "number of external references" multiplied by "TLS timeout". Some of the load times we could optimize by allowing google analytics in the group. But for the above example we cannot account for all the third party tracking sites out there.
Remember on R77.30 the action was Block, who doesnt exists anymore for Application Layer.
One workaround I found for this load time was change the action to Reject instead of Drop for HTTPS; this way the Firewall tells the PC that the site is not reachable, so there is no retry for the connection.
You can see my discussion about this here: Web Advertisements drop in R80.10
Regards
Hi Cezar,
The problem you mentioned it seems to be because of the Drop at the Application Layer on R80.10 and up:
After it times out the original allowed site is finnaly loaded but this may take up to "number of external references" multiplied by "TLS timeout". Some of the load times we could optimize by allowing google analytics in the group. But for the above example we cannot account for all the third party tracking sites out there.
Remember on R77.30 the action was Block, who doesnt exists anymore for Application Layer.
One workaround I found for this load time was change the action to Reject instead of Drop for HTTPS; this way the Firewall tells the PC that the site is not reachable, so there is no retry for the connection.
You can see my discussion about this here: Web Advertisements drop in R80.10
Regards
I will have to let you know that this works. For sites that have "correct" or "compliant" certificates it works just fine!
About CheckMates
Learn Check Point
Advanced Learning
YOU DESERVE THE BEST SECURITY