AnsweredAssumed Answered

Allowing traffic based on URL Categorization for HTTPS Sites

Question asked by cezar varlan on Dec 11, 2018
Latest reply on Dec 11, 2018 by cezar varlan

URL Categorization for HTTPS Sites - without HTTPS Inspection active - i believe this is intended for blocking purposes. I have a use case where i would like to do the reverse and actually allow traffic just to a list of 200 URLs, most are HTTPS. Of course if this was static we could to a firewall rule and just allow the IPs of the webservers but it has to be done based on URL. As HTTPS Categorization works based on CN and i can confirm that the CN is correct (sometimes we have wildcard certificates also). The issue itself manifests when those sites have references to other sites ( google analytics, ads, and other dynamic content generating third party sites) where the browser can be seen trying to establish TLS connection.



After it times out the original allowed site is finally loaded but this may take up to "number of external references" multiplied by "TLS timeout".  Some of the load times we could optimize by allowing google analytics in the group. But for the above example we cannot account for all the third party tracking sites out there.


Example rule: 

"SiteuriPermise" is the Allowed Sites Group. The second rule is useless and there for troubleshoot to make sure no packets pass the negated Drop. 



1. Does anyone have any good tips on how to optimize or approach this issue? We are preparing to start doing HTTPS Inspection but this rule would be meant for some sources that are not managed and not even windows. Therefore they would from the start NOT trust the MiTM certificate issued by our organization. So even if this would solve partially the issues with internal hosts. I understand that when 80.20 will be allowed on the gateway(we are using VSEC so this is 80.10 as per HCL at the time of writing this article)  we would be able to use Categorization on the bypassed HTTPS Inspection traffic but we would still run into the same speed issue. 


At this moment i am unsure if this is a bug, or actually expected behavior. Other URL filtering solutions used before (Squid?!) would allow the main site to be loaded along side the third party scripts. 


2. Running the Gateway in Explicit Proxy mode would help in any way for this behaviour?


Topics relevant for this thread i've found:


Similar thread (i assume with the same problem):