Create a Post
Showing results for 
Search instead for 
Did you mean: 

Performance Test Documentation HTTPS Inspection and Content Awareness

The video documents performance test results where a 16000 appliance achieves 5Gbps while sustaining 3000 connections per second performing HTTPS inspection, Content Awareness and attack prevention. The test is based on the IXIA Enterprise Traffic Mix and uses R80.30 software release.

2 Replies

Hello -- thanks for the great post.   I came to checkmates to post new topic but instead with reply to this.

The ever increasing percentage of HTTPS traffic in Enterprise is affecting our ability to properly size appliances for customer needs.

At some point, all Check Point sizing criteria must assume HTTPS_Decrypt will be enabled (vs the opposite today).   It would be a forward-thinking leadership approach for CP to publicly state HTTPS_Decrypt sizing numbers so customers (and resellers) can make more intelligent sizing decisions.

Yes, I understand this is a complicate topic with many permutations, but I suggest that doing something (while clearly detailing how you achieved results) would be better vs operating in vacuum.    Again, clearly spelling out the blend of traffic, including a mix of typical Enterprise cloud-based sites, with total HTTPS traffic at 75-80% of total.    Put this in packet capture, along with some clearly defined attachments to inspect (for the NGTP features).   

I envision R&D having all this scripted for easy re-production of HTTPS_decrypt results for (a) various CP models in 3k,5k,6k,etc (pick the highest -- or lowest - in each line), and (b) new GAIA releases. 

Furthermore, the packet capture and associated blend could have published "release version" as the blend may want to be adjusted based on customer feedback or industry developments.    CP could take this one step further with different blends for different industries (example: Energy, Financial, etc).  for example:  you would HTTPS-Decrypt load test Energy_traffic_blend_v1.1 on any given appliance.

If you pick the high or lower model in each category, you could relegate the specific model testing to solution center based on field request from sales. 

The biggest concerns for me is how HTTPS Decrypt affected by (a) CPU differences in product line (# cores), and (b) the version of GAIA -- and associated JFA.   

example:  I assume the number of cores has a large impact on HTTPS_decrypt throughput for the smaller appliance models (with limited number of cores).   In addition, it would be great to understand how much adding cores can affect HTTPS_decrypt performance (example:  double the cores at same CPU speed, do you get 2x throughput or more?). 

Just an idea for @Dorit_Dor  and team. 



Thank You Garrett,

for your feedback. I will talk to my peers in R&D and share it. 

Unfortunately understanding the performance impact of HTTPS inspection, preventing known and unknown attacks and applying access control requires dedication to the topic. I am happy to see you are taking this challenge to go the extra mile to understand the topic. We need more engineers with your dedication! 

Whenever I am working with partners and customers who are open for a dialog lasting at least a 20 minutes about this topic I observe an increase of understanding. Unfortunately in our work we are often forced providing short answers to a complex question (such as 'what is the performance impact of xyz'). Just when it comes to performance short answers often don't help customers on the long run.

In the documentation I have provided here I changed the point of view: we may want to look at throughput while the gateway performs a certain number of connections per second at the same time given the gateway is loaded at a certain percentage while securing a certain traffic. Ask competitors the same question. They all give you results related to either throughput or the connections per second rate. I never have seen any market player documenting both figures for the same test. As both values are mutual exclusive you can't have the maximum value of both dimensions at the same time. Unfortunately documenting 'not the max' is not popular.

If we - as a community - would come to an agreement that vendors should start documenting performance in the way documented in this test we might be one step closer of achieving a better understanding of performance in our market. As long as the market is focused documenting the 'max numbers measured in dedicated tests' we still not meet the customers need to understand what these numbers mean for his/her environment. 

Let me know what you think of this approach.

best regards

peter (pelmer)