cancel
Showing results for 
Search instead for 
Did you mean: 
Create a Post
Highlighted

whitelist AWS S3 buckets using complex URI / URL patterns?

We're working with a customer who wishes to make a whitelist entry for a range of AWS S3 bucket addresses in their firewall. The names would be in the form:

abc-*-xyz.s3-us-east-2.amazonaws.com

OR

abc-*-xyz.s3.us-west-1.amazonaws.com

Where the "*" would be a randomly generated string that maps to an ephemeral name for a particular S3 bucket.

They are claiming this is not possible because the host in the URI has more than 3 parts. So they say that if it were "abc-*-xyz.amazonaws.com" it could work. But the other pieces in that host make it an invalid authority to use in a whitelist entry.

Is that true? Might it be a limitation of some very old version? I would welcome any pointers to appropriate documentation about this as well as answers.

Thanks!

7 Replies
Admin
Admin

Re: whitelist AWS S3 buckets using complex URI / URL patterns?

That doesn't sound right.

What steps are they following to try and do this on which version?

Re: whitelist AWS S3 buckets using complex URI / URL patterns?

These were, of course, my first questions as well. I am awaiting replies on both. The other detail that I do know is that this is related to deep packet inspection with SSL. The reason for the whitelisting is to have the data streams inbound from S3 on those address patterns exempt from the inspection.

In general, do you believe that a host in a URI can only have 3 parts? Or can it be arbitrarily long so long as it is a valid host name?

0 Kudos
Admin
Admin

Re: whitelist AWS S3 buckets using complex URI / URL patterns?

I've never heard of such a limitation myself (related to number of hosts/domain in a URI). 

Re: whitelist AWS S3 buckets using complex URI / URL patterns?

Spoke to an engineer today who pointed out that if you create the URL as a regex instead of a plain URL, you can essentially use any form you need to. So that may be a solution for us in this case, and certainly may help others facing similar challenges. This would then be connected to the https inspection policy as an exception to avoid the deep packet inspection.

Shows the application site creation dialog in the policy creation tool entering a regular expression that contains a complicated URL URI

0 Kudos
Admin
Admin

Re: whitelist AWS S3 buckets using complex URI / URL patterns?

Regex is definitely the way to go with this.

As I recall, you can't use a custom application in the HTTPS Inspection rulebase, but you should be able to use the "category".

Is that working for you?

0 Kudos

Re: whitelist AWS S3 buckets using complex URI / URL patterns?

I'm doing all of this through proxies (people), but the way this engineer configured it was to add an entry to the policy for HTTPS inspection to bypass anything that matched those regexs for the URLs. That did seem to do the trick according to what we could tell. This was done on 80.10, but still not clear the exact version the client has.

HTTPS inspection Policy screen showing rules and one with a bypass action for these regex URL / URI types

0 Kudos

Re: whitelist AWS S3 buckets using complex URI / URL patterns?

Looks like this was all the result of confusion. With some help from Brian Butts‌, what we discovered is that if you request something like bucketname.s3.us-west-2.amazonaws.com from AWS, they will respond from s3.us-west-2.amazonaws.com, and their certificate shows *.s3-us-west-2.amazonaws.com. These translations that cut off the bucket name likely account for the thought that the rules woudl not work with longer address, since these longer addresses were not involved with the responses and therefore a bypass on them would never work. SO it's a different issue all together.

I've opened a new thread to discuss this: https://community.checkpoint.com/message/28606-base-a-bypass-rule-on-the-request-address-not-the-res... 

0 Kudos