- Products
- Learn
- Local User Groups
- Partners
-
More
Celebrate the New Year
With CheckMates!
Value of Security
Vendor Self-Awareness
Join Us for CPX 360
23-24 February 2021
Important certificate update to CloudGuard Controller, CME,
and Azure HA Security Gateways
How to Remediate Endpoint & VPN
Issues (in versions E81.10 or earlier)
Mobile Security
Buyer's Guide Out Now
Important! R80 and R80.10
End Of Support around the corner (May 2021)
We have a custom site that we've created an access rule for all users to be able to access. However the page only partially loads. Looking into the logs show that the images used by the website fail to load, as they are being blocked because they are hosted on an external site (*.cloudfront.net) that isnt explicitly allowed.
I'd like to be able to allow the site to load these pages for our users, without white listing cloudfront.net
I feel like this is doable, but I'm missing something.
I am sure that the images are not hosted on *.cloudfront.net .
I understand that cloudfront is just the AWS CDN. however, they are referenced via cloudfronts url. I understand that they are not literally hosted on cloudfront.net
When inspecting, I see that the site is trying to load the image from cloudfront.net, and is being blocked.
I enjoy your sarcasm, but I'm hoping for some constructive help.
Hi David,
From both security and site design and development aspects, I can suggest you download the resource and serve it in the html from your hosting service (page and resources from the same domain).
Hi Ron,
I have asked our team that handles our web development to due just this, however they hope that I can resolve this without their needing to change anything. Ill keep pushing for them to follow best practices.
Thank you
Ever heard of the program called Fiddler? it is a freeware tool like WireShark that will sit on your pc between browser and internet like a proxy, that way it is able to intercept all traffic and all called URLS.
When you start Fiddler and then load a clean browser and load this page, it will show all URLs it will open, a very valuable tool in these type of cases.
I had not heard of fiddler, but this seems like an excellent tool to add to my kit. I'll take a look into it!
I suspect that this is not possible, unless you can create a custom application that will be able to identify traffic regardless of its provenance.
This is a major headache, however.
Take a look at:
Signature Tool for custom Application Control and URL Filtering applications
and read the "Example for sk103051 - Detecting SSL traffic by DN":
I haven't tried running the custom application tool yet, just from reading the articles it would be a headache to try and assemble. I'll take a further look and tinker with this idea
mayby with the customer signature you can make a signature based on the HTTP Headers
HOST: *.cloudfront.com
Referer: <your site's domain name>
It is just not so easy to define a custom site that can differentiate between cloudfront urls that show your pictures and other ressources. The only way i can think of is using RegEx together with folder names, e.g. /173x512/ to discrimitate between random cloudfront urls and the image urls.
Hi Gunther, how would you implement this approach? I'm unsure how we would be able to use the folder names to allow the references resources within. Is this something that would be added somewhere within the signature tool?
Thank you
No, you can do that with the signature tool using RegEx (see Custom Application Control (Check Point Signature Tool)), but i would just use R80.20 Dashboard and create a Custom Application using URLs defined as RegEx (see URL Filtering Policy). The great thing and most difficult part is creating the correct Regular Expression - but using the fiddler list from above and the online RegEx testing tools, it should be not so much work 😉
Hi David,
I agree with Vladimir, if not all content are hosted on the same site it really is hard with a white-list approach. I actually had the same challenge and here is my post: https://community.checkpoint.com/thread/9701-school-exam
Hi Enis,
What solution did you end up using, whitelisting the resources IP address when needed? I don't think this will work well in my scenario.
I am seeing the same issue. Has anybody come up with a solution?
We're moving over to a blacklist approach to solve this issue.
In today's website structure, with resources being hosted on other sites/platforms/CDNs, it just makes sense to do a blacklist approach.
Initial overhead is much larger, having to determine what you need to block to maintain a similar level of security as a whitelist is the big task. Once implemented it's much easier to maintain, you'll just need to play whack-a-mole to block any sites your users shouldn't be getting to that aren't in the blacklist, so setting up some daily reports for bandwidth or application usage is helpful.
It's intrinsically less secure to do blacklist instead of white list, but the a well maintained implementation with appropriate security applications will help eliminate that risk.
About CheckMates
Learn Check Point
Advanced Learning
WELCOME TO THE FUTURE OF CYBER SECURITY