- CheckMates
- :
- Products
- :
- Quantum
- :
- Management
- :
- Re: Allowing custom site with external hosted imag...
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Are you a member of CheckMates?
×- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Allowing custom site with external hosted images
We have a custom site that we've created an access rule for all users to be able to access. However the page only partially loads. Looking into the logs show that the images used by the website fail to load, as they are being blocked because they are hosted on an external site (*.cloudfront.net) that isnt explicitly allowed.
I'd like to be able to allow the site to load these pages for our users, without white listing cloudfront.net
I feel like this is doable, but I'm missing something.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am sure that the images are not hosted on *.cloudfront.net .
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I understand that cloudfront is just the AWS CDN. however, they are referenced via cloudfronts url. I understand that they are not literally hosted on cloudfront.net
When inspecting, I see that the site is trying to load the image from cloudfront.net, and is being blocked.
I enjoy your sarcasm, but I'm hoping for some constructive help.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi David,
From both security and site design and development aspects, I can suggest you download the resource and serve it in the html from your hosting service (page and resources from the same domain).
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Ron,
I have asked our team that handles our web development to due just this, however they hope that I can resolve this without their needing to change anything. Ill keep pushing for them to follow best practices.
Thank you
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Ever heard of the program called Fiddler? it is a freeware tool like WireShark that will sit on your pc between browser and internet like a proxy, that way it is able to intercept all traffic and all called URLS.
When you start Fiddler and then load a clean browser and load this page, it will show all URLs it will open, a very valuable tool in these type of cases.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I had not heard of fiddler, but this seems like an excellent tool to add to my kit. I'll take a look into it!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I suspect that this is not possible, unless you can create a custom application that will be able to identify traffic regardless of its provenance.
This is a major headache, however.
Take a look at:
Signature Tool for custom Application Control and URL Filtering applications
and read the "Example for sk103051 - Detecting SSL traffic by DN":
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I haven't tried running the custom application tool yet, just from reading the articles it would be a headache to try and assemble. I'll take a further look and tinker with this idea
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
mayby with the customer signature you can make a signature based on the HTTP Headers
HOST: *.cloudfront.com
Referer: <your site's domain name>
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It is just not so easy to define a custom site that can differentiate between cloudfront urls that show your pictures and other ressources. The only way i can think of is using RegEx together with folder names, e.g. /173x512/ to discrimitate between random cloudfront urls and the image urls.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Gunther, how would you implement this approach? I'm unsure how we would be able to use the folder names to allow the references resources within. Is this something that would be added somewhere within the signature tool?
Thank you
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
No, you can do that with the signature tool using RegEx (see Custom Application Control (Check Point Signature Tool)), but i would just use R80.20 Dashboard and create a Custom Application using URLs defined as RegEx (see URL Filtering Policy). The great thing and most difficult part is creating the correct Regular Expression - but using the fiddler list from above and the online RegEx testing tools, it should be not so much work 😉
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi David,
I agree with Vladimir, if not all content are hosted on the same site it really is hard with a white-list approach. I actually had the same challenge and here is my post: https://community.checkpoint.com/thread/9701-school-exam
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Enis,
What solution did you end up using, whitelisting the resources IP address when needed? I don't think this will work well in my scenario.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have similar case. To what solution did you end up?
BR
Kostas
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am seeing the same issue. Has anybody come up with a solution?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
We're moving over to a blacklist approach to solve this issue.
In today's website structure, with resources being hosted on other sites/platforms/CDNs, it just makes sense to do a blacklist approach.
Initial overhead is much larger, having to determine what you need to block to maintain a similar level of security as a whitelist is the big task. Once implemented it's much easier to maintain, you'll just need to play whack-a-mole to block any sites your users shouldn't be getting to that aren't in the blacklist, so setting up some daily reports for bandwidth or application usage is helpful.
It's intrinsically less secure to do blacklist instead of white list, but the a well maintained implementation with appropriate security applications will help eliminate that risk.
