I'm surprised with our current multi-layer rulebase, it's not enough logical separation to avoid this. I'm not seeing anything that would be a "potential match" until you get down into sub-layers that would have filtered out this traffic simply from the child's parent layer.
My rule base is essentially:
1 Source: Internet IPs (New Layer)
- 1.1 Allow rules
2 Source: Internal IPs (New Layer)
- 2.1 Allow rules
So what you're saying is that because I have "potential matches" on the 2.1 rules for something coming from the internet (because source may be blank on those or destination is an application like Facebook) then it's going to allow traffic from the internet like it was in the 1.1 layer?. That seems pretty wild to me.
What if I took all the inbound internet rules and put them in their own ordered Access Control layer like what was done for GEO IP? Would that be enough of a logical separation for the engine to not evaluate other stuff in the main layer?