This document describes best practices for identifying potential web skimming vulnerabilities in your third-party scripts and deploying rules that will prevent them from compromising your users' data. We have found the following guidelines to be helpful in creating and managing Web Skimming Protection rules.
Document your rules
When creating a Web Skimming Protection rule, there is a Description field. It's important to write a clear sentence or two to describe what the rule is intended for. The following data, at least, should be tracked:
- The rule’s purpose
- The affected page(s) or application(s)
- The date when the rule was added
- The name of the person who added the rule
- The rule type (form or cookie protection)
- Exceptions to the rule, if any
There is also a Labels field. It's helpful to identify appropriate labels to use tag your rules with.
Here's an example rule description: "Homepage login form field protection to protect username and password"
Establish a formal change procedure
You should establish a formal process to track rules you create and track any modifications you might later make. A typical change procedure might involve the following steps:
- A change request process that business users can use to request changes to the form field and cookie protection configuration.
- An assessment process for the security team to analyze risk and determine the best course of action to balance the business users' needs with security needs.
- A testing process that ensures that any changes to rules actually have the desired effect.
- A deployment process for moving a new or modified rule into production after it has been tested.
- A validation process to ensure that the new or modified rule is indeed operating as intended.
- A documentation process to track the changes that have been made.
If you have a small security team, it might be tempting to implement changes less formally. But following the process strictly can help avoid lapses in security caused by poor configuration.
Keep rules simple
Always keep the rules simple and limit the amount of sub-rules used within. This helps keep the overall rule structure straightforward. Current Web Skimming Protection analytics do not expose events at the granularity of sub-rules. Simple rules are much easier to debug, as the analytics only give ability to see which rules are being hit; they don't provide any information on sub-rules triggering.
Always set rule actions to monitor before blocking
All rules should be added with a monitor-before-block model. Monitoring allows you to view the effectiveness of a rule and check which exact scenarios the rule is affecting, and eliminates the risk of accidentally blocking a script that actually should have access.
Keep a rule in monitoring mode for at least a week to validate that it is indeed triggering on the correct pages/paths for the correct form fields and/or cookies, before moving to blocking mode.
Generic-to-specific rule model
Instart's rule evaluation model reads and evaluates all rules from top to bottom in the order provided. If there are multiple rules with the same set of criteria, the last rule that triggers will always be the one that takes effect. Here is a model that works well when creating rules:
- Have all generic rules at the very top:
- Rules that apply across the entire website
- Block-all types of rules
- Add more specific exceptions to this rule after the generic rules.
- Include any exceptions for specific paths/pages
- Keep form field and cookie protection rules in separate groups
Use rule sets
Rule sets allow you to separate rules into different buckets. They should be used for the following cases:
- For individual environments (EIT, SIT, Automation, Staging, Production)
- When there is a clear delineation of rules based on a specific characteristic; for example, network speeds – 3G (384 kbps), 4G (100 Mbps), 5G (10 Gbps)
- When rules are to be added based on very specific criteria; for example, all pages that belong under a path like /product-description-page/
Please limit the nesting levels within rulesets to one or two at most. Having significantly nested rulesets can reduce overall manageability of rules.
Review rules frequently
Websites and web applications frequently change, adding new pages, additional form fields or cookies, getting refactored to improve functionality, and so on. It's very easy for a rule to stop working because of this. Therefore it is important to set up a regular maintenance schedule to make updated changes to the rules.
Also, as old pages become deprecated, rules relating to them should also be deprecated. As the number of rules in the system grows, having a significant number of old rules that aren't actually needed anymore can lead to extra work being done by the system, adding unnecessary execution delays.