Validation Rules
Set quality standards for extracted data so you catch issues before results are delivered.
Validation Rules
Validation rules are your quality control layer. After Doculent extracts data from a document, validation checks whether that data meets your standards before it gets delivered.
What Do Validation Rules Check?
You can set rules for:
| Check type | Example |
|---|---|
| Required fields | "Insured Name must be present" |
| Format | "Effective Date must be a valid date" |
| Value ranges | "Premium must be between $100 and $10,000,000" |
| Confidence threshold | "All key fields must have at least 85% confidence" |
Why Use Validation?
Without validation, every extraction result goes straight to delivery — even if a field is missing or looks wrong. Validation adds a safety net:
- Results that pass all rules are delivered automatically
- Results that fail any rule are flagged as Needs Review for your team to check
This means your team only needs to look at the exceptions, not every single submission.
Setting Up Validation Rules
- Go to Channels > select your channel > Settings > Validation
- Click Add Rule
- Choose the field this rule applies to
- Set the condition:
- Required (field must exist)
- Format (must match a pattern)
- Range (must be within min/max values)
- Confidence (AI confidence must exceed threshold)
- Choose what happens on failure: Flag for Review or Reject
- Click Save
Example Rules
| Field | Rule | On failure |
|---|---|---|
| Insured Name | Required | Flag for review |
| Effective Date | Must be a valid date | Flag for review |
| Premium Amount | Must be > $0 | Flag for review |
| All fields | Confidence > 80% | Flag for review |
The Confidence Threshold
This is one of the most useful validation settings. Every field Doculent extracts comes with a confidence score — how sure the AI is about the value it read.
- 90%+ — Very confident. The document was clear and the field was easy to read.
- 70-90% — Moderately confident. Usually correct, but worth a glance on important fields.
- Below 70% — Less certain. Could be a tricky handwritten field or a low-quality scan.
Setting a confidence threshold per channel means you can say: "If the AI is less than 85% sure about any key field, send it to a human for review."
Start with a confidence threshold of 80% and adjust based on your experience. If too many submissions are flagged, lower it. If bad data is getting through, raise it.
Auto-Approve vs. Review
Submissions that pass all validation rules are automatically approved and delivered — no human intervention needed. Only the ones that fail a rule end up in your review queue.
The goal is to tune your rules so that the vast majority of submissions sail through automatically, and only the genuinely ambiguous ones need your attention.
Tips
- Start with required-field checks — these catch the most common issues
- Add confidence thresholds for your most important fields — not everything needs to be perfect, but key fields should be reliable
- Review your rules monthly — as document quality and volume change, your rules may need adjusting
- Check the Analytics dashboard to see what percentage of submissions are passing vs. failing validation