Learn how DSALTA uses automated and manual tests to continuously verify control effectiveness and maintain compliance.
What Are Compliance Tests?
A compliance test is a check that verifies a specific security control is implemented and functioning correctly. Tests answer the question: "Is this control actually working?"
Example Tests:
"Multi-factor authentication is enabled for all users"
"Database encryption is configured using AES-256"
"Access reviews were completed in the last 90 days"
"Security patches are applied within 30 days"
Tests provide objective evidence that controls are effective.
Automated vs. Manual Tests
Automated Tests
Automated tests run without human intervention through integrations:
How They Work: DSALTA connects to your systems and automatically checks configurations, settings, logs, and data
Frequency: Run continuously, hourly, daily, or weekly, depending on the control
Benefits:
Always current and up-to-date
No manual effort required
Catch issues immediately
Provide historical trend data
Examples:
Checking MFA enrollment status in Google Workspace
Verifying encryption settings in AWS
Monitoring log retention in cloud infrastructure
Tracking code review completion in GitHub
Manual Tests
Manual tests require human verification:
How They Work: Team members periodically verify controls through observation, documentation review, or attestation
Frequency: Quarterly, annually, or as needed
Benefits:
Cover requirements that can't be automated
Provide context and judgment
Address process and policy compliance
Verify physical or organizational controls
Examples:
Confirming background checks were completed
Verifying physical security measures
Reviewing vendor contracts for security clauses
Attesting to policy training completion
Test Results and Status
Tests produce three possible results:
Passing (Green): Control is working as expected
Requirements are met
No action needed
Contributes to compliance score
Failing (Red): Control is not working correctly
Requirements not met
Immediate remediation needed
Blocks compliance progress
Needs Attention (Yellow): Partial compliance or unclear status
Some requirements met, others missing
Manual review required
Clarification or additional evidence needed
[Screenshot needed: Test results showing different statuses]
Test Components
Each test includes:
Test Name: Clear description of what's being verified
Source: Integration or manual verification method
Frequency: How often the test runs
Risk Level: Critical, High, Medium, or Low
Mapped Controls: Which controls this test verifies
Success Criteria: What constitutes passing
Remediation Guidance: How to fix failures
Automated Test Examples by Integration
Identity Provider Tests (Google Workspace, Microsoft 365)
MFA enforcement verification
Password policy compliance
Inactive user detection
Admin account monitoring
Group membership reviews
Cloud Infrastructure Tests (AWS, GCP, Azure)
Encryption at rest verification
Security group configuration
IAM policy compliance
Logging and monitoring setup
Public access prevention
Code Repository Tests (GitHub, GitLab)
Branch protection rules
Code review requirements
Security scanning results
Commit signing verification
Dependency vulnerability checks
Communication Tools Tests (Slack, Teams)
Guest access controls
External sharing restrictions
Data retention compliance
Test Frequency Explained
Different controls require different monitoring cadences:
Continuous/Hourly: Critical security controls
Encryption status
Authentication requirements
Production access controls
Daily: Important operational controls
Log retention
Backup completion
Security monitoring
Weekly: Regular compliance checks
Access permissions reviews
Configuration compliance
Security tool health
Monthly/Quarterly: Periodic reviews
Comprehensive access audits
Vendor assessments
Policy compliance reviews
Test Pass Rates and Compliance Score
Your overall compliance score reflects test performance:
90-100% Pass Rate: Excellent compliance posture 70-89% Pass Rate: Good, but improvement areas exist 50-69% Pass Rate: Significant gaps requiring attention Below 50%: Major compliance issues
Framework completion percentage incorporates test pass rates, making them critical to certification readiness.
Integration-Dependent Tests
Automated tests depend on integration health:
Integration Connected: Tests run automatically
Integration Disconnected: Tests can't run, show as "Unknown" status
Integration Issues: Tests may fail due to connectivity problems
Monitor integration status to ensure tests continue running.
Manual Test Execution
For manual tests:
Navigate to the test detail page
Review the requirements and success criteria
Perform the verification activity
Document findings
Mark the test as passing or failing
Upload supporting evidence
Set next test date
Manual test results are recorded with timestamps and can include notes explaining outcomes.
Test History and Trends
DSALTA tracks test performance over time:
Historical Results: View past test outcomes
Trend Analysis: Identify improving or degrading controls
Pattern Recognition: Spot recurring issues
Audit Trail: Demonstrate continuous monitoring
Historical data proves sustained compliance, not just point-in-time success.
