Automated Testing
Set up continuous testing pipelines to validate your AI deployments automatically.
Overview
Automated testing in Centrify allows you to set up continuous testing pipelines that automatically validate your AI deployments. This ensures that your models maintain their quality and performance over time, even as you make updates and changes.
Key Features
- CI/CD Integration: Connect your testing pipelines to your continuous integration and deployment workflows.
- Scheduled Testing: Set up regular test runs to continuously monitor your deployed models.
- Event-Triggered Testing: Automatically run tests when specific events occur, such as model updates or configuration changes.
- Parallel Test Execution: Run multiple tests simultaneously to reduce testing time.
Testing Workflows
- Pre-Deployment Testing: Validate models before they are deployed to production.
- Post-Deployment Verification: Confirm that deployed models are functioning correctly in the production environment.
- Continuous Monitoring: Regularly test deployed models to detect any degradation in performance or behavior.
- Regression Testing: Automatically compare new model versions against previous versions to identify regressions.
Getting Started
To set up automated testing in Centrify, navigate to the Automated Testing section in your project dashboard. From there, you can configure testing pipelines, schedule test runs, and integrate with your CI/CD workflows.