AI Agents Overview
Autonify provides a comprehensive suite of AI-powered agents that automatically analyze, document, and optimize your data infrastructure. Each agent is designed for specific tasks and can work independently or together to provide complete data intelligence.
Available Agents
Data Discovery & Documentation
Data Documenter
- Purpose: Automatically generates comprehensive documentation for your tables and columns
- How it works: Analyzes data patterns, column names, and relationships to create meaningful descriptions
- Output: Natural language descriptions for each table and column in your database
- Use when: You need to understand unfamiliar databases or create documentation for teams
Data Profiler
- Purpose: Generates detailed statistical profiles and pattern analysis for your datasets
- How it works: Samples data to calculate distributions, identify patterns, and detect outliers
- Output: Statistical summaries, value distributions, and data quality metrics
- Use when: You need to understand data characteristics before analysis or migration
Data Growth
- Purpose: Tracks and monitors data volume changes over time
- How it works: Records row counts and table sizes during each scan
- Output: Growth trends, volume metrics, and storage utilization reports
- Use when: Planning capacity, monitoring data accumulation, or identifying rapid growth areas
Semantic Grouping
- Purpose: Automatically organizes related tables and columns into logical groups
- How it works: Uses AI to identify semantic relationships and business domains
- Output: Grouped tables by business function (e.g., customer data, financial data, inventory)
- Use when: Organizing large databases or understanding data domains
Security & Compliance
Data Sensitivity Detection
- Purpose: Identifies and classifies sensitive information like PII, PHI, and financial data
- How it works: Scans column names and data samples to detect sensitive patterns
- Output: Classification of each column as High, Medium, or Low sensitivity
- Categories detected:
- Personal Identifiable Information (PII)
- Protected Health Information (PHI)
- Financial account numbers
- Government identifiers
- Contact information
- Use when: Ensuring compliance with GDPR, HIPAA, or other data privacy regulations
Data Quality
Data Quality Monitor
- Purpose: Identifies and reports data quality issues across your database
- How it works: Runs configurable rules to detect missing values, inconsistencies, and anomalies
- Output: Quality score, issue reports, and recommendations for improvement
- Issues detected:
- Missing or null values
- Duplicate records
- Data type mismatches
- Referential integrity violations
- Statistical anomalies
- Use when: Maintaining data quality standards or preparing for analytics
Data Structure & Relationships
Foreign Key Discovery
- Purpose: Discovers hidden relationships between tables across your data sources
- How it works: Analyzes column names, data types, and value overlaps to identify potential foreign keys
- Output: Recommended foreign key relationships with confidence scores
- Use when: Understanding database relationships or documenting legacy systems
- Note: Currently marked as "Coming Soon"
Business Intelligence
Data Agents (BI Agent)
- Purpose: Performs deep business intelligence investigations on your data
- How it works: Executes multi-phase analysis using customizable workflows
- Output: Executive reports with findings, monetary impact, and recommendations
- Special features:
- Industry-specific agent packs
- Customizable investigation workflows
- Financial impact quantification
- Use when: Need comprehensive business insights or executive reporting
- Access: Through the BI Packs menu instead of regular agents list
API & Integration
Mesh Metadata (GraphQL)
- Purpose: Tracks and manages tables exposed through GraphQL API
- How it works: Synchronizes database schema with GraphQL metadata
- Output: API metadata configuration and tracking status
- Use when: Managing GraphQL API endpoints
Utility Agents
Database Scan
- Purpose: Core agent that discovers and catalogs database structure
- How it works: Connects to database and extracts schema information
- Output: Complete catalog of tables, columns, and basic metadata
- Note: Runs automatically, not visible in regular agent list
Agents Coming Soon
Several advanced agents are planned but not yet available:
- AI Readiness: Assess datasets for machine learning readiness
- Data Lineage: Track data flow from source to destination
- Test Data Management: Comprehensive test data masking and subsetting
- Data Migration: End-to-end migration between systems
- Data Warehouse: Warehouse optimization and cost reduction
- Data Integration: Build ELT/ETL pipelines
- Data Modeling: Create optimized data models
Agent Categories
Agents are grouped by their primary function:
- Security: Data protection and compliance
- Documentation: Understanding and documenting data
- Data Quality: Ensuring data integrity
- Data Structure: Relationships and organization
- Analytics: Business intelligence and insights
- Development: Testing and development support
- API: External system integration
- Data Operations: Migration and transformation
How Agents Work Together
Agents can be run in sequence for comprehensive analysis:
-
Initial Discovery
- Database Scan → Data Documenter → Data Profiler
-
Security Assessment
- Data Sensitivity Detection → Data Quality Monitor
-
Business Analysis
- Semantic Grouping → BI Agent → Executive Reports
-
Development Preparation
- Data Profiler → Data Quality Monitor
Getting Started
- Connect your data source first
- Start with Data Sensitivity Detection for compliance
- Run Data Documenter to understand your data
- Use Data Profiler for detailed analysis
- Configure specialized agents based on your needs