TL;DR
- Use Commvault Command Center job details to analyze throughput performance across read, write, DDB, and network categories to identify where bottlenecks originate in your backup environment.
- High read percentages in job metrics indicate client-side disk issues, while high write percentages point to Media Agent or storage infrastructure problems requiring investigation.
- Run CVDiskPerf on the client system for read performance issues or on the Media Agent for write performance issues to validate disk infrastructure with Commvault-optimized workloads.
This tutorial provides a practical guide to diagnosing and resolving disk performance bottlenecks that impact Commvault backup and recovery operations. Environmental factors such as slow storage infrastructure can cause data protection jobs to run sub-optimally, and Commvault includes built-in tools to help administrators identify and address these issues efficiently. The video begins by explaining how to use the Command Center to view job details for both running and historical jobs, where administrators can analyze average throughput performance broken down into four categories: read, write, DDB, and network. This categorization is essential for pinpointing where bottlenecks originate—high read percentages typically indicate client-side disk infrastructure problems, while high write percentages suggest issues with the Media Agent or library storage systems. The core of the tutorial focuses on CVDiskPerf, a command-line utility included with every Commvault installation that validates disk performance using workloads optimized for Commvault processes. Administrators learn that the tool should be run on the client system when investigating read performance issues and on the Media Agent when troubleshooting write performance problems. The video covers the tool's location in the Commvault base installation directory and demonstrates both default and customized test configurations. Key customization options include specifying local or network paths, adjusting block size from the default 64KB, modifying block and file counts, changing thread counts, switching between random and sequential I/O operations, and using credential impersonation for network share access. Test results provide detailed metrics including data processed, time elapsed, and calculated throughput in GB per hour, giving administrators actionable data for storage optimization decisions.
Chapters
0:00 - Understanding Performance Bottlenecks
0:19 - Analyzing Job Throughput Metrics
1:02 - Introduction to CVDiskPerf
1:27 - Customizing Test Parameters
2:51 - Running Tests and Interpreting Results
Key Quotes
0:11 "Commvault has made it easy to identify these bottlenecks, test performance and identify steps that can be taken to optimise your environment."
1:46 "The default values are optimised for Commvault processes and will provide the most relevant tests."
3:06 "With CV Disk Perf in your toolbox, you can help ensure your environment is running optimally."