About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Last updated: Nov 21, 2024
Use performance monitoring to know the velocity of data records processed by your deployment.You enable performance monitoring when you select the deployment to be tracked and monitored.
Performance metrics are calculated based on the following information:
- scoring payload data
For proper monitoring purpose, log in every scoring request as well. Payload data logging is automated for IBM watsonx.ai Runtime engines. For other machine learning engines, the payload data can be provided either by using the Python client or the REST API. Performance monitoring does not create any additional scoring requests on the monitored deployment.
You can review performance metrics value over time on the Insights dashboard:
Supported performance metrics
The following performance metrics are supported:
Next steps
Parent topic: Configuring model evaluations