TensorFlow Model Inference
Workflow capability for running ML model inference using TensorFlow Serving. Combines model management and inference operations to support MLOps workflows including model health monitoring, metadata inspection, and running classification, regression, and prediction tasks in production environments.
What You Can Do
MCP Tools
get-model-status
Check the health and availability status of a TensorFlow model
get-model-version-status
Check status of a specific version of a TensorFlow model
get-model-metadata
Retrieve signature definitions and schema metadata for a TensorFlow model
get-model-version-metadata
Retrieve metadata for a specific version of a TensorFlow model
classify-with-model
Run classification inference on a TensorFlow model with input examples
regress-with-model
Run regression inference on a TensorFlow model with input examples
predict-with-model
Run prediction inference on a TensorFlow model using row or column format inputs