Newer
Older
# Torch Serve metrics with Grafana
To run Grafana, Prometheus, and Torch serve:
docker-compose up
- Grafana: http://localhost:3000
- Prometheus: http://localhost:9090
- To register a model in Torch Serve (resnet-18 example):
curl --write-out %{http_code} --silent --output /dev/null --retry 5 -X POST "http://localhost:8081/models?url=https://torchserve.pytorch.org/mar_files/resnet-18.mar&initial_workers=1&synchronous=true"
- To inspect models: curl "http://localhost:8081/models"
- Prediction example:
curl -O https://s3.amazonaws.com/model-server/inputs/kitten.jpg
curl -X POST http://127.0.0.1:8080/predictions/densenet161 -T kitten.jpg