-
Notifications
You must be signed in to change notification settings - Fork 71
Description
What did you find confusing? Please describe.
This is Dockerfile link:
https://github.com/aws/sagemaker-pytorch-inference-toolkit/blob/master/docker/1.5.0/py3/Dockerfile.cpu
This link https://docs.aws.amazon.com/sagemaker/latest/dg/ei-endpoints.html#ei-endpoints-pytorch
states:
You can download the Elastic Inference enabled binary for PyTorch from the public Amazon S3 bucket at console.aws.amazon.com/s3/buckets/amazonei-pytorch. For information about building a container that uses the Elastic Inference enabled version of PyTorch, see Building your image.
I am confused. If I use the Dockerfile above, do I still need to download and install https://console.aws.amazon.com/s3/buckets/amazonei-pytorch to build docker container image?
If I want to use customer docker image for Sagemaker elastic inference, do I need to convert pytorch code into torchscript?
This part is not covered.
Can I use it for Python version >=3.7 and PyTorch version >=1.12?
Describe how documentation can be improved
A clear and concise description of where documentation was lacking and how it can be improved.
Additional context
Add any other context or screenshots about the documentation request here.