-
Notifications
You must be signed in to change notification settings - Fork 71
Open
Description
hi i am going with a custom docker image with all the cuda cudnn installed and also tested locally gpu is being utilized. but when upload to ecr and create endpoint it does not create endpoint and says kindly make sure docker serve command is valid , from debugging i came to found out that inference toolkit is needed inside image for the image to see if sagemaker gpu is avail or not, but there is no sample dockerfile from which i can understand , kindly tell
1)how to enable cuda support in custom built docker images for sagemaker
2)will using prebuilt images e.g accountnum.aws.amazon.com/pytorch:1.10-cuda113-py3 directly use cuda/gpu of sagemaker instance?
Metadata
Metadata
Assignees
Labels
No labels