I did get the sense from the documentation that model can be loaded to memory and serve deployment can be created for this case, but just wanted to check if it’s possible to serve custom containerized model?
I did get the sense from the documentation that model can be loaded to memory and serve deployment can be created for this case, but just wanted to check if it’s possible to serve custom containerized model?