PicklingError with structured logger

I’m using a minimal structured logger with ray/serve which is causing a PicklingError while the code is loaded by Python:

import time
import ray
from ray import serve
from fastapi import FastAPI
# https://github.com/vapor-ware/containerlog
import containerlog
from containerlog.proxy.std import patch

logger = containerlog.get_logger('__name__')
containerlog.set_level(containerlog.TRACE)
# patch all standard loggers to use containerlog
patch()

app = FastAPI()

http_options = {}
http_options["host"] = "127.0.0.1"
http_options["port"] = 8787
http_options["location"] = "HeadOnly"
http_options["num_cpus"] = 2

ray.init(address="127.0.0.1:8787", namespace="serve")
serve.start(http_options=http_options)

@serve.deployment(route_prefix="/")
@serve.ingress(app)
class Deployment:
    @app.post("/test")
    def this_one(self):
        try:
            print(1/0)
        except Exception as e:
            # test messages
            logger.trace("Valentine's day today")
            logger.debug("Debug the insects")
            logger.info("Just an fyi")
            logger.warn("Don't Look Up")
            logger.error("Did you try to divide by zero? Yes, I did")
            logger.critical("The metorite hit Earth at ...")

Deployment.deploy()

while True:
    time.sleep(2)
> python badlog.py
Traceback (most recent call last):
  File "C:\...\badlog.py", line 27, in <module>
    class Deployment:
  File "C:\Users\User\AppData\Local\Programs\Python\Python39\lib\site-packages\ray\serve\api.py", line 589, in decorator
    frozen_app = cloudpickle.loads(cloudpickle.dumps(app))
  File "C:\Users\User\AppData\Local\Programs\Python\Python39\lib\site-packages\ray\cloudpickle\cloudpickle_fast.py", line 73, in dumps
    cp.dump(obj)
  File "C:\Users\User\AppData\Local\Programs\Python\Python39\lib\site-packages\ray\cloudpickle\cloudpickle_fast.py", line 620, in dump
    return Pickler.dump(self, obj)
  File "C:\Users\User\AppData\Local\Programs\Python\Python39\lib\site-packages\ray\cloudpickle\cloudpickle_fast.py", line 316, in _file_reduce
    raise pickle.PicklingError(
_pickle.PicklingError: Cannot pickle files that map to tty objects

Hi @Henry_Thornton can you try instantiating the logger inside of the deployment class rather than capturing the global logger?

The way it’s written, Serve will try to pickle the logger object and save it as part of the class definition. That doesn’t work in this case due to the logger having some un-pickleable state. If you do something like self._logger = configure_logger(*options) in the class constructor, it should resolve the problem.

Ah got it. Thanks

Is it possible to write the ray/serve Deployment class as functions and if so how?

@serve.deployment(route_prefix="/")
@serve.ingress(app)
class Deployment:
    def __init__(self):
        ...
        ...
    @app.post("/letter")
    def this_one(self):
        ...
        ...
    @app.get("/ttysberg")
    def that_one(self):
        ...
        ...

Deployment.deploy()

while True:
    time.sleep(2)

Hi @Henry_Thornton, could you explain a bit more what you mean by writing the Deployment class as a function?

Any Deployment can be either a class or a function; as in this example from the documentation

@serve.deployment
def hello(request):
    name = request.query_params["name"]
    return f"Hello {name}!"

Personally, I don’t write classes. Only functions. Plus, it makes debugging faster (for me). So, what does the Deployment class look like as one or more functions?

This is an expanded example from End-to-End Tutorial — Ray v1.10.0. How would the Deployment class be re-written as function(s) so this module is the first to be executed?

import ray
from ray import serve
from fastapi import FastAPI

app = FastAPI()

http_options = {}
http_options["host"] = "127.0.0.1"
http_options["port"] = 8787
http_options["location"] = "HeadOnly"
http_options["num_cpus"] = 2

ray.init(address="127.0.0.1:8787", namespace="serve")
serve.start(http_options=http_options)

@serve.deployment(route_prefix="/api")
@serve.ingress(app)
class Deployment:

    def __init__(self):
        self.count = 0

    @app.get("/")
    def get(self):
        return {"count": self.count}

    @app.get("/incr")
    def incr(self):
        self.count += 1
        return {"count": self.count}

    @app.get("/decr")
    def decr(self):
        self.count -= 1
        return {"count": self.count}

Deployment.deploy()

# Serve will be shut down once the script exits, so keep it alive manually.
while True:
    time.sleep(2)

Sorry, I think for this example you have to use a class.