Feature importance

Hi
How can I do feature importance after i train a DQN.agent in Rllib?
Thank you.

Hey @sherko_salehpour,

You can add FeatureImportance similar to any other off_policy_estimation_method to your config and run the algorithm with that config. Something like the following:

from ray.rllib.algorithms.dqn import DQNConfig
from ray.rllib.offline.feature_importance import FeatureImportance


config = (
    DQNConfig()
    .framework("torch")
    .evaluation(
        evaluation_interval=1,
        off_policy_estimation_methods={
            "feature_importance": {
                "type": FeatureImportance, "limit_fraction": 1e-3,
            }
    )
)

Hello
thanks for your response
After building the model, how can I plot the feature importance?

After each round of training you will get a result dict back which will have feature_importance across each dimension of observation computed. You can plot them at any iteration you want.

Please tell me how I can callback feature_importance_score
(With script)
Thank you