XGBoostTrainer -- Distributed Weights Not Working?

When trying to use the weights for XGBoostTrainer, like:

train_weights_ds = train_set.select_columns(['weight'])

trainer = XGBoostTrainer(
    scaling_config=ScalingConfig(
        num_workers=16,
        use_gpu=True,
    ),
    early_stopping_rounds=10,
    dmatrix_params={"train": {'weight': train_weights_ds}, },
    ...
)

I am met with data size mismatches, suggesting that weights are not being sharded in line with each shard to sent to each worker. Is it possible to attach weights to each worker?

Check failed: weights_.Size() == num_row_ (92711999 vs. 3862999) : Size of weights must equal to number of rows.

Can you use the weight column name instead?

-    dmatrix_params={"train": {'weight': train_weights_ds}, },
+    dmatrix_params={"train": {'weight': 'weight'}, },

Works great, thanks!