Relationship of epochs and training itertions

Hi all,

I am new to ray and I am trying to use the API for tuning.

I do not understand the difference between epochs and iterations.

The step() function of my trainable looks like this

    def step(self) -> dict:
        """
        Train the model for all epochs.
        For each batch in the training data, calculate the loss and update the model parameters.
        This calculation is performed based on the model's step function.
        At the end, return the objective metric(s) for the tuning process.
        """
        loss = 0.0
        for epoch in range(self.epochs):
            self.model.train()
            print(epoch)
            for x, y, meta in self.training:
                self.optimizer.zero_grad()
                current_loss = self.model.step(x, y, self.loss_dict)
                loss += current_loss.item()
                current_loss.backward()
                self.optimizer.step()
            loss /= len(self.training)
        return self.objective()

Is this supposed to be like this - with the explicit iteration through the epochs or is the number of iterations supposed to take care of this?

Thanks a lot!
Luisa