You are right that enumerate doesn’t really affect training - it just gives you a number of current iteration, which you might use or ignore. You will do perfectly fine if you just omit calling enumerate at all.
From my experience, usually people use this number for 2 reasons:
If there are many batches (let’s say 10k) you may want to display some sort of output every X iterations (let’s say every 1000 iteration). Then you can use
if i % 1000 == 0 logic to display what you want
Some algorithms relies on warming up of learning rate technique. That involves gradually increasing learning rate value depending on which iteration you are in. Having
i allows you to write a code like
if e == 0 and i < 50: ...change learning rate...
These are 2 most common thing I saw people do, but I am sure there are more examples when knowing the number of iteration inside of an epoch might be useful. I also saw when people use global iteration number, which doesn’t depend on epoch. I guess using it in the second case makes code more clean…