I'm not sure if my answer fits quite well to what you're looking for. In principle what you could do in order to have random batch sizes during training is to make the fit in a loop. That is, in each loop step you save the model weights, then you load it into the model, change the batch size, compile it and fit again. It might be possible to implement a callback class to do that internally instead of needing a loop.
Though I think this is not exactly what you want, since the batch size just change once the net goes over the full sample. But, as previous answer, so far we don't have a option to do that per epoch (in principle it could be implemented).