Embeddings regularization
Hi, first of all, thanks for your contribution to recsys world :)
I have one general question regarding regularization of user/item embeddings:
Your regularization term consists of normalized weights of ALL users embeddings and ALL items embeddings in each learning batch (https://gitlab.com/recpack-maintainers/recpack/-/blob/master/recpack/algorithms/bprmf.py#L211). In that way, ALL embeddings are regularized even if some users/items are not present in current batch. As I know this is common practice in many repositories, but I have some doubts about it.
How about considering only those users/items embeddings weights (in regularization term) that are present in current single batch? Do you have an opinion about it?
Best regards
Edited by Krystian Franus