URL details: yash0307.github.io/RecallatK_surrogate

URL title: Recall@k Surrogate Loss with Large Batches and Similarity Mixup
URL paragraphs: Yash Patel     Giorgos Tolias     Jiří Matas Direct optimization, by gradient descent, of an evaluation metric, is not possible when it is non-differentiable, which is the case for recall in retrieval. In this work, a differentiable surrogate loss
URL keywords: Learning Surrogates, Recall@k, Recall@k Surrogate Loss, Mixup, Similarity Mixup, Non-Differentiable Losses
URL last crawled: 2023-01-07
URL speed: 147.000 MB/s, downloaded in 0.001 seconds

open external url

We found no external links pointing to this url.