mygrad.nnet.losses.margin_ranking_loss#

mygrad.nnet.losses.margin_ranking_loss(x1: ArrayLike, x2: ArrayLike, y: ArrayLike, margin: float, *, constant: Optional[bool] = None) Tensor[source]#

Computes the margin average margin ranking loss. Equivalent to:

>>> import mygrad as mg
>>> mg.mean(mg.maximum(0, margin - y * (x1 - x2)))
Parameters
x1ArrayLike, shape=(N,) or (N, D)

A batch of scores or descriptors to compare against those in x2

x2ArrayLike, shape=(N,) or (N, D)

A batch of scores or descriptors to compare against those in x1

yUnion[int, ArrayLike], scalar or shape=(N,)

1 or -1. Specifies whether the margin is compared against (x1 - x2) or (x2 - x1), for each of the N comparisons.

marginfloat

A non-negative value to be used as the margin for the loss.

constantbool, optional(default=False)

If True, the returned tensor is a constant (it does not back-propagate a gradient)

Returns
mygrad.Tensor, shape=()

The mean margin ranking loss.