lossFunctions package

Package with a bunch of loss function callbacks. If you’re planning to write your own loss function classes, then you have to set l’s loss and lossG fields. lossG is the original loss, still attached to the graph (hence “G”). Then, loss is just lossG.detach().item(). This is so that other utilities can use a shared detached loss value, for performance reasons.

shorts module

For not very complicated loss functions

class k1lib.callbacks.lossFunctions.shorts.LossLambda(lossF: Callable[[Tuple[torch.Tensor, torch.Tensor]], float])[source]

Bases: k1lib.callbacks.callbacks.Callback

__init__(lossF: Callable[[Tuple[torch.Tensor, torch.Tensor]], float])[source]

Creates a generic loss function that takes in y and correct y yb and return a single loss float (still attached to graph).

class k1lib.callbacks.lossFunctions.shorts.LossNLLCross(nll: bool, integrations: bool)[source]

Bases: k1lib.callbacks.callbacks.Callback

__init__(nll: bool, integrations: bool)[source]
Parameters
detach()[source]

Detaches from the parent Callbacks