k1lib.data module

class k1lib.data.DataLoader(dataset, batchSize: int = 32, transform: Callable = None, random=True)[source]

Bases: object

__init__(dataset, batchSize: int = 32, transform: Callable = None, random=True)[source]

Creates a random sampler.

Basically, when given a dataset with length n and batch size, this will split things up into n/batchSize batches. Then, when indexed by an integer, this will return a range of the dataset.

Parameters

dataset – any object that implements __getitem__() and __len__() batchSize: integer

Deprecated since version 0.1.3.

copy()[source]
class k1lib.data.Data(train: k1lib.data.DataLoader, valid: k1lib.data.DataLoader)[source]

Bases: object

__init__(train: k1lib.data.DataLoader, valid: k1lib.data.DataLoader)[source]

Just a shell of both these variables really. Also, you can use PyTorch’s torch.utils.data.DataLoader here just fine

static fromDataset(dataset, batchSize: int = 32, trainSplit=0.8, *args, **kwargs)[source]
class k1lib.data.FunctionDataset(function: callable, _range=[- 5, 5], samples: int = 300)[source]

Bases: Generic[torch.utils.data.dataset.T_co]

A dataset tailored for 1->1 functions. Have several prebuilt datasets: - .exp: e^x - .log: ln(x) - .inverse: 1/x - .linear: 2x + 8 - .sin: sin(x)

__init__(function: callable, _range=[- 5, 5], samples: int = 300)[source]

Creates a new dataset, with a specific function.

Parameters
  • function – first order function, takes in an x variable

  • _range – range of x

  • samples – how many x in specified range

split(fraction)[source]
property xs
property ys
dl(shuffle=True, batch_size=32, **kwargs)[source]
exp = Simple 1->1 function dataset. Can do: - a.dl(): to get PyTorch's DataLoader object - a.xs: to get a tensor of all x values - a.ys: to get a tensor of all y values - len(a): to get length of dataset - a[i]: to get specific (x, y) element - a[a:b]: to get another FunctionDataset with a new range [a, b] (same density) - next(iter(a)): to iterate over all elements
inverse = Simple 1->1 function dataset. Can do: - a.dl(): to get PyTorch's DataLoader object - a.xs: to get a tensor of all x values - a.ys: to get a tensor of all y values - len(a): to get length of dataset - a[i]: to get specific (x, y) element - a[a:b]: to get another FunctionDataset with a new range [a, b] (same density) - next(iter(a)): to iterate over all elements
linear = Simple 1->1 function dataset. Can do: - a.dl(): to get PyTorch's DataLoader object - a.xs: to get a tensor of all x values - a.ys: to get a tensor of all y values - len(a): to get length of dataset - a[i]: to get specific (x, y) element - a[a:b]: to get another FunctionDataset with a new range [a, b] (same density) - next(iter(a)): to iterate over all elements
log = Simple 1->1 function dataset. Can do: - a.dl(): to get PyTorch's DataLoader object - a.xs: to get a tensor of all x values - a.ys: to get a tensor of all y values - len(a): to get length of dataset - a[i]: to get specific (x, y) element - a[a:b]: to get another FunctionDataset with a new range [a, b] (same density) - next(iter(a)): to iterate over all elements
sin = Simple 1->1 function dataset. Can do: - a.dl(): to get PyTorch's DataLoader object - a.xs: to get a tensor of all x values - a.ys: to get a tensor of all y values - len(a): to get length of dataset - a[i]: to get specific (x, y) element - a[a:b]: to get another FunctionDataset with a new range [a, b] (same density) - next(iter(a)): to iterate over all elements