torch_bsf package
Submodules
torch_bsf.bezier_simplex module
- class torch_bsf.bezier_simplex.BezierSimplex(n_params: int, n_values: int, degree: int)[source]
Bases:
LightningModule
A Bezier simplex model.
- Parameters:
n_params – The number of parameters.
n_values – The number of values.
degree – The degree of the Bezier simplex.
Examples
>>> ts = torch.tensor( # parameters on a simplex ... [ ... [3/3, 0/3, 0/3], ... [2/3, 1/3, 0/3], ... [2/3, 0/3, 1/3], ... [1/3, 2/3, 0/3], ... [1/3, 1/3, 1/3], ... [1/3, 0/3, 2/3], ... [0/3, 3/3, 0/3], ... [0/3, 2/3, 1/3], ... [0/3, 1/3, 2/3], ... [0/3, 0/3, 3/3], ... ] ... ) >>> xs = 1 - ts * ts # values corresponding to the parameters >>> dl = DataLoader(TensorDataset(ts, xs)) >>> bs = BezierSimplex( ... n_params=int(ts.shape[1]), ... n_values=int(xs.shape[1]), ... degree=3, ... ) >>> trainer = pl.Trainer( ... callbacks=[EarlyStopping(monitor="val_mse")], ... ) >>> trainer.fit(bs, dl) >>> ts, xs = bs.meshgrid()
- forward(t: Tensor) Tensor [source]
Process a forwarding step of training.
- Parameters:
t – A minibatch of parameter vectors \(\mathbf t\).
- Return type:
A minibatch of value vectors.
- class torch_bsf.bezier_simplex.BezierSimplexDataModule(data: str, label: str, header: int = 0, delimiter: str | None = None, batch_size: int | None = None, split_ratio: float = 0.5, normalize: str = 'none')[source]
Bases:
LightningDataModule
A data module for training a Bezier simplex.
- Parameters:
data – The path to a data file.
label – The path to a label file.
header – The number of headers in data files.
delimiter – The delimiter of data files.
batch_size – The size of minibatch.
split_ratio – The ratio of train-val split.
normalize – The data normalization method. Either
"max"
,"std"
,"quantile"
, or"none"
.
- test_dataloader() DataLoader [source]
- train_dataloader() DataLoader [source]
- val_dataloader() DataLoader [source]
- torch_bsf.bezier_simplex.fit(params: Tensor, values: Tensor, degree: int, batch_size: int | None = None, max_epochs: int | None = None, accelerator: str | Accelerator = 'auto', strategy: str | Strategy = 'auto', devices: List[int] | str | int = 'auto', num_nodes: int | None = None) BezierSimplex [source]
Fits a Bezier simplex.
- Parameters:
params – The data.
values – The label data.
degree – The degree of the Bezier simplex.
batch_size – The size of minibatch.
max_epochs – The number of epochs to stop training.
accelerator – The type of accelerators to use.
strategy – Distributed computing strategy.
devices – The number of accelerator devices to use.
num_nodes – The number of compute nodes to use.
- Return type:
A trained Bezier simplex.
Examples
>>> import torch >>> import torch_bsf
Prepare training data
>>> ts = torch.tensor( # parameters on a simplex ... [ ... [3/3, 0/3, 0/3], ... [2/3, 1/3, 0/3], ... [2/3, 0/3, 1/3], ... [1/3, 2/3, 0/3], ... [1/3, 1/3, 1/3], ... [1/3, 0/3, 2/3], ... [0/3, 3/3, 0/3], ... [0/3, 2/3, 1/3], ... [0/3, 1/3, 2/3], ... [0/3, 0/3, 3/3], ... ] ... ) >>> xs = 1 - ts * ts # values corresponding to the parameters
Train a model
>>> bs = torch_bsf.fit(params=ts, values=xs, degree=3)
Predict by the trained model
>>> t = [[0.2, 0.3, 0.5]] >>> x = bs(t) >>> print(f"{t} -> {x}")
- torch_bsf.bezier_simplex.indices(dim: int, deg: int) Iterable[Tuple[int, ...]] [source]
Iterates the index of control points of the Bezier simplex.
- Parameters:
dim – The array length of indices.
deg – The degree of the Bezier simplex.
- Return type:
The indices.
- torch_bsf.bezier_simplex.monomial(var: Iterable[float], deg: Iterable[int]) Tensor [source]
Computes a monomial \(\mathbf t^{\mathbf d} = t_1^{d_1} t_2^{d_2}\cdots t_M^{d^M}\).
- Parameters:
var – The bases \(\mathbf t\).
deg – The powers \(\mathbf d\).
- Return type:
The monomial \(\mathbf t^{\mathbf d}\).
- torch_bsf.bezier_simplex.polynom(degree: int, index: Iterable[int]) float [source]
Computes a polynomial coefficient \(inom{D}{\mathbf d} = rac{D!}{d_1!d_2!\cdots d_M!}\).
- Parameters:
degree – The degree \(D\).
index – The index \(\mathbf d\).
- Return type:
The polynomial coefficient \(inom{D}{\mathbf d}\).
Module contents
torch_bsf: PyTorch implementation of Bezier simplex fitting.
- class torch_bsf.BezierSimplex(n_params: int, n_values: int, degree: int)[source]
Bases:
LightningModule
A Bezier simplex model.
- Parameters:
n_params – The number of parameters.
n_values – The number of values.
degree – The degree of the Bezier simplex.
Examples
>>> ts = torch.tensor( # parameters on a simplex ... [ ... [3/3, 0/3, 0/3], ... [2/3, 1/3, 0/3], ... [2/3, 0/3, 1/3], ... [1/3, 2/3, 0/3], ... [1/3, 1/3, 1/3], ... [1/3, 0/3, 2/3], ... [0/3, 3/3, 0/3], ... [0/3, 2/3, 1/3], ... [0/3, 1/3, 2/3], ... [0/3, 0/3, 3/3], ... ] ... ) >>> xs = 1 - ts * ts # values corresponding to the parameters >>> dl = DataLoader(TensorDataset(ts, xs)) >>> bs = BezierSimplex( ... n_params=int(ts.shape[1]), ... n_values=int(xs.shape[1]), ... degree=3, ... ) >>> trainer = pl.Trainer( ... callbacks=[EarlyStopping(monitor="val_mse")], ... ) >>> trainer.fit(bs, dl) >>> ts, xs = bs.meshgrid()
- forward(t: Tensor) Tensor [source]
Process a forwarding step of training.
- Parameters:
t – A minibatch of parameter vectors \(\mathbf t\).
- Return type:
A minibatch of value vectors.
- class torch_bsf.BezierSimplexDataModule(data: str, label: str, header: int = 0, delimiter: str | None = None, batch_size: int | None = None, split_ratio: float = 0.5, normalize: str = 'none')[source]
Bases:
LightningDataModule
A data module for training a Bezier simplex.
- Parameters:
data – The path to a data file.
label – The path to a label file.
header – The number of headers in data files.
delimiter – The delimiter of data files.
batch_size – The size of minibatch.
split_ratio – The ratio of train-val split.
normalize – The data normalization method. Either
"max"
,"std"
,"quantile"
, or"none"
.
- test_dataloader() DataLoader [source]
- train_dataloader() DataLoader [source]
- val_dataloader() DataLoader [source]
- torch_bsf.fit(params: Tensor, values: Tensor, degree: int, batch_size: int | None = None, max_epochs: int | None = None, accelerator: str | Accelerator = 'auto', strategy: str | Strategy = 'auto', devices: List[int] | str | int = 'auto', num_nodes: int | None = None) BezierSimplex [source]
Fits a Bezier simplex.
- Parameters:
params – The data.
values – The label data.
degree – The degree of the Bezier simplex.
batch_size – The size of minibatch.
max_epochs – The number of epochs to stop training.
accelerator – The type of accelerators to use.
strategy – Distributed computing strategy.
devices – The number of accelerator devices to use.
num_nodes – The number of compute nodes to use.
- Return type:
A trained Bezier simplex.
Examples
>>> import torch >>> import torch_bsf
Prepare training data
>>> ts = torch.tensor( # parameters on a simplex ... [ ... [3/3, 0/3, 0/3], ... [2/3, 1/3, 0/3], ... [2/3, 0/3, 1/3], ... [1/3, 2/3, 0/3], ... [1/3, 1/3, 1/3], ... [1/3, 0/3, 2/3], ... [0/3, 3/3, 0/3], ... [0/3, 2/3, 1/3], ... [0/3, 1/3, 2/3], ... [0/3, 0/3, 3/3], ... ] ... ) >>> xs = 1 - ts * ts # values corresponding to the parameters
Train a model
>>> bs = torch_bsf.fit(params=ts, values=xs, degree=3)
Predict by the trained model
>>> t = [[0.2, 0.3, 0.5]] >>> x = bs(t) >>> print(f"{t} -> {x}")