Skip to content

liblaf.peach.optim.base ¤

Classes:

  • Callback
  • Objective
  • Optimizer
  • Result
  • Solution
  • State
  • Stats
  • SupportsFun
  • SupportsGrad
  • SupportsHessDiag
  • SupportsHessProd
  • SupportsHessQuad
  • SupportsValueAndGrad

Callback ¤

Bases: Protocol


              flowchart TD
              liblaf.peach.optim.base.Callback[Callback]

              

              click liblaf.peach.optim.base.Callback href "" "liblaf.peach.optim.base.Callback"
            

Methods:

  • __call__

__call__ ¤

__call__(
    objective: Objective[X],
    model_state: X,
    opt_state: S,
    opt_stats: T,
) -> None
Source code in src/liblaf/peach/optim/base/_types.py
62
63
64
def __call__(
    self, objective: Objective[X], model_state: X, opt_state: S, opt_stats: T, /
) -> None: ...

Objective ¤

Bases: Protocol


              flowchart TD
              liblaf.peach.optim.base.Objective[Objective]

              

              click liblaf.peach.optim.base.Objective href "" "liblaf.peach.optim.base.Objective"
            

Methods:

  • update

update ¤

update(state: X, params: Vector) -> X
Source code in src/liblaf/peach/optim/base/_objective.py
11
def update(self, state: X, params: Vector, /) -> X: ...

Optimizer ¤

Parameters:

  • jit ¤

    (bool, default: True ) –

Methods:

  • init
  • minimize
  • postprocess
  • step
  • terminate
  • update_stats

Attributes:

jit class-attribute instance-attribute ¤

jit: bool = static(default=True, kw_only=True)

init ¤

init[X](
    objective: P, model_state: X, params: Vector
) -> tuple[S, T]
Source code in src/liblaf/peach/optim/base/_optimizer.py
20
21
22
23
24
25
26
def init[X](
    self,
    objective: P,
    model_state: X,  # pyright: ignore[reportInvalidTypeVarUse]
    params: Vector,
) -> tuple[S, T]:
    raise NotImplementedError

minimize ¤

minimize[X](
    objective: P,
    model_state: X,
    params: Vector,
    callback: Callback[X, S, T] | None = None,
) -> tuple[Solution[S, T], X]
Source code in src/liblaf/peach/optim/base/_optimizer.py
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
def minimize[X](
    self,
    objective: P,
    model_state: X,
    params: Vector,
    callback: Callback[X, S, T] | None = None,
) -> tuple[Solution[S, T], X]:
    opt_state: S
    opt_stats: T
    opt_state, opt_stats = self.init(objective, model_state, params)
    if self.jit:
        model_state, opt_state, opt_stats = self._while_loop_jit(
            objective, model_state, opt_state, opt_stats, callback
        )
    else:
        model_state, opt_state, opt_stats = self._while_loop(
            objective, model_state, opt_state, opt_stats, callback
        )
    solution: Solution[S, T] = self.postprocess(
        objective, model_state, opt_state, opt_stats
    )
    return solution, model_state

postprocess ¤

postprocess[X](
    objective: P,
    model_state: X,
    opt_state: S,
    opt_stats: T,
) -> Solution[S, T]
Source code in src/liblaf/peach/optim/base/_optimizer.py
54
55
56
57
58
59
60
61
62
63
64
def postprocess[X](
    self,
    objective: P,  # noqa: ARG002
    model_state: X,  # pyright: ignore[reportInvalidTypeVarUse]  # noqa: ARG002
    opt_state: S,
    opt_stats: T,
) -> Solution[S, T]:
    opt_stats._end_time = time.perf_counter()  # noqa: SLF001
    return Optimizer.Solution(
        result=Optimizer.Result.SUCCESS, state=opt_state, stats=opt_stats
    )

step ¤

step[X](
    objective: P, model_state: X, opt_state: S
) -> tuple[X, S]
Source code in src/liblaf/peach/optim/base/_optimizer.py
28
29
30
31
32
33
34
def step[X](
    self,
    objective: P,
    model_state: X,
    opt_state: S,
) -> tuple[X, S]:
    raise NotImplementedError

terminate ¤

terminate[X](
    objective: P,
    model_state: X,
    opt_state: S,
    opt_stats: T,
) -> BooleanNumeric
Source code in src/liblaf/peach/optim/base/_optimizer.py
45
46
47
48
49
50
51
52
def terminate[X](
    self,
    objective: P,
    model_state: X,  # pyright: ignore[reportInvalidTypeVarUse]
    opt_state: S,
    opt_stats: T,
) -> BooleanNumeric:
    raise NotImplementedError

update_stats ¤

update_stats[X](
    objective: P,
    model_state: X,
    opt_state: S,
    opt_stats: T,
) -> T
Source code in src/liblaf/peach/optim/base/_optimizer.py
36
37
38
39
40
41
42
43
def update_stats[X](
    self,
    objective: P,  # noqa: ARG002
    model_state: X,  # pyright: ignore[reportInvalidTypeVarUse]  # noqa: ARG002
    opt_state: S,  # noqa: ARG002
    opt_stats: T,
) -> T:
    return opt_stats

Result ¤

Bases: StrEnum


              flowchart TD
              liblaf.peach.optim.base.Result[Result]

              

              click liblaf.peach.optim.base.Result href "" "liblaf.peach.optim.base.Result"
            

Methods:

  • __bool__

Attributes:

  • MAX_STEPS_REACHED
  • NAN
  • PRIMARY_SUCCESS
  • SECONDARY_SUCCESS
  • STAGNATION
  • SUCCESS
  • UNKNOWN_ERROR

MAX_STEPS_REACHED class-attribute instance-attribute ¤

MAX_STEPS_REACHED = auto()

NAN class-attribute instance-attribute ¤

NAN = auto()

PRIMARY_SUCCESS class-attribute instance-attribute ¤

PRIMARY_SUCCESS = auto()

SECONDARY_SUCCESS class-attribute instance-attribute ¤

SECONDARY_SUCCESS = auto()

STAGNATION class-attribute instance-attribute ¤

STAGNATION = auto()

SUCCESS class-attribute instance-attribute ¤

SUCCESS = auto()

UNKNOWN_ERROR class-attribute instance-attribute ¤

UNKNOWN_ERROR = auto()

__bool__ ¤

__bool__() -> bool
Source code in src/liblaf/peach/optim/base/_types.py
29
30
31
32
33
34
def __bool__(self) -> bool:
    return self in {
        Result.SUCCESS,
        Result.PRIMARY_SUCCESS,
        Result.SECONDARY_SUCCESS,
    }

Solution ¤

Parameters:

  • result ¤

    (Result) –
  • state ¤

    (S) –
  • stats ¤

    (T) –

Attributes:

  • params (Vector) –
  • result (Result) –
  • state (S) –
  • stats (T) –
  • success (bool) –

params property ¤

params: Vector

result class-attribute instance-attribute ¤

result: Result = static()

state instance-attribute ¤

state: S

stats instance-attribute ¤

stats: T

success property ¤

success: bool

State ¤

Parameters:

  • params ¤

    (Vector, default: None ) –

Attributes:

  • params (Vector) –

params class-attribute instance-attribute ¤

params: Vector = array(default=None, kw_only=True)

Stats ¤

Methods:

  • __pdoc__
  • __rich_repr__

Attributes:

time property ¤

time: float

__pdoc__ ¤

__pdoc__(**kwargs) -> AbstractDoc | None
Source code in src/liblaf/peach/optim/base/_types.py
47
48
def __pdoc__(self, **kwargs) -> wl.AbstractDoc | None:
    return pdoc_rich_repr(self, **kwargs)

__rich_repr__ ¤

__rich_repr__() -> RichReprResult
Source code in src/liblaf/peach/optim/base/_types.py
50
51
52
def __rich_repr__(self) -> RichReprResult:
    yield from rich_repr_fieldz(self)
    yield "time", self.time

SupportsFun ¤

Bases: Protocol


              flowchart TD
              liblaf.peach.optim.base.SupportsFun[SupportsFun]

              

              click liblaf.peach.optim.base.SupportsFun href "" "liblaf.peach.optim.base.SupportsFun"
            

Methods:

  • fun

fun ¤

fun(state: X) -> Scalar
Source code in src/liblaf/peach/optim/base/_objective.py
16
def fun(self, state: X, /) -> Scalar: ...

SupportsGrad ¤

Bases: Protocol


              flowchart TD
              liblaf.peach.optim.base.SupportsGrad[SupportsGrad]

              

              click liblaf.peach.optim.base.SupportsGrad href "" "liblaf.peach.optim.base.SupportsGrad"
            

Methods:

  • grad

grad ¤

grad(state: X) -> Vector
Source code in src/liblaf/peach/optim/base/_objective.py
21
def grad(self, state: X, /) -> Vector: ...

SupportsHessDiag ¤

Bases: Protocol


              flowchart TD
              liblaf.peach.optim.base.SupportsHessDiag[SupportsHessDiag]

              

              click liblaf.peach.optim.base.SupportsHessDiag href "" "liblaf.peach.optim.base.SupportsHessDiag"
            

Methods:

  • hess_diag

hess_diag ¤

hess_diag(state: X) -> Vector
Source code in src/liblaf/peach/optim/base/_objective.py
36
def hess_diag(self, state: X, /) -> Vector: ...

SupportsHessProd ¤

Bases: Protocol


              flowchart TD
              liblaf.peach.optim.base.SupportsHessProd[SupportsHessProd]

              

              click liblaf.peach.optim.base.SupportsHessProd href "" "liblaf.peach.optim.base.SupportsHessProd"
            

Methods:

  • hess_prod

hess_prod ¤

hess_prod(state: X, p: Vector) -> Vector
Source code in src/liblaf/peach/optim/base/_objective.py
31
def hess_prod(self, state: X, p: Vector, /) -> Vector: ...

SupportsHessQuad ¤

Bases: Protocol


              flowchart TD
              liblaf.peach.optim.base.SupportsHessQuad[SupportsHessQuad]

              

              click liblaf.peach.optim.base.SupportsHessQuad href "" "liblaf.peach.optim.base.SupportsHessQuad"
            

Methods:

  • hess_quad

hess_quad ¤

hess_quad(state: X, p: Vector) -> Scalar
Source code in src/liblaf/peach/optim/base/_objective.py
41
def hess_quad(self, state: X, p: Vector, /) -> Scalar: ...

SupportsValueAndGrad ¤

Bases: Protocol


              flowchart TD
              liblaf.peach.optim.base.SupportsValueAndGrad[SupportsValueAndGrad]

              

              click liblaf.peach.optim.base.SupportsValueAndGrad href "" "liblaf.peach.optim.base.SupportsValueAndGrad"
            

Methods:

  • value_and_grad

value_and_grad ¤

value_and_grad(state: X) -> tuple[Scalar, Vector]
Source code in src/liblaf/peach/optim/base/_objective.py
26
def value_and_grad(self, state: X, /) -> tuple[Scalar, Vector]: ...