liblaf.flame_pytorch.config
¤
Classes:
Functions:
FlameConfig
pydantic-model
¤
Bases: BaseModel
Parameters:
-
(flame_model_path¤Path, default:PosixPath('model/generic_model.pkl')) –flame model path
-
(static_landmark_embedding_path¤Path, default:PosixPath('/home/runner/.cache/liblaf/flame-pytorch/flame_static_embedding.pkl')) –Static landmark embeddings path for FLAME
-
(dynamic_landmark_embedding_path¤Path, default:PosixPath('/home/runner/.cache/liblaf/flame-pytorch/flame_dynamic_embedding.npy')) –Dynamic contour embedding path for FLAME
-
(shape_params¤int, default:100) –the number of shape parameters
-
(expression_params¤int, default:50) –the number of expression parameters
-
(pose_params¤int, default:6) –the number of pose parameters
-
(use_face_contour¤bool, default:True) –If true apply the landmark loss on also on the face contour.
-
(use_3d_translation¤bool, default:True) –If true apply the landmark loss on also on the face contour.
-
(optimize_eyeballpose¤bool, default:True) –If true optimize for the eyeball pose.
-
(optimize_neckpose¤bool, default:True) –If true optimize for the neck pose.
-
(num_worker¤int, default:4) –pytorch number worker.
-
(batch_size¤int, default:1) –Training batch size.
-
(ring_margin¤float, default:0.5) –ring margin.
-
(ring_loss_weight¤float, default:1.0) –weight on ring loss.
Show JSON schema:
{
"properties": {
"flame_model_path": {
"format": "path",
"title": "Flame Model Path",
"type": "string"
},
"static_landmark_embedding_path": {
"format": "path",
"title": "Static Landmark Embedding Path",
"type": "string"
},
"dynamic_landmark_embedding_path": {
"format": "path",
"title": "Dynamic Landmark Embedding Path",
"type": "string"
},
"shape_params": {
"default": 100,
"title": "Shape Params",
"type": "integer"
},
"expression_params": {
"default": 50,
"title": "Expression Params",
"type": "integer"
},
"pose_params": {
"default": 6,
"title": "Pose Params",
"type": "integer"
},
"use_face_contour": {
"default": true,
"title": "Use Face Contour",
"type": "boolean"
},
"use_3d_translation": {
"default": true,
"title": "Use 3D Translation",
"type": "boolean"
},
"optimize_eyeballpose": {
"default": true,
"title": "Optimize Eyeballpose",
"type": "boolean"
},
"optimize_neckpose": {
"default": true,
"title": "Optimize Neckpose",
"type": "boolean"
},
"num_worker": {
"default": 4,
"title": "Num Worker",
"type": "integer"
},
"batch_size": {
"default": 1,
"title": "Batch Size",
"type": "integer"
},
"ring_margin": {
"default": 0.5,
"title": "Ring Margin",
"type": "number"
},
"ring_loss_weight": {
"default": 1.0,
"title": "Ring Loss Weight",
"type": "number"
}
},
"title": "FlameConfig",
"type": "object"
}
Fields:
-
flame_model_path(Path) -
static_landmark_embedding_path(Path) -
dynamic_landmark_embedding_path(Path) -
shape_params(int) -
expression_params(int) -
pose_params(int) -
use_face_contour(bool) -
use_3d_translation(bool) -
optimize_eyeballpose(bool) -
optimize_neckpose(bool) -
num_worker(int) -
batch_size(int) -
ring_margin(float) -
ring_loss_weight(float)
dynamic_landmark_embedding_path
pydantic-field
¤
dynamic_landmark_embedding_path: Path
Dynamic contour embedding path for FLAME
optimize_eyeballpose
pydantic-field
¤
optimize_eyeballpose: bool = True
If true optimize for the eyeball pose.
optimize_neckpose
pydantic-field
¤
optimize_neckpose: bool = True
If true optimize for the neck pose.
static_landmark_embedding_path
pydantic-field
¤
static_landmark_embedding_path: Path
Static landmark embeddings path for FLAME
get_config
¤
get_config() -> FlameConfig
Source code in src/liblaf/flame_pytorch/config.py
91 92 | |