Validation¶
Validation pipeline element for running a trained parameter set against held-out objectives.
Schema for Validation pipeline element.
- class ionworks_schema.validation.Validation(objectives: dict[str, Annotated[dict[str, Any] | FittingObjective | DesignObjective | Any, FieldInfo(annotation=NoneType, required=True, metadata=[_PydanticGeneralMetadata(union_mode='left_to_right')])]], summary_stats: list[Annotated[dict[str, Any] | ObjectiveFunction | Any, FieldInfo(annotation=NoneType, required=True, metadata=[_PydanticGeneralMetadata(union_mode='left_to_right')])]] | None = None)¶
Bases:
BaseSchemaCheck a fitted model against held-out experimental data.
A
Validationstep takes the parameters produced earlier in the pipeline, simulates the experiments listed inobjectives, and compares those simulations to the measured data. The result tells you how well the model generalises beyond the data you fit on.Each
objectivedescribes one comparison (e.g. “current vs. time for this discharge”). Thesummary_statslist controls which scalar error metrics — RMSE, MAE, max error, … — get reported alongside the full time-series comparison.Parameters¶
- objectivesdict[str, FittingObjective | DesignObjective | dict]
One entry per experiment you want to compare against. The key is a human-readable label (used in the report); the value is the objective describing what to simulate and what to compare.
- summary_statslist[ObjectiveFunction | dict], optional
Which scalar error metrics to report (e.g.
RMSE(),MAE(),Max()). If you leave this unset, sensible defaults are filled in for fitting-style objectives so the report carries the same physical units as the measurements.
Examples¶
>>> obj1 = iws.objectives.CurrentDriven(data_input="path/to/cycle_1C.csv") >>> obj2 = iws.objectives.CurrentDriven(data_input="path/to/cycle_C2.csv") >>> val = iws.Validation( ... objectives={"1C": obj1, "C/2": obj2}, ... summary_stats=[iws.costs.RMSE(), iws.costs.MAE()], ... ) >>> config = iws.Pipeline({"validate": val}).to_config() >>> # then submit `config` via ionworks-api
Extends:
ionworks_schema.base.BaseSchemaSee also:
ionworkspipeline.Validation(runtime implementation).- objectives: dict[str, Annotated[dict[str, Any] | FittingObjective | DesignObjective | Any, FieldInfo(annotation=NoneType, required=True, metadata=[_PydanticGeneralMetadata(union_mode='left_to_right')])]]¶
- summary_stats: list[Annotated[dict[str, Any] | ObjectiveFunction | Any, FieldInfo(annotation=NoneType, required=True, metadata=[_PydanticGeneralMetadata(union_mode='left_to_right')])]] | None¶
- model_config = {'arbitrary_types_allowed': True, 'extra': 'forbid', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].