Parameter estimators¶
Optimizer and sampler schemas used inside DataFit. Mirrors
ionworkspipeline.optimizers (available as
ionworks_schema.optimizers alias).
Schemas for parameter_estimators.
- class ionworks_schema.parameter_estimators.AskTellOptimizer(method='CMAES', log_to_screen=False, sigma0=None, max_iterations=300, max_unchanged_iterations=75, max_unchanged_iterations_threshold=None, min_iterations=1, max_evaluations=1000000, population_size=None, threshold=None, absolute_tolerance=1e-05, relative_tolerance=0.01, xtol=1e-06, population_convergence_tol=0.005, flat_fitness_tol=None, convergence_patience=3, algorithm_options=None, async_mode=False)¶
Bases:
BaseSchemaAsk/tell optimizer for population-based and simplex optimization.
Supports CMAES, PSO, DifferentialEvolution, XNES, and Nelder-Mead.
Parameters¶
- methodstr, optional
Optimization method. Default is “CMAES”. Must be one of: “CMAES”, “Nelder-Mead”, “PSO”, “DifferentialEvolution”, or “XNES”.
- log_to_screenbool, optional
Whether to print optimization progress. Default is False.
- sigma0float | np.ndarray, optional
Initial step size for population-based methods. Default is None.
- max_iterationsint, optional
Maximum number of iterations. Default is 300.
- max_unchanged_iterationsint, optional
Stop after this many iterations without improvement. Default is 75.
- max_unchanged_iterations_thresholdfloat, optional
Legacy alias for
absolute_tolerance. Preferabsolute_toleranceon new code.- min_iterationsint, optional
Minimum iterations before checking stopping criteria. Default is 1.
- max_evaluationsint, optional
Maximum number of function evaluations. Default is 1e6.
- population_sizeint, optional
Population size for population-based methods. Default is method-specific.
- thresholdfloat, optional
Target objective value to stop optimization. Default is None.
- absolute_tolerancefloat, optional
Absolute tolerance for unchanged iterations. Default is 1e-5.
- relative_tolerancefloat, optional
Relative tolerance for unchanged iterations. Default is 1e-2.
- xtolfloat, optional
Parameter change tolerance (L-inf norm). Default is 1e-6.
- population_convergence_tolfloat, optional
Population convergence tolerance. The exact convergence semantics are algorithm-dependent (e.g. DE also checks position-space diversity). Default is 5e-3.
- flat_fitness_tolfloat, optional
Flat fitness tolerance. Default is None (disabled).
- convergence_patienceint, optional
Number of consecutive generations the population convergence check must pass before actually stopping. Default is 3.
- algorithm_optionsdict, optional
Algorithm-specific configuration options. For PSO, supported keys are:
inertia_weight: tuple[float, float] - (w_start, w_end) for linear decay. Enables inertia mode and disables constriction.constriction: bool - Use constriction coefficient (default: True)c1: float - Cognitive coefficient (default: 2.05)c2: float - Social coefficient (default: 2.05)boundary_handling: str | BoundaryHandling - “absorb”, “reflect”, “random”, “ignore”velocity_clamping: str | VelocityClamping - “none”, “fraction”, “adaptive”v_max_fraction: float - Fraction of search space for max velocity (default: 0.2)max_iterations: int - For adaptive parameter scheduling (default: 1000)
- **kwargs
Additional arguments for the AskTellOptimizer constructor.
Notes¶
Parallelism is controlled via DataFit’s parallel and num_workers parameters, not through the optimizer directly. DataFit will automatically configure the optimizer’s parallelism settings based on its own configuration.
Examples¶
>>> opt = iws.parameter_estimators.AskTellOptimizer( ... method="CMAES", max_iterations=200, ... ) >>> fit = iws.DataFit( ... objectives={"ocp": iws.objectives.OCPHalfCell( ... electrode="positive", data_input="path/to/ocp.csv", ... )}, ... parameters={"x": iws.Parameter("x", initial_value=1.0, bounds=(0.0, 2.0))}, ... optimizer=opt, ... )
Extends:
ionworks_schema.base.BaseSchema- algorithm_options: CMAESOptions | PSOOptions | DEOptions | XNESOptions | dict | None¶
- model_config = {'arbitrary_types_allowed': True, 'extra': 'forbid', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.CMAES(**extra_data: Any)¶
Bases:
_PassthroughOptimizerCovariance Matrix Adaptation Evolution Strategy (CMA-ES).
A strong default for ill-conditioned, noisy, non-linear fits. Convenience wrapper around
AskTellOptimizer(method="CMAES"); seeAskTellOptimizerfor the full list of accepted kwargs.Examples¶
>>> opt = iws.parameter_estimators.CMAES() >>> fit = iws.DataFit( ... objectives={"ocp": iws.objectives.OCPHalfCell( ... electrode="positive", data_input="path/to/ocp.csv", ... )}, ... parameters={"x": iws.Parameter("x", initial_value=1.0, bounds=(0.0, 2.0))}, ... optimizer=opt, ... )
Extends:
ionworks_schema.parameter_estimators.parameter_estimators._PassthroughOptimizer- model_config = {'arbitrary_types_allowed': True, 'extra': 'allow', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.CMAESOptions(**extra_data: Any)¶
Bases:
_AlgorithmOptionsOptions for
AskTellOptimizer(method="CMAES")passed through to pycma.Accepts any key valid for
cma.CMAOptions()— see pycma documentation for the full list. Common keys:CMA_diagonal,ftarget,CMA_stds,scaling_of_variables.Extends:
ionworks_schema.parameter_estimators.parameter_estimators._AlgorithmOptions- model_config = {'arbitrary_types_allowed': True, 'extra': 'allow', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.DEOptions(*, mutation_strategy: str | None = None, crossover_method: str | None = None, F: float | None = None, CR: float | None = None, dither: bool | tuple[float, float] | list[float] | None = None, p_best_rate: float | None = None, archive_size_ratio: float | None = None, memory_size: int | None = None, boundary_handling: str | None = None)¶
Bases:
_AlgorithmOptionsOptions for
AskTellOptimizer(method="DifferentialEvolution").Any field left as
Nonefalls back to the Differential-Evolution defaults set by the Ionworks pipeline.Parameters¶
- mutation_strategystr, optional
One of
"rand_1","best_1","current_to_best_1","rand_2","best_2","current_to_pbest_1". Default"current_to_pbest_1"(SHADE adaptive).- crossover_methodstr, optional
One of
"binomial","exponential". Default"binomial".- Ffloat, optional
Mutation scale factor; initial mean for SHADE adaptive sampling. Default 0.5.
- CRfloat, optional
Crossover probability; initial mean for SHADE adaptive sampling. Default 0.7.
- ditherbool or tuple of (float, float), optional
Randomise
Fper generation when using classic strategies.Trueuses the default range[0.5, 1.0]. DefaultTrue.- p_best_ratefloat, optional
Fraction of top individuals for pbest selection. Default 0.1.
- archive_size_ratiofloat, optional
Archive size as a multiple of population. Default 1.0.
- memory_sizeint, optional
Number of history entries for F/CR adaptation. Default equal to population size.
- boundary_handlingstr, optional
Boundary handling strategy. Default
"reflect".
Extends:
ionworks_schema.parameter_estimators.parameter_estimators._AlgorithmOptions- model_config = {'arbitrary_types_allowed': True, 'extra': 'forbid', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.DifferentialEvolution(**extra_data: Any)¶
Bases:
_PassthroughOptimizerDifferential Evolution (DE) with SHADE adaptive control.
Uses success-history based adaptive F/CR with current-to-pbest/1 mutation by default. Convenience wrapper around
AskTellOptimizer(method="DifferentialEvolution"). See AskTellOptimizer class for full parameter documentation.Examples¶
>>> opt = iws.parameter_estimators.DifferentialEvolution() >>> fit = iws.DataFit( ... objectives={"ocp": iws.objectives.OCPHalfCell( ... electrode="positive", data_input="path/to/ocp.csv", ... )}, ... parameters={"x": iws.Parameter("x", initial_value=1.0, bounds=(0.0, 2.0))}, ... optimizer=opt, ... )
Extends:
ionworks_schema.parameter_estimators.parameter_estimators._PassthroughOptimizer- model_config = {'arbitrary_types_allowed': True, 'extra': 'allow', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.DummyOptimizer(**extra_data: Any)¶
Bases:
_PassthroughOptimizerAlias for
PointEstimateOptimizer.Returns the supplied initial guess without running any optimisation — useful as a no-op in tests and pipeline scaffolding.
Extends:
ionworks_schema.parameter_estimators.parameter_estimators._PassthroughOptimizer- model_config = {'arbitrary_types_allowed': True, 'extra': 'allow', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.DummySampler(**extra_data: Any)¶
Bases:
_PassthroughOptimizerAlias for
PointEstimateSampler.Returns a single sample equal to the initial guess — useful as a no-op in sampling pipelines.
Extends:
ionworks_schema.parameter_estimators.parameter_estimators._PassthroughOptimizer- model_config = {'arbitrary_types_allowed': True, 'extra': 'allow', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.GridSearch(npts=10)¶
Bases:
BaseSchemaGrid search sampler.
Evaluates the objective at all combinations of grid points across dimensions — a deterministic, exhaustive sweep. Cost grows as
npts ** n_parameters; keep the parameter count small.Extends:
ionworks_schema.base.BaseSchema- model_config = {'arbitrary_types_allowed': True, 'extra': 'forbid', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.Optimizer¶
Bases:
BaseSchemaBase class for all optimizers.
Optimisers seek a single optimal point in parameter space that minimises the objective. Not used directly — pick a concrete subclass such as
AskTellOptimizeror one of theScipy*wrappers.Parameters¶
- **kwargs
Arguments passed to the underlying optimiser algorithm.
Extends:
ionworks_schema.base.BaseSchema- model_config = {'arbitrary_types_allowed': True, 'extra': 'forbid', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.PSO(**extra_data: Any)¶
Bases:
_PassthroughOptimizerParticle Swarm Optimization (PSO). Population-based optimiser inspired by bird-flocking behaviour.
Convenience wrapper around
AskTellOptimizer(method="PSO"); seeAskTellOptimizerfor the full list of accepted kwargs.Examples¶
>>> opt = iws.parameter_estimators.PSO() >>> fit = iws.DataFit( ... objectives={"ocp": iws.objectives.OCPHalfCell( ... electrode="positive", data_input="path/to/ocp.csv", ... )}, ... parameters={"x": iws.Parameter("x", initial_value=1.0, bounds=(0.0, 2.0))}, ... optimizer=opt, ... )
Extends:
ionworks_schema.parameter_estimators.parameter_estimators._PassthroughOptimizer- model_config = {'arbitrary_types_allowed': True, 'extra': 'allow', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.PSOOptions(*, inertia_weight: tuple[float, float] | list[float] | None = None, constriction: bool | None = None, c1: float | None = None, c2: float | None = None, boundary_handling: str | None = None, velocity_clamping: str | None = None, v_max_fraction: float | None = None, max_iterations: int | None = None)¶
Bases:
_AlgorithmOptionsOptions for
AskTellOptimizer(method="PSO").Any field left as
Nonefalls back to the PSO defaults set by the Ionworks pipeline.Parameters¶
- inertia_weighttuple of (float, float), optional
(w_start, w_end)for linear decay. When set, enables inertia mode and disables constriction.- constrictionbool, optional
Use constriction coefficient. Default
True(wheninertia_weightis not set).- c1float, optional
Cognitive coefficient (attraction to personal best). Default 2.05.
- c2float, optional
Social coefficient (attraction to global best). Default 2.05.
- boundary_handlingstr, optional
One of
"absorb","reflect","random","ignore". Default"reflect".- velocity_clampingstr, optional
One of
"none","fraction","adaptive". Default"fraction".- v_max_fractionfloat, optional
Fraction of search space for max velocity. Default 0.2.
- max_iterationsint, optional
Horizon used for adaptive parameter scheduling. Default 1000.
Extends:
ionworks_schema.parameter_estimators.parameter_estimators._AlgorithmOptions- model_config = {'arbitrary_types_allowed': True, 'extra': 'forbid', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.ParameterEstimator¶
Bases:
BaseSchemaBase class for all parameter estimators.
Parameter estimators find an optimal (or posterior) parameter set by minimising an objective. Not used directly — pick a concrete
OptimizerorSamplersubclass.Parameters¶
- **kwargs
Arguments passed to the underlying estimation algorithm.
Extends:
ionworks_schema.base.BaseSchemaSee also:
ionworkspipeline.ParameterEstimator(runtime implementation).- model_config = {'arbitrary_types_allowed': True, 'extra': 'forbid', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- ionworks_schema.parameter_estimators.PintsOptimizer¶
alias of
AskTellOptimizer
- class ionworks_schema.parameter_estimators.PintsSampler(method='DramACMC', log_to_screen=False, max_iterations=1000, burnin_iterations=None, initial_phase_iterations=None)¶
Bases:
BaseSchemaSampler using the Pints library: Probabilistic Inference for Bayesian Models.
Wraps Pints’ MCMCController class for MCMC sampling.
Parameters¶
- methodstr, optional
Sampling method. Default is “DramACMC”. Must be one of: “DramACMC”, “HamiltonianMCMC”, “HaarioBardenetACMC”, “MALAMCMC”, “MetropolisRandomWalkMCMC”, “MonomialGammaHamiltonianMCMC”, “NoUTurnMCMC”, “PopulationMCMC”, “RelativisticMCMC”, “SliceDoublingMCMC”, “SliceRankShrinkingMCMC”, or “SliceStepoutMCMC”.
- log_to_screenbool, optional
Whether to print information at runtime. Default is False.
- max_iterationsint, optional
Maximum number of iterations. Default is 1000.
- burnin_iterationsint, optional
Number of initial iterations to discard. Default is 10% of max_iterations.
- initial_phase_iterationsint, optional
Number of iterations in initial phase. Only used for methods that need an initial phase. Default is equal to burnin_iterations.
- **kwargs
Additional parameters passed to the pints MCMC sampler. See pints documentation for details.
Notes¶
The log PDF values are cached internally to work around a pints limitation.
For methods that need an initial phase, initial_phase_iterations must not exceed burnin_iterations, which in turn must not exceed max_iterations.
Extends:
ionworks_schema.base.BaseSchema- model_config = {'arbitrary_types_allowed': True, 'extra': 'forbid', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.PointEstimateOptimizer(**extra_data: Any)¶
Bases:
_PassthroughOptimizerNo-op optimiser — returns the initial guess unchanged.
Useful for evaluating a single parameter set or initialising pipelines without paying the cost of a real optimiser run.
Examples¶
>>> opt = iws.parameter_estimators.PointEstimateOptimizer() >>> fit = iws.DataFit( ... objectives={"ocp": iws.objectives.OCPHalfCell( ... electrode="positive", data_input="path/to/ocp.csv", ... )}, ... parameters={"x": iws.Parameter("x", initial_value=1.0, bounds=(0.0, 2.0))}, ... optimizer=opt, ... )
Extends:
ionworks_schema.parameter_estimators.parameter_estimators._PassthroughOptimizer- model_config = {'arbitrary_types_allowed': True, 'extra': 'allow', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.PointEstimateSampler(**extra_data: Any)¶
Bases:
_PassthroughOptimizerNo-op sampler — returns the initial guess as a single sample.
Useful for evaluating a single parameter set within a sampling-style pipeline without paying the cost of real MCMC.
Extends:
ionworks_schema.parameter_estimators.parameter_estimators._PassthroughOptimizer- model_config = {'arbitrary_types_allowed': True, 'extra': 'allow', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.Sampler¶
Bases:
BaseSchemaBase class for all Monte Carlo-style samplers.
Samplers explore parameter space to characterise the posterior, in contrast to optimisers which return a single best point. Not used directly — pick a concrete subclass such as
PintsSamplerorGridSearch.Parameters¶
- **kwargs
Arguments passed to the underlying sampling algorithm.
Extends:
ionworks_schema.base.BaseSchema- model_config = {'arbitrary_types_allowed': True, 'extra': 'forbid', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.ScipyBasinhopping(**extra_data: Any)¶
Bases:
_PassthroughOptimizerGlobal optimizer using basin-hopping with local minimization.
Basin-hopping is a two-phase stochastic algorithm that combines random perturbations with local minimization to escape local minima. It repeatedly applies random perturbations to the current minimum, accepts or rejects based on the Metropolis criterion, and performs local minimization from the perturbed position. This approach efficiently explores the energy landscape while refining solutions locally.
Notes¶
Accepts initial guess x0 as the starting point for optimization
More efficient than pure global search for moderately complex landscapes
Local minimizer can be customized via minimizer_kwargs parameter
Stochastic algorithm (use seed parameter for reproducibility)
Supports constraints through the local minimizer (via minimizer_kwargs)
Temperature parameter controls acceptance of uphill moves (higher = more exploration)
Parameters¶
- niterint, default=100
Number of basin-hopping iterations.
- Tfloat, default=1.0
Temperature parameter for the Metropolis acceptance criterion. Higher values increase the probability of accepting uphill moves.
- stepsizefloat, default=0.5
Initial step size for random displacement of coordinates.
- minimizer_kwargsdict, optional
Extra keyword arguments passed to the local minimizer (scipy.optimize.minimize). Can specify the local method, bounds, constraints, and convergence criteria. Default uses L-BFGS-B with provided bounds.
- take_stepcallable, optional
Custom step-taking function. If None, uses random displacement.
- accept_testcallable, optional
Custom acceptance test function. If None, uses Metropolis criterion.
- intervalint, default=50
Interval for updating stepsize (adaptive step size adjustment).
- seedint, optional
Random seed for reproducible results.
- **kwargs
Additional arguments passed to scipy.optimize.basinhopping. See scipy documentation for complete options.
Extends:
ionworks_schema.parameter_estimators.parameter_estimators._PassthroughOptimizer- model_config = {'arbitrary_types_allowed': True, 'extra': 'allow', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.ScipyDifferentialEvolution(**extra_data: Any)¶
Bases:
_PassthroughOptimizerGlobal stochastic optimizer using differential evolution with parallel evaluation.
Differential evolution is a robust global optimization algorithm that evolves a population of candidate solutions across generations. It excels at handling multi-modal, non-convex objective landscapes and requires no gradient information.
Notes¶
Does not support custom equality or inequality constraints
Parallel workers significantly speed up optimization (use DataFits’ num_workers arg)
Initial guess x0 is ignored; initial population is generated from bounds
Polish option disabled by default as it conventionally significantly decreases performance
Callback logs only best solution per generation (not individual evaluations)
Parameters¶
- workersint, default=1
Number of parallel workers for function evaluations. Use -1 for all CPU cores.
- maxiterint, default=1000
Maximum number of generations.
- popsizeint, default=15
Population size multiplier (total population = popsize * dimensionality).
- strategystr, default=’best1bin’
Differential evolution strategy. Options include ‘best1bin’, ‘rand1bin’, ‘best2bin’, ‘rand2bin’, ‘currenttobest1bin’.
- mutationfloat or tuple, default=(0.5, 1)
Mutation constant. Can be float or (min, max) tuple for adaptive mutation.
- recombinationfloat, default=0.7
Crossover probability for parameter mixing.
- seedint, optional
Random seed for reproducible results.
- atol, tolfloat, optional
Absolute and relative tolerance for convergence.
- **kwargs
Additional arguments passed to scipy.optimize.differential_evolution. See scipy documentation for complete options.
Examples¶
>>> opt = iws.parameter_estimators.ScipyDifferentialEvolution() >>> fit = iws.DataFit( ... objectives={"ocp": iws.objectives.OCPHalfCell( ... electrode="positive", data_input="path/to/ocp.csv", ... )}, ... parameters={"x": iws.Parameter("x", initial_value=1.0, bounds=(0.0, 2.0))}, ... optimizer=opt, ... )
Extends:
ionworks_schema.parameter_estimators.parameter_estimators._PassthroughOptimizer- model_config = {'arbitrary_types_allowed': True, 'extra': 'allow', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.ScipyDualAnnealing(**extra_data: Any)¶
Bases:
_PassthroughOptimizerGlobal stochastic optimizer using dual annealing.
Dual annealing combines generalized simulated annealing with fast local search. It’s designed for global optimization with a good balance between exploration and exploitation, particularly effective for rugged objective landscapes.
Notes¶
Does not support custom equality or inequality constraints
Accepts optional initial guess x0 to seed the search
Stochastic algorithm (use seed parameter for reproducibility)
Generally faster convergence than pure simulated annealing
Good choice when gradient information is unavailable
Parameters¶
- maxiterint, default=1000
Maximum number of global search iterations.
- initial_tempfloat, default=5230
Initial temperature for the annealing schedule.
- restart_temp_ratiofloat, default=2e-5
Temperature ratio for restart condition during local search.
- visitfloat, default=2.62
Parameter for the visiting distribution (higher = more exploration).
- acceptfloat, default=-5.0
Parameter for the acceptance distribution (lower = more exploitation).
- seedint, optional
Random seed for reproducible results.
- no_local_searchbool, default=False
If True, skip local minimization (pure generalized simulated annealing).
- **kwargs
Additional arguments passed to scipy.optimize.dual_annealing. See scipy documentation for complete options.
Extends:
ionworks_schema.parameter_estimators.parameter_estimators._PassthroughOptimizer- model_config = {'arbitrary_types_allowed': True, 'extra': 'allow', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.ScipyLeastSquares(**extra_data: Any)¶
Bases:
_PassthroughOptimizerNonlinear least squares optimizer using scipy’s Trust Region Reflective algorithm.
This optimizer is designed for problems where the objective returns a residual vector rather than a scalar cost. It minimizes the sum of squares of the residuals. Best suited for well-behaved, smooth problems with a clear residual structure.
Notes¶
Requires objective functions that return an array (residual vector)
Automatically handles linear algebra errors by returning NaN values
More efficient than general minimization for least-squares structure
Supports bound constraints but not general equality/inequality constraints
Parameters¶
- methodstr, optional
Algorithm to use. Options: ‘trf’ (default), ‘dogbox’, ‘lm’.
- ftol, xtol, gtolfloat, optional
Tolerance parameters for convergence criteria.
- max_nfevint, optional
Maximum number of function evaluations.
- **kwargs
Additional arguments passed to scipy.optimize.least_squares. See scipy documentation for complete options.
Extends:
ionworks_schema.parameter_estimators.parameter_estimators._PassthroughOptimizer- model_config = {'arbitrary_types_allowed': True, 'extra': 'allow', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.ScipyMinimize(**extra_data: Any)¶
Bases:
_PassthroughOptimizerGeneral-purpose scalar minimization with support for constraints.
Wraps scipy’s minimize function, providing access to multiple local optimization algorithms (e.g., L-BFGS-B, SLSQP, trust-constr, COBYQA). Suitable for smooth, scalar-valued objectives with optional equality and inequality constraints.
Notes¶
Requires objective functions that return a scalar value
Supports bound constraints and custom equality/inequality constraints
Choice of method depends on problem structure and constraint types
Some methods (e.g., ‘L-BFGS-B’) support bounds only, not general constraints
Parameters¶
- methodstr, optional
Optimization algorithm. Common choices: - ‘L-BFGS-B’: Bound-constrained, gradient-based (default for bounded problems) - ‘SLSQP’: Sequential Least Squares, supports all constraint types - ‘trust-constr’: Modern trust-region method, supports all constraints - ‘COBYQA’: Derivative-free, supports nonlinear constraints
- maxiterint, optional
Maximum number of iterations.
- tolfloat, optional
Tolerance for termination.
- **kwargs
Additional arguments passed to scipy.optimize.minimize. See scipy documentation for complete options.
Examples¶
>>> opt = iws.parameter_estimators.ScipyMinimize(method="L-BFGS-B") >>> fit = iws.DataFit( ... objectives={"ocp": iws.objectives.OCPHalfCell( ... electrode="positive", data_input="path/to/ocp.csv", ... )}, ... parameters={"x": iws.Parameter("x", initial_value=1.0, bounds=(0.0, 2.0))}, ... optimizer=opt, ... )
Extends:
ionworks_schema.parameter_estimators.parameter_estimators._PassthroughOptimizer- model_config = {'arbitrary_types_allowed': True, 'extra': 'allow', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.ScipyShgo(**extra_data: Any)¶
Bases:
_PassthroughOptimizerGlobal optimizer using simplicial homology techniques.
SHGO (Simplicial Homology Global Optimization) uses topological techniques to identify and sample from all local minima basins. It’s particularly effective for problems with many local minima and supports general nonlinear constraints.
Notes¶
Deterministic algorithm (reproducible results without random seed)
Efficiently handles problems with many local optima
Supports bound, equality, and inequality constraints
May be slower than stochastic methods for high-dimensional problems
Initial guess x0 is ignored; sampling points determined by algorithm
Parameters¶
- nint, default=100
Number of sampling points used in the algorithm.
- itersint, default=1
Number of iterations for algorithm convergence.
- sampling_methodstr, default=’simplicial’
Sampling strategy: ‘simplicial’ (default) or ‘sobol’.
- minimizer_kwargsdict, optional
Additional arguments passed to the local minimizer.
- **kwargs
Additional arguments passed to scipy.optimize.shgo. See scipy documentation for complete options.
Extends:
ionworks_schema.parameter_estimators.parameter_estimators._PassthroughOptimizer- model_config = {'arbitrary_types_allowed': True, 'extra': 'allow', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class ionworks_schema.parameter_estimators.XNES(**extra_data: Any)¶
Bases:
_PassthroughOptimizerExponential Natural Evolution Strategy (xNES). Adapts its covariance matrix to the local fitness landscape.
Convenience wrapper around
AskTellOptimizer(method="XNES"); seeAskTellOptimizerfor the full list of accepted kwargs.Examples¶
>>> opt = iws.parameter_estimators.XNES() >>> fit = iws.DataFit( ... objectives={"ocp": iws.objectives.OCPHalfCell( ... electrode="positive", data_input="path/to/ocp.csv", ... )}, ... parameters={"x": iws.Parameter("x", initial_value=1.0, bounds=(0.0, 2.0))}, ... optimizer=opt, ... )
Extends:
ionworks_schema.parameter_estimators.parameter_estimators._PassthroughOptimizer- model_config = {'arbitrary_types_allowed': True, 'extra': 'allow', 'populate_by_name': True, 'validate_assignment': True, 'validate_by_alias': True, 'validate_by_name': True}¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].