Internal API#
SCons Extensions#
Provide common SCons extensions with utility functions, builders, and construction environment methods.
- class waves.scons_extensions.AbaqusPseudoBuilder(builder: Builder, override_cpus: int | None = None)[source]
Bases:
objectAbaqus Pseudo-Builder class which allows users to customize the Abaqus Pseudo-Builder.
- Parameters:
builder – A Builder generated by
waves.scons_extensions.abaqus_solver_builder_factory()override_cpus – Override the task-specific default number of CPUs. This kwarg value is most useful if propagated from a user-specified option at execution time. If None, Abaqus Pseudo-Builder tasks will use the task-specific default.
Warning
You must use an AbaqusSolver Builder generated from
waves.scons_extensions.abaqus_solver_builder_factory(). Using the non-builder-factorywaves.scons_extensions.abaqus_solver()(i.e. a Builder that does not use theprogram_optionskwarg) is not supported.- __call__(
- env: SConsEnvironment,
- job: str,
- inp: str | None = None,
- user: str | None = None,
- cpus: int = 1,
- processes: int | None = None,
- oldjob: list[str] | str | None = None,
- oldjob_restart_file_count: list[int] | int = 1,
- write_restart: bool = False,
- double: str = 'both',
- extra_sources: list[str] | None = None,
- extra_targets: list[str] | None = None,
- extra_options: str = '',
- **kwargs,
SCons Pseudo-Builder for running Abaqus jobs.
This SCons Pseudo-Builder wraps the WAVES Abaqus builders to automatically adjust the Abaqus command, sources list, and target list when specifying restart jobs and user subroutines.
Note
Restart files that are only used by Abaqus/Explicit (i.e.
.abq,.pac, and.sel) are not currently added to the source and target lists when specifyingoldjoborwrite_restart. Useextra_sourcesandextra_targetsto manually add them when needed.- Parameters:
job – Abaqus job name without file extension.
inp – Abaqus input file name with file extension. Defaults to
job.inp.user – User subroutine.
cpus – CPUs to use for simulation. Is superceded by
override_cpusif provided during object instantiation. The CPUs option is escaped in the action string, i.e. changing the number of CPUs will not trigger a rebuild.processes – Number of MPI processes. This option is escaped in the action string, i.e. changing the number of MPI processes will not necessarily trigger a rebuild. For some Abaqus/Standard analyses that write restart files, if the number of MPI processes changes then the number of restart files will change and that will trigger a rebuild.
oldjob – Name of job(s) to restart/import. If a single old job is specified, “-oldjob” will be added to the Abaqus command. If multiple old jobs are specified, “-oldjob” will not be added to the Abaqus command (users should use the
*IMPORT, LIBRARY=syntax in the Abaqus input file).oldjob_restart_file_count – Number of MPI process-specific restart files generated by
oldjob. For Abaqus Standard jobs, the number of restart files (specifically .mdl and .stt) will be determined by the number of MPI processes the job used. For example, an Abaqus Standard job on 2 MPI processes will not output a.mdlfile, but will produce.mdl.0and.mdl.1files. When restarting/importing that Abaqus Standard job, specifyoldjob_restart_file_count = 2. Abaqus Explicit restart files are not affected by the number of MPI processes andoldjob_restart_file_count = 1should be specified when restarting/importing from an Abaqus Explicit job. Ifoldjobis a single Abaqus job, specify a single integer. Ifoldjobis a list of Abaqus jobs, specify a list of integers for job-specific restart files or a single integer to be applied to all old jobs.write_restart – If True, add restart files to target list. This is required if you want to use these restart files for a restart job.
double – Passthrough option for Abaqus’
-double ${double}.extra_sources – Additional sources to supply to builder.
extra_targets – Additional targets to supply to builder.
extra_options – Additional Abaqus options to supply to builder. Should not include any Abaqus options available as kwargs, e.g. cpus, oldjob, user, input, job.
kwargs – Any additional kwargs are passed through to the builder.
- Returns:
All targets associated with Abaqus simulation.
SConstruct#import waves # Allow user to override simulation-specific default number of CPUs AddOption('--solve-cpus', type='int') env = Environment() env.AddMethod(waves.scons_extensions.add_program, "AddProgram") env["ABAQUS_PROGRAM"] = env.AddProgram(["abaqus"]) env.Append(BUILDERS={ "AbaqusSolver": waves.scons_extensions.abaqus_solver_builder_factory( program=env["ABAQUS_PROGRAM"] ) }) env.AddMethod( waves.scons_extensions.AbaqusPseudoBuilder( builder=env.AbaqusSolver, override_cpus=env.GetOption("solve_cpus")), "Abaqus", )
To define a simple Abaqus simulation:
env.Abaqus(job='simulation_1')
The job name can differ from the input file name:
env.Abaqus(job='assembly_simulation_1', inp='simulation_1.inp')
Specifying a user subroutine automatically adds the user subroutine to the source list:
env.Abaqus(job='simulation_1', user='user.f')
If you write restart files, you can add the restart files to the target list with:
env.Abaqus(job='simulation_1', write_restart=True)
When running Abaqus Standard jobs using MPI parallelization, explicitly control the number of MPI processes as this affects the number of restart files:
env.Abaqus(job='simulation_1', write_restart=True, processes=4)
Accurately capturing the number of restart files is important when restarting/importing an analysis, as SCons will know to check that the required restart files exist and are up-to-date:
env.Abaqus(job='simulation_2', oldjob='simulation_1')
Multiple previous jobs can be specified if the Abaqus simulation imports from multiple ODBs:
env.Abaqus(job='simulation_2', oldjob=['simulation_1', 'simulation_1a'])
The number of MPI process-specific restart files for old jobs can be specified for one or multiple old jobs:
env.Abaqus(job='simulation_2', oldjob='simulation_1', oldjob_restart_file_count = 2) env.Abaqus(job='simulation_2', oldjob=['simulation_1', 'simulation_1a'], oldjob_restart_file_count = [2, 4])
If your Abaqus job depends on files which aren’t detected by an implicit dependency scanner, you can add them to the source list directly:
env.Abaqus(job='simulation_1', user='user.f', extra_sources=['user_subroutine_input.csv'])
You can specify the default number of CPUs for the simulation:
env.Abaqus(job='simulation_1', cpus=4)
- class waves.scons_extensions.QOIPseudoBuilder(
- collection_dir: Path,
- build_dir: Path,
- update_expected: bool = False,
- _program: str = 'waves',
Bases:
objectSCons Pseudo-Builder class which allows users to customize the QOI Pseudo-Builder.
Warning
This pseudo-builder is considered experimental pending early adopter end user trial and feedback.
The QOI xarray data array and dataset handling for expected/calculated comparisons should be stable, but the output plotting and reporting formatting is subject to change.
- Parameters:
collection_dir – Root directory of QOI archive artifacts.
build_dir – Root directory of SCons project build artifacts.
updated_expected – Update the expected QOI CSV source files to match the calculated QOI values instead of comparing the calculated and expected values.
_program – The WAVES command line program call. Intended for internal use by developers to perform in-repository system testing. End users should not change the default value of this argument.
- __call__(
- env: SConsEnvironment,
- calculated: Path,
- expected: Path | None = None,
- archive: bool = False,
SCons Pseudo-Builder for regression testing and archiving quantities of interest (QOIs).
This SCons Pseudo-Builder provides a convenient method for archiving and regression testing QOIs (such as critical simulation outputs). When requested, it aggregates the calculated values in a directory for easy archival. If expected values are specified, it compares them to the calculated values and reports any differences to a CSV file. If there are differences which exceed the user-specified tolerances, an error is raised. If
self.update_expectedisTrue, the expected CSV files (in the source tree) will be updated to match the calculated QOI values, and no comparison between the two will be performed.- Parameters:
calculated – Path to CSV file containing calculated QOIs. See
qoi.read_qoi_set()for the CSV format.expected –
Path to CSV file containing expected QOI values and tolerances. See
qoi.read_qoi_set()for the CSV format. Seeqoi.create_qoi()for the types of tolerances allowed. See qoi for how tolerances are checked.Each of the tolerances are checked independently. If any fail, an error is raised.
If
expectedis not specified, then QOIs are archived but not compared to expected values. Eitherexpectedorarchive=Truemust be specified. An expected QOI file without tolerances is meaningless; the regression test will always pass.archive – If True, add the calculated QOIs to
self.collection_diralongside other archived QOIs. To complete the archive, the QOI files collected inself.collection_dirshould be copied to a read-only central location usingwaves qoi archive
- Returns:
list of SCons Target. The list of targets associated with regression testing and archiving the QOIs. Building these targets will regression test the QOIs and output a CSV file which contains the exact differences between calculated and expected values. If
archive == True, these targets will also include moving the calculated QOIs CSV file to the collection directory.
- class waves.scons_extensions.WAVESEnvironment(
- *args,
- ABAQUS_PROGRAM: str = 'abaqus',
- ANSYS_PROGRAM: str = 'ansys',
- CCX_PROGRAM: str = 'ccx',
- CHARMRUN_PROGRAM: str = 'charmrun',
- FIERRO_EXPLICIT_PROGRAM: str = 'fierro-parallel-explicit',
- FIERRO_IMPLICIT_PROGRAM: str = 'fierro-parallel-implicit',
- INCITER_PROGRAM: str = 'inciter',
- MPIRUN_PROGRAM: str = 'mpirun',
- PYTHON_PROGRAM: str = 'python',
- SIERRA_PROGRAM: str = 'sierra',
- SPHINX_BUILD_PROGRAM: str = 'sphinx-build',
- **kwargs,
Bases:
SConsEnvironmentOverload SConsEnvironment with WAVES construction environment methods and builders.
- AbaqusDatacheck(target: list, source: list, *args, **kwargs) NodeList[source]
Define tasks with builder returned by
waves.scons_extensions.abaqus_solver_builder_factory().Uses the
waves.scons_extensions.abaqus_datacheck_emitter().- Variables:
program –
${ABAQUS_PROGRAM}
- Parameters:
target – The task target list
source – The task source list
args – All positional arguments are passed through to the builder (not to the builder factory)
kwargs – All keyword arguments are passed through to the builder (not to the builder factory)
- AbaqusExplicit(target: list, source: list, *args, **kwargs) NodeList[source]
Define tasks with builder returned by
waves.scons_extensions.abaqus_solver_builder_factory().Uses the
waves.scons_extensions.abaqus_explicit_emitter().- Variables:
program –
${ABAQUS_PROGRAM}
- Parameters:
target – The task target list
source – The task source list
args – All positional arguments are passed through to the builder (not to the builder factory)
kwargs – All keyword arguments are passed through to the builder (not to the builder factory)
- AbaqusJournal(target: list, source: list, *args, **kwargs) NodeList[source]
Define tasks with the builder returned by
waves.scons_extensions.abaqus_journal_builder_factory().- Variables:
program –
${ABAQUS_PROGRAM}- Parameters:
target – The task target list
source – The task source list
args – All positional arguments are passed through to the builder (not to the builder factory)
kwargs – All keyword arguments are passed through to the builder (not to the builder factory)
- AbaqusPseudoBuilder(
- job: str,
- *args,
- override_cpus: int | None = None,
- **kwargs,
Define tasks with
waves.scons_extensions.AbaqusPseudoBuilder.When using this environment pseudo-builder, do not provide the first
envargument- Parameters:
job – Abaqus job name.
override_cpus – Override the task-specific default number of CPUs. This kwarg value is most useful if propagated from a user-specified option at execution time. If None, Abaqus Pseudo-Builder tasks will use the task-specific default.
args – All other positional arguments are passed through to
waves.scons_extensions.AbaqusPseudoBuilder.__call__`()kwargs – All other keyword arguments are passed through to
waves.scons_extensions.AbaqusPseudoBuilder.__call__`()
- AbaqusSolver(target: list, source: list, *args, **kwargs) NodeList[source]
Define tasks with the builder returned by
waves.scons_extensions.abaqus_solver_builder_factory().- Variables:
program –
${ABAQUS_PROGRAM}- Parameters:
target – The task target list
source – The task source list
args – All positional arguments are passed through to the builder (not to the builder factory)
kwargs – All keyword arguments are passed through to the builder (not to the builder factory)
- AbaqusStandard(target: list, source: list, *args, **kwargs) NodeList[source]
Define tasks with builder returned by
waves.scons_extensions.abaqus_solver_builder_factory().Uses the
waves.scons_extensions.abaqus_standard_emitter().- Variables:
program –
${ABAQUS_PROGRAM}
- Parameters:
target – The task target list
source – The task source list
args – All positional arguments are passed through to the builder (not to the builder factory)
kwargs – All keyword arguments are passed through to the builder (not to the builder factory)
- AddCubit(*args, **kwargs) str | None[source]
Call
waves.scons_extensions.add_cubit()as a construction environment method.When using this environment method, do not provide the first
envargument
- AddCubitPython(*args, **kwargs) str | None[source]
Call
waves.scons_extensions.add_cubit_python()as a construction environment method.When using this environment method, do not provide the first
envargument
- AddProgram(*args, **kwargs) str | None[source]
Call
waves.scons_extensions.add_program()as a construction environment method.When using this environment method, do not provide the first
envargument
- AnsysAPDL(target: list, source: list, *args, **kwargs) NodeList[source]
Define tasks with the builder returned by
waves.scons_extensions.ansys_apdl_builder_factory().- Variables:
program –
${ANSYS_PROGRAM}- Parameters:
target – The task target list
source – The task source list
args – All positional arguments are passed through to the builder (not to the builder factory)
kwargs – All keyword arguments are passed through to the builder (not to the builder factory)
- CalculiX(target: list, source: list, *args, **kwargs) NodeList[source]
Define tasks with the builder returned by
waves.scons_extensions.calculix_builder_factory().- Variables:
program –
${CCX_PROGRAM}- Parameters:
target – The task target list
source – The task source list
args – All positional arguments are passed through to the builder (not to the builder factory)
kwargs – All keyword arguments are passed through to the builder (not to the builder factory)
- CheckProgram(*args, **kwargs) str | None[source]
Call
waves.scons_extensions.check_program()as a construction environment method.When using this environment method, do not provide the first
envargument
- CopySubstfile(*args, **kwargs) NodeList[source]
Call
waves.scons_extensions.copy_substfile()as a construction environment method.When using this environment method, do not provide the first
envargument
- FierroExplicit(target: list, source: list, *args, **kwargs) NodeList[source]
Define tasks with the builder returned by
waves.scons_extensions.fierro_explicit_builder_factory().- Variables:
program –
${MPIRUN_PROGRAM}subcommand –
${FIERRO_EXPLICIT_PROGRAM}
- Parameters:
target – The task target list
source – The task source list
args – All positional arguments are passed through to the builder (not to the builder factory)
kwargs – All keyword arguments are passed through to the builder (not to the builder factory)
- FierroImplicit(target: list, source: list, *args, **kwargs) NodeList[source]
Define tasks with the builder returned by
waves.scons_extensions.fierro_implicit_builder_factory().- Variables:
program –
${MPIRUN_PROGRAM}subcommand –
${FIERRO_IMPLICIT_PROGRAM}
- Parameters:
target – The task target list
source – The task source list
args – All positional arguments are passed through to the builder (not to the builder factory)
kwargs – All keyword arguments are passed through to the builder (not to the builder factory)
- FindProgram(*args, **kwargs) str | None[source]
Call
waves.scons_extensions.find_program()as a construction environment method.When using this environment method, do not provide the first
envargument
- FirstTargetBuilder(target: list, source: list, *args, **kwargs) NodeList[source]
Define tasks with the builder returned by
waves.scons_extensions.first_target_builder_factory().- Parameters:
target – The task target list
source – The task source list
args – All positional arguments are passed through to the builder (not to the builder factory)
kwargs – All keyword arguments are passed through to the builder (not to the builder factory)
- ParameterStudySConscript(*args, **kwargs) Any | tuple[Any] | None[source]
Call
waves.scons_extensions.parameter_study_sconscript()as a construction environment method.When using this environment method, do not provide the first
envargument
- ParameterStudyTask(*args, **kwargs) NodeList[source]
Call
waves.scons_extensions.parameter_study_task()as a construction environment method.When using this environment pseudo-builder, do not provide the first
envargument
- ParameterStudyWrite(*args, **kwargs) NodeList[source]
Define tasks with the
waves.scons_extensions.parameter_study_write().When using this environment method, do not provide the first
envargument
- PrintBuildFailures(*args, **kwargs) None[source]
Call
waves.scons_extensions.print_build_failures()as a construction environment method.When using this environment method, do not provide the first
envargument
- ProjectAlias(*args, **kwargs) dict[str, str][source]
Call
waves.scons_extensions.project_alias()as a construction environment method.When using this environment method, do not provide the first
envargument
- ProjectHelp(*args, **kwargs) None[source]
Call
waves.scons_extensions.project_help()as a construction environment method.When using this environment method, do not provide the first
envargument
- PythonScript(target: list, source: list, *args, **kwargs) NodeList[source]
Define tasks with the builder returned by
waves.scons_extensions.python_builder_factory().- Variables:
program –
${PYTHON_PROGRAM}- Parameters:
target – The task target list
source – The task source list
args – All positional arguments are passed through to the builder (not to the builder factory)
kwargs – All keyword arguments are passed through to the builder (not to the builder factory)
- QuinoaSolver(target: list, source: list, *args, **kwargs) NodeList[source]
Define tasks with the builder returned by
waves.scons_extensions.quinoa_builder_factory().- Variables:
program –
${CHARMRUN_PROGRAM}subcommand –
${INCITER_PROGRAM}
- Parameters:
target – The task target list
source – The task source list
args – All positional arguments are passed through to the builder (not to the builder factory)
kwargs – All keyword arguments are passed through to the builder (not to the builder factory)
- Sierra(target: list, source: list, *args, **kwargs) NodeList[source]
Define tasks with the builder returned by
waves.scons_extensions.sierra_builder_factory().- Variables:
program –
${SIERRA_PROGRAM}- Parameters:
target – The task target list
source – The task source list
args – All positional arguments are passed through to the builder (not to the builder factory)
kwargs – All keyword arguments are passed through to the builder (not to the builder factory)
- SphinxBuild(target: list, source: list, *args, **kwargs) NodeList[source]
Define tasks with the builder returned by
waves.scons_extensions.sphinx_build().- Variables:
program –
${SPHINX_BUILD_PROGRAM}- Parameters:
target – The task target list
source – The task source list
args – All positional arguments are passed through to the builder (not to the builder factory)
kwargs – All keyword arguments are passed through to the builder (not to the builder factory)
- SphinxPDF(target: list, source: list, *args, **kwargs) NodeList[source]
Define tasks with the builder returned by
waves.scons_extensions.sphinx_latexpdf().- Variables:
program –
${SPHINX_BUILD_PROGRAM}- Parameters:
target – The task target list
source – The task source list
args – All positional arguments are passed through to the builder (not to the builder factory)
kwargs – All keyword arguments are passed through to the builder (not to the builder factory)
- SubstitutionSyntax(*args, **kwargs) dict[str, Any][source]
Call
waves.scons_extensions.substitution_syntax()as a construction environment method.When using this environment method, do not provide the first
envargument
- Truchas(target: list, source: list, *args, **kwargs) NodeList[source]
Define tasks with the builder returned by
waves.scons_extensions.truchas_builder_factory().- Variables:
program –
${MPIRUN_PROGRAM}subcommand –
${TRUCHAS_PROGRAM}
- Parameters:
target – The task target list
source – The task source list
args – All positional arguments are passed through to the builder (not to the builder factory)
kwargs – All keyword arguments are passed through to the builder (not to the builder factory)
- waves.scons_extensions.abaqus_datacheck_emitter(
- target: list,
- source: list,
- env: SConsEnvironment,
- suffixes: Iterable[str] = ('.odb', '.dat', '.msg', '.com', '.prt', '.023', '.mdl', '.sim', '.stt'),
- appending_suffixes: Iterable[str] | None = None,
- stdout_extension: str = '.stdout',
Abaqus solver emitter for datacheck targets.
SCons emitter for
waves.scons_extensions.abaqus_solver_builder_factory()based builders. Built onwaves.scons_extensions.abaqus_solver_emitter_factory().Appends the target list with
jobtask keyword argument named targets before passing through thewaves.scons_extensions.first_target_emitter()emitter.Searches for the
jobtask keyword argument and appends the target list withf"{job}{suffix}"targets using thesuffixeslist.Searches for a file ending in the stdout extension. If none is found, creates a target by appending the stdout extension to the first target in the
targetlist. The associated Builder requires at least one target for this reason. The stdout file is always placed at the end of the returned target list.This is an SCons emitter function and not an emitter factory. The suffix arguments:
suffixesandappending_suffixesare only relevant for developers writing new emitters which call this function as a base. The suffixes list emits targets where the suffix replaces the first target’s suffix, e.g. fortarget.extemit a new targettarget.suffix. The appending suffixes list emits targets where the suffix appends the first target’s suffix, e.g. fortarget.extemit a new targettarget.ext.appending_suffix.The emitter will assume all emitted targets build in the current build directory. If the target(s) must be built in a build subdirectory, e.g. in a parameterized target build, then the first target must be provided with the build subdirectory, e.g.
parameter_set1/target.ext. When in doubt, provide a STDOUT redirect file with the.stdoutextension as a target, e.g.target.stdoutorparameter_set1/target.stdout.SConstruct#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_program, "AddProgram") env["ABAQUS_PROGRAM"] = env.AddProgram(["abaqus"]) env.Append(BUILDERS={ "AbaqusDatacheck": waves.scons_extensions.abaqus_solver_builder_factory( program=env["ABAQUS_PROGRAM"], emitter=waves.scons_extensions.abaqus_datacheck_emitter, ) }) env.AbaqusDatacheck(target=["job.odb"], source=["input.inp"], job="job")
Note
The
jobkeyword argument must be provided in the task definition.- Parameters:
target – The target file list of strings
source – The source file list of SCons.Node.FS.File objects
env – The builder’s SCons construction environment object
suffixes – Suffixes which should replace the first target’s extension
appending_suffixes – Suffixes which should append the first target’s extension
stdout_extension – The extension used by the STDOUT/STDERR redirect file
- Returns:
target, source
- waves.scons_extensions.abaqus_explicit_emitter(
- target: list,
- source: list,
- env: SConsEnvironment,
- suffixes: Iterable[str] = ('.odb', '.dat', '.msg', '.com', '.prt', '.sta'),
- appending_suffixes: Iterable[str] | None = None,
- stdout_extension: str = '.stdout',
Abaqus solver emitter for Explicit targets.
SCons emitter for
waves.scons_extensions.abaqus_solver_builder_factory()based builders. Built onwaves.scons_extensions.abaqus_solver_emitter_factory().Appends the target list with
jobtask keyword argument named targets before passing through thewaves.scons_extensions.first_target_emitter()emitter.Searches for the
jobtask keyword argument and appends the target list withf"{job}{suffix}"targets using thesuffixeslist.Searches for a file ending in the stdout extension. If none is found, creates a target by appending the stdout extension to the first target in the
targetlist. The associated Builder requires at least one target for this reason. The stdout file is always placed at the end of the returned target list.This is an SCons emitter function and not an emitter factory. The suffix arguments:
suffixesandappending_suffixesare only relevant for developers writing new emitters which call this function as a base. The suffixes list emits targets where the suffix replaces the first target’s suffix, e.g. fortarget.extemit a new targettarget.suffix. The appending suffixes list emits targets where the suffix appends the first target’s suffix, e.g. fortarget.extemit a new targettarget.ext.appending_suffix.The emitter will assume all emitted targets build in the current build directory. If the target(s) must be built in a build subdirectory, e.g. in a parameterized target build, then the first target must be provided with the build subdirectory, e.g.
parameter_set1/target.ext. When in doubt, provide a STDOUT redirect file with the.stdoutextension as a target, e.g.target.stdoutorparameter_set1/target.stdout.SConstruct#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_program, "AddProgram") env["ABAQUS_PROGRAM"] = env.AddProgram(["abaqus"]) env.Append(BUILDERS={ "AbaqusExplicit": waves.scons_extensions.abaqus_solver_builder_factory( program=env["ABAQUS_PROGRAM"], emitter=waves.scons_extensions.abaqus_explicit_emitter, ) }) env.AbaqusExplicit(target=["job.odb"], source=["input.inp"], job="job")
Note
The
jobkeyword argument must be provided in the task definition.- Parameters:
target – The target file list of strings
source – The source file list of SCons.Node.FS.File objects
env – The builder’s SCons construction environment object
suffixes – Suffixes which should replace the first target’s extension
appending_suffixes – Suffixes which should append the first target’s extension
stdout_extension – The extension used by the STDOUT/STDERR redirect file
- Returns:
target, source
- waves.scons_extensions.abaqus_extract(program: str = 'abaqus') Builder[source]
Abaqus ODB file extraction Builder.
This builder executes the
odb_extractcommand line utility against an ODB file in the source list. The ODB file must be the first file in the source list. If there is more than one ODB file in the source list, all but the first file are ignored byodb_extract.This builder is unique in that no targets are required. The Builder emitter will append the builder managed targets and
odb_extracttarget name constructions automatically. The first target determines the working directory for the emitter targets. If the target(s) must be built in a build subdirectory, e.g. in a parameterized target build, then at least one target must be provided with the build subdirectory, e.g.parameter_set1/target.h5. When in doubt, provide the expected H5 file as a target, e.g.source[0].h5.The target list may specify an output H5 file name that differs from the ODB file base name as
new_name.h5. If the first file in the target list does not contain the*.h5extension, or if there is no file in the target list, the target list will be prepended with a name matching the ODB file base name and the*.h5extension.The builder emitter appends the CSV file created by the
abaqus odbreportcommand as executed byodb_extractunlessdelete_report_fileis set toTrue.This builder supports the keyword arguments:
output_type,odb_report_args,delete_report_filewith behavior as described in the ODB Extract command line interface.Format of HDF5 file#/ # Top level group required in all hdf5 files /<instance name>/ # Groups containing data of each instance found in an odb FieldOutputs/ # Group with multiple xarray datasets for each field output <field name>/ # Group with datasets containing field output data for a specified set or surface # If no set or surface is specified, the <field name> will be # 'ALL_NODES' or 'ALL_ELEMENTS' HistoryOutputs/ # Group with multiple xarray datasets for each history output <region name>/ # Group with datasets containing history output data for specified history region name # If no history region name is specified, the <region name> will be 'ALL NODES' Mesh/ # Group written from an xarray dataset with all mesh information for this instance /<instance name>_Assembly/ # Group containing data of assembly instance found in an odb Mesh/ # Group written from an xarray dataset with all mesh information for this instance /odb/ # Catch all group for data found in the odbreport file not already organized by instance info/ # Group with datasets that mostly give odb meta-data like name, path, etc. jobData/ # Group with datasets that contain additional odb meta-data rootAssembly/ # Group with datasets that match odb file organization per Abaqus documentation sectionCategories/ # Group with datasets that match odb file organization per Abaqus documentation /xarray/ # Group with a dataset that lists the location of all data written from xarray datasets
SConstruct#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_program, "AddProgram") env["ABAQUS_PROGRAM"] = env.AddProgram(["abaqus"]) env.Append(BUILDERS={"AbaqusExtract": waves.scons_extensions.abaqus_extract()}) env.AbaqusExtract(target=["my_job.h5", "my_job.csv"], source=["my_job.odb"])
- Parameters:
program – An absolute path or basename string for the abaqus program
- Returns:
Abaqus extract builder
- waves.scons_extensions.abaqus_input_scanner() Scanner[source]
Abaqus input file dependency scanner.
Custom SCons scanner that searches for the
INPUT=parameter and associated file dependencies inside Abaqus*.inpfiles.- Returns:
Abaqus input file dependency Scanner
- waves.scons_extensions.abaqus_journal(
- program: str = 'abaqus',
- required: str = 'cae -noGUI ${SOURCE.abspath}',
- action_prefix: str = 'cd ${TARGET.dir.abspath} &&',
- action_suffix: str = '> ${TARGETS[-1].abspath} 2>&1',
- environment_suffix: str = '> ${TARGETS[-2].abspath} 2>&1',
Construct and return an Abaqus journal file SCons builder.
This builder requires that the journal file to execute is the first source in the list. The builder returned by this function accepts all SCons Builder arguments. The arguments of this function are also available as keyword arguments of the builder. When provided during task definition, the keyword arguments override the builder returned by this function.
Builder/Task keyword arguments
program: The Abaqus command line executable absolute or relative pathrequired: A space delimited string of Abaqus required argumentsabaqus_options: The Abaqus command line options provided as a stringjournal_options: The journal file command line options provided as a stringaction_prefix: Advanced behavior. Most users should accept the defaultsaction_suffix: Advanced behavior. Most users should accept the defaults.environment_suffix: Advanced behavior. Most users should accept the defaults.
At least one target must be specified. The first target determines the working directory for the builder’s action, as shown in the action code snippet below. The action changes the working directory to the first target’s parent directory prior to executing the journal file.
The Builder emitter will append the builder managed targets automatically. Appends
target[0].abaqus_v6.env andtarget[0].stdout to thetargetlist.The emitter will assume all emitted targets build in the current build directory. If the target(s) must be built in a build subdirectory, e.g. in a parameterized target build, then the first target must be provided with the build subdirectory, e.g.
parameter_set1/my_target.ext. When in doubt, provide a STDOUT redirect file as a target, e.g.target.stdout.Abaqus journal builder action keywords#${action_prefix} ${program} -information environment ${environment_suffix} ${action_prefix} ${program} ${required} ${abaqus_options} -- ${journal_options} ${action_suffix}With the default argument values, this expands to
Abaqus journal builder action default expansion#cd ${TARGET.dir.abspath} && abaqus -information environment > ${TARGETS[-2].abspath} 2>&1 cd ${TARGET.dir.abspath} && abaqus cae -noGui ${SOURCE.abspath} ${abaqus_options} -- ${journal_options} > ${TARGETS[-1].abspath} 2>&1SConstruct#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_program, "AddProgram") env["ABAQUS_PROGRAM"] = env.AddProgram(["abaqus"]) env.Append(BUILDERS={"AbaqusJournal": waves.scons_extensions.abaqus_journal()}) env.AbaqusJournal(target=["my_journal.cae"], source=["my_journal.py"], journal_options="")
- Parameters:
program – The Abaqus command line executable absolute or relative path
required – A space delimited string of Abaqus required arguments
action_prefix – Advanced behavior. Most users should accept the defaults.
action_suffix – Advanced behavior. Most users should accept the defaults.
environment_suffix – Advanced behavior. Most users should accept the defaults.
- Returns:
Abaqus journal builder
- waves.scons_extensions.abaqus_journal_builder_factory(environment: str = '', action_prefix: str = 'cd ${TARGET.dir.abspath} &&', program: str = 'abaqus', program_required: str = 'cae -noGUI ${SOURCES[0].abspath}', program_options: str = '', subcommand: str = '--', subcommand_required: str = '', subcommand_options: str = '', action_suffix: str = '> ${TARGETS[-1].abspath} 2>&1', emitter: ~collections.abc.Callable[[list, list, ~SCons.Script.SConscript.SConsEnvironment], tuple[list, list]] = <function first_target_emitter>, **kwargs) Builder[source]
Abaqus journal builder factory.
This builder factory extends
waves.scons_extensions.first_target_builder_factory(). This builder factory uses thewaves.scons_extensions.first_target_emitter(). At least one task target must be specified in the task definition and the last target will always be the expected STDOUT and STDERR redirection output file,TARGETS[-1]ending in*.stdout.Warning
Users overriding the
emitterkeyword argument are responsible for providing an emitter with equivalent STDOUT file handling behavior aswaves.scons_extensions.first_target_emitter()or updating theaction_suffixoption to match their emitter’s behavior.With the default options this builder requires the following sources file provided in the order:
Abaqus journal file:
*.py
action string construction#${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}action string default expansion#${environment} cd ${TARGET.dir.abspath} && abaqus cae -noGUI=${SOURCES[0].abspath} ${program_options} -- ${subcommand_required} ${subcommand_options} > ${TARGETS[-1].abspath} 2>&1SConstruct#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_program, "AddProgram") env["ABAQUS_PROGRAM"] = env.AddProgram(["abaqus"]) env.Append(BUILDERS={ "AbaqusJournal": waves.scons_extensions.abaqus_journal_builder_factory( program=env["ABAQUS_PROGRAM"] ) }) env.AbaqusJournal(target=["my_journal.cae"], source=["my_journal.py"])
The builder returned by this factory accepts all SCons Builder arguments. The arguments of this function are also available as keyword arguments of the builder. When provided during task definition, the task keyword arguments override the builder keyword arguments.
- Parameters:
environment – This variable is intended primarily for use with builders and tasks that can not execute from an SCons construction environment. For instance, when tasks execute on a remote server with SSH wrapped actions using
waves.scons_extensions.ssh_builder_actions()and therefore must initialize the remote environment as part of the builder action.action_prefix – This variable is intended to perform directory change operations prior to program execution
program – The Abaqus absolute or relative path
program_required – Space delimited string of required Abaqus options and arguments that are crucial to builder behavior and should not be modified except by advanced users
program_options – Space delimited string of optional Abaqus options and arguments that can be freely modified by the user
subcommand – The shell separator for positional arguments used to separate Abaqus program from Abaqus journal file arguments and options
subcommand_required – Space delimited string of required Abaqus journal file options and arguments that are crucial to builder behavior and should not be modified except by advanced users.
subcommand_options – Space delimited string of optional Abaqus journal file options and arguments that can be freely modified by the user
action_suffix – This variable is intended to perform program STDOUT and STDERR redirection operations.
emitter – An SCons emitter function. This is not a keyword argument in the action string.
kwargs – Any additional keyword arguments are passed directly to the SCons builder object.
- Returns:
Abaqus journal builder
- waves.scons_extensions.abaqus_solver(
- program: str = 'abaqus',
- required: str = '-interactive -ask_delete no -job ${job_name} -input ${SOURCE.filebase}',
- action_prefix: str = 'cd ${TARGET.dir.abspath} &&',
- action_suffix: str = '> ${TARGETS[-1].abspath} 2>&1',
- environment_suffix: str = '> ${TARGETS[-2].abspath} 2>&1',
- emitter: Literal['standard', 'explicit', 'datacheck'] | None = None,
Construct and return an Abaqus solver SCons builder.
This builder requires that the root input file is the first source in the list. The builder returned by this function accepts all SCons Builder arguments. The arguments of this function are also available as keyword arguments of the builder. When provided during task definition, the keyword arguments override the builder returned by this function.
Builder/Task keyword arguments
program: The Abaqus command line executable absolute or relative pathrequired: A space delimited string of Abaqus required argumentsjob_name: The job name string. If not specifiedjob_namedefaults to the root input file stem. The Builderemitter will append common Abaqus output files as targets automatically from the
job_name, e.g.job_name.odb.
abaqus_options: The Abaqus command line options provided as a string.suffixes: override the emitter targets with a new list of extensions, e.g.AbaqusSolver(target=[], source=["input.inp"], suffixes=[".odb"])will emit only one file namedjob_name.odb.action_prefix: Advanced behavior. Most users should accept the defaultsaction_suffix: Advanced behavior. Most users should accept the defaults.environment_suffix: Advanced behavior. Most users should accept the defaults.
The first target determines the working directory for the builder’s action, as shown in the action code snippet below. The action changes the working directory to the first target’s parent directory prior to executing the journal file.
This builder is unique in that no targets are required. The Builder emitter will append the builder managed targets automatically. The target list only appends those extensions which are common to Abaqus analysis operations. Some extensions may need to be added explicitly according to the Abaqus simulation solver, type, or options. If you find that SCons isn’t automatically cleaning some Abaqus output files, they are not in the automatically appended target list.
The emitter will assume all emitted targets build in the current build directory. If the target(s) must be built in a build subdirectory, e.g. in a parameterized target build, then the first target must be provided with the build subdirectory, e.g.
parameter_set1/job_name.odb. When in doubt, provide a STDOUT redirect file as a target, e.g.target.stdout.The
-interactiveoption is always appended to the builder action to avoid exiting the Abaqus task before the simulation is complete. The-ask_delete nooption is always appended to the builder action to overwrite existing files in programmatic execution, where it is assumed that the Abaqus solver target(s) should be re-built when their source files change.SConstruct#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_program, "AddProgram") env["ABAQUS_PROGRAM"] = env.AddProgram(["abaqus"]) env.Append(BUILDERS={ "AbaqusSolver": waves.scons_extensions.abaqus_solver(), "AbaqusStandard": waves.scons_extensions.abaqus_solver(emitter='standard'), "AbaqusOld": waves.scons_extensions.abaqus_solver(program="abq2019") }) env.AbaqusSolver(target=[], source=["input.inp"], job_name="my_job", abaqus_options="-cpus 4") env.AbaqusSolver(target=[], source=["input.inp"], job_name="my_job", suffixes=[".odb"])
Abaqus solver builder action keywords#${action_prefix} ${program} -information environment ${environment_suffix} ${action_prefix} ${program} ${required} ${abaqus_options} ${action_suffix}Abaqus solver builder action default expansion#cd ${TARGET.dir.abspath} && abaqus -information environment > ${TARGETS[-2].abspath} 2>&1 cd ${TARGET.dir.abspath} && ${program} -interactive -ask_delete no -job ${job_name} -input ${SOURCE.filebase} ${abaqus_options} > ${TARGETS[-1].abspath} 2>&1- Parameters:
program – An absolute path or basename string for the abaqus program
required – A space delimited string of Abaqus required arguments
action_prefix – Advanced behavior. Most users should accept the defaults.
action_suffix – Advanced behavior. Most users should accept the defaults.
environment_suffix – Advanced behavior. Most users should accept the defaults.
emitter –
emit file extensions based on the value of this variable. Overridden by the
suffixeskeyword argument that may be provided in the Task definition.”standard”: [“.odb”, “.dat”, “.msg”, “.com”, “.prt”, “.sta”]
”explicit”: [“.odb”, “.dat”, “.msg”, “.com”, “.prt”, “.sta”]
”datacheck”: [“.odb”, “.dat”, “.msg”, “.com”, “.prt”, “.023”, “.mdl”, “.sim”, “.stt”]
default value: [“.odb”, “.dat”, “.msg”, “.com”, “.prt”]
- Returns:
Abaqus solver builder
- waves.scons_extensions.abaqus_solver_builder_factory(environment: str = '', action_prefix: str = 'cd ${TARGET.dir.abspath} &&', program: str = 'abaqus', program_required: str = '-interactive -ask_delete no -job ${job} -input ${SOURCE.filebase}', program_options: str = '', subcommand: str = '', subcommand_required: str = '', subcommand_options: str = '', action_suffix: str = '> ${TARGETS[-1].abspath} 2>&1', emitter: ~collections.abc.Callable[[list, list, ~SCons.Script.SConscript.SConsEnvironment], tuple[list, list]] = <function first_target_emitter>, **kwargs) Builder[source]
Abaqus solver builder factory.
This builder factory extends
waves.scons_extensions.first_target_builder_factory(). This builder factory uses thewaves.scons_extensions.first_target_emitter(). At least one task target must be specified in the task definition and the last target will always be the expected STDOUT and STDERR redirection output file,TARGETS[-1]ending in*.stdout.Warning
Users overriding the
emitterkeyword argument are responsible for providing an emitter with equivalent STDOUT file handling behavior aswaves.scons_extensions.first_target_emitter()or updating theaction_suffixoption to match their emitter’s behavior.With the default options this builder requires the following sources file provided in the order:
Abaqus solver file:
*.inp
action string construction#${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}action string default expansion#${environment} cd ${TARGET.dir.abspath} && abaqus -interactive -ask_delete no -job ${job} -input ${SOURCE.filebase} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} > ${TARGETS[-1].abspath} 2>&1SConstruct#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_program, "AddProgram") env["ABAQUS_PROGRAM"] = env.AddProgram(["abaqus"]) env.Append(BUILDERS={ "AbaqusSolver": waves.scons_extensions.abaqus_solver_builder_factory( program=env["ABAQUS_PROGRAM"] ) }) env.AbaqusSolver(target=["job.odb"], source=["input.inp"], job="job")
Note
The
jobkeyword argument must be provided in the task definition.The builder returned by this factory accepts all SCons Builder arguments. The arguments of this function are also available as keyword arguments of the builder. When provided during task definition, the task keyword arguments override the builder keyword arguments.
- Parameters:
environment – This variable is intended primarily for use with builders and tasks that can not execute from an SCons construction environment. For instance, when tasks execute on a remote server with SSH wrapped actions using
waves.scons_extensions.ssh_builder_actions()and therefore must initialize the remote environment as part of the builder action.action_prefix – This variable is intended to perform directory change operations prior to program execution
program – The Abaqus absolute or relative path
program_required – Space delimited string of required Abaqus options and arguments that are crucial to builder behavior and should not be modified except by advanced users
program_options – Space delimited string of optional Abaqus options and arguments that can be freely modified by the user
subcommand – The subcommand absolute or relative path
subcommand_required – Space delimited string of required subcommand options and arguments that are crucial to builder behavior and should not be modified except by advanced users.
subcommand_options – Space delimited string of optional subcommand options and arguments that can be freely modified by the user
action_suffix – This variable is intended to perform program STDOUT and STDERR redirection operations.
emitter – An SCons emitter function. This is not a keyword argument in the action string.
kwargs – Any additional keyword arguments are passed directly to the SCons builder object.
- Returns:
Abaqus solver builder
- waves.scons_extensions.abaqus_solver_emitter_factory(
- suffixes: Iterable[str] = ('.odb', '.dat', '.msg', '.com', '.prt'),
- appending_suffixes: Iterable[str] | None = None,
- stdout_extension: str = '.stdout',
Abaqus solver emitter factory.
SCons emitter factory that returns emitters for
waves.scons_extensions.abaqus_solver_builder_factory()based builders.Emitters returned by this factory append the target list with
jobtask keyword argument named targets before passing through thewaves.scons_extensions.first_target_emitter()emitter.Searches for the
jobtask keyword argument and appends the target list withf"{job}{suffix}"targets using thesuffixeslist.Searches for a file ending in the stdout extension. If none is found, creates a target by appending the stdout extension to the first target in the
targetlist. The associated Builder requires at least one target for this reason. The stdout file is always placed at the end of the returned target list.The
suffixeslist emits targets where the suffix replaces the first target’s suffix, e.g. fortarget.extemit a new targettarget.suffix. The appending suffixes list emits targets where the suffix appends the first target’s suffix, e.g. fortarget.extemit a new targettarget.ext.appending_suffix.The emitter will assume all emitted targets build in the current build directory. If the target(s) must be built in a build subdirectory, e.g. in a parameterized target build, then the first target must be provided with the build subdirectory, e.g.
parameter_set1/target.ext. When in doubt, provide a STDOUT redirect file with the.stdoutextension as a target, e.g.target.stdoutorparameter_set1/target.stdout.SConstruct#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_program, "AddProgram") env["ABAQUS_PROGRAM"] = env.AddProgram(["abaqus"]) env.Append(BUILDERS={ "AbaqusStandard": waves.scons_extensions.abaqus_solver_builder_factory( program=env["ABAQUS_PROGRAM"], emitter=waves.scons_extensions.abaqus_solver_emitter_factory( suffixes=[".odb", ".dat", ".msg", ".com", ".prt", ".sta"], ) ) }) env.AbaqusStandard(target=["job.odb"], source=["input.inp"], job="job")
Note
The
jobkeyword argument must be provided in the task definition.- Parameters:
suffixes – Suffixes which should replace the first target’s extension
appending_suffixes – Suffixes which should append the first target’s extension
stdout_extension – The extension used by the STDOUT/STDERR redirect file
- Returns:
emitter function
- waves.scons_extensions.abaqus_standard_emitter(
- target: list,
- source: list,
- env: SConsEnvironment,
- suffixes: Iterable[str] = ('.odb', '.dat', '.msg', '.com', '.prt', '.sta'),
- appending_suffixes: Iterable[str] | None = None,
- stdout_extension: str = '.stdout',
Abaqus solver emitter for Standard targets.
SCons emitter for
waves.scons_extensions.abaqus_solver_builder_factory()based builders. Built onwaves.scons_extensions.abaqus_solver_emitter_factory().Appends the target list with
jobtask keyword argument named targets before passing through thewaves.scons_extensions.first_target_emitter()emitter.Searches for the
jobtask keyword argument and appends the target list withf"{job}{suffix}"targets using thesuffixeslist.Searches for a file ending in the stdout extension. If none is found, creates a target by appending the stdout extension to the first target in the
targetlist. The associated Builder requires at least one target for this reason. The stdout file is always placed at the end of the returned target list.This is an SCons emitter function and not an emitter factory. The suffix arguments:
suffixesandappending_suffixesare only relevant for developers writing new emitters which call this function as a base. The suffixes list emits targets where the suffix replaces the first target’s suffix, e.g. fortarget.extemit a new targettarget.suffix. The appending suffixes list emits targets where the suffix appends the first target’s suffix, e.g. fortarget.extemit a new targettarget.ext.appending_suffix.The emitter will assume all emitted targets build in the current build directory. If the target(s) must be built in a build subdirectory, e.g. in a parameterized target build, then the first target must be provided with the build subdirectory, e.g.
parameter_set1/target.ext. When in doubt, provide a STDOUT redirect file with the.stdoutextension as a target, e.g.target.stdoutorparameter_set1/target.stdout.SConstruct#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_program, "AddProgram") env["ABAQUS_PROGRAM"] = env.AddProgram(["abaqus"]) env.Append(BUILDERS={ "AbaqusStandard": waves.scons_extensions.abaqus_solver_builder_factory( program=env["ABAQUS_PROGRAM"], emitter=waves.scons_extensions.abaqus_standard_emitter, ) }) env.AbaqusStandard(target=["job.odb"], source=["input.inp"], job="job")
Note
The
jobkeyword argument must be provided in the task definition.- Parameters:
target – The target file list of strings
source – The source file list of SCons.Node.FS.File objects
env – The builder’s SCons construction environment object
suffixes – Suffixes which should replace the first target’s extension
appending_suffixes – Suffixes which should append the first target’s extension
stdout_extension – The extension used by the STDOUT/STDERR redirect file
- Returns:
target, source
- waves.scons_extensions.action_list_scons(actions: Iterable[str]) ListAction[source]
Convert a list of action strings to an SCons.Action.ListAction object.
- Parameters:
actions – List of action strings
- Returns:
SCons.Action.ListAction object of SCons.Action.CommandAction
- waves.scons_extensions.action_list_strings(builder: Builder) list[str][source]
Return a builder’s action list as a list of str.
- Parameters:
builder – The builder to extract the action list from
- Returns:
list of builder actions
- waves.scons_extensions.add_cubit(env: SConsEnvironment, names: Sequence[str]) str | None[source]
Modify environment variables with the paths required to
import cubitin a Python3 environment.Returns the absolute path of the first program name found. Appends
PATHwith first program’s parent directory if a program is found and the directory is not already onPATH. PrependsPYTHONPATHwithparent/bin. PrependsLD_LIBRARY_PATHwithparent/bin/python3.Returns None if no program name is found.
Example Cubit environment modification#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_cubit, "AddCubit") env["CUBIT_PROGRAM"] = env.AddCubit(["cubit"])
- Parameters:
env – The SCons construction environment object to modify
names – list of string program names for the main Cubit executable. May include an absolute path.
- Returns:
Absolute path of the Cubit executable. None if none of the names are found.
- waves.scons_extensions.add_cubit_python(env: SConsEnvironment, names: Sequence[str]) str | None[source]
Modify environment variables with the paths required to
import cubitwith the Cubit Python interpreter.Returns the absolute path of the first Cubit Python intepreter found. Appends
PATHwith Cubit Python parent directory if a program is found and the directory is not already onPATH. PrependsPYTHONPATHwithparent/bin.Returns None if no Cubit Python interpreter is found.
Example Cubit environment modification#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_cubit_python, "AddCubitPython") env["CUBIT_PROGRAM"] = env.AddCubitPython(["cubit"])
- Parameters:
env – The SCons construction environment object to modify
names – list of string program names for the main Cubit executable. May include an absolute path.
- Returns:
Absolute path of the Cubit Python intepreter. None if none of the names are found.
- waves.scons_extensions.add_program(env: SConsEnvironment, names: Sequence[str]) str | None[source]
Search for a program from a list of possible program names. Add first found to system
PATH.Returns the absolute path of the first program name found. Appends
PATHwith first program’s parent directory if a program is found and the directory is not already onPATH. Returns None if no program name is found.Example search for an executable named “program”#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_program, "AddProgram") env["PROGRAM"] = env.AddProgram(["program"])
- Parameters:
env – The SCons construction environment object to modify
names – list of string program names. May include an absolute path.
- Returns:
Absolute path of the found program. None if none of the names are found.
- waves.scons_extensions.ansys_apdl_builder_factory(environment: str = '', action_prefix: str = 'cd ${TARGET.dir.abspath} &&', program: str = 'ansys', program_required: str = '-i ${SOURCES[0].abspath} -o ${TARGETS[-1].abspath}', program_options: str = '', subcommand: str = '', subcommand_required: str = '', subcommand_options: str = '', action_suffix: str = '', emitter: ~collections.abc.Callable[[list, list, ~SCons.Script.SConscript.SConsEnvironment], tuple[list, list]] = <function first_target_emitter>, **kwargs) Builder[source]
Ansys APDL builder factory.
Warning
This is an experimental builder. It is subject to change without warning.
Warning
This builder does not have a tutorial and is not included in the regression test suite yet. Contact the development team if you encounter problems or have recommendations for improved design behavior.
This builder factory extends
waves.scons_extensions.first_target_builder_factory(). This builder factory uses thewaves.scons_extensions.first_target_emitter(). At least one task target must be specified in the task definition and the last target will always be the expected STDOUT and STDERR redirection output file,TARGETS[-1]ending in*.stdout.Warning
Users overriding the
emitterkeyword argument are responsible for providing an emitter with equivalent STDOUT file handling behavior aswaves.scons_extensions.first_target_emitter()or updating theprogram_requiredoption to match their emitter’s behavior.With the default options this builder requires the following sources file provided in the order:
Ansys APDL file:
*.dat
action string construction#${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}action string default expansion#${environment} cd ${TARGET.dir.abspath} && ansys -i ${SOURCES[0].abspath} -o ${TARGETS[-1].abspath} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}SConstruct#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_program, "AddProgram") env["ANSYS_PROGRAM"] = env.AddProgram(["ansys232"]) env.Append(BUILDERS={ "AnsysAPDL": waves.scons_extensions.ansys_apdl_builder_factory( program=env["ANSYS_PROGRAM"] ) }) env.AnsysAPDL( target=["job.rst"], source=["source.dat"], program_options="-j job" )
The builder returned by this factory accepts all SCons Builder arguments. The arguments of this function are also available as keyword arguments of the builder. When provided during task definition, the task keyword arguments override the builder keyword arguments.
- Parameters:
environment – This variable is intended primarily for use with builders and tasks that can not execute from an SCons construction environment. For instance, when tasks execute on a remote server with SSH wrapped actions using
waves.scons_extensions.ssh_builder_actions()and therefore must initialize the remote environment as part of the builder action.action_prefix – This variable is intended to perform directory change operations prior to program execution
program – The Ansys absolute or relative path
program_required – Space delimited string of required Ansys options and arguments that are crucial to builder behavior and should not be modified except by advanced users
program_options – Space delimited string of optional Ansys options and arguments that can be freely modified by the user
subcommand – A subcommand absolute or relative path
subcommand_required – Space delimited string of required subcommand options and arguments that are crucial to builder behavior and should not be modified except by advanced users.
subcommand_options – Space delimited string of optional subcommand options and arguments that can be freely modified by the user
action_suffix – This variable is intended to perform program STDOUT and STDERR redirection operations.
emitter – An SCons emitter function. This is not a keyword argument in the action string.
kwargs – Any additional keyword arguments are passed directly to the SCons builder object.
- Returns:
Ansys builder
- waves.scons_extensions.append_env_path(env: SConsEnvironment, program: str | Path) None[source]
Append SCons contruction environment
PATHwith the program’s parent directory.Uses the SCons AppendENVPath method. If the program parent directory is already on
PATH, thePATHdirectory order is preserved.Example environment modification#import waves env = Environment() env["PROGRAM"] = waves.scons_extensions.find_program(env, ["program"]) if env["PROGRAM"]: waves.append_env_path(env, env["PROGRAM"])
- Parameters:
env – The SCons construction environment object to modify
program – An absolute path for the program to add to SCons construction environment
PATH
- Raises:
FileNotFoundError – if the
programabsolute path does not exist.
- waves.scons_extensions.builder_factory(
- environment: str = '',
- action_prefix: str = '',
- program: str = '',
- program_required: str = '',
- program_options: str = '',
- subcommand: str = '',
- subcommand_required: str = '',
- subcommand_options: str = '',
- action_suffix: str = '',
- emitter: Callable[[list, list, SConsEnvironment], tuple[list, list]] | None = None,
- **kwargs,
Template builder factory returning a builder with no emitter.
This builder provides a template action string with placeholder keyword arguments in the action string. The default behavior will not do anything unless the
programorsubcommandargument is updated to include an executable program. Because this builder has no emitter, all task targets must be fully specified in the task definition. Seewaves.scons_extensions.first_target_builder_factory()for an example of the default options used by most WAVES builders.action string construction#${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}action string default expansion#${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}- Parameters:
environment – This variable is intended primarily for use with builders and tasks that can not execute from an SCons construction environment. For instance, when tasks execute on a remote server with SSH wrapped actions using
waves.scons_extensions.ssh_builder_actions()and therefore must initialize the remote environment as part of the builder action.action_prefix – This variable is intended to perform directory change operations prior to program execution. By default, SCons performs actions in the parent directory of the SConstruct file. However, many computational science and engineering programs leave output files in the current working directory, so it is convenient and sometimes necessary to change to the target’s parent directory prior to execution.
program – This variable is intended to contain the primary command line executable absolute or relative path
program_required – This variable is intended to contain a space delimited string of required program options and arguments that are crucial to builder behavior and should not be modified except by advanced users.
program_options – This variable is intended to contain a space delimited string of optional program options and arguments that can be freely modified by the user.
subcommand – This variable is intended to contain the program’s subcommand. If the program variable is set to a launch controlling program, e.g.
mpirunorcharmrun, then the subcommand may need to contain the full target executable program and any subcommands.subcommand_required – This variable is intended to contain a space delimited string of required subcommand options and arguments that are crucial to builder behavior and should not be modified except by advanced users.
subcommand_options – This variable is intended to contain a space delimited string of optional subcommand options and arguments that can be freely modified by the user.
action_suffix – This variable is intended to perform program STDOUT and STDERR redirection operations. By default, SCons streams all STDOUT and STDERR to the terminal. However, in long or parallel workflows this may clutter the terminal and make it difficult to isolate critical debugging information, so it is convenient to redirect each program’s output to a task specific log file for later inspection and troubleshooting.
emitter – An SCons emitter function. This is not a keyword argument in the action string.
kwargs – Any additional keyword arguments are passed directly to the SCons builder object.
- Returns:
SCons template builder
- waves.scons_extensions.calculix_builder_factory(environment: str = '', action_prefix: str = 'cd ${TARGET.dir.abspath} &&', program: str = 'ccx', program_required: str = '-i ${SOURCE.filebase}', program_options: str = '', subcommand: str = '', subcommand_required: str = '', subcommand_options: str = '', action_suffix: str = '> ${TARGETS[-1].abspath} 2>&1', emitter: ~collections.abc.Callable[[list, list, ~SCons.Script.SConscript.SConsEnvironment], tuple[list, list]] = <function first_target_emitter>, **kwargs) Builder[source]
CalculiX builder factory.
Warning
This is an experimental builder. It is subject to change without warning.
This builder factory extends
waves.scons_extensions.first_target_builder_factory(). This builder factory uses thewaves.scons_extensions.first_target_emitter(). At least one task target must be specified in the task definition and the last target will always be the expected STDOUT and STDERR redirection output file,TARGETS[-1]ending in*.stdout.Warning
Users overriding the
emitterkeyword argument are responsible for providing an emitter with equivalent STDOUT file handling behavior aswaves.scons_extensions.first_target_emitter()or updating theaction_suffixoption to match their emitter’s behavior.Warning
CalculiX always appends the
.inpextension to the input file argument. Stripping the extension in the builder requires a file basename without preceding relative or absolute path. This builder is fragile to current working directory. Most users should not modify theaction_prefix.With the default options this builder requires the following sources file provided in the order:
CalculiX input file:
*.inp
action string construction#${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}action string default expansion#${environment} cd ${TARGET.dir.abspath} && ccx -i ${SOURCE.filebase} ${program_required} ${subcommand} ${subcommand_required} ${subcommand_options} > ${TARGETS[-1].abspath} 2>&1SConstruct#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_program, "AddProgram") env["CCX_PROGRAM"] = env.AddProgram(["ccx"]) env.Append(BUILDERS={ "CalculiX": waves.scons_extensions.calculix_builder_factory( subcommand=env["CCX_PROGRAM"] ) }) env.CalculiX( target=["target.stdout"], source=["source.inp"], )
The builder returned by this factory accepts all SCons Builder arguments. The arguments of this function are also available as keyword arguments of the builder. When provided during task definition, the task keyword arguments override the builder keyword arguments.
- Parameters:
environment – This variable is intended primarily for use with builders and tasks that can not execute from an SCons construction environment. For instance, when tasks execute on a remote server with SSH wrapped actions using
waves.scons_extensions.ssh_builder_actions()and therefore must initialize the remote environment as part of the builder action.action_prefix – This variable is intended to perform directory change operations prior to program execution
program – The CalculiX
ccxabsolute or relative pathprogram_required – Space delimited string of required CalculiX options and arguments that are crucial to builder behavior and should not be modified except by advanced users
program_options – Space delimited string of optional CalculiX options and arguments that can be freely modified by the user
subcommand – A subcommand absolute or relative path
subcommand_required – Space delimited string of required subcommand options and arguments that are crucial to builder behavior and should not be modified except by advanced users.
subcommand_options – Space delimited string of optional subcommand options and arguments that can be freely modified by the user
action_suffix – This variable is intended to perform program STDOUT and STDERR redirection operations.
emitter – An SCons emitter function. This is not a keyword argument in the action string.
kwargs – Any additional keyword arguments are passed directly to the SCons builder object.
- Returns:
CalculiX builder
- waves.scons_extensions.catenate_actions(**outer_kwargs) Callable[source]
Apply the
catenate_builder_actions()action modifications to a function that returns an SCons Builder.Accepts the same keyword arguments as the
waves.scons_extensions.catenate_builder_actions()import SCons.Builder import waves @waves.scons_extensions.catenate_actions def my_builder(): return SCons.Builder.Builder(action=["echo $SOURCE > $TARGET", "echo $SOURCE >> $TARGET"])
- waves.scons_extensions.catenate_builder_actions(
- builder: Builder,
- program: str = '',
- options: str = '',
Catenate a builder’s arguments and prepend the program and options.
${program} ${options} "action one && action two"- Parameters:
builder – The SCons builder to modify
program – wrapping executable
options – options for the wrapping executable
- Returns:
modified builder
- waves.scons_extensions.check_program(env: SConsEnvironment, prog_name: str) str | None[source]
Return the absolute path of the requested program or
None.Replacement for SCons CheckProg like behavior without an SCons configure object.
Example search for an executable named “program”#import waves env = Environment() env.AddMethod(waves.scons_extensions.check_program, "CheckProgram") env["PROGRAM"] = env.CheckProgram(["program"])
- Parameters:
env – The SCons construction environment object to modify
prog_name – string program name to search in the construction environment path
- waves.scons_extensions.conda_environment(
- program: str = 'conda',
- subcommand: str = 'env export',
- required: str = '--file ${TARGET.abspath}',
- options: str = '',
- action_prefix: str = 'cd ${TARGET.dir.abspath} &&',
Create a Conda environment file with
conda env export.This builder is intended to help WAVES workflows document the Conda environment used in the current build. The arguments of this function are also available as keyword arguments of the builder. When provided during task definition, the keyword arguments override the builder returned by this function.
Builder/Task keyword arguments
program: The Conda command line executable absolute or relative pathsubcommand: The Conda environment export subcommandrequired: A space delimited string of subcommand required argumentsoptions: A space delimited string of subcommand optional argumentsaction_prefix: Advanced behavior. Most users should accept the defaults
At least one target must be specified. The first target determines the working directory for the builder’s action, as shown in the action code snippet below. The action changes the working directory to the first target’s parent directory prior to creating the Conda environment file.
Conda environment builder action default expansion#${action_prefix} ${program} ${subcommand} ${required} ${options}"Conda environment builder action default expansion#cd ${TARGET.dir.abspath} && conda env export --file ${TARGET.abspath} ${options}The modsim owner may choose to re-use this builder throughout their project configuration to provide various levels of granularity in the recorded Conda environment state. It’s recommended to include this builder at least once for any workflows that also use the
waves.scons_extensions.python_builder_factory(). The builder may be re-used once per build sub-directory to provide more granular build environment reproducibility in the event that sub-builds are run at different times with variations in the active Conda environment. For per-Python script task environment reproducibility, the builder source list can be linked to the output of awaves.scons_extensions.python_builder_factory()task with a target environment file name to match.The first recommendation, always building the project wide Conda environment file, is demonstrated in the example usage below.
SConstruct#import waves env = Environment() env.Append(BUILDERS={"CondaEnvironment": waves.scons_extensions.conda_environment()}) environment_target = env.CondaEnvironment(target=["environment.yaml"]) env.AlwaysBuild(environment_target)
- Parameters:
program – The Conda command line executable absolute or relative path
subcommand – The Conda environment export subcommand
required – A space delimited string of subcommand required arguments
options – A space delimited string of subcommand optional arguments
action_prefix – Advanced behavior. Most users should accept the defaults
- Returns:
Conda environment builder
- waves.scons_extensions.construct_action_list(
- actions: Sequence[str],
- prefix: str = '${action_prefix}',
- suffix: str = '',
Return an action list with a common pre/post-fix.
Returns the constructed action list with pre/post fix strings as
f"{prefix} {new_action} {suffix}"
where SCons action objects are converted to their string representation. If a string is passed instead of a list, it is first converted to a list. If an empty list is passed, and empty list is returned.
- Parameters:
actions – List of action strings
prefix – Common prefix to prepend to each action
suffix – Common suffix to append to each action
- Returns:
action list
- waves.scons_extensions.copy_substfile(
- env: SConsEnvironment,
- source_list: list,
- substitution_dictionary: dict | None = None,
- build_subdirectory: str | Path = '.',
- symlink: bool = False,
Pseudo-builder to copy source list to build directory and perform template substitutions on
*.infilenames.SCons Pseudo-Builder to chain two builders: a builder with the SCons Copy action and the SCons Substfile builders. Files are first copied to the build (variant) directory and then template substitution is performed on template files (any file ending with
*.insuffix) to create a file without the template suffix.When pseudo-builders are added to the environment with the SCons AddMethod function they can be accessed with the same syntax as a normal builder. When called from the construction environment, the
envargument is omitted. See the example below.To avoid dependency cycles, the source file(s) should be passed by absolute path.
SConstruct#import pathlib import waves current_directory = pathlib.Path(Dir(".").abspath) env = Environment() env.AddMethod(waves.scons_extensions.copy_substfile, "CopySubstfile") source_list = [ "#/subdir3/file_three.ext", # File found with respect to project root directory using SCons notation current_directory / file_one.ext, # File found in current SConscript directory current_directory / "subdir2/file_two", # File found below current SConscript directory current_directory / "file_four.ext.in" # File with substitutions matching substitution dictionary keys ] substitution_dictionary = { "@variable_one@": "value_one" } env.CopySubstfile(source_list, substitution_dictionary=substitution_dictionary)
- Parameters:
env – An SCons construction environment to use when defining the targets.
source_list – List of pathlike objects or strings. Will be converted to list of pathlib.Path objects.
substitution_dictionary – key: value pairs for template substitution. The keys must contain the optional template characters if present, e.g.
@variable@. The template character, e.g.@, can be anything that works in the SCons Substfile builder.build_subdirectory – build subdirectory relative path prepended to target files
symlink – Whether symbolic links are created as new symbolic links. If true, symbolic links are shallow copies as a new symbolic link. If false, symbolic links are copied as a new file (dereferenced).
- Returns:
SCons NodeList of Copy and Substfile target nodes
- waves.scons_extensions.fierro_explicit_builder_factory(environment: str = '', action_prefix: str = 'cd ${TARGET.dir.abspath} &&', program: str = 'mpirun', program_required: str = '', program_options: str = '-np 1', subcommand: str = 'fierro-parallel-explicit', subcommand_required: str = '${SOURCE.abspath}', subcommand_options: str = '', action_suffix: str = '> ${TARGETS[-1].abspath} 2>&1', emitter: ~collections.abc.Callable[[list, list, ~SCons.Script.SConscript.SConsEnvironment], tuple[list, list]] = <function first_target_emitter>, **kwargs) Builder[source]
Fierro explicit builder factory.
Warning
This is an experimental builder. It is subject to change without warning.
This builder factory extends
waves.scons_extensions.first_target_builder_factory(). This builder factory uses thewaves.scons_extensions.first_target_emitter(). At least one task target must be specified in the task definition and the last target will always be the expected STDOUT and STDERR redirection output file,TARGETS[-1]ending in*.stdout.Warning
Users overriding the
emitterkeyword argument are responsible for providing an emitter with equivalent STDOUT file handling behavior aswaves.scons_extensions.first_target_emitter()or updating theaction_suffixoption to match their emitter’s behavior.With the default options this builder requires the following sources file provided in the order:
Fierro input file:
*.yaml
action string construction#${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}action string default expansion#${environment} cd ${TARGET.dir.abspath} && mpirun ${program_required} -np 1 fierro-parallel-explicit ${SOURCE.abspath} ${subcommand_options} > ${TARGETS[-1].abspath} 2>&1SConstruct#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_program, "AddProgram") env["FIERRO_EXPLICIT_PROGRAM"] = env.AddProgram(["fierro-parallel-explicit"]) env.Append(BUILDERS={ "FierroExplicit": waves.scons_extensions.fierro_explicit_builder_factory( subcommand=env["FIERRO_EXPLICIT_PROGRAM"] ) }) env.FierroExplicit( target=["target.stdout"], source=["source.yaml"], )
The builder returned by this factory accepts all SCons Builder arguments. The arguments of this function are also available as keyword arguments of the builder. When provided during task definition, the task keyword arguments override the builder keyword arguments.
- Parameters:
environment – This variable is intended primarily for use with builders and tasks that can not execute from an SCons construction environment. For instance, when tasks execute on a remote server with SSH wrapped actions using
waves.scons_extensions.ssh_builder_actions()and therefore must initialize the remote environment as part of the builder action.action_prefix – This variable is intended to perform directory change operations prior to program execution
program – The mpirun absolute or relative path
program_required – Space delimited string of required mpirun options and arguments that are crucial to builder behavior and should not be modified except by advanced users
program_options – Space delimited string of optional mpirun options and arguments that can be freely modified by the user
subcommand – The Fierro absolute or relative path
subcommand_required – Space delimited string of required Fierro options and arguments that are crucial to builder behavior and should not be modified except by advanced users.
subcommand_options – Space delimited string of optional Fierro options and arguments that can be freely modified by the user
action_suffix – This variable is intended to perform program STDOUT and STDERR redirection operations.
emitter – An SCons emitter function. This is not a keyword argument in the action string.
kwargs – Any additional keyword arguments are passed directly to the SCons builder object.
- Returns:
Fierro explicit builder
- waves.scons_extensions.fierro_implicit_builder_factory(environment: str = '', action_prefix: str = 'cd ${TARGET.dir.abspath} &&', program: str = 'mpirun', program_required: str = '', program_options: str = '-np 1', subcommand: str = 'fierro-parallel-implicit', subcommand_required: str = '${SOURCE.abspath}', subcommand_options: str = '', action_suffix: str = '> ${TARGETS[-1].abspath} 2>&1', emitter: ~collections.abc.Callable[[list, list, ~SCons.Script.SConscript.SConsEnvironment], tuple[list, list]] = <function first_target_emitter>, **kwargs) Builder[source]
Fierro implicit builder factory.
Warning
This is an experimental builder. It is subject to change without warning.
This builder factory extends
waves.scons_extensions.first_target_builder_factory(). This builder factory uses thewaves.scons_extensions.first_target_emitter(). At least one task target must be specified in the task definition and the last target will always be the expected STDOUT and STDERR redirection output file,TARGETS[-1]ending in*.stdout.With the default options this builder requires the following sources file provided in the order:
Fierro input file:
*.yaml
action string construction#${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}action string default expansion#${environment} cd ${TARGET.dir.abspath} && mpirun ${program_required} -np 1 fierro-parallel-implicit ${SOURCE.abspath} ${subcommand_options} > ${TARGETS[-1].abspath} 2>&1SConstruct#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_program, "AddProgram") env["FIERRO_IMPLICIT_PROGRAM"] = env.AddProgram(["fierro-parallel-implicit"]) env.Append(BUILDERS={ "FierroImplicit": waves.scons_extensions.fierro_implicit_builder_factory( subcommand=env["FIERRO_IMPLICIT_PROGRAM"] ) }) env.FierroImplicit( target=["target.stdout"], source=["source.yaml"], )
The builder returned by this factory accepts all SCons Builder arguments. The arguments of this function are also available as keyword arguments of the builder. When provided during task definition, the task keyword arguments override the builder keyword arguments.
- Parameters:
environment – This variable is intended primarily for use with builders and tasks that can not execute from an SCons construction environment. For instance, when tasks execute on a remote server with SSH wrapped actions using
waves.scons_extensions.ssh_builder_actions()and therefore must initialize the remote environment as part of the builder action.action_prefix – This variable is intended to perform directory change operations prior to program execution
program – The mpirun absolute or relative path
program_required – Space delimited string of required mpirun options and arguments that are crucial to builder behavior and should not be modified except by advanced users
program_options – Space delimited string of optional mpirun options and arguments that can be freely modified by the user
subcommand – The Fierro absolute or relative path
subcommand_required – Space delimited string of required Fierro options and arguments that are crucial to builder behavior and should not be modified except by advanced users.
subcommand_options – Space delimited string of optional Fierro options and arguments that can be freely modified by the user
action_suffix – This variable is intended to perform program STDOUT and STDERR redirection operations.
emitter – An SCons emitter function. This is not a keyword argument in the action string.
kwargs – Any additional keyword arguments are passed directly to the SCons builder object.
- Returns:
Fierro implicit builder
- waves.scons_extensions.find_program(env: SConsEnvironment, names: Sequence[str]) str | None[source]
Search for a program from a list of possible program names.
Returns the absolute path of the first program name found. If path parts contain spaces, the part will be wrapped in double quotes.
Example search for an executable named “program”#import waves env = Environment() env.AddMethod(waves.scons_extensions.find_program, "FindProgram") env["PROGRAM"] = env.FindProgram(["program"])
- Parameters:
env – The SCons construction environment object to modify
names – list of string program names. May include an absolute path.
- Returns:
Absolute path of the found program. None if none of the names are found.
- waves.scons_extensions.first_target_builder_factory(environment: str = '', action_prefix: str = 'cd ${TARGET.dir.abspath} &&', program: str = '', program_required: str = '', program_options: str = '', subcommand: str = '', subcommand_required: str = '', subcommand_options: str = '', action_suffix: str = '> ${TARGETS[-1].abspath} 2>&1', emitter: ~collections.abc.Callable[[list, list, ~SCons.Script.SConscript.SConsEnvironment], tuple[list, list]] = <function first_target_emitter>, **kwargs) Builder[source]
Template builder factory with WAVES default action behaviors and a task STDOUT file emitter.
This builder factory extends
waves.scons_extensions.builder_factory()to provide a template action string with placeholder keyword arguments and WAVES builder default behavior. The default behavior will not do anything unless theprogramorsubcommandargument is updated to include an executable program. This builder factory uses thewaves.scons_extensions.first_target_emitter(). At least one task target must be specified in the task definition and the last target will always be the expected STDOUT and STDERR redirection output file,TARGETS[-1]ending in*.stdout.Warning
Users overriding the
emitterkeyword argument are responsible for providing an emitter with equivalent STDOUT file handling behavior aswaves.scons_extensions.first_target_emitter()or updating theaction_suffixoption to match their emitter’s behavior.action string construction#${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}action string default expansion#${environment} cd ${TARGET.dir.abspath} && ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} > ${TARGETS[-1].abspath} 2>&1- Parameters:
environment – This variable is intended primarily for use with builders and tasks that can not execute from an SCons construction environment. For instance, when tasks execute on a remote server with SSH wrapped actions using
waves.scons_extensions.ssh_builder_actions()and therefore must initialize the remote environment as part of the builder action.action_prefix – This variable is intended to perform directory change operations prior to program execution. By default, SCons performs actions in the parent directory of the SConstruct file. However, many computational science and engineering programs leave output files in the current working directory, so it is convenient and sometimes necessary to change to the target’s parent directory prior to execution.
program – This variable is intended to containg the primary command line executable absolute or relative path
program_required – This variable is intended to contain a space delimited string of required program options and arguments that are crucial to builder behavior and should not be modified except by advanced users.
program_options – This variable is intended to contain a space delimited string of optional program options and arguments that can be freely modified by the user.
subcommand – This variable is intended to contain the program’s subcommand. If the program variable is set to a launch controlling program, e.g.
mpirunorcharmrun, then the subcommand may need to contain the full target executable program and any subcommands.subcommand_required – This variable is intended to contain a space delimited string of subcommand required options and arguments that are crucial to builder behavior and should not be modified except by advanced users.
subcommand_options – This variable is intended to contain a space delimited string of optional subcommand options and arguments that can be freely modified by the user.
action_suffix – This variable is intended to perform program STDOUT and STDERR redirection operations. By default, SCons streams all STDOUT and STDERR to the terminal. However, in long or parallel workflows this may clutter the terminal and make it difficult to isolate critical debugging information, so it is convenient to redirect each program’s output to a task specific log file for later inspection and troubleshooting.
emitter – An SCons emitter function. This is not a keyword argument in the action string.
kwargs – Any additional keyword arguments are passed directly to the SCons builder object.
- Returns:
SCons template builder
- waves.scons_extensions.first_target_emitter(
- target: list,
- source: list,
- env: SConsEnvironment,
- suffixes: Iterable[str] | None = None,
- appending_suffixes: Iterable[str] | None = None,
- stdout_extension: str = '.stdout',
SCons emitter function that emits new targets based on the first target.
Searches for a file ending in the stdout extension. If none is found, creates a target by appending the stdout extension to the first target in the
targetlist. The associated Builder requires at least one target for this reason. The stdout file is always placed at the end of the returned target list.This is an SCons emitter function and not an emitter factory. The suffix arguments:
suffixesandappending_suffixesare only relevant for developers writing new emitters which call this function as a base. The suffixes list emits targets where the suffix replaces the first target’s suffix, e.g. fortarget.extemit a new targettarget.suffix. The appending suffixes list emits targets where the suffix appends the first target’s suffix, e.g. fortarget.extemit a new targettarget.ext.appending_suffix.The emitter will assume all emitted targets build in the current build directory. If the target(s) must be built in a build subdirectory, e.g. in a parameterized target build, then the first target must be provided with the build subdirectory, e.g.
parameter_set1/target.ext. When in doubt, provide a STDOUT redirect file with the.stdoutextension as a target, e.g.target.stdoutorparameter_set1/target.stdout.- Parameters:
target – The target file list of strings
source – The source file list of SCons.Node.FS.File objects
env – The builder’s SCons construction environment object
suffixes – Suffixes which should replace the first target’s extension
appending_suffixes – Suffixes which should append the first target’s extension
stdout_extension – The extension used by the STDOUT/STDERR redirect file
- Returns:
target, source
- waves.scons_extensions.matlab_script(
- program: str = 'matlab',
- action_prefix: str = 'cd ${TARGET.dir.abspath} &&',
- action_suffix: str = '> ${TARGETS[-1].abspath} 2>&1',
- environment_suffix: str = '> ${TARGETS[-2].abspath} 2>&1',
Matlab script SCons builder.
Warning
Experimental implementation is subject to change
This builder requires that the Matlab script to execute is the first source in the list. The builder returned by this function accepts all SCons Builder arguments. The arguments of this function are also available as keyword arguments of the builder. When provided during task definition, the keyword arguments override the builder returned by this function.
Builder/Task keyword arguments
program: The Matlab command line executable absolute or relative pathmatlab_options: The Matlab command line options provided as a string.script_options: The Matlab function interface options in Matlab syntax and provided as a string.action_prefix: Advanced behavior. Most users should accept the defaultsaction_suffix: Advanced behavior. Most users should accept the defaults.environment_suffix: Advanced behavior. Most users should accept the defaults.
The parent directory absolute path is added to the Matlab
pathvariable prior to execution. All required Matlab files should be co-located in the same source directory.At least one target must be specified. The first target determines the working directory for the builder’s action, as shown in the action code snippet below. The action changes the working directory to the first target’s parent directory prior to executing the python script.
The Builder emitter will append the builder managed targets automatically. Appends
target[0].matlab.env and ``target[0].stdout to thetargetlist.The emitter will assume all emitted targets build in the current build directory. If the target(s) must be built in a build subdirectory, e.g. in a parameterized target build, then the first target must be provided with the build subdirectory, e.g.
parameter_set1/my_target.ext. When in doubt, provide a STDOUT redirect file as a target, e.g.target.stdout.Matlab script builder action keywords#${action_prefix} ${program} ${matlab_options} -batch "path(path, '${SOURCE.dir.abspath}'); [fileList, productList] = matlab.codetools.requiredFilesAndProducts('${SOURCE.file}'); disp(cell2table(fileList)); disp(struct2table(productList, 'AsArray', true)); exit;" ${environment_suffix} ${action_prefix} ${program} ${matlab_options} -batch "path(path, '${SOURCE.dir.abspath}'); ${SOURCE.filebase}(${script_options})" ${action_suffix}Matlab script builder action default expansion#cd ${TARGET.dir.abspath} && matlab ${matlab_options} -batch "path(path, '${SOURCE.dir.abspath}'); [fileList, productList] = matlab.codetools.requiredFilesAndProducts('${SOURCE.file}'); disp(cell2table(fileList)); disp(struct2table(productList, 'AsArray', true)); exit;" > ${TARGETS[-2].abspath} 2>&1 cd ${TARGET.dir.abspath} && matlab ${matlab_options} -batch "path(path, '${SOURCE.dir.abspath}'); ${SOURCE.filebase}(${script_options})" > ${TARGETS[-1].abspath} 2>&1- Parameters:
program – An absolute path or basename string for the Matlab program.
action_prefix – Advanced behavior. Most users should accept the defaults.
action_suffix – Advanced behavior. Most users should accept the defaults.
environment_suffix – Advanced behavior. Most users should accept the defaults.
- Returns:
Matlab script builder
- waves.scons_extensions.parameter_study_sconscript(
- env: SConsEnvironment,
- *args,
- variant_dir: str | Path | None = None,
- exports: dict | None = None,
- study: dict | ParameterGenerator | None = None,
- set_name: str = '',
- subdirectories: bool = False,
- **kwargs,
Wrap the SCons SConscript call to unpack parameter generators.
Always overrides the
exportsdictionary withset_nameandparameterskeys. Whenstudyis a dictionary or parameter generator, theparametersare overridden. Whenstudyis a parameter generator, theset_nameis overridden.If the study is a WAVES parameter generator object, call SConscript once per
set_nameandparametersin the generator’s parameter study dictionary.If the study is a
dict, call SConscript with the study asparametersand use theset_namefrom the method API.In all other cases, the SConscript call is given the
set_namefrom the method API and an emptyparametersdictionary.
SConstruct#import pathlib import waves env = Environment() env.Append(BUILDERS={ "AbaqusJournal": waves.scons_extensions.abaqus_journal(), "AbaqusSolver": waves.scons_extensions.abaqus_solver() }) env.AddMethod(waves.scons_extensions.parameter_study_sconscript, "ParameterStudySConscript") parameter_study_file = pathlib.Path("parameter_study.h5") parameter_generator = waves.parameter_generators.CartesianProduct( {"parameter_one": [1, 2, 3]}, output_file=parameter_study_file, previous_parameter_study=parameter_study_file ) studies = ( ("SConscript", parameter_generator), ("SConscript", {"parameter_one": 1}) ) for workflow, study in studies: env.ParameterStudySConscript(workflow, variant_dir="build", study=study, subdirectories=True)
SConscript#Import("env", "set_name", "parameters") env.AbaqusJournal( target=["job.inp"], source=["journal.py"], journal_options="--input=${SOURCE.abspath} --output=${TARGET.abspath} --option ${parameter_one}" **parameters ) env.AbaqusSolver( target=["job.odb"], source=["job.inp"], job="job", **parameters )
- Parameters:
env – SCons construction environment. Do not provide when using this function as a construction environment method, e.g.
env.ParameterStudySConscript.args – All positional arguments are passed through to the SConscript call directly
variant_dir – The SConscript API variant directory argument
exports – Dictionary of
{key: value}pairs for theexportsvariables. Must use the dictionary style because the calling script’s namespace is not available to the function namespace.study – Parameter generator or dictionary simulation parameters
set_name – Set name to use when not provided a
study. Overridden by thestudyset names whenstudyis a parameter generator.kwargs – All other keyword arguments are passed through to the SConscript call directly
subdirectories – Switch to use parameter generator
studyset names as subdirectories. Ignored whenstudyis not a parameter generator.
- Returns:
SConscript
Export()variables. When called with a parameter generator study, theExport()variables are returned as a list with one entry per parameter set.- Raises:
TypeError – if
exportsis not a dictionary
- waves.scons_extensions.parameter_study_task(
- env: SConsEnvironment,
- builder: Builder,
- *args,
- study: dict | ParameterGenerator | None = None,
- subdirectories: bool = False,
- **kwargs,
Parameter study pseudo-builder.
SCons Pseudo-Builder aids in task construction for WAVES parameter studies with any SCons builder. Works with WAVES parameter generators or parameter dictionaries to reduce parameter study task definition boilerplate and make nominal workflow definitions directly re-usable in parameter studies.
If the study is a WAVES parameter generator object, loop over the parameter sets and replace
@{set_name}in any*argsand**kwargsthat are strings, paths, or lists of strings and paths.If the study is a
dict(), unpack the dictionary as keyword arguments directly into the builder.In all other cases, the task is passed through unchanged to the builder and the study variable is ignored.
When chaining parameter study tasks, arguments belonging to the parameter study can be prefixed by the template
@{set_name}source.ext. If the task uses a parameter study, the set name prefix will be replaced to match the task path modifications, e.g.parameter_set0_source.extorparameter_set0/source.extdepending on thesubdirectoriesboolean. If the task is not part of a parameter study, the set name will be removed from the source, e.g.source.ext. The@symbol is used as the delimiter to reduce with clashes in shell variable syntax and SCons substitution syntax.When pseudo-builders are added to the environment with the SCons AddMethod function they can be accessed with the same syntax as a normal builder. When called from the construction environment, the
envargument is omitted.This pseudo-builder is most powerful when used in an SConscript call to a separate workflow configuration file. The SConscript file can then be called with a nominal parameter dictionary or with a parameter generator object. See the example below.
SConstruct#import pathlib import waves env = Environment() env.Append(BUILDERS={ "AbaqusJournal": waves.scons_extensions.abaqus_journal(), "AbaqusSolver": waves.scons_extensions.abaqus_solver() }) env.AddMethod(waves.scons_extensions.parameter_study_task, "ParameterStudyTask") parameter_study_file = pathlib.Path("parameter_study.h5") parameter_generator = waves.parameter_generators.CartesianProduct( {"parameter_one": [1, 2, 3]}, output_file=parameter_study_file, previous_parameter_study=parameter_study_file ) studies = ( ("SConscript", parameter_generator), ("SConscript", {"parameter_one": 1}) ) for workflow, study in studies: SConscript(workflow, exports={"env": env, "study": study})
SConscript#Import("env", "study") env.ParameterStudyTask( env.AbaqusJournal, target=["@{set_name}job.inp"], source=["journal.py"], journal_options="--input=${SOURCE.abspath} --output=${TARGET.abspath} --option ${parameter_one}" study=study, subdirectories=True, ) env.ParameterStudyTask( env.AbaqusSolver, target=["@{set_name}job.odb"], source=["@{set_name}job.inp"], job="job", study=study, subdirectories=True, )
- Parameters:
env – An SCons construction environment to use when defining the targets.
builder – The builder to parameterize
args – All other positional arguments are passed through to the builder after
@{set_name}string substitutionsstudy – Parameter generator or dictionary parameter set to provide to the builder. Parameter generators are unpacked with set name directory prefixes. Dictionaries are unpacked as keyword arguments.
subdirectories – Switch to use parameter generator
studyset names as subdirectories. Ignored whenstudyis not a parameter generator.kwargs – all other keyword arguments are passed through to the builder after
@{set_name}string substitutions
- Returns:
SCons NodeList of target nodes
- waves.scons_extensions.parameter_study_write(
- env: SConsEnvironment,
- parameter_generator: ParameterGenerator,
- **kwargs,
Pseudo-builder to write a parameter generator’s parameter study file.
SConstruct#import pathlib import waves env = Environment() env.AddMethod(waves.scons_extensions.parameter_study_write, "ParameterStudyWrite") parameter_study_file = pathlib.Path("parameter_study.h5") parameter_generator = waves.parameter_generators.CartesianProduct( {"parameter_one": [1, 2, 3]}, output_file=parameter_study_file, previous_parameter_study=parameter_study_file ) env.ParameterStudyWrite(parameter_generator)
- Parameters:
parameter_generator – WAVES ParameterGenerator class
kwargs – All other keyword arguments are passed directly to the
waves.parameter_generators.ParameterGenerator.write()method.
- Returns:
SCons NodeList of target nodes
- waves.scons_extensions.print_action_signature_string(
- s: str,
- target: list,
- source: list,
- env: SConsEnvironment,
Print the action string used to calculate the action signature.
Designed to behave similarly to SCons
--debug=presuboption usingPRINT_CMD_LINE_FUNCfeature: https://scons.org/doc/production/HTML/scons-man.html#cv-PRINT_CMD_LINE_FUNCSConstruct#import waves env = Environment(PRINT_CMD_LINE_FUNC=waves.scons_extensions.print_action_signature_string) env.Command( target=["target.txt"], source=["SConstruct"], action=["echo 'Hello World!' > ${TARGET.relpath}"] ) .. code-block:: :caption: shell $ scons target.txt scons: Reading SConscript files ... scons: done reading SConscript files. scons: Building targets ... Building target.txt with action signature string: echo 'Hello World!' > target.txt_relpath echo 'Hello World!' > target.txt scons: done building targets.
- waves.scons_extensions.print_build_failures(env: SConsEnvironment | None = None, print_stdout: bool = True) None[source]
On exit, query the SCons reported build failures and print the associated node’s STDOUT file, if it exists.
SConstruct#AddOption( "--print-build-failures", dest="print_build_failures", default=False, action="store_true", help="Print task *.stdout target file(s) on build failures. (default: '%default')" ) env = Environment( print_build_failures=GetOption("print_build_failures") ) env.AddMethod(waves.scons_extensions.print_build_failures, "PrintBuildFailures") env.PrintBuildFailures(print_stdout=env["print_build_failures"])
- Parameters:
env – SCons construction environment
print_stdout – Boolean to set the exit behavior. If False, don’t modify the exit behavior.
- waves.scons_extensions.project_alias(
- env: SConsEnvironment = None,
- *args,
- description: str = '',
- target_descriptions: dict = {},
- **kwargs,
Compile and return a dictionary of
{alias: description}pairs.Wrapper around the SCons Alias method. Appends and returns target descriptions dictionary.
- Parameters:
env – The SCons construction environment object to modify.
args – All other positional arguments are passed to the SCons Alias method.
description – String representing metadata of the alias.
target_descriptions – Mutable dictionary used to keep track of all alias’s metadata. If the function is called with a user-supplied dictionary, the accumulated target descriptions are reset to match the provided dictionary and all previously accumulated descriptions are discarded. If an existing alias is called it will overwrite the previous description.
kwargs – All other keyword arguments are passed to the SCons Alias method.
- Returns:
target descriptions dictionary
- waves.scons_extensions.project_help(
- env: SConsEnvironment | None = None,
- append: bool = True,
- local_only: bool = True,
- target_descriptions: dict | None = None,
Add default targets and alias lists to project help message.
See the SCons Help documentation for appending behavior. Thin wrapper around
- Parameters:
env – The SCons construction environment object to modify
append – append to the
env.Helpmessage (default). When False, theenv.Helpmessage will be overwritten ifenv.Helphas not been previously called.local_only – Limit help message to the project specific content when True. Only applies to SCons >=4.6.0
target_descriptions – dictionary containing target metadata.
- waves.scons_extensions.project_help_aliases(
- env: SConsEnvironment | None = None,
- append: bool = True,
- local_only: bool = True,
- target_descriptions: dict | None = None,
Add the alias list to the project’s help message.
See the SCons Help documentation for appending behavior. Adds text to the project help message formatted as
Target Aliases: Alias_1 Alias_2
where the aliases are recovered from
SCons.Node.Alias.default_ans.- Parameters:
env – The SCons construction environment object to modify
append – append to the
env.Helpmessage (default). When False, theenv.Helpmessage will be overwritten ifenv.Helphas not been previously called.local_only – Limit help message to the project specific content when True. Only applies to SCons >=4.6.0
target_descriptions – dictionary containing target metadata.
- waves.scons_extensions.project_help_default_targets(
- env: SConsEnvironment | None = None,
- append: bool = True,
- local_only: bool = True,
- target_descriptions: dict | None = None,
Add a default targets list to the project’s help message.
See the SCons Help documentation for appending behavior. Adds text to the project help message formatted as
Default Targets: Default_Target_1 Default_Target_2
where the targets are recovered from
SCons.Script.DEFAULT_TARGETS.- Parameters:
env – The SCons construction environment object to modify
append – append to the
env.Helpmessage (default). When False, theenv.Helpmessage will be overwritten ifenv.Helphas not been previously called.local_only – Limit help message to the project specific content when True. Only applies to SCons >=4.6.0
target_descriptions – dictionary containing target metadata.
- waves.scons_extensions.python_builder_factory(environment: str = '', action_prefix: str = 'cd ${TARGET.dir.abspath} &&', program: str = 'python', program_required: str = '', program_options: str = '', subcommand: str = '${SOURCE.abspath}', subcommand_required: str = '', subcommand_options: str = '', action_suffix: str = '> ${TARGETS[-1].abspath} 2>&1', emitter: ~collections.abc.Callable[[list, list, ~SCons.Script.SConscript.SConsEnvironment], tuple[list, list]] = <function first_target_emitter>, **kwargs) Builder[source]
Python builder factory.
This builder factory extends
waves.scons_extensions.first_target_builder_factory(). This builder factory uses thewaves.scons_extensions.first_target_emitter(). At least one task target must be specified in the task definition and the last target will always be the expected STDOUT and STDERR redirection output file,TARGETS[-1]ending in*.stdout.Warning
Users overriding the
emitterkeyword argument are responsible for providing an emitter with equivalent STDOUT file handling behavior aswaves.scons_extensions.first_target_emitter()or updating theaction_suffixoption to match their emitter’s behavior.With the default options this builder requires the following sources file provided in the order:
Python script:
*.py
action string construction#${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}action string default expansion#${environment} cd ${TARGET.dir.abspath} && python ${program_required} ${program_options} ${SOURCE.abspath} ${subcommand_required} ${subcommand_options} > ${TARGETS[-1].abspath} 2>&1SConstruct#import waves env = Environment() env.Append(BUILDERS={"PythonScript": waves.scons_extensions.python_builder_factory()}) env.PythonScript(target=["my_output.stdout"], source=["my_script.py"])
The builder returned by this factory accepts all SCons Builder arguments. The arguments of this function are also available as keyword arguments of the builder. When provided during task definition, the task keyword arguments override the builder keyword arguments.
- Parameters:
environment – This variable is intended primarily for use with builders and tasks that can not execute from an SCons construction environment. For instance, when tasks execute on a remote server with SSH wrapped actions using
waves.scons_extensions.ssh_builder_actions()and therefore must initialize the remote environment as part of the builder action.action_prefix – This variable is intended to perform directory change operations prior to program execution
program – The Python interpreter absolute or relative path
program_required – Space delimited string of required Python interpreter options and arguments that are crucial to builder behavior and should not be modified except by advanced users
program_options – Space delimited string of optional Python interpreter options and arguments that can be freely modified by the user
subcommand – The Python script absolute or relative path
subcommand_required – Space delimited string of required Python script options and arguments that are crucial to builder behavior and should not be modified except by advanced users.
subcommand_options – Space delimited string of optional Python script options and arguments that can be freely modified by the user
action_suffix – This variable is intended to perform program STDOUT and STDERR redirection operations.
emitter – An SCons emitter function. This is not a keyword argument in the action string.
kwargs – Any additional keyword arguments are passed directly to the SCons builder object.
- Returns:
Python builder
- waves.scons_extensions.quinoa_builder_factory(environment: str = '', action_prefix: str = 'cd ${TARGET.dir.abspath} &&', program: str = 'charmrun', program_required: str = '', program_options: str = '+p1', subcommand: str = 'inciter', subcommand_required: str = '--control ${SOURCES[0].abspath} --input ${SOURCES[1].abspath}', subcommand_options: str = '', action_suffix: str = '> ${TARGETS[-1].abspath} 2>&1', emitter: ~collections.abc.Callable[[list, list, ~SCons.Script.SConscript.SConsEnvironment], tuple[list, list]] = <function first_target_emitter>, **kwargs) Builder[source]
Quinoa builder factory.
This builder factory extends
waves.scons_extensions.first_target_builder_factory(). This builder factory uses thewaves.scons_extensions.first_target_emitter(). At least one task target must be specified in the task definition and the last target will always be the expected STDOUT and STDERR redirection output file,TARGETS[-1]ending in*.stdout.Warning
Users overriding the
emitterkeyword argument are responsible for providing an emitter with equivalent STDOUT file handling behavior aswaves.scons_extensions.first_target_emitter()or updating theaction_suffixoption to match their emitter’s behavior.With the default options this builder requires the following sources file provided in the order:
Quinoa control file:
*.qExodus mesh file:
*.exo
action string construction#${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}action string default expansion#${environment} cd ${TARGET.dir.abspath} && charmrun ${program_required} +p1 inciter --control ${SOURCES[0].abspath} --input ${SOURCES[1].abspath} ${subcommand_options} > ${TARGETS[-1].abspath} 2>&1SConstruct#import waves env = waves.scons_extensions.shell_environment("module load quinoa") env.Append(BUILDERS={ "QuinoaSolver": waves.scons_extensions.quinoa_builder_factory(), }) # Serial execution with "+p1" env.QuinoaSolver(target=["flow.stdout"], source=["flow.lua", "box.exo"]) # Parallel execution with "+p4" env.QuinoaSolver(target=["flow.stdout"], source=["flow.lua", "box.exo"], program_options="+p4")
The builder returned by this factory accepts all SCons Builder arguments. The arguments of this function are also available as keyword arguments of the builder. When provided during task definition, the task keyword arguments override the builder keyword arguments.
- Parameters:
environment – This variable is intended primarily for use with builders and tasks that can not execute from an SCons construction environment. For instance, when tasks execute on a remote server with SSH wrapped actions using
waves.scons_extensions.ssh_builder_actions()and therefore must initialize the remote environment as part of the builder action.action_prefix – This variable is intended to perform directory change operations prior to program execution
program – The charmrun absolute or relative path
program_required – Space delimited string of required charmrun options and arguments that are crucial to builder behavior and should not be modified except by advanced users
program_options – Space delimited string of optional charmrun options and arguments that can be freely modified by the user
subcommand – The inciter (quinoa executable) absolute or relative path
subcommand_required – Space delimited string of required inciter (quinoa executable) options and arguments that are crucial to builder behavior and should not be modified except by advanced users.
subcommand_options – Space delimited string of optional inciter (quinoa executable) options and arguments that can be freely modified by the user
action_suffix – This variable is intended to perform program STDOUT and STDERR redirection operations.
emitter – An SCons emitter function. This is not a keyword argument in the action string.
kwargs – Any additional keyword arguments are passed directly to the SCons builder object.
- Returns:
Quinoa builder
- waves.scons_extensions.sbatch(
- program: str = 'sbatch',
- required: str = '--wait --output=${TARGETS[-1].abspath}',
- action_prefix: str = 'cd ${TARGET.dir.abspath} &&',
-
The builder returned by this function accepts all SCons Builder arguments. The arguments of this function are also available as keyword arguments of the builder. When provided during task definition, the keyword arguments override the builder returned by this function.
Builder/Task keyword arguments
program: The sbatch command line executable absolute or relative pathrequired: A space delimited string of sbatch required argumentsslurm_job: The command to submit with sbatchsbatch_options: Optional sbatch optionsaction_prefix: Advanced behavior. Most users should accept the defaults
The builder does not use a SLURM batch script. Instead, it requires the
slurm_jobvariable to be defined with the command string to execute.At least one target must be specified. The first target determines the working directory for the builder’s action, as shown in the action code snippet below. The action changes the working directory to the first target’s parent directory prior to executing the journal file.
The Builder emitter will append the builder managed targets automatically. Appends
target[0].stdout to thetargetlist.SLURM sbatch builder action keywords#${action_prefix} ${program} ${required} ${sbatch_options} --wrap "${slurm_job}"SLURM sbatch builder action default expansion#cd ${TARGET.dir.abspath} && sbatch --wait --output=${TARGETS[-1].abspath} ${sbatch_options} --wrap "${slurm_job}"SConstruct#import waves env = Environment() env.Append(BUILDERS={"SlurmSbatch": waves.scons_extensions.sbatch()}) env.SlurmSbatch(target=["my_output.stdout"], source=["my_source.input"], slurm_job="cat $SOURCE > $TARGET")
- Parameters:
program – An absolute path or basename string for the sbatch program.
required – A space delimited string of sbatch required arguments
action_prefix – Advanced behavior. Most users should accept the defaults.
- Returns:
SLURM sbatch builder
- waves.scons_extensions.sbatch_abaqus_journal(*args, **kwargs) Builder[source]
Thin pass through wrapper of
waves.scons_extensions.abaqus_journal().Catenate the actions and submit with SLURM sbatch. Accepts the
sbatch_optionsbuilder keyword argument to modify sbatch behavior.Sbatch Abaqus journal builder action keywords#sbatch --wait --output=${TARGET.base}.slurm.out ${sbatch_options} --wrap "${action_prefix} ${program} -information environment ${environment_suffix} && ${action_prefix} ${program} ${required} ${abaqus_options} -- ${journal_options} ${action_suffix}"Sbatch Abaqus journal builder action default expansion#sbatch --wait --output=${TARGET.base}.slurm.out ${sbatch_options} --wrap "cd ${TARGET.dir.abspath} && abaqus cae -noGui ${SOURCE.abspath} ${abaqus_options} -- ${journal_options} > ${TARGETS[-1].abspath} 2>&1"
- waves.scons_extensions.sbatch_abaqus_journal_builder_factory(*args, **kwargs) Builder[source]
Thin pass through wrapper of
waves.scons_extensions.abaqus_journal_builder_factory().Catenate the actions and submit with SLURM sbatch. Accepts the
sbatch_optionsbuilder keyword argument to modify sbatch behavior.action string construction#sbatch --wait --output=${TARGET.base}.slurm.out ${sbatch_options} --wrap "${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}"
- waves.scons_extensions.sbatch_abaqus_solver(*args, **kwargs) Builder[source]
Thin pass through wrapper of
waves.scons_extensions.abaqus_solver().Catenate the actions and submit with SLURM sbatch. Accepts the
sbatch_optionsbuilder keyword argument to modify sbatch behavior.Sbatch Abaqus solver builder action default expansion#sbatch --wait --output=${TARGET.base}.slurm.out ${sbatch_options} --wrap "${action_prefix} ${program} -job ${job_name} -input ${SOURCE.filebase} ${abaqus_options} ${required} ${action_suffix}"Sbatch Abaqus solver builder action default expansion#sbatch --wait --output=${TARGET.base}.slurm.out ${sbatch_options} --wrap "cd ${TARGET.dir.abspath} && ${program} -job ${job_name} -input ${SOURCE.filebase} ${abaqus_options} -interactive -ask_delete no > ${TARGETS[-1].abspath} 2>&1"
- waves.scons_extensions.sbatch_abaqus_solver_builder_factory(*args, **kwargs) Builder[source]
Thin pass through wrapper of
waves.scons_extensions.abaqus_solver_builder_factory().Catenate the actions and submit with SLURM sbatch. Accepts the
sbatch_optionsbuilder keyword argument to modify sbatch behavior.action string construction#sbatch --wait --output=${TARGET.base}.slurm.out ${sbatch_options} --wrap "${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}"
- waves.scons_extensions.sbatch_python_builder_factory(*args, **kwargs) Builder[source]
Thin pass through wrapper of
waves.scons_extensions.python_builder_factory().Catenate the actions and submit with SLURM sbatch. Accepts the
sbatch_optionsbuilder keyword argument to modify sbatch behavior.action string construction#sbatch --wait --output=${TARGET.base}.slurm.out ${sbatch_options} --wrap "${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}"
- waves.scons_extensions.sbatch_quinoa_builder_factory(*args, **kwargs) Builder[source]
Thin pass through wrapper of
waves.scons_extensions.quinoa_builder_factory().Catenate the actions and submit with SLURM sbatch. Accepts the
sbatch_optionsbuilder keyword argument to modify sbatch behavior.action string construction#sbatch --wait --output=${TARGET.base}.slurm.out ${sbatch_options} --wrap "${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}"
- waves.scons_extensions.sbatch_sierra_builder_factory(*args, **kwargs) Builder[source]
Thin pass through wrapper of
waves.scons_extensions.sierra_builder_factory().Catenate the actions and submit with SLURM sbatch. Accepts the
sbatch_optionsbuilder keyword argument to modify sbatch behavior.action string construction#sbatch --wait --output=${TARGET.base}.slurm.out ${sbatch_options} --wrap "${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}"
- waves.scons_extensions.shell_environment(
- command: str,
- shell: str = 'bash',
- cache: str | None = None,
- overwrite_cache: bool = False,
Return an SCons shell environment from a cached file or by running a shell command.
If the environment is created successfully and a cache file is requested, the cache file is _always_ written. The
overwrite_cachebehavior forces the shellcommandexecution, even when the cache file is present. If thecommandfails (raising asubprocess.CalledProcessError) the captured output is printed to STDERR before re-raising the exception.Warning
Currently assumes a nix flavored shell: sh, bash, zsh, csh, tcsh. May work with any shell supporting command construction as below.
{shell} -c "{command} && env -0"
The method may fail if the command produces stdout that does not terminate in a newline. Redirect command output away from stdout if this causes problems, e.g.
command = 'command > /dev/null && command two > /dev/null'in most shells.SConstruct#import waves env = waves.scons_extensions.shell_environment("source my_script.sh")
- Parameters:
command – the shell command to execute
shell – the shell to use when executing command by absolute or relative path
cache – absolute or relative path to read/write a shell environment dictionary. Will be written as YAML formatted file regardless of extension.
overwrite_cache – Ignore previously cached files if they exist.
- Returns:
SCons shell environment
- Raises:
subprocess.CalledProcessError – Print the captured output and re-raise exception when the shell command returns a non-zero exit status.
- waves.scons_extensions.sierra_builder_factory(environment: str = '', action_prefix: str = 'cd ${TARGET.dir.abspath} &&', program: str = 'sierra', program_required: str = '', program_options: str = '', subcommand: str = 'adagio', subcommand_required: str = '-i ${SOURCE.abspath}', subcommand_options: str = '', action_suffix: str = '> ${TARGETS[-1].abspath} 2>&1', emitter: ~collections.abc.Callable[[list, list, ~SCons.Script.SConscript.SConsEnvironment], tuple[list, list]] = <function first_target_emitter>, **kwargs) Builder[source]
Sierra builder factory.
This builder factory extends
waves.scons_extensions.first_target_builder_factory(). This builder factory uses thewaves.scons_extensions.first_target_emitter(). At least one task target must be specified in the task definition and the last target will always be the expected STDOUT and STDERR redirection output file,TARGETS[-1]ending in*.stdout.Warning
Users overriding the
emitterkeyword argument are responsible for providing an emitter with equivalent STDOUT file handling behavior aswaves.scons_extensions.first_target_emitter()or updating theaction_suffixoption to match their emitter’s behavior.With the default options this builder requires the following sources file provided in the order:
Sierra input file:
*.i
action string construction#${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}action string default expansion#${environment} cd ${TARGET.dir.abspath} && sierra ${program_required} ${program_options} adagio ${SOURCE.abspath} ${subcommand_options} > ${TARGETS[-1].abspath} 2>&1The builder returned by this factory accepts all SCons Builder arguments. The arguments of this function are also available as keyword arguments of the builder. When provided during task definition, the task keyword arguments override the builder keyword arguments.
- Parameters:
environment – This variable is intended primarily for use with builders and tasks that can not execute from an SCons construction environment. For instance, when tasks execute on a remote server with SSH wrapped actions using
waves.scons_extensions.ssh_builder_actions()and therefore must initialize the remote environment as part of the builder action.action_prefix – This variable is intended to perform directory change operations prior to program execution
program – The Sierra absolute or relative path
program_required – Space delimited string of required Sierra options and arguments that are crucial to builder behavior and should not be modified except by advanced users
program_options – Space delimited string of optional Sierra options and arguments that can be freely modified by the user
subcommand – The Sierra application absolute or relative path
subcommand_required – Space delimited string of required Sierra application options and arguments that are crucial to builder behavior and should not be modified except by advanced users.
subcommand_options – Space delimited string of optional Sierra application options and arguments that can be freely modified by the user
action_suffix – This variable is intended to perform program STDOUT and STDERR redirection operations.
emitter – An SCons emitter function. This is not a keyword argument in the action string.
kwargs – Any additional keyword arguments are passed directly to the SCons builder object.
- Returns:
Sierra builder
- waves.scons_extensions.sphinx_build(
- program: str = 'sphinx-build',
- options: str = '',
- builder: str = 'html',
- tags: str = '',
Sphinx builder using the
-bspecifier.This builder does not have an emitter. It requires at least one target.
action#${program} ${options} -b ${builder} ${TARGET.dir.dir.abspath} ${TARGET.dir.abspath} ${tags}SConstruct#import waves env = Environment() env.Append(BUILDERS={ "SphinxBuild": waves.scons_extensions.sphinx_build(options="-W"), }) sources = ["conf.py", "index.rst"] targets = ["html/index.html"] html = env.SphinxBuild( target=targets, source=sources, ) env.Clean(html, [Dir("html")] + sources) env.Alias("html", html)
- Parameters:
program – sphinx-build executable
options – sphinx-build options
builder – builder name. See the Sphinx documentation for options
tags – sphinx-build tags
- Returns:
Sphinx builder
- waves.scons_extensions.sphinx_latexpdf(
- program: str = 'sphinx-build',
- options: str = '',
- builder: str = 'latexpdf',
- tags: str = '',
Sphinx builder using the
-Mspecifier. Intended forlatexpdfbuilds.This builder does not have an emitter. It requires at least one target.
action#${program} -M ${builder} ${TARGET.dir.dir.abspath} ${TARGET.dir.dir.abspath} ${tags} ${options}"SConstruct#import waves env = Environment() env.Append(BUILDERS={ "SphinxPDF": waves.scons_extensions.sphinx_latexpdf(options="-W"), }) sources = ["conf.py", "index.rst"] targets = ["latex/project.pdf"] latexpdf = env.SphinxBuild( target=targets, source=sources, ) env.Clean(latexpdf, [Dir("latex")] + sources) env.Alias("latexpdf", latexpdf)
- Parameters:
program (str) – sphinx-build executable
options (str) – sphinx-build options
builder (str) – builder name. See the Sphinx documentation for options
tags (str) – sphinx-build tags
- Returns:
Sphinx latexpdf builder
- waves.scons_extensions.sphinx_scanner() Scanner[source]
SCons scanner that searches for directives.
.. include::.. literalinclude::.. image::.. figure::.. bibliography::
inside
.rstand.txtfiles- Returns:
Sphinx source file dependency Scanner
- waves.scons_extensions.ssh_builder_actions(
- builder: Builder,
- remote_server: str = '',
- remote_directory: str = '',
- rsync_push_options: str = '-rlptv',
- rsync_pull_options: str = '-rlptv',
- ssh_options: str = '',
Wrap and modify a builder’s action list with remote copy operations and SSH commands.
Warning
This builder does not provide asynchronous server-client behavior. The local/client machine must maintain the SSH connection continuously throughout the duration of the task. If the SSH connection is interrupted, the task will fail. This makes SSH wrapped builders fragile with respect to network connectivity. Users are strongly encouraged to seek solutions that allow full software installation and full workflow execution on the target compute server. If mixed server execution is required, a build directory on a shared network drive and interrupted workflow execution should be preferred over SSH wrapped builders.
If the remote server and remote directory strings are not specified at builder instantation, then the task definitions must specify these keyword arguments. If a portion of the remote server and/or remote directory are known to be constant across all possible tasks, users may define their own substitution keyword arguments. For example, the following remote directory uses common leading path elements and introduces a new keyword variable
task_directoryto allow per-task changes to the remote directory:remote_directory="/path/to/base/build/${task_directory}".Warning
The
waves.scons_extensions.ssh_builder_actions()is a work-in-progress solution with some assumptions specific to the action construction used by WAVES. It _should_ work for most basic builders, but adapting this function to users’ custom builders will probably require some advanced SCons knowledge and inspection of thewaves.scons_extensions_ssh_builder_actions()implementation.Builder/Task keyword arguments
remote_server: remote server where the original builder’s actions should be executedremote_directory: absolute or relative path where the original builder’s actions should be executed.rsync_push_options: rsync options when pushing sources to the remote serverrsync_pull_options: rsync options when pulling remote directory from the remote serverssh_options: SSH options when running the original builder’s actions on the remote server
Design assumptions
Creates the
remote_directorywithmkdir -p.mkdirmust exist on theremote_server.Copies all source files to a flat
remote_directorywithrsync.rsyncmust exist on the local system.Replaces instances of
cd ${TARGET.dir.abspath} &&withcd ${remote_directory} &&in the original builder actions and keyword arguments.Replaces instances of
SOURCE.abspathorSOURCES.abspathwithSOURCE[S].filein the original builder actions and keyword arguments.Replaces instances of
SOURCES[0-9]/TARGETS[0-9].abspathwithSOURCES[0-9]/TARGETS[0-9].filein the original builder action and keyword arguments.Prefixes all original builder actions with
cd ${remote_directory} &&.All original builder actions are wrapped in single quotes as
'{original action}'to preserve the&&as part of theremote_servercommand. Shell variables, e.g.$USER, will not be expanded on theremote_server. If quotes are included in the original builder actions, they should be double quotes.Returns the entire
remote_directoryto the original builder${TARGET.dir.abspath}withrysnc.rsyncmust exist on the local system.
SConstruct#import getpass import waves user = getpass.getuser() env = Environment() env.Append(BUILDERS={ "SSHAbaqusSolver": waves.scons_extensions.ssh_builder_actions( waves.scons_extensions.abaqus_solver( program="/remote/server/installation/path/of/abaqus" ), remote_server="myserver.mydomain.com", remote_directory="/scratch/${user}/myproject/myworkflow/${task_directory}" ) }) env.SSHAbaqusSolver( target=["myjob.sta"], source=["input.inp"], job_name="myjob", abaqus_options="-cpus 4", task_directory="myjob", user=user )
my_package.py#import SCons.Builder import waves def print_builder_actions(builder): for action in builder.action.list: print(action.cmd_list) def cat(): builder = SCons.Builder.Builder( action=[ "cat ${SOURCES.abspath} | tee ${TARGETS[0].abspath}", "echo \\"Hello World!\\"" ] ) return builder build_cat = cat() ssh_build_cat = waves.scons_extensions.ssh_builder_actions( cat(), remote_server="myserver.mydomain.com", remote_directory="/scratch/roppenheimer/ssh_wrapper" )>>> import my_package >>> my_package.print_builder_actions(my_package.build_cat) cat ${SOURCES.abspath} | tee ${TARGETS[0].abspath} echo "Hello World!" >>> my_package.print_builder_actions(my_package.ssh_build_cat) ssh ${ssh_options} ${remote_server} "mkdir -p /scratch/roppenheimer/ssh_wrapper" rsync ${rsync_push_options} ${SOURCES.abspath} ${remote_server}:${remote_directory} ssh ${ssh_options} ${remote_server} 'cd ${remote_directory} && cat ${SOURCES.file} | tee ${TARGETS[0].file}' ssh ${ssh_options} ${remote_server} 'cd ${remote_directory} && echo "Hello World!"' rsync ${rsync_pull_options} ${remote_server}:${remote_directory} ${TARGET.dir.abspath}
- Parameters:
builder – The SCons builder to modify
remote_server – remote server where the original builder’s actions should be executed
remote_directory – absolute or relative path where the original builder’s actions should be executed.
rsync_push_options – rsync options when pushing sources to the remote server
rsync_pull_options – rsync options when pulling remote directory from the remote server
ssh_options – SSH options when running the original builder’s actions on the remote server
- Returns:
modified builder
- waves.scons_extensions.substitution_syntax(
- env: SConsEnvironment,
- substitution_dictionary: dict,
- prefix: str = '@',
- suffix: str = '@',
Return a dictionary copy with the pre/suffix added to the key strings.
Assumes a flat dictionary with keys of type str. Keys that aren’t strings will be converted to their string representation. Nested dictionaries can be supplied, but only the first layer keys will be modified. Dictionary values are unchanged.
SConstruct#env = Environment() env.AddMethod(waves.scons_extensions.substitution_syntax, "SubstitutionSyntax") original_dictionary = {"key": "value"} substitution_dictionary = env.SubstitutionSyntax(original_dictionary)
- Parameters:
substitution_dictionary (dict) – Original dictionary to copy
prefix (string) – String to prepend to all dictionary keys
suffix (string) – String to append to all dictionary keys
- Returns:
Copy of the dictionary with key strings modified by the pre/suffix
- waves.scons_extensions.truchas_builder_factory(environment: str = '', action_prefix: str = 'cd ${TARGET.dir.dir.abspath} &&', program: str = 'mpirun', program_required: str = '', program_options: str = '-np 1', subcommand: str = 'truchas', subcommand_required: str = '-f -o:${TARGET.dir.filebase} ${SOURCE.abspath}', subcommand_options: str = '', action_suffix: str = '> ${TARGETS[-1].abspath} 2>&1', emitter: ~collections.abc.Callable[[list, list, ~SCons.Script.SConscript.SConsEnvironment], tuple[list, list]] = <function first_target_emitter>, **kwargs) Builder[source]
Truchas builder factory.
Warning
This is an experimental builder. It is subject to change without warning.
Warning
This builder is not included in the regression test suite yet. Contact the development team if you encounter problems or have recommendations for improved design behavior.
This builder factory extends
waves.scons_extensions.first_target_builder_factory(). This builder factory uses thewaves.scons_extensions.first_target_emitter(). At least one task target must be specified in the task definition and the last target will always be the expected STDOUT and STDERR redirection output file,TARGETS[-1]ending in*.stdout.Warning
Note that this builder’s action prefix is different from other builders. Truchas output control produces a build subdirectory, so the action prefix moves up two directories above the expected output instead of one. All Truchas output targets must include the requested output directory and the output directory name must match the target file basename, e.g.
target/target.logandparameter_set1/target/target.log.With the default options this builder requires the following sources file provided in the order:
Truchas input file:
*.inp
With the default options this builder requires the following target file provided in the order:
Truchas output log with desired output directory:
target/target.log
action string construction#${environment} ${action_prefix} ${program} ${program_required} ${program_options} ${subcommand} ${subcommand_required} ${subcommand_options} ${action_suffix}action string default expansion#${environment} cd ${TARGET.dir.dir.abspath} && mpirun ${program_required} -np 1 truchas -f -o:${TARGET.dir.filebase} ${SOURCE.abspath} ${subcommand_options} > ${TARGETS[-1].abspath} 2>&1SConstruct#import waves env = Environment() env.AddMethod(waves.scons_extensions.add_program, "AddProgram") env["TRUCHAS_PROGRAM"] = env.AddProgram(["truchas"]) env.Append(BUILDERS={ "Truchas": waves.scons_extensions.truchas_builder_factory( subcommand=env["TRUCHAS_PROGRAM"] ) }) env.Truchas( target=[ "target/target.log" "target/target.h5" ], source=["source.inp"], )
The builder returned by this factory accepts all SCons Builder arguments. The arguments of this function are also available as keyword arguments of the builder. When provided during task definition, the task keyword arguments override the builder keyword arguments.
- Parameters:
environment – This variable is intended primarily for use with builders and tasks that can not execute from an SCons construction environment. For instance, when tasks execute on a remote server with SSH wrapped actions using
waves.scons_extensions.ssh_builder_actions()and therefore must initialize the remote environment as part of the builder action.action_prefix – This variable is intended to perform directory change operations prior to program execution
program – The mpirun absolute or relative path
program_required – Space delimited string of required mpirun options and arguments that are crucial to builder behavior and should not be modified except by advanced users
program_options – Space delimited string of optional mpirun options and arguments that can be freely modified by the user
subcommand – The Truchas absolute or relative path
subcommand_required – Space delimited string of required Truchas options and arguments that are crucial to builder behavior and should not be modified except by advanced users.
subcommand_options – Space delimited string of optional Truchas options and arguments that can be freely modified by the user
action_suffix – This variable is intended to perform program STDOUT and STDERR redirection operations.
emitter – An SCons emitter function. This is not a keyword argument in the action string.
kwargs – Any additional keyword arguments are passed directly to the SCons builder object.
- Returns:
Truchas builder
Parameter Generators#
Provide build system compatible parameter generators for use as an external API module.
Will raise RuntimeError or a derived class of waves.exceptions.WAVESError to allow the CLI implementation
to convert stack-trace/exceptions into STDERR message and non-zero exit codes.
- class waves.parameter_generators._ScipyGenerator(*args, **kwargs)[source]
Bases:
ParameterGenerator,ABC- _create_parameter_names() None[source]
Construct the parameter names from a distribution parameter schema.
- _generate(**kwargs) None[source]
Generate the parameter study definition.
All implemented class method should accept kwargs as
_generate(self, **kwargs). The ABC class accepts, but does not use anykwargs.Must set the class attributes:
self._samples: The parameter study samples. A 2D numpy array in the shape (number of parameter sets,number of parameters). If it’s possible that the samples may be of mixed type,
numpy.array(..., dtype=object)should be used to preserve the original Python types.
self._set_hashes: list of parameter set content hashes created by callingself._create_set_hashesafter populating theself._samplesparameter study values.self._set_names: Dictionary mapping parameter set hash to parameter set name strings created bycalling
self._create_set_namesafter populatingself._set_hashes.
self.parameter_study: The Xarray Dataset parameter study object, created by callingself._create_parameter_study()after definingself._samples.
Minimum necessary work example:
# Work unique to the parameter generator schema and set generation set_count = 5 # Normally set according to the parameter schema parameter_count = len(self._parameter_names) self._samples = numpy.zeros((set_count, parameter_count)) # Work performed by common ABC methods super()._generate()
- _generate_distribution_samples(
- sampler: Halton | LatinHypercube | PoissonDisk | Sobol,
- set_count: int,
- parameter_count: int,
Create parameter distribution samples.
Requires attibrutes:
self.parameter_distributions: dictionary containing the {parameter name: scipy.stats distribution} defined by the parameter schema. Set bywaves.parameter_generators._ScipyGenerator._generate_parameter_distributions().
Sets attribute(s):
self._samples: The parameter study samples. A 2D numpy array in the shape (number of parameter sets,number of parameters).
- _generate_parameter_distributions() dict[source]
Return dictionary containing the {parameter name: scipy.stats distribution} defined by the parameter schema.
- Returns:
parameter_distributions
- _validate() None[source]
Validate the parameter distribution schema. Executed by class initiation.
parameter_schema = { 'num_simulations': 4, # Required key. Value must be an integer. 'parameter_1': { 'distribution': 'norm', # Required key. Value must be a valid scipy.stats 'loc': 50, # distribution name. 'scale': 1 }, 'parameter_2': { 'distribution': 'skewnorm', 'a': 4, 'loc': 30, 'scale': 2 } }
- Raises:
waves.exceptions.SchemaValidationError –
Parameter schema is not a dictionary
Parameter schema
num_simulationskey is not an integerParameter definition distribution value is not a valid Python identifier
Parameter definition key(s) is not a valid Python identifier
Parameter schema does not have a
num_simulationskeyParameter definition does not contain a
distributionkey
- class waves.parameter_generators.CartesianProduct(
- parameter_schema: dict,
- output_file_template: str | None = None,
- output_file: str | Path | None = None,
- output_file_type: Literal['h5', 'yaml'] = 'h5',
- set_name_template: str = 'parameter_set@number',
- previous_parameter_study: str | Path | None = None,
- require_previous_parameter_study: bool = False,
- overwrite: bool = False,
- write_meta: bool = False,
- **kwargs,
Bases:
ParameterGeneratorBuilds a cartesian product parameter study.
Parameters must be scalar valued integers, floats, strings, or booleans
- Parameters:
parameter_schema – The YAML loaded parameter study schema dictionary -
{parameter_name: schema value}CartesianProduct expects “schema value” to be an iterable. For example, when read from a YAML file “schema value” will be a Python list. Each parameter’s values must have a consistent data type, but data type may vary between parameters.output_file_template – Output file name template for multiple file output of the parameter study. Required if parameter sets will be written to files instead of printed to STDOUT. May contain pathseps for an absolute or relative path template. May contain the
@numberset number placeholder in the file basename but not in the path. If the placeholder is not found it will be appended to the template string. Output files are overwritten if the content of the file has changed or ifoverwriteis True.output_file_templateandoutput_fileare mutually exclusive.output_file – Output file name for single file output of the parameter study. Required if parameter sets will be written to a file instead of printed to STDOUT. May contain pathseps for an absolute or relative path. Output file is overwritten if the content of the file has changed or if
overwriteis True.output_fileandoutput_file_templateare mutually exclusive.output_file_type – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
set_name_template – Parameter set name template. Overridden by
output_file_template, if provided.previous_parameter_study – A relative or absolute file path to a previously created parameter study Xarray Dataset. If a previous parameter study exists, it is merged into the current study upon generation. Set name to content associations of the previous study are preserved when the parameter spaces between the previous and current study are identical. If the parameter spaces are unique, the current study will propagate the parameter spaces to resolve them. This will break set name to content associations of the previous study.
require_previous_parameter_study – Raise a
RuntimeErrorif the previous parameter study file is missing.overwrite – Overwrite existing output files
write_meta – Write a meta file named “parameter_study_meta.txt” containing the parameter set file names. Useful for command line execution with build systems that require an explicit file list for target creation.
- Variables:
self.parameter_study – The final parameter study XArray Dataset object
- Raises:
waves.exceptions.MutuallyExclusiveError – If the mutually exclusive output file template and output file options are both specified
waves.exceptions.APIError – If an unknown output file type is requested
RuntimeError – If a previous parameter study file is specified and missing, and
require_previous_parameter_studyisTruewaves.exceptions.SchemaValidationError –
Parameter schema is not a dictionary
Parameter key is not a supported iterable: set, tuple, list
Example:
>>> import waves >>> parameter_schema = { ... 'parameter_1': [1, 2], ... 'parameter_2': ['a', 'b'] ... } >>> parameter_generator = waves.parameter_generators.CartesianProduct(parameter_schema) >>> print(parameter_generator.parameter_study) <xarray.Dataset> Dimensions: (set_hash: 4) Coordinates: set_hash (set_hash) <U32 'de3cb3eaecb767ff63973820b2... * set_name (set_hash) <U14 'parameter_set0' ... 'param... Data variables: parameter_1 (set_hash) object 1 1 2 2 parameter_2 (set_hash) object 'a' 'b' 'a' 'b'
- _conditionally_write_dataset(
- existing_parameter_study: Path,
- parameter_study: Dataset,
Write NetCDF file over previous study if the datasets have changed or self.overwrite is True.
- Parameters:
existing_parameter_study – A relative or absolute file path to a previously created parameter study Xarray Dataset
parameter_study – Parameter study xarray dataset
- _conditionally_write_yaml(output_file: str | Path, parameter_dictionary: dict) None
Write YAML file over previous study if the datasets have changed or self.overwrite is True.
- Parameters:
output_file – A relative or absolute file path to the output YAML file
parameter_dictionary – dictionary containing parameter set data
- _create_parameter_study() None
Create the standard structure for the parameter study dataset.
requires:
self._set_hashes: parameter set content hashes identifying rows of parameter studyself._parameter_names: parameter names used as columns of parameter studyself._samples: The parameter study samples. Rows are sets. Columns are parameters.
creates attribute:
self.parameter_study
- _create_set_hashes() None
Construct unique, repeatable parameter set content hashes from
self._samples.Creates an md5 hash from the concatenated string representation of parameter
name:valueassociations.requires:
self._samples: The parameter study samples. Rows are sets. Columns are parameters.self._parameter_names: parameter names used as columns of parameter study
creates attribute:
self._set_hashes: parameter set content hashes identifying rows of parameter study
- _create_set_names() None
Construct parameter set names from the set name template and number of parameter sets in
self._samples.Creates the class attribute
self._set_namesrequired to populate the_generate()method’s parameter study Xarray dataset object.requires:
self._set_hashes: parameter set content hashes identifying rows of parameter studyself.set_name_template: Parameter set name template. Overridden byoutput_file_template, if provided
creates attribute:
self._set_names: Dictionary mapping parameter set hash to parameter set name
- _create_set_names_array() DataArray
Create an Xarray DataArray with the parameter set names using parameter set hashes as the coordinate.
- Returns:
set_names_array
- _generate(**kwargs) None[source]
Generate the Cartesian Product parameter sets.
- _merge_parameter_studies() None
Merge the current parameter study into a previous parameter study.
When merging across identical parameter spaces, preserves the previous parameter study set name to set contents associations by dropping the new studies’ set names during merge. If the parameter spaces are unique across studies, this method will use
_propagate_parameter_space()to resolve the parameter spaces and break the set name to set contents associations of the previous study.Resets attributes:
self.parameter_studyself._samplesself._set_hashesself._set_names
- Raises:
RuntimeError – If the
self.previous_parameter_studyattribute is None
- _merge_set_names_array() None
Merge the parameter set names array into the parameter study dataset as a non-index coordinate.
- _parameter_study_to_numpy() ndarray
Return the parameter study data as a 2D numpy array.
- Returns:
data
- _scons_write(target: list, source: list, env: Base) None
SCons Python build function wrapper for the parameter generator’s write() function.
Reference: https://scons.org/doc/production/HTML/scons-user/ch17s04.html
Searches for following keyword arguments in the task construction environment and passes to the write function:
output_file_type
- Parameters:
target – The target file list of strings
source – The source file list of SCons.Node.FS.File objects
env (SCons.Script.SConscript.SConsEnvironment) – The builder’s SCons construction environment object
- _validate() None[source]
Validate the Cartesian Product parameter schema. Executed by class initiation.
- _write(
- parameter_study_object: dict | Dataset,
- parameter_study_iterator: ItemsView | DatasetGroupBy,
- conditional_write_function: Callable[[Path, dict], None] | Callable[[Path, Dataset], None],
- dry_run: bool = False,
Write parameter study formatted output to STDOUT, separate set files, or a single file.
Behavior as specified in
waves.parameter_generators.ParameterGenerator.write()
- _write_meta() None
Write the parameter study meta data file.
The parameter study meta file is always overwritten. It should NOT be used to determine if the parameter study target or dependee is out-of-date. Parameter study file paths are written as absolute paths.
- parameter_study_to_dict() dict[str, dict[str, Any]]
Return parameter study as a dictionary.
Used for iterating on parameter sets in an SCons workflow with parameter substitution dictionaries, e.g.
>>> for set_name, parameters in parameter_generator.parameter_study_to_dict().items(): ... print(f"{set_name}: {parameters}") ... parameter_set0: {'parameter_1': 1, 'parameter_2': 'a'} parameter_set1: {'parameter_1': 1, 'parameter_2': 'b'} parameter_set2: {'parameter_1': 2, 'parameter_2': 'a'} parameter_set3: {'parameter_1': 2, 'parameter_2': 'b'}
- Returns:
parameter study sets and samples as a dictionary: {set_name: {parameter: value}, …}
- write(output_file_type: Literal['h5', 'yaml'] | None = None, dry_run: bool = False) None
Write the parameter study to STDOUT or an output file.
Writes to STDOUT by default. Requires non-default
output_file_templateoroutput_filespecification to write to files.If printing to STDOUT, print all parameter sets together. If printing to files, overwrite when contents of existing files have changed. If overwrite is specified, overwrite all parameter set files. If a dry run is requested print file-content associations for files that would have been written.
Writes parameter set files in YAML syntax by default. Output formatting is controlled by
output_file_type.parameter_1: 1 parameter_2: a
- Parameters:
output_file_type – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
dry_run – Print contents of new parameter study output files to STDOUT and exit
- Raises:
waves.exceptions.ChoicesError – If an unsupported output file type is requested
- class waves.parameter_generators.CustomStudy(
- parameter_schema: dict,
- output_file_template: str | None = None,
- output_file: str | Path | None = None,
- output_file_type: Literal['h5', 'yaml'] = 'h5',
- set_name_template: str = 'parameter_set@number',
- previous_parameter_study: str | Path | None = None,
- require_previous_parameter_study: bool = False,
- overwrite: bool = False,
- write_meta: bool = False,
- **kwargs,
Bases:
ParameterGeneratorBuilds a custom parameter study from user-specified values.
Parameters must be scalar valued integers, floats, strings, or booleans
- Parameters:
parameter_schema – Dictionary with two keys:
parameter_samplesandparameter_names. Parameter samples in the form of a 2D array with shape M x N, where M is the number of parameter sets and N is the number of parameters. Parameter names in the form of a 1D array with length N. When creating a parameter_samples array with mixed type (e.g. string and floats) use dtype=object to preserve the mixed types and avoid casting all values to a common type (e.g. all your floats will become strings). Each parameter’s values must have a consistent data type, but data type may vary between parameters.output_file_template – Output file name template for multiple file output of the parameter study. Required if parameter sets will be written to files instead of printed to STDOUT. May contain pathseps for an absolute or relative path template. May contain the
@numberset number placeholder in the file basename but not in the path. If the placeholder is not found it will be appended to the template string. Output files are overwritten if the content of the file has changed or ifoverwriteis True.output_file_templateandoutput_fileare mutually exclusive.output_file – Output file name for single file output of the parameter study. Required if parameter sets will be written to a file instead of printed to STDOUT. May contain pathseps for an absolute or relative path. Output file is overwritten if the content of the file has changed or if
overwriteis True.output_fileandoutput_file_templateare mutually exclusive.output_file_type – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
set_name_template – Parameter set name template. Overridden by
output_file_template, if provided.previous_parameter_study – A relative or absolute file path to a previously created parameter study Xarray Dataset. If a previous parameter study exists, it is merged into the current study upon generation. Set name to content associations of the previous study are preserved when the parameter spaces between the previous and current study are identical. If the parameter spaces are unique, the current study will propagate the parameter spaces to resolve them. This will break set name to content associations of the previous study.
require_previous_parameter_study – Raise a
RuntimeErrorif the previous parameter study file is missing.overwrite – Overwrite existing output files
write_meta – Write a meta file named “parameter_study_meta.txt” containing the parameter set file names. Useful for command line execution with build systems that require an explicit file list for target creation.
- Variables:
self.parameter_study – The final parameter study XArray Dataset object
- Raises:
waves.exceptions.MutuallyExclusiveError – If the mutually exclusive output file template and output file options are both specified
waves.exceptions.APIError – If an unknown output file type is requested
RuntimeError – If a previous parameter study file is specified and missing, and
require_previous_parameter_studyisTruewaves.exceptions.SchemaValidationError –
Parameter schema is not a dictionary
Parameter schema does not contain the
parameter_nameskeyParameter schema does not contain the
parameter_sampleskeyThe
parameter_samplesvalue is an improperly shaped array
Example:
>>> import waves >>> import numpy >>> parameter_schema = dict( ... parameter_samples = numpy.array([[1.0, 'a', 5], [2.0, 'b', 6]], dtype=object), ... parameter_names = numpy.array(['height', 'prefix', 'index'])) >>> parameter_generator = waves.parameter_generators.CustomStudy(parameter_schema) >>> print(parameter_generator.parameter_study) <xarray.Dataset> Dimensions: (set_hash: 2) Coordinates: set_hash (set_hash) <U32 '50ba1a2716e42f8c4fcc34a90a... * set_name (set_hash) <U14 'parameter_set0' 'parameter... Data variables: height (set_hash) object 1.0 2.0 prefix (set_hash) object 'a' 'b' index (set_hash) object 5 6
- _conditionally_write_dataset(
- existing_parameter_study: Path,
- parameter_study: Dataset,
Write NetCDF file over previous study if the datasets have changed or self.overwrite is True.
- Parameters:
existing_parameter_study – A relative or absolute file path to a previously created parameter study Xarray Dataset
parameter_study – Parameter study xarray dataset
- _conditionally_write_yaml(output_file: str | Path, parameter_dictionary: dict) None
Write YAML file over previous study if the datasets have changed or self.overwrite is True.
- Parameters:
output_file – A relative or absolute file path to the output YAML file
parameter_dictionary – dictionary containing parameter set data
- _create_parameter_study() None
Create the standard structure for the parameter study dataset.
requires:
self._set_hashes: parameter set content hashes identifying rows of parameter studyself._parameter_names: parameter names used as columns of parameter studyself._samples: The parameter study samples. Rows are sets. Columns are parameters.
creates attribute:
self.parameter_study
- _create_set_hashes() None
Construct unique, repeatable parameter set content hashes from
self._samples.Creates an md5 hash from the concatenated string representation of parameter
name:valueassociations.requires:
self._samples: The parameter study samples. Rows are sets. Columns are parameters.self._parameter_names: parameter names used as columns of parameter study
creates attribute:
self._set_hashes: parameter set content hashes identifying rows of parameter study
- _create_set_names() None
Construct parameter set names from the set name template and number of parameter sets in
self._samples.Creates the class attribute
self._set_namesrequired to populate the_generate()method’s parameter study Xarray dataset object.requires:
self._set_hashes: parameter set content hashes identifying rows of parameter studyself.set_name_template: Parameter set name template. Overridden byoutput_file_template, if provided
creates attribute:
self._set_names: Dictionary mapping parameter set hash to parameter set name
- _create_set_names_array() DataArray
Create an Xarray DataArray with the parameter set names using parameter set hashes as the coordinate.
- Returns:
set_names_array
- _generate(**kwargs) None[source]
Generate the parameter study dataset from the user provided parameter array.
- _merge_parameter_studies() None
Merge the current parameter study into a previous parameter study.
When merging across identical parameter spaces, preserves the previous parameter study set name to set contents associations by dropping the new studies’ set names during merge. If the parameter spaces are unique across studies, this method will use
_propagate_parameter_space()to resolve the parameter spaces and break the set name to set contents associations of the previous study.Resets attributes:
self.parameter_studyself._samplesself._set_hashesself._set_names
- Raises:
RuntimeError – If the
self.previous_parameter_studyattribute is None
- _merge_set_names_array() None
Merge the parameter set names array into the parameter study dataset as a non-index coordinate.
- _parameter_study_to_numpy() ndarray
Return the parameter study data as a 2D numpy array.
- Returns:
data
- _scons_write(target: list, source: list, env: Base) None
SCons Python build function wrapper for the parameter generator’s write() function.
Reference: https://scons.org/doc/production/HTML/scons-user/ch17s04.html
Searches for following keyword arguments in the task construction environment and passes to the write function:
output_file_type
- Parameters:
target – The target file list of strings
source – The source file list of SCons.Node.FS.File objects
env (SCons.Script.SConscript.SConsEnvironment) – The builder’s SCons construction environment object
- _validate() None[source]
Validate the Custom Study parameter samples and names. Executed by class initiation.
- _write(
- parameter_study_object: dict | Dataset,
- parameter_study_iterator: ItemsView | DatasetGroupBy,
- conditional_write_function: Callable[[Path, dict], None] | Callable[[Path, Dataset], None],
- dry_run: bool = False,
Write parameter study formatted output to STDOUT, separate set files, or a single file.
Behavior as specified in
waves.parameter_generators.ParameterGenerator.write()
- _write_meta() None
Write the parameter study meta data file.
The parameter study meta file is always overwritten. It should NOT be used to determine if the parameter study target or dependee is out-of-date. Parameter study file paths are written as absolute paths.
- parameter_study_to_dict() dict[str, dict[str, Any]]
Return parameter study as a dictionary.
Used for iterating on parameter sets in an SCons workflow with parameter substitution dictionaries, e.g.
>>> for set_name, parameters in parameter_generator.parameter_study_to_dict().items(): ... print(f"{set_name}: {parameters}") ... parameter_set0: {'parameter_1': 1, 'parameter_2': 'a'} parameter_set1: {'parameter_1': 1, 'parameter_2': 'b'} parameter_set2: {'parameter_1': 2, 'parameter_2': 'a'} parameter_set3: {'parameter_1': 2, 'parameter_2': 'b'}
- Returns:
parameter study sets and samples as a dictionary: {set_name: {parameter: value}, …}
- write(output_file_type: Literal['h5', 'yaml'] | None = None, dry_run: bool = False) None
Write the parameter study to STDOUT or an output file.
Writes to STDOUT by default. Requires non-default
output_file_templateoroutput_filespecification to write to files.If printing to STDOUT, print all parameter sets together. If printing to files, overwrite when contents of existing files have changed. If overwrite is specified, overwrite all parameter set files. If a dry run is requested print file-content associations for files that would have been written.
Writes parameter set files in YAML syntax by default. Output formatting is controlled by
output_file_type.parameter_1: 1 parameter_2: a
- Parameters:
output_file_type – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
dry_run – Print contents of new parameter study output files to STDOUT and exit
- Raises:
waves.exceptions.ChoicesError – If an unsupported output file type is requested
- waves.parameter_generators.HASH_COORDINATE_KEY: Final[str] = 'set_hash'
The set hash coordinate used in WAVES parameter study Xarray Datasets
- class waves.parameter_generators.LatinHypercube(*args, **kwargs)[source]
Bases:
_ScipyGeneratorBuilds a Latin-Hypercube parameter study from the scipy Latin Hypercube class.
Warning
The merged parameter study feature does not check for consistent parameter distributions. Changing the parameter definitions and merging with a previous parameter study will result in incorrect relationships between parameter schema and the parameter study samples.
- Parameters:
parameter_schema – The YAML loaded parameter study schema dictionary -
{parameter_name: schema value}LatinHypercube expects “schema value” to be a dictionary with a strict structure and several required keys. Validated on class instantiation.output_file_template – Output file name template for multiple file output of the parameter study. Required if parameter sets will be written to files instead of printed to STDOUT. May contain pathseps for an absolute or relative path template. May contain the
@numberset number placeholder in the file basename but not in the path. If the placeholder is not found it will be appended to the template string. Output files are overwritten if the content of the file has changed or ifoverwriteis True.output_file_templateandoutput_fileare mutually exclusive.output_file – Output file name for single file output of the parameter study. Required if parameter sets will be written to a file instead of printed to STDOUT. May contain pathseps for an absolute or relative path. Output file is overwritten if the content of the file has changed or if
overwriteis True.output_fileandoutput_file_templateare mutually exclusive.output_file_type – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
set_name_template – Parameter set name template. Overridden by
output_file_template, if provided.previous_parameter_study – A relative or absolute file path to a previously created parameter study Xarray Dataset. If a previous parameter study exists, it is merged into the current study upon generation. Set name to content associations of the previous study are preserved when the parameter spaces between the previous and current study are identical. If the parameter spaces are unique, the current study will propagate the parameter spaces to resolve them. This will break set name to content associations of the previous study.
require_previous_parameter_study – Raise a
RuntimeErrorif the previous parameter study file is missing.overwrite – Overwrite existing output files
write_meta – Write a meta file named “parameter_study_meta.txt” containing the parameter set file names. Useful for command line execution with build systems that require an explicit file list for target creation.
kwargs – Any additional keyword arguments are passed through to the sampler method
- Variables:
self.parameter_distributions – A dictionary mapping parameter names to the scipy.stats distribution
self.parameter_study – The final parameter study XArray Dataset object
- Raises:
waves.exceptions.MutuallyExclusiveError – If the mutually exclusive output file template and output file options are both specified
waves.exceptions.APIError – If an unknown output file type is requested
RuntimeError – If a previous parameter study file is specified and missing, and
require_previous_parameter_studyisTrue
To produce consistent Latin Hypercubes on repeat instantiations, the
**kwargsmust include{'seed': <int>}. See the scipy Latin Hypercubescipy.stats.qmc.LatinHypercubeclass documentation for details Thedkeyword argument is internally managed and will be overwritten to match the number of parameters defined in the parameter schema.Example:
>>> import waves >>> parameter_schema = { ... 'num_simulations': 4, # Required key. Value must be an integer. ... 'parameter_1': { ... 'distribution': 'norm', # Required key. Value must be a valid scipy.stats ... 'loc': 50, # distribution name. ... 'scale': 1 ... }, ... 'parameter_2': { ... 'distribution': 'skewnorm', ... 'a': 4, ... 'loc': 30, ... 'scale': 2 ... } ... } >>> parameter_generator = waves.parameter_generators.LatinHypercube(parameter_schema) >>> print(parameter_generator.parameter_study) <xarray.Dataset> Dimensions: (set_hash: 4) Coordinates: set_hash (set_hash) <U32 '1e8219dae27faa5388328e225a... * set_name (set_hash) <U14 'parameter_set0' ... 'param... Data variables: parameter_1 (set_hash) float64 0.125 ... 51.15 parameter_2 (set_hash) float64 0.625 ... 30.97
- _conditionally_write_dataset(
- existing_parameter_study: Path,
- parameter_study: Dataset,
Write NetCDF file over previous study if the datasets have changed or self.overwrite is True.
- Parameters:
existing_parameter_study – A relative or absolute file path to a previously created parameter study Xarray Dataset
parameter_study – Parameter study xarray dataset
- _conditionally_write_yaml(output_file: str | Path, parameter_dictionary: dict) None
Write YAML file over previous study if the datasets have changed or self.overwrite is True.
- Parameters:
output_file – A relative or absolute file path to the output YAML file
parameter_dictionary – dictionary containing parameter set data
- _create_parameter_names() None
Construct the parameter names from a distribution parameter schema.
- _create_parameter_study() None
Create the standard structure for the parameter study dataset.
requires:
self._set_hashes: parameter set content hashes identifying rows of parameter studyself._parameter_names: parameter names used as columns of parameter studyself._samples: The parameter study samples. Rows are sets. Columns are parameters.
creates attribute:
self.parameter_study
- _create_set_hashes() None
Construct unique, repeatable parameter set content hashes from
self._samples.Creates an md5 hash from the concatenated string representation of parameter
name:valueassociations.requires:
self._samples: The parameter study samples. Rows are sets. Columns are parameters.self._parameter_names: parameter names used as columns of parameter study
creates attribute:
self._set_hashes: parameter set content hashes identifying rows of parameter study
- _create_set_names() None
Construct parameter set names from the set name template and number of parameter sets in
self._samples.Creates the class attribute
self._set_namesrequired to populate the_generate()method’s parameter study Xarray dataset object.requires:
self._set_hashes: parameter set content hashes identifying rows of parameter studyself.set_name_template: Parameter set name template. Overridden byoutput_file_template, if provided
creates attribute:
self._set_names: Dictionary mapping parameter set hash to parameter set name
- _create_set_names_array() DataArray
Create an Xarray DataArray with the parameter set names using parameter set hashes as the coordinate.
- Returns:
set_names_array
- _generate(**kwargs) None[source]
Generate the Latin Hypercube parameter sets.
- _generate_distribution_samples(
- sampler: Halton | LatinHypercube | PoissonDisk | Sobol,
- set_count: int,
- parameter_count: int,
Create parameter distribution samples.
Requires attibrutes:
self.parameter_distributions: dictionary containing the {parameter name: scipy.stats distribution} defined by the parameter schema. Set bywaves.parameter_generators._ScipyGenerator._generate_parameter_distributions().
Sets attribute(s):
self._samples: The parameter study samples. A 2D numpy array in the shape (number of parameter sets,number of parameters).
- _generate_parameter_distributions() dict
Return dictionary containing the {parameter name: scipy.stats distribution} defined by the parameter schema.
- Returns:
parameter_distributions
- _merge_parameter_studies() None
Merge the current parameter study into a previous parameter study.
When merging across identical parameter spaces, preserves the previous parameter study set name to set contents associations by dropping the new studies’ set names during merge. If the parameter spaces are unique across studies, this method will use
_propagate_parameter_space()to resolve the parameter spaces and break the set name to set contents associations of the previous study.Resets attributes:
self.parameter_studyself._samplesself._set_hashesself._set_names
- Raises:
RuntimeError – If the
self.previous_parameter_studyattribute is None
- _merge_set_names_array() None
Merge the parameter set names array into the parameter study dataset as a non-index coordinate.
- _parameter_study_to_numpy() ndarray
Return the parameter study data as a 2D numpy array.
- Returns:
data
- _scons_write(target: list, source: list, env: Base) None
SCons Python build function wrapper for the parameter generator’s write() function.
Reference: https://scons.org/doc/production/HTML/scons-user/ch17s04.html
Searches for following keyword arguments in the task construction environment and passes to the write function:
output_file_type
- Parameters:
target – The target file list of strings
source – The source file list of SCons.Node.FS.File objects
env (SCons.Script.SConscript.SConsEnvironment) – The builder’s SCons construction environment object
- _validate() None
Validate the parameter distribution schema. Executed by class initiation.
parameter_schema = { 'num_simulations': 4, # Required key. Value must be an integer. 'parameter_1': { 'distribution': 'norm', # Required key. Value must be a valid scipy.stats 'loc': 50, # distribution name. 'scale': 1 }, 'parameter_2': { 'distribution': 'skewnorm', 'a': 4, 'loc': 30, 'scale': 2 } }
- Raises:
waves.exceptions.SchemaValidationError –
Parameter schema is not a dictionary
Parameter schema
num_simulationskey is not an integerParameter definition distribution value is not a valid Python identifier
Parameter definition key(s) is not a valid Python identifier
Parameter schema does not have a
num_simulationskeyParameter definition does not contain a
distributionkey
- _write(
- parameter_study_object: dict | Dataset,
- parameter_study_iterator: ItemsView | DatasetGroupBy,
- conditional_write_function: Callable[[Path, dict], None] | Callable[[Path, Dataset], None],
- dry_run: bool = False,
Write parameter study formatted output to STDOUT, separate set files, or a single file.
Behavior as specified in
waves.parameter_generators.ParameterGenerator.write()
- _write_meta() None
Write the parameter study meta data file.
The parameter study meta file is always overwritten. It should NOT be used to determine if the parameter study target or dependee is out-of-date. Parameter study file paths are written as absolute paths.
- parameter_study_to_dict() dict[str, dict[str, Any]]
Return parameter study as a dictionary.
Used for iterating on parameter sets in an SCons workflow with parameter substitution dictionaries, e.g.
>>> for set_name, parameters in parameter_generator.parameter_study_to_dict().items(): ... print(f"{set_name}: {parameters}") ... parameter_set0: {'parameter_1': 1, 'parameter_2': 'a'} parameter_set1: {'parameter_1': 1, 'parameter_2': 'b'} parameter_set2: {'parameter_1': 2, 'parameter_2': 'a'} parameter_set3: {'parameter_1': 2, 'parameter_2': 'b'}
- Returns:
parameter study sets and samples as a dictionary: {set_name: {parameter: value}, …}
- write(output_file_type: Literal['h5', 'yaml'] | None = None, dry_run: bool = False) None
Write the parameter study to STDOUT or an output file.
Writes to STDOUT by default. Requires non-default
output_file_templateoroutput_filespecification to write to files.If printing to STDOUT, print all parameter sets together. If printing to files, overwrite when contents of existing files have changed. If overwrite is specified, overwrite all parameter set files. If a dry run is requested print file-content associations for files that would have been written.
Writes parameter set files in YAML syntax by default. Output formatting is controlled by
output_file_type.parameter_1: 1 parameter_2: a
- Parameters:
output_file_type – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
dry_run – Print contents of new parameter study output files to STDOUT and exit
- Raises:
waves.exceptions.ChoicesError – If an unsupported output file type is requested
- class waves.parameter_generators.OneAtATime(
- parameter_schema: dict,
- output_file_template: str | None = None,
- output_file: str | Path | None = None,
- output_file_type: Literal['h5', 'yaml'] = 'h5',
- set_name_template: str = 'parameter_set@number',
- previous_parameter_study: str | Path | None = None,
- require_previous_parameter_study: bool = False,
- overwrite: bool = False,
- write_meta: bool = False,
- **kwargs,
Bases:
ParameterGeneratorBuild a parameter study with single-value changes from a nominal parameter set.
The nominal parameter set is created from the first value of every parameter iterable.
Parameters must be scalar valued integers, floats, strings, or booleans
The nominal parameter set will always be the first parameter set, e.g.
parameter_set0for the defaultset_name_template.- Parameters:
parameter_schema – The YAML loaded parameter study schema dictionary -
{parameter_name: schema value}OneAtATime expects “schema value” to be an ordered iterable. For example, when read from a YAML file “schema value” will be a Python list. Each parameter’s values must have a consistent data type, but data type may vary between parameters.output_file_template – Output file name template for multiple file output of the parameter study. Required if parameter sets will be written to files instead of printed to STDOUT. May contain pathseps for an absolute or relative path template. May contain the
@numberset number placeholder in the file basename but not in the path. If the placeholder is not found it will be appended to the template string. Output files are overwritten if the content of the file has changed or ifoverwriteis True.output_file_templateandoutput_fileare mutually exclusive.output_file – Output file name for single file output of the parameter study. Required if parameter sets will be written to a file instead of printed to STDOUT. May contain pathseps for an absolute or relative path. Output file is overwritten if the content of the file has changed or if
overwriteis True.output_fileandoutput_file_templateare mutually exclusive.output_file_type – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
set_name_template – Parameter set name template. Overridden by
output_file_template, if provided.previous_parameter_study – A relative or absolute file path to a previously created parameter study Xarray Dataset. If a previous parameter study exists, it is merged into the current study upon generation. Set name to content associations of the previous study are preserved when the parameter spaces between the previous and current study are identical. If the parameter spaces are unique, the current study will propagate the parameter spaces to resolve them. This will break set name to content associations of the previous study.
require_previous_parameter_study – Raise a
RuntimeErrorif the previous parameter study file is missing.overwrite – Overwrite existing output files
write_meta – Write a meta file named “parameter_study_meta.txt” containing the parameter set file names. Useful for command line execution with build systems that require an explicit file list for target creation.
- Variables:
self.parameter_study – The final parameter study XArray Dataset object
- Raises:
waves.exceptions.MutuallyExclusiveError – If the mutually exclusive output file template and output file options are both specified
waves.exceptions.APIError – If an unknown output file type is requested
RuntimeError – If a previous parameter study file is specified and missing, and
require_previous_parameter_studyisTruewaves.exceptions.SchemaValidationError –
Parameter schema is not a dictionary
Parameter key is not a supported iterable: tuple, list
Parameter key is empty
Example:
>>> import waves >>> parameter_schema = { ... 'parameter_1': [1.0], ... 'parameter_2': ['a', 'b'], ... 'parameter_3': [5, 3, 7] ... } >>> parameter_generator = waves.parameter_generators.OneAtATime(parameter_schema) >>> print(parameter_generator.parameter_study) <xarray.Dataset> Dimensions: (set_name: 4) Coordinates: set_hash (set_name) <U32 '375a9b0b7c00d01bced92d9c5a6d302c' .... * set_name (set_name) <U14 'parameter_set0' ... 'parameter_set3' parameter_sets (set_name) <U14 'parameter_set0' ... 'parameter_set3' Data variables: parameter_1 (set_name) float64 32B 1.0 1.0 1.0 1.0 parameter_2 (set_name) <U1 16B 'a' 'b' 'a' 'a' parameter_3 (set_name) int64 32B 5 5 3 7
- _conditionally_write_dataset(
- existing_parameter_study: Path,
- parameter_study: Dataset,
Write NetCDF file over previous study if the datasets have changed or self.overwrite is True.
- Parameters:
existing_parameter_study – A relative or absolute file path to a previously created parameter study Xarray Dataset
parameter_study – Parameter study xarray dataset
- _conditionally_write_yaml(output_file: str | Path, parameter_dictionary: dict) None
Write YAML file over previous study if the datasets have changed or self.overwrite is True.
- Parameters:
output_file – A relative or absolute file path to the output YAML file
parameter_dictionary – dictionary containing parameter set data
- _create_parameter_study() None
Create the standard structure for the parameter study dataset.
requires:
self._set_hashes: parameter set content hashes identifying rows of parameter studyself._parameter_names: parameter names used as columns of parameter studyself._samples: The parameter study samples. Rows are sets. Columns are parameters.
creates attribute:
self.parameter_study
- _create_set_hashes() None
Construct unique, repeatable parameter set content hashes from
self._samples.Creates an md5 hash from the concatenated string representation of parameter
name:valueassociations.requires:
self._samples: The parameter study samples. Rows are sets. Columns are parameters.self._parameter_names: parameter names used as columns of parameter study
creates attribute:
self._set_hashes: parameter set content hashes identifying rows of parameter study
- _create_set_names() None
Construct parameter set names from the set name template and number of parameter sets in
self._samples.Creates the class attribute
self._set_namesrequired to populate the_generate()method’s parameter study Xarray dataset object.requires:
self._set_hashes: parameter set content hashes identifying rows of parameter studyself.set_name_template: Parameter set name template. Overridden byoutput_file_template, if provided
creates attribute:
self._set_names: Dictionary mapping parameter set hash to parameter set name
- _create_set_names_array() DataArray
Create an Xarray DataArray with the parameter set names using parameter set hashes as the coordinate.
- Returns:
set_names_array
- _generate(**kwargs) None[source]
Generate the parameter sets from the user provided parameter values.
- _merge_parameter_studies() None
Merge the current parameter study into a previous parameter study.
When merging across identical parameter spaces, preserves the previous parameter study set name to set contents associations by dropping the new studies’ set names during merge. If the parameter spaces are unique across studies, this method will use
_propagate_parameter_space()to resolve the parameter spaces and break the set name to set contents associations of the previous study.Resets attributes:
self.parameter_studyself._samplesself._set_hashesself._set_names
- Raises:
RuntimeError – If the
self.previous_parameter_studyattribute is None
- _merge_set_names_array() None
Merge the parameter set names array into the parameter study dataset as a non-index coordinate.
- _parameter_study_to_numpy() ndarray
Return the parameter study data as a 2D numpy array.
- Returns:
data
- _scons_write(target: list, source: list, env: Base) None
SCons Python build function wrapper for the parameter generator’s write() function.
Reference: https://scons.org/doc/production/HTML/scons-user/ch17s04.html
Searches for following keyword arguments in the task construction environment and passes to the write function:
output_file_type
- Parameters:
target – The target file list of strings
source – The source file list of SCons.Node.FS.File objects
env (SCons.Script.SConscript.SConsEnvironment) – The builder’s SCons construction environment object
- _validate() None[source]
Validate the One-at-a-Time parameter schema. Executed by class initiation.
- _write(
- parameter_study_object: dict | Dataset,
- parameter_study_iterator: ItemsView | DatasetGroupBy,
- conditional_write_function: Callable[[Path, dict], None] | Callable[[Path, Dataset], None],
- dry_run: bool = False,
Write parameter study formatted output to STDOUT, separate set files, or a single file.
Behavior as specified in
waves.parameter_generators.ParameterGenerator.write()
- _write_meta() None
Write the parameter study meta data file.
The parameter study meta file is always overwritten. It should NOT be used to determine if the parameter study target or dependee is out-of-date. Parameter study file paths are written as absolute paths.
- parameter_study_to_dict() dict[str, dict[str, Any]]
Return parameter study as a dictionary.
Used for iterating on parameter sets in an SCons workflow with parameter substitution dictionaries, e.g.
>>> for set_name, parameters in parameter_generator.parameter_study_to_dict().items(): ... print(f"{set_name}: {parameters}") ... parameter_set0: {'parameter_1': 1, 'parameter_2': 'a'} parameter_set1: {'parameter_1': 1, 'parameter_2': 'b'} parameter_set2: {'parameter_1': 2, 'parameter_2': 'a'} parameter_set3: {'parameter_1': 2, 'parameter_2': 'b'}
- Returns:
parameter study sets and samples as a dictionary: {set_name: {parameter: value}, …}
- write(output_file_type: Literal['h5', 'yaml'] | None = None, dry_run: bool = False) None
Write the parameter study to STDOUT or an output file.
Writes to STDOUT by default. Requires non-default
output_file_templateoroutput_filespecification to write to files.If printing to STDOUT, print all parameter sets together. If printing to files, overwrite when contents of existing files have changed. If overwrite is specified, overwrite all parameter set files. If a dry run is requested print file-content associations for files that would have been written.
Writes parameter set files in YAML syntax by default. Output formatting is controlled by
output_file_type.parameter_1: 1 parameter_2: a
- Parameters:
output_file_type – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
dry_run – Print contents of new parameter study output files to STDOUT and exit
- Raises:
waves.exceptions.ChoicesError – If an unsupported output file type is requested
- class waves.parameter_generators.ParameterGenerator(
- parameter_schema: dict,
- output_file_template: str | None = None,
- output_file: str | Path | None = None,
- output_file_type: Literal['h5', 'yaml'] = 'h5',
- set_name_template: str = 'parameter_set@number',
- previous_parameter_study: str | Path | None = None,
- require_previous_parameter_study: bool = False,
- overwrite: bool = False,
- write_meta: bool = False,
- **kwargs,
Bases:
ABCAbstract base class for parameter study generators.
Parameters must be scalar valued integers, floats, strings, or booleans
- Parameters:
parameter_schema – The YAML loaded parameter study schema dictionary, e.g.
{parameter_name: schema_value}. Validated on class instantiation.output_file_template – Output file name template for multiple file output of the parameter study. Required if parameter sets will be written to files instead of printed to STDOUT. May contain pathseps for an absolute or relative path template. May contain the
@numberset number placeholder in the file basename but not in the path. If the placeholder is not found it will be appended to the template string. Output files are overwritten if the content of the file has changed or ifoverwriteis True.output_file_templateandoutput_fileare mutually exclusive.output_file – Output file name for single file output of the parameter study. Required if parameter sets will be written to a file instead of printed to STDOUT. May contain pathseps for an absolute or relative path. Output file is overwritten if the content of the file has changed or if
overwriteis True.output_fileandoutput_file_templateare mutually exclusive.output_file_type – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
set_name_template – Parameter set name template. Overridden by
output_file_template, if provided.previous_parameter_study – A relative or absolute file path to a previously created parameter study Xarray Dataset. If a previous parameter study exists, it is merged into the current study upon generation. Set name to content associations of the previous study are preserved when the parameter spaces between the previous and current study are identical. If the parameter spaces are unique, the current study will propagate the parameter spaces to resolve them. This will break set name to content associations of the previous study.
require_previous_parameter_study – Raise a
RuntimeErrorif the previous parameter study file is missing.overwrite – Overwrite existing output files
write_meta – Write a meta file named “parameter_study_meta.txt” containing the parameter set file names. Useful for command line execution with build systems that require an explicit file list for target creation.
- Variables:
self.parameter_study – The final parameter study XArray Dataset object
- Raises:
waves.exceptions.MutuallyExclusiveError – If the mutually exclusive output file template and output file options are both specified
waves.exceptions.APIError – If an unknown output file type is requested
RuntimeError – If a previous parameter study file is specified and missing, and
require_previous_parameter_studyisTrue
- _conditionally_write_dataset(
- existing_parameter_study: Path,
- parameter_study: Dataset,
Write NetCDF file over previous study if the datasets have changed or self.overwrite is True.
- Parameters:
existing_parameter_study – A relative or absolute file path to a previously created parameter study Xarray Dataset
parameter_study – Parameter study xarray dataset
- _conditionally_write_yaml(output_file: str | Path, parameter_dictionary: dict) None[source]
Write YAML file over previous study if the datasets have changed or self.overwrite is True.
- Parameters:
output_file – A relative or absolute file path to the output YAML file
parameter_dictionary – dictionary containing parameter set data
- _create_parameter_study() None[source]
Create the standard structure for the parameter study dataset.
requires:
self._set_hashes: parameter set content hashes identifying rows of parameter studyself._parameter_names: parameter names used as columns of parameter studyself._samples: The parameter study samples. Rows are sets. Columns are parameters.
creates attribute:
self.parameter_study
- _create_set_hashes() None[source]
Construct unique, repeatable parameter set content hashes from
self._samples.Creates an md5 hash from the concatenated string representation of parameter
name:valueassociations.requires:
self._samples: The parameter study samples. Rows are sets. Columns are parameters.self._parameter_names: parameter names used as columns of parameter study
creates attribute:
self._set_hashes: parameter set content hashes identifying rows of parameter study
- _create_set_names() None[source]
Construct parameter set names from the set name template and number of parameter sets in
self._samples.Creates the class attribute
self._set_namesrequired to populate the_generate()method’s parameter study Xarray dataset object.requires:
self._set_hashes: parameter set content hashes identifying rows of parameter studyself.set_name_template: Parameter set name template. Overridden byoutput_file_template, if provided
creates attribute:
self._set_names: Dictionary mapping parameter set hash to parameter set name
- _create_set_names_array() DataArray[source]
Create an Xarray DataArray with the parameter set names using parameter set hashes as the coordinate.
- Returns:
set_names_array
- abstractmethod _generate(**kwargs) None[source]
Generate the parameter study definition.
All implemented class method should accept kwargs as
_generate(self, **kwargs). The ABC class accepts, but does not use anykwargs.Must set the class attributes:
self._samples: The parameter study samples. A 2D numpy array in the shape (number of parameter sets,number of parameters). If it’s possible that the samples may be of mixed type,
numpy.array(..., dtype=object)should be used to preserve the original Python types.
self._set_hashes: list of parameter set content hashes created by callingself._create_set_hashesafter populating theself._samplesparameter study values.self._set_names: Dictionary mapping parameter set hash to parameter set name strings created bycalling
self._create_set_namesafter populatingself._set_hashes.
self.parameter_study: The Xarray Dataset parameter study object, created by callingself._create_parameter_study()after definingself._samples.
Minimum necessary work example:
# Work unique to the parameter generator schema and set generation set_count = 5 # Normally set according to the parameter schema parameter_count = len(self._parameter_names) self._samples = numpy.zeros((set_count, parameter_count)) # Work performed by common ABC methods super()._generate()
- _merge_parameter_studies() None[source]
Merge the current parameter study into a previous parameter study.
When merging across identical parameter spaces, preserves the previous parameter study set name to set contents associations by dropping the new studies’ set names during merge. If the parameter spaces are unique across studies, this method will use
_propagate_parameter_space()to resolve the parameter spaces and break the set name to set contents associations of the previous study.Resets attributes:
self.parameter_studyself._samplesself._set_hashesself._set_names
- Raises:
RuntimeError – If the
self.previous_parameter_studyattribute is None
- _merge_set_names_array() None[source]
Merge the parameter set names array into the parameter study dataset as a non-index coordinate.
- _parameter_study_to_numpy() ndarray[source]
Return the parameter study data as a 2D numpy array.
- Returns:
data
- _scons_write(target: list, source: list, env: Base) None[source]
SCons Python build function wrapper for the parameter generator’s write() function.
Reference: https://scons.org/doc/production/HTML/scons-user/ch17s04.html
Searches for following keyword arguments in the task construction environment and passes to the write function:
output_file_type
- Parameters:
target – The target file list of strings
source – The source file list of SCons.Node.FS.File objects
env (SCons.Script.SConscript.SConsEnvironment) – The builder’s SCons construction environment object
- abstractmethod _validate() None[source]
Process parameter study input to verify schema.
Must set the class attributes:
self._parameter_names: list of strings containing the parameter study’s parameter names
Minimum necessary work example:
# Work unique to the parameter generator schema. Example matches CartesianProduct schema. self._parameter_names = list(self.parameter_schema.keys())
- _write(
- parameter_study_object: dict | Dataset,
- parameter_study_iterator: ItemsView | DatasetGroupBy,
- conditional_write_function: Callable[[Path, dict], None] | Callable[[Path, Dataset], None],
- dry_run: bool = False,
Write parameter study formatted output to STDOUT, separate set files, or a single file.
Behavior as specified in
waves.parameter_generators.ParameterGenerator.write()
- _write_meta() None[source]
Write the parameter study meta data file.
The parameter study meta file is always overwritten. It should NOT be used to determine if the parameter study target or dependee is out-of-date. Parameter study file paths are written as absolute paths.
- parameter_study_to_dict() dict[str, dict[str, Any]][source]
Return parameter study as a dictionary.
Used for iterating on parameter sets in an SCons workflow with parameter substitution dictionaries, e.g.
>>> for set_name, parameters in parameter_generator.parameter_study_to_dict().items(): ... print(f"{set_name}: {parameters}") ... parameter_set0: {'parameter_1': 1, 'parameter_2': 'a'} parameter_set1: {'parameter_1': 1, 'parameter_2': 'b'} parameter_set2: {'parameter_1': 2, 'parameter_2': 'a'} parameter_set3: {'parameter_1': 2, 'parameter_2': 'b'}
- Returns:
parameter study sets and samples as a dictionary: {set_name: {parameter: value}, …}
- write(output_file_type: Literal['h5', 'yaml'] | None = None, dry_run: bool = False) None[source]
Write the parameter study to STDOUT or an output file.
Writes to STDOUT by default. Requires non-default
output_file_templateoroutput_filespecification to write to files.If printing to STDOUT, print all parameter sets together. If printing to files, overwrite when contents of existing files have changed. If overwrite is specified, overwrite all parameter set files. If a dry run is requested print file-content associations for files that would have been written.
Writes parameter set files in YAML syntax by default. Output formatting is controlled by
output_file_type.parameter_1: 1 parameter_2: a
- Parameters:
output_file_type – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
dry_run – Print contents of new parameter study output files to STDOUT and exit
- Raises:
waves.exceptions.ChoicesError – If an unsupported output file type is requested
- class waves.parameter_generators.SALibSampler(sampler_class: str, *args, **kwargs)[source]
Bases:
ParameterGenerator,ABCBuilds a SALib sampler parameter study from a SALib.sample
sampler_class.Samplers must use the
Nsample count argument. Note that in SALib.sampleNis not always equivalent to the number of simulations. The following samplers are tested for parameter study shape and merge behavior:fast_sampler
finite_diff
latin
sobol
morris
Warning
For small numbers of parameters, some SALib generators produce duplicate parameter sets. These duplicate sets are removed during parameter study generation. This may cause the SALib analyze method(s) to raise errors related to the expected parameter set count.
Warning
The merged parameter study feature does not check for consistent parameter distributions. Changing the parameter definitions and merging with a previous parameter study will result in incorrect relationships between parameter schema and the parameter study samples.
- Parameters:
sampler_class – The SALib.sample sampler class name. Case sensitive.
parameter_schema – The YAML loaded parameter study schema dictionary -
{parameter_name: schema value}SALibSampler expects “schema value” to be a dictionary with a strict structure and several required keys. Validated on class instantiation.output_file_template – Output file name template for multiple file output of the parameter study. Required if parameter sets will be written to files instead of printed to STDOUT. May contain pathseps for an absolute or relative path template. May contain the
@numberset number placeholder in the file basename but not in the path. If the placeholder is not found it will be appended to the template string. Output files are overwritten if the content of the file has changed or ifoverwriteis True.output_file_templateandoutput_fileare mutually exclusive.output_file – Output file name for single file output of the parameter study. Required if parameter sets will be written to a file instead of printed to STDOUT. May contain pathseps for an absolute or relative path. Output file is overwritten if the content of the file has changed or if
overwriteis True.output_fileandoutput_file_templateare mutually exclusive.output_file_type – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
set_name_template – Parameter set name template. Overridden by
output_file_template, if provided.previous_parameter_study – A relative or absolute file path to a previously created parameter study Xarray Dataset. If a previous parameter study exists, it is merged into the current study upon generation. Set name to content associations of the previous study are preserved when the parameter spaces between the previous and current study are identical. If the parameter spaces are unique, the current study will propagate the parameter spaces to resolve them. This will break set name to content associations of the previous study.
overwrite – Overwrite existing output files
write_meta – Write a meta file named “parameter_study_meta.txt” containing the parameter set file names. Useful for command line execution with build systems that require an explicit file list for target creation.
kwargs – Any additional keyword arguments are passed through to the sampler method
- Variables:
self.parameter_study – The final parameter study XArray Dataset object
- Raises:
waves.exceptions.MutuallyExclusiveError – If the mutually exclusive output file template and output file options are both specified
waves.exceptions.APIError – If an unknown output file type is requested
RuntimeError – If a previous parameter study file is specified and missing, and
require_previous_parameter_studyisTruewaves.exceptions.SchemaValidationError –
If the SALib sobol or SALib morris sampler is specified and there are fewer than 2 parameters.
Nis not a key ofparameter_schemaproblemis not a key ofparameter_schemanamesis not a key ofparameter_schema['problem']parameter_schemais not a dictionaryparameter_schema['N']is not an integerparameter_schema['problem']is not a dictionaryparameter_schema['problem']['names']is not a YAML compliant iterable (list, set, tuple)
Keyword arguments for the SALib.sample
sampler_classsamplemethod.Example
>>> import waves >>> parameter_schema = { ... "N": 4, # Required key. Value must be an integer. ... "problem": { # Required key. See the SALib sampler interface documentation ... "num_vars": 3, ... "names": ["parameter_1", "parameter_2", "parameter_3"], ... "bounds": [[-1, 1], [-2, 2], [-3, 3]] ... } ... } >>> parameter_generator = waves.parameter_generators.SALibSampler("sobol", parameter_schema) >>> print(parameter_generator.parameter_study) <xarray.Dataset> Dimensions: (set_name: 32) Coordinates: set_hash (set_name) <U32 'e0cb1990f9d70070eaf5638101dcaf... * set_name (set_name) <U15 'parameter_set0' ... 'parameter... Data variables: parameter_1 (set_name) float64 -0.2029 ... 0.187 parameter_2 (set_name) float64 -0.801 ... 0.6682 parameter_3 (set_name) float64 0.4287 ... -2.871
- _conditionally_write_dataset(
- existing_parameter_study: Path,
- parameter_study: Dataset,
Write NetCDF file over previous study if the datasets have changed or self.overwrite is True.
- Parameters:
existing_parameter_study – A relative or absolute file path to a previously created parameter study Xarray Dataset
parameter_study – Parameter study xarray dataset
- _conditionally_write_yaml(output_file: str | Path, parameter_dictionary: dict) None
Write YAML file over previous study if the datasets have changed or self.overwrite is True.
- Parameters:
output_file – A relative or absolute file path to the output YAML file
parameter_dictionary – dictionary containing parameter set data
- _create_parameter_names() None[source]
Construct the parameter names from a distribution parameter schema.
- _create_parameter_study() None
Create the standard structure for the parameter study dataset.
requires:
self._set_hashes: parameter set content hashes identifying rows of parameter studyself._parameter_names: parameter names used as columns of parameter studyself._samples: The parameter study samples. Rows are sets. Columns are parameters.
creates attribute:
self.parameter_study
- _create_set_hashes() None
Construct unique, repeatable parameter set content hashes from
self._samples.Creates an md5 hash from the concatenated string representation of parameter
name:valueassociations.requires:
self._samples: The parameter study samples. Rows are sets. Columns are parameters.self._parameter_names: parameter names used as columns of parameter study
creates attribute:
self._set_hashes: parameter set content hashes identifying rows of parameter study
- _create_set_names() None
Construct parameter set names from the set name template and number of parameter sets in
self._samples.Creates the class attribute
self._set_namesrequired to populate the_generate()method’s parameter study Xarray dataset object.requires:
self._set_hashes: parameter set content hashes identifying rows of parameter studyself.set_name_template: Parameter set name template. Overridden byoutput_file_template, if provided
creates attribute:
self._set_names: Dictionary mapping parameter set hash to parameter set name
- _create_set_names_array() DataArray
Create an Xarray DataArray with the parameter set names using parameter set hashes as the coordinate.
- Returns:
set_names_array
- _generate(**kwargs) None[source]
Generate the SALib.sample
sampler_classparameter sets.
- _merge_parameter_studies() None
Merge the current parameter study into a previous parameter study.
When merging across identical parameter spaces, preserves the previous parameter study set name to set contents associations by dropping the new studies’ set names during merge. If the parameter spaces are unique across studies, this method will use
_propagate_parameter_space()to resolve the parameter spaces and break the set name to set contents associations of the previous study.Resets attributes:
self.parameter_studyself._samplesself._set_hashesself._set_names
- Raises:
RuntimeError – If the
self.previous_parameter_studyattribute is None
- _merge_set_names_array() None
Merge the parameter set names array into the parameter study dataset as a non-index coordinate.
- _parameter_study_to_numpy() ndarray
Return the parameter study data as a 2D numpy array.
- Returns:
data
- _sampler_overrides(override_kwargs: dict | None = None) dict[source]
Provide sampler specific kwarg override dictionaries.
sobol produces duplicate parameter sets for two parameters when
calc_second_orderisTrue. Override this kwarg to beFalseif there are only two parameters.
- Parameters:
override_kwargs – any common kwargs to include in the override dictionary
- Returns:
override kwarg dictionary
- _sampler_validation() None[source]
Call campler specific schema validation check methods.
sobol requires at least two parameters
Requires attributes:
self._sampler_classset by class initiationself._parameter_namesset byself._create_parameter_names()
- Raises:
waves.exceptions.SchemaValidationError – A sobol or morris sampler contains fewer than two parameters
- _scons_write(target: list, source: list, env: Base) None
SCons Python build function wrapper for the parameter generator’s write() function.
Reference: https://scons.org/doc/production/HTML/scons-user/ch17s04.html
Searches for following keyword arguments in the task construction environment and passes to the write function:
output_file_type
- Parameters:
target – The target file list of strings
source – The source file list of SCons.Node.FS.File objects
env (SCons.Script.SConscript.SConsEnvironment) – The builder’s SCons construction environment object
- _validate() None[source]
Process parameter study input to verify schema.
Must set the class attributes:
self._parameter_names: list of strings containing the parameter study’s parameter names
Minimum necessary work example:
# Work unique to the parameter generator schema. Example matches CartesianProduct schema. self._parameter_names = list(self.parameter_schema.keys())
- _write(
- parameter_study_object: dict | Dataset,
- parameter_study_iterator: ItemsView | DatasetGroupBy,
- conditional_write_function: Callable[[Path, dict], None] | Callable[[Path, Dataset], None],
- dry_run: bool = False,
Write parameter study formatted output to STDOUT, separate set files, or a single file.
Behavior as specified in
waves.parameter_generators.ParameterGenerator.write()
- _write_meta() None
Write the parameter study meta data file.
The parameter study meta file is always overwritten. It should NOT be used to determine if the parameter study target or dependee is out-of-date. Parameter study file paths are written as absolute paths.
- parameter_study_to_dict() dict[str, dict[str, Any]]
Return parameter study as a dictionary.
Used for iterating on parameter sets in an SCons workflow with parameter substitution dictionaries, e.g.
>>> for set_name, parameters in parameter_generator.parameter_study_to_dict().items(): ... print(f"{set_name}: {parameters}") ... parameter_set0: {'parameter_1': 1, 'parameter_2': 'a'} parameter_set1: {'parameter_1': 1, 'parameter_2': 'b'} parameter_set2: {'parameter_1': 2, 'parameter_2': 'a'} parameter_set3: {'parameter_1': 2, 'parameter_2': 'b'}
- Returns:
parameter study sets and samples as a dictionary: {set_name: {parameter: value}, …}
- write(output_file_type: Literal['h5', 'yaml'] | None = None, dry_run: bool = False) None
Write the parameter study to STDOUT or an output file.
Writes to STDOUT by default. Requires non-default
output_file_templateoroutput_filespecification to write to files.If printing to STDOUT, print all parameter sets together. If printing to files, overwrite when contents of existing files have changed. If overwrite is specified, overwrite all parameter set files. If a dry run is requested print file-content associations for files that would have been written.
Writes parameter set files in YAML syntax by default. Output formatting is controlled by
output_file_type.parameter_1: 1 parameter_2: a
- Parameters:
output_file_type – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
dry_run – Print contents of new parameter study output files to STDOUT and exit
- Raises:
waves.exceptions.ChoicesError – If an unsupported output file type is requested
- waves.parameter_generators.SET_COORDINATE_KEY: Final[str] = 'set_name'
The set name coordinate used in WAVES parameter study Xarray Datasets
- class waves.parameter_generators.ScipySampler(sampler_class: str, *args, **kwargs)[source]
Bases:
_ScipyGeneratorBuilds a scipy sampler parameter study from a scipy.stats.qmc
sampler_class.Samplers must use the
dparameter space dimension keyword argument. The following samplers are tested for parameter study shape and merge behavior:Halton
LatinHypercube
PoissonDisk
Sobol
Warning
The merged parameter study feature does not check for consistent parameter distributions. Changing the parameter definitions and merging with a previous parameter study will result in incorrect relationships between parameter schema and the parameter study samples.
- Parameters:
sampler_class – The scipy.stats.qmc sampler class name. Case sensitive.
parameter_schema – The YAML loaded parameter study schema dictionary -
{parameter_name: schema value}ScipySampler expects “schema value” to be a dictionary with a strict structure and several required keys. Validated on class instantiation.output_file_template – Output file name template for multiple file output of the parameter study. Required if parameter sets will be written to files instead of printed to STDOUT. May contain pathseps for an absolute or relative path template. May contain the
@numberset number placeholder in the file basename but not in the path. If the placeholder is not found it will be appended to the template string. Output files are overwritten if the content of the file has changed or ifoverwriteis True.output_file_templateandoutput_fileare mutually exclusive.output_file – Output file name for single file output of the parameter study. Required if parameter sets will be written to a file instead of printed to STDOUT. May contain pathseps for an absolute or relative path. Output file is overwritten if the content of the file has changed or if
overwriteis True.output_fileandoutput_file_templateare mutually exclusive.output_file_type – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
set_name_template – Parameter set name template. Overridden by
output_file_template, if provided.previous_parameter_study – A relative or absolute file path to a previously created parameter study Xarray Dataset. If a previous parameter study exists, it is merged into the current study upon generation. Set name to content associations of the previous study are preserved when the parameter spaces between the previous and current study are identical. If the parameter spaces are unique, the current study will propagate the parameter spaces to resolve them. This will break set name to content associations of the previous study.
overwrite – Overwrite existing output files
write_meta – Write a meta file named “parameter_study_meta.txt” containing the parameter set file names. Useful for command line execution with build systems that require an explicit file list for target creation.
kwargs – Any additional keyword arguments are passed through to the sampler method
- Variables:
self.parameter_distributions – A dictionary mapping parameter names to the
scipy.statsdistributionself.parameter_study – The final parameter study XArray Dataset object
- Raises:
waves.exceptions.MutuallyExclusiveError – If the mutually exclusive output file template and output file options are both specified
waves.exceptions.APIError – If an unknown output file type is requested
RuntimeError – If a previous parameter study file is specified and missing, and
require_previous_parameter_studyisTrue
Keyword arguments for the
scipy.stats.qmcsampler_class. Thedkeyword argument is internally managed and will be overwritten to match the number of parameters defined in the parameter schema.Example:
>>> import waves >>> parameter_schema = { ... 'num_simulations': 4, # Required key. Value must be an integer. ... 'parameter_1': { ... 'distribution': 'norm', # Required key. Value must be a valid scipy.stats ... 'loc': 50, # distribution name. ... 'scale': 1 ... }, ... 'parameter_2': { ... 'distribution': 'skewnorm', ... 'a': 4, ... 'loc': 30, ... 'scale': 2 ... } ... } >>> parameter_generator = waves.parameter_generators.ScipySampler("LatinHypercube", parameter_schema) >>> print(parameter_generator.parameter_study) <xarray.Dataset> Dimensions: (set_hash: 4) Coordinates: set_hash (set_hash) <U32 '1e8219dae27faa5388328e225a... * set_name (set_hash) <U14 'parameter_set0' ... 'param... Data variables: parameter_1 (set_hash) float64 0.125 ... 51.15 parameter_2 (set_hash) float64 0.625 ... 30.97
- _conditionally_write_dataset(
- existing_parameter_study: Path,
- parameter_study: Dataset,
Write NetCDF file over previous study if the datasets have changed or self.overwrite is True.
- Parameters:
existing_parameter_study – A relative or absolute file path to a previously created parameter study Xarray Dataset
parameter_study – Parameter study xarray dataset
- _conditionally_write_yaml(output_file: str | Path, parameter_dictionary: dict) None
Write YAML file over previous study if the datasets have changed or self.overwrite is True.
- Parameters:
output_file – A relative or absolute file path to the output YAML file
parameter_dictionary – dictionary containing parameter set data
- _create_parameter_names() None
Construct the parameter names from a distribution parameter schema.
- _create_parameter_study() None
Create the standard structure for the parameter study dataset.
requires:
self._set_hashes: parameter set content hashes identifying rows of parameter studyself._parameter_names: parameter names used as columns of parameter studyself._samples: The parameter study samples. Rows are sets. Columns are parameters.
creates attribute:
self.parameter_study
- _create_set_hashes() None
Construct unique, repeatable parameter set content hashes from
self._samples.Creates an md5 hash from the concatenated string representation of parameter
name:valueassociations.requires:
self._samples: The parameter study samples. Rows are sets. Columns are parameters.self._parameter_names: parameter names used as columns of parameter study
creates attribute:
self._set_hashes: parameter set content hashes identifying rows of parameter study
- _create_set_names() None
Construct parameter set names from the set name template and number of parameter sets in
self._samples.Creates the class attribute
self._set_namesrequired to populate the_generate()method’s parameter study Xarray dataset object.requires:
self._set_hashes: parameter set content hashes identifying rows of parameter studyself.set_name_template: Parameter set name template. Overridden byoutput_file_template, if provided
creates attribute:
self._set_names: Dictionary mapping parameter set hash to parameter set name
- _create_set_names_array() DataArray
Create an Xarray DataArray with the parameter set names using parameter set hashes as the coordinate.
- Returns:
set_names_array
- _generate(**kwargs) None[source]
Generate the scipy.stats.qmc
sampler_classparameter sets.
- _generate_distribution_samples(
- sampler: Halton | LatinHypercube | PoissonDisk | Sobol,
- set_count: int,
- parameter_count: int,
Create parameter distribution samples.
Requires attibrutes:
self.parameter_distributions: dictionary containing the {parameter name: scipy.stats distribution} defined by the parameter schema. Set bywaves.parameter_generators._ScipyGenerator._generate_parameter_distributions().
Sets attribute(s):
self._samples: The parameter study samples. A 2D numpy array in the shape (number of parameter sets,number of parameters).
- _generate_parameter_distributions() dict
Return dictionary containing the {parameter name: scipy.stats distribution} defined by the parameter schema.
- Returns:
parameter_distributions
- _merge_parameter_studies() None
Merge the current parameter study into a previous parameter study.
When merging across identical parameter spaces, preserves the previous parameter study set name to set contents associations by dropping the new studies’ set names during merge. If the parameter spaces are unique across studies, this method will use
_propagate_parameter_space()to resolve the parameter spaces and break the set name to set contents associations of the previous study.Resets attributes:
self.parameter_studyself._samplesself._set_hashesself._set_names
- Raises:
RuntimeError – If the
self.previous_parameter_studyattribute is None
- _merge_set_names_array() None
Merge the parameter set names array into the parameter study dataset as a non-index coordinate.
- _parameter_study_to_numpy() ndarray
Return the parameter study data as a 2D numpy array.
- Returns:
data
- _scons_write(target: list, source: list, env: Base) None
SCons Python build function wrapper for the parameter generator’s write() function.
Reference: https://scons.org/doc/production/HTML/scons-user/ch17s04.html
Searches for following keyword arguments in the task construction environment and passes to the write function:
output_file_type
- Parameters:
target – The target file list of strings
source – The source file list of SCons.Node.FS.File objects
env (SCons.Script.SConscript.SConsEnvironment) – The builder’s SCons construction environment object
- _validate() None
Validate the parameter distribution schema. Executed by class initiation.
parameter_schema = { 'num_simulations': 4, # Required key. Value must be an integer. 'parameter_1': { 'distribution': 'norm', # Required key. Value must be a valid scipy.stats 'loc': 50, # distribution name. 'scale': 1 }, 'parameter_2': { 'distribution': 'skewnorm', 'a': 4, 'loc': 30, 'scale': 2 } }
- Raises:
waves.exceptions.SchemaValidationError –
Parameter schema is not a dictionary
Parameter schema
num_simulationskey is not an integerParameter definition distribution value is not a valid Python identifier
Parameter definition key(s) is not a valid Python identifier
Parameter schema does not have a
num_simulationskeyParameter definition does not contain a
distributionkey
- _write(
- parameter_study_object: dict | Dataset,
- parameter_study_iterator: ItemsView | DatasetGroupBy,
- conditional_write_function: Callable[[Path, dict], None] | Callable[[Path, Dataset], None],
- dry_run: bool = False,
Write parameter study formatted output to STDOUT, separate set files, or a single file.
Behavior as specified in
waves.parameter_generators.ParameterGenerator.write()
- _write_meta() None
Write the parameter study meta data file.
The parameter study meta file is always overwritten. It should NOT be used to determine if the parameter study target or dependee is out-of-date. Parameter study file paths are written as absolute paths.
- parameter_study_to_dict() dict[str, dict[str, Any]]
Return parameter study as a dictionary.
Used for iterating on parameter sets in an SCons workflow with parameter substitution dictionaries, e.g.
>>> for set_name, parameters in parameter_generator.parameter_study_to_dict().items(): ... print(f"{set_name}: {parameters}") ... parameter_set0: {'parameter_1': 1, 'parameter_2': 'a'} parameter_set1: {'parameter_1': 1, 'parameter_2': 'b'} parameter_set2: {'parameter_1': 2, 'parameter_2': 'a'} parameter_set3: {'parameter_1': 2, 'parameter_2': 'b'}
- Returns:
parameter study sets and samples as a dictionary: {set_name: {parameter: value}, …}
- write(output_file_type: Literal['h5', 'yaml'] | None = None, dry_run: bool = False) None
Write the parameter study to STDOUT or an output file.
Writes to STDOUT by default. Requires non-default
output_file_templateoroutput_filespecification to write to files.If printing to STDOUT, print all parameter sets together. If printing to files, overwrite when contents of existing files have changed. If overwrite is specified, overwrite all parameter set files. If a dry run is requested print file-content associations for files that would have been written.
Writes parameter set files in YAML syntax by default. Output formatting is controlled by
output_file_type.parameter_1: 1 parameter_2: a
- Parameters:
output_file_type – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
dry_run – Print contents of new parameter study output files to STDOUT and exit
- Raises:
waves.exceptions.ChoicesError – If an unsupported output file type is requested
- class waves.parameter_generators.SobolSequence(*args, **kwargs)[source]
Bases:
_ScipyGeneratorBuilds a Sobol sequence parameter study from the scipy Sobol class
randommethod.Warning
The merged parameter study feature does not check for consistent parameter distributions. Changing the parameter definitions and merging with a previous parameter study will result in incorrect relationships between parameter schema and the parameter study samples.
- Parameters:
parameter_schema – The YAML loaded parameter study schema dictionary -
{parameter_name: schema value}SobolSequence expects “schema value” to be a dictionary with a strict structure and several required keys. Validated on class instantiation.output_file_template – Output file name template for multiple file output of the parameter study. Required if parameter sets will be written to files instead of printed to STDOUT. May contain pathseps for an absolute or relative path template. May contain the
@numberset number placeholder in the file basename but not in the path. If the placeholder is not found it will be appended to the template string. Output files are overwritten if the content of the file has changed or ifoverwriteis True.output_file_templateandoutput_fileare mutually exclusive.output_file – Output file name for single file output of the parameter study. Required if parameter sets will be written to a file instead of printed to STDOUT. May contain pathseps for an absolute or relative path. Output file is overwritten if the content of the file has changed or if
overwriteis True.output_fileandoutput_file_templateare mutually exclusive.output_file_type – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
set_name_template – Parameter set name template. Overridden by
output_file_template, if provided.previous_parameter_study – A relative or absolute file path to a previously created parameter study Xarray Dataset. If a previous parameter study exists, it is merged into the current study upon generation. Set name to content associations of the previous study are preserved when the parameter spaces between the previous and current study are identical. If the parameter spaces are unique, the current study will propagate the parameter spaces to resolve them. This will break set name to content associations of the previous study.
overwrite – Overwrite existing output files
write_meta – Write a meta file named “parameter_study_meta.txt” containing the parameter set file names. Useful for command line execution with build systems that require an explicit file list for target creation.
kwargs – Any additional keyword arguments are passed through to the sampler method
- Variables:
self.parameter_distributions – A dictionary mapping parameter names to the
scipy.statsdistributionself.parameter_study – The final parameter study XArray Dataset object
- Raises:
waves.exceptions.MutuallyExclusiveError – If the mutually exclusive output file template and output file options are both specified
waves.exceptions.APIError – If an unknown output file type is requested
RuntimeError – If a previous parameter study file is specified and missing, and
require_previous_parameter_studyisTrue
To produce consistent Sobol sequences on repeat instantiations, the
**kwargsmust include eitherscramble=Falseorseed=<int>. See the scipy Sobolscipy.stats.qmc.Sobolclass documentation for details. Thedkeyword argument is internally managed and will be overwritten to match the number of parameters defined in the parameter schema.Example:
>>> import waves >>> parameter_schema = { ... 'num_simulations': 4, # Required key. Value must be an integer. ... 'parameter_1': { ... 'distribution': 'uniform', # Required key. Value must be a valid scipy.stats ... 'loc': 0, # distribution name. ... 'scale': 10 ... }, ... 'parameter_2': { ... 'distribution': 'uniform', ... 'loc': 2, ... 'scale': 3 ... } ... } >>> parameter_generator = waves.parameter_generators.SobolSequence(parameter_schema) >>> print(parameter_generator.parameter_study) <xarray.Dataset> Dimensions: (set_name: 4) Coordinates: set_hash (set_name) <U32 'c1fa74da12c0991379d1df6541c421... * set_name (set_name) <U14 'parameter_set0' ... 'parameter... Data variables: parameter_1 (set_name) float64 0.0 0.5 ... 7.5 2.5 parameter_2 (set_name) float64 0.0 0.5 ... 4.25
- _conditionally_write_dataset(
- existing_parameter_study: Path,
- parameter_study: Dataset,
Write NetCDF file over previous study if the datasets have changed or self.overwrite is True.
- Parameters:
existing_parameter_study – A relative or absolute file path to a previously created parameter study Xarray Dataset
parameter_study – Parameter study xarray dataset
- _conditionally_write_yaml(output_file: str | Path, parameter_dictionary: dict) None
Write YAML file over previous study if the datasets have changed or self.overwrite is True.
- Parameters:
output_file – A relative or absolute file path to the output YAML file
parameter_dictionary – dictionary containing parameter set data
- _create_parameter_names() None
Construct the parameter names from a distribution parameter schema.
- _create_parameter_study() None
Create the standard structure for the parameter study dataset.
requires:
self._set_hashes: parameter set content hashes identifying rows of parameter studyself._parameter_names: parameter names used as columns of parameter studyself._samples: The parameter study samples. Rows are sets. Columns are parameters.
creates attribute:
self.parameter_study
- _create_set_hashes() None
Construct unique, repeatable parameter set content hashes from
self._samples.Creates an md5 hash from the concatenated string representation of parameter
name:valueassociations.requires:
self._samples: The parameter study samples. Rows are sets. Columns are parameters.self._parameter_names: parameter names used as columns of parameter study
creates attribute:
self._set_hashes: parameter set content hashes identifying rows of parameter study
- _create_set_names() None
Construct parameter set names from the set name template and number of parameter sets in
self._samples.Creates the class attribute
self._set_namesrequired to populate the_generate()method’s parameter study Xarray dataset object.requires:
self._set_hashes: parameter set content hashes identifying rows of parameter studyself.set_name_template: Parameter set name template. Overridden byoutput_file_template, if provided
creates attribute:
self._set_names: Dictionary mapping parameter set hash to parameter set name
- _create_set_names_array() DataArray
Create an Xarray DataArray with the parameter set names using parameter set hashes as the coordinate.
- Returns:
set_names_array
- _generate(**kwargs) None[source]
Generate the parameter study dataset from the user provided parameter array.
- _generate_distribution_samples(
- sampler: Halton | LatinHypercube | PoissonDisk | Sobol,
- set_count: int,
- parameter_count: int,
Create parameter distribution samples.
Requires attibrutes:
self.parameter_distributions: dictionary containing the {parameter name: scipy.stats distribution} defined by the parameter schema. Set bywaves.parameter_generators._ScipyGenerator._generate_parameter_distributions().
Sets attribute(s):
self._samples: The parameter study samples. A 2D numpy array in the shape (number of parameter sets,number of parameters).
- _generate_parameter_distributions() dict
Return dictionary containing the {parameter name: scipy.stats distribution} defined by the parameter schema.
- Returns:
parameter_distributions
- _merge_parameter_studies() None
Merge the current parameter study into a previous parameter study.
When merging across identical parameter spaces, preserves the previous parameter study set name to set contents associations by dropping the new studies’ set names during merge. If the parameter spaces are unique across studies, this method will use
_propagate_parameter_space()to resolve the parameter spaces and break the set name to set contents associations of the previous study.Resets attributes:
self.parameter_studyself._samplesself._set_hashesself._set_names
- Raises:
RuntimeError – If the
self.previous_parameter_studyattribute is None
- _merge_set_names_array() None
Merge the parameter set names array into the parameter study dataset as a non-index coordinate.
- _parameter_study_to_numpy() ndarray
Return the parameter study data as a 2D numpy array.
- Returns:
data
- _scons_write(target: list, source: list, env: Base) None
SCons Python build function wrapper for the parameter generator’s write() function.
Reference: https://scons.org/doc/production/HTML/scons-user/ch17s04.html
Searches for following keyword arguments in the task construction environment and passes to the write function:
output_file_type
- Parameters:
target – The target file list of strings
source – The source file list of SCons.Node.FS.File objects
env (SCons.Script.SConscript.SConsEnvironment) – The builder’s SCons construction environment object
- _validate() None
Validate the parameter distribution schema. Executed by class initiation.
parameter_schema = { 'num_simulations': 4, # Required key. Value must be an integer. 'parameter_1': { 'distribution': 'norm', # Required key. Value must be a valid scipy.stats 'loc': 50, # distribution name. 'scale': 1 }, 'parameter_2': { 'distribution': 'skewnorm', 'a': 4, 'loc': 30, 'scale': 2 } }
- Raises:
waves.exceptions.SchemaValidationError –
Parameter schema is not a dictionary
Parameter schema
num_simulationskey is not an integerParameter definition distribution value is not a valid Python identifier
Parameter definition key(s) is not a valid Python identifier
Parameter schema does not have a
num_simulationskeyParameter definition does not contain a
distributionkey
- _write(
- parameter_study_object: dict | Dataset,
- parameter_study_iterator: ItemsView | DatasetGroupBy,
- conditional_write_function: Callable[[Path, dict], None] | Callable[[Path, Dataset], None],
- dry_run: bool = False,
Write parameter study formatted output to STDOUT, separate set files, or a single file.
Behavior as specified in
waves.parameter_generators.ParameterGenerator.write()
- _write_meta() None
Write the parameter study meta data file.
The parameter study meta file is always overwritten. It should NOT be used to determine if the parameter study target or dependee is out-of-date. Parameter study file paths are written as absolute paths.
- parameter_study_to_dict() dict[str, dict[str, Any]]
Return parameter study as a dictionary.
Used for iterating on parameter sets in an SCons workflow with parameter substitution dictionaries, e.g.
>>> for set_name, parameters in parameter_generator.parameter_study_to_dict().items(): ... print(f"{set_name}: {parameters}") ... parameter_set0: {'parameter_1': 1, 'parameter_2': 'a'} parameter_set1: {'parameter_1': 1, 'parameter_2': 'b'} parameter_set2: {'parameter_1': 2, 'parameter_2': 'a'} parameter_set3: {'parameter_1': 2, 'parameter_2': 'b'}
- Returns:
parameter study sets and samples as a dictionary: {set_name: {parameter: value}, …}
- write(output_file_type: Literal['h5', 'yaml'] | None = None, dry_run: bool = False) None
Write the parameter study to STDOUT or an output file.
Writes to STDOUT by default. Requires non-default
output_file_templateoroutput_filespecification to write to files.If printing to STDOUT, print all parameter sets together. If printing to files, overwrite when contents of existing files have changed. If overwrite is specified, overwrite all parameter set files. If a dry run is requested print file-content associations for files that would have been written.
Writes parameter set files in YAML syntax by default. Output formatting is controlled by
output_file_type.parameter_1: 1 parameter_2: a
- Parameters:
output_file_type – Output file syntax or type. Options are: ‘yaml’, ‘h5’.
dry_run – Print contents of new parameter study output files to STDOUT and exit
- Raises:
waves.exceptions.ChoicesError – If an unsupported output file type is requested
Quantity of Interest (QOI)#
Quantity of Interest (QOI) tools.
Warning
This module is considered experimental pending early adopter end user trial and feedback.
The QOI xarray data array and dataset handling of create_qoi() and create_qoi_set() should be stable, but
the output plotting and reporting formatting is subject to change.
- waves.qoi.create_qoi(
- name: str,
- calculated: float = nan,
- expected: float = nan,
- lower_rtol: float = nan,
- upper_rtol: float = nan,
- lower_atol: float = nan,
- upper_atol: float = nan,
- lower_limit: float = nan,
- upper_limit: float = nan,
- **attrs,
Create a QOI DataArray.
If you create a QOI with calculated values, and a separate QOI with only expected values, you can combine them with
xarray.merge([calculated, expected]).- Parameters:
name – QOI name
expected (calculated,) – Calculated and expected QOI values, respectively.
upper_limit (lower_rtol, lower_atol, lower_limit, upper_rtol, upper_atol,) – Tolerance values which set the acceptable range of calculated QOI values. Any or all of these tolerances may be specified. If
lower_rtolorupper_rtolare specified,expectedmust also be specified. The calculated QOI value will be considered within tolerance if it is greater than or equal tomax((lower_limit, expected + lower_atol, expected + abs(expected * lower_rtol))and less than or equal tomin((upper_limit, expected + upper_atol, expected + abs(expected * upper_rtol)). Unspecified tolerances are not considered in the tolerance check. If no tolerances are specified, the calculated QOI will always be considered within tolerance.attrs – Attributes to associate with the QOI. Recommended attributes are: group, units, description, long_name, version. Together
nameandattrs['group']should distinguish each QOI from every other QOI in the Mod/Sim repository. In other words,groupshould be as specific as possible, e.g., “Local Test XYZ Assembly Preload” instead of just “Preload”.
- Returns:
QOI
Example:
>>> load = waves.qoi.create_qoi( ... name="load", ... calculated=5.0, ... units="N", ... long_name="Axial Load", ... description="Axial load through component XYZ", ... group="Assembly ABC Preload", ... version="abcdef", ... date="2025-01-01", ... ) >>> load <xarray.DataArray 'load' (value_type: 4)> Size: 32B array([ 5., nan, nan, nan]) Coordinates: * value_type (value_type) <U11 176B 'calculated' 'expected' ... 'upper_limit' Attributes: units: N long_name: Axial Load description: Axial load through component XYZ group: Assembly ABC Preload version: abcdef date: 2025-01-01
- waves.qoi.create_qoi_set(qois: Iterable[DataArray]) Dataset[source]
Create a QOI dataset containing multiple QOIs from a single simulation.
This operation combines multiple QOIs (
xarray.DataArray``s) into a single "QOI Set" (``xarray.Dataset) usingxarray.merge(). This results in anxarray.Datasetwith each QOI represented as a separate data variable. Each QOI must have a unique name. If multiple QOIs share a dimension name then that dimension will be merged across QOIs. This can lead to sparse Datasets if the dimension’s index values differ. To avoid this, use unique dimension names or avoid combining those QOIs into the same QOI set. No attributes will be set at the top-level Dataset, but QOI attributes will be preserved at the data variable level.- Parameters:
qois – Sequence of QOIs.
- Returns:
QOI Set containing each QOI as a separate data variable.
Example:
>>> load = waves.qoi.create_qoi( ... name="load", ... calculated=5.0, ... units="N", ... long_name="Axial Load", ... description="Axial load through component XYZ", ... group="Assembly ABC Preload", ... version="abcdef", ... date="2025-01-01", ... ) ... gap = waves.qoi.create_qoi( ... name="gap", ... calculated=1.0, ... units="mm", ... long_name="Radial gap", ... description="Radial gap between components A and B", ... group="Assembly ABC Preload", ... version="abcdef", ... date="2025-01-01", ... ) ... ... # Combine QOIs into calculated QOIs set ... simulation_1_qois = waves.qoi.create_qoi_set((load, gap)) ... simulation_1_qois <xarray.Dataset> Size: 240B Dimensions: (value_type: 4) Coordinates: * value_type (value_type) <U11 176B 'calculated' 'expected' ... 'upper_limit' Data variables: load (value_type) float64 32B 5.0 nan nan nan gap (value_type) float64 32B 1.0 nan nan nan >>> simulation_1_qois["load"] <xarray.DataArray 'load' (value_type: 4)> Size: 32B array([ 5., nan, nan, nan]) Coordinates: * value_type (value_type) <U11 176B 'calculated' 'expected' ... 'upper_limit' Attributes: units: N long_name: Axial Load description: Axial load through component XYZ group: Assembly ABC Preload version: abcdef date: 2025-01-01
- waves.qoi.write_qoi_set_to_csv(qoi_set: Dataset, output: Path) None[source]
Write a QOI Dataset to a CSV file.
- Parameters:
qoi_set – QOI set.
output – Output CSV file.
Example:
>>> simulation_1_qois <xarray.Dataset> Size: 240B Dimensions: (value_type: 4) Coordinates: * value_type (value_type) <U11 176B 'calculated' 'expected' ... 'upper_limit' Data variables: load (value_type) float64 32B 5.0 4.5 3.5 5.5 gap (value_type) float64 32B 1.0 0.8 0.7 0.9 >>> waves.qoi.write_qoi_set_to_csv(simulation_1_qois, "simulation_1_qois.csv")
name
calculated
expected
lower_limit
upper_limit
units
long_name
description
group
version
date
load
5.0
4.5
3.5
5.5
N
Axial Load
Axial load through component XYZ
Assembly ABC Preload
abcdef
2025-01-01
gap
1.0
0.8
0.7000000000000001
0.9
mm
Radial gap
Radial gap between components A and B
Assembly ABC Preload
abcdef
2025-01-01
Exceptions#
Module of package specific exceptions.
The project design intent is to print error messages to STDERR and return non-zero exit codes when used as a command line utility (CLI), but raise exceptions when used as a package (API). The only time a stack trace should be printed when using the CLI is if the exception is unexpected and may represent an internal bug.
Most raised exceptions in the package API should be Python built-ins. However, to support the CLI error handling
described above some exceptions need to be uniquely identifiable as package exceptions and not third-party exceptions.
waves.exceptions.WAVESError() and RuntimeError exceptions will be be caught by the command line utility and
converted to error messages and non-zero return codes. Third-party exceptions represent truly unexpected behavior that
may be an internal bug and print a stack trace.
This behavior could be supported by limiting package raised exceptions to RuntimeError exceptions; however, more specific exceptions are desirable when using the package API to allow end-users to handle different collections of API exceptions differently.
- exception waves.exceptions.APIError[source]
Bases:
WAVESErrorRaised when an API validation fails, e.g. an argument value is outside the list of acceptable choices.
Intended to mirror an associated
argparseCLI option validation.
- exception waves.exceptions.ChoicesError[source]
Bases:
APIErrorRaised during API validation that mirrors an
argparseCLI argument with limited choices.
- exception waves.exceptions.MutuallyExclusiveError[source]
Bases:
APIErrorRaised during API validation that mirrors an
argparseCLI mutually exclusive group.
- exception waves.exceptions.SchemaValidationError[source]
Bases:
APIErrorRaised when a WAVES parameter generator schema validation fails.
- exception waves.exceptions.WAVESError[source]
Bases:
ExceptionDefine the base class for WAVES exceptions.
All exceptions that must be caught by the CLI should derive from this class.
_main.py#
Internal module implementing the command line utility behavior.
Should raise RuntimeError or a derived class of waves.exceptions.WAVESError to allow
waves._main.main() to convert stack-trace/exceptions into STDERR message and non-zero exit codes.
- waves._main.get_parser() ArgumentParser[source]
Get parser object for command line options.
- Returns:
parser
- waves._main.main() None[source]
Run the WAVES command line interface.
_docs.py#
Internal API module implementing the docs subcommand behavior.
Should raise RuntimeError or a derived class of waves.exceptions.WAVESError to allow the CLI implementation
to convert stack-trace/exceptions into STDERR message and non-zero exit codes.
- waves._docs.get_parser() ArgumentParser[source]
Return a ‘no-help’ parser for the docs subcommand.
- Returns:
parser
- waves._docs.main(documentation_index: Path, print_local_path: bool = False) None[source]
Open the package HTML documentation in the system default web browser or print documentation index path.
- Parameters:
print_local_path – Flag to print the local path to terminal instead of calling the default web browser
- Raises:
RuntimeError – if the installed documentation path is not found
RuntimeError – if a web browser fails to open
_fetch.py#
Internal API module implementing the fetch subcommand behavior.
Should raise RuntimeError or a derived class of waves.exceptions.WAVESError to allow the CLI implementation
to convert stack-trace/exceptions into STDERR message and non-zero exit codes.
- waves._fetch.available_files(
- root_directory: str | Path,
- relative_paths: Iterable[str | Path],
Build a list of files at
relative_pathswith respect to the rootroot_directorydirectory.Returns a list of absolute paths and a list of any relative paths that were not found. Falls back to a full recursive search of
relative_pathswithpathlib.Path.rglobto enable pathlib style pattern matching.- Parameters:
root_directory – Relative or absolute root path to search. Relative paths are converted to absolute paths with respect to the current working directory before searching.
relative_paths – Relative paths to search for. Directories are searched recursively for files.
- Returns:
available_files, not_found
- waves._fetch.build_copy_tuples(
- destination: str | Path,
- requested_paths_resolved: list[Path],
- overwrite: bool = False,
Build a tuple of (requested, destination) copy pairs.
- Parameters:
destination – String or pathlike object for the destination directory
requested_paths_resolved – List of absolute requested files as path-objects
- Returns:
requested and destination file path pairs
- waves._fetch.build_destination_files(
- destination: str | Path,
- requested_paths: list[Path],
Build destination file paths from the requested paths, truncating the longest possible source prefix path.
- Parameters:
destination – String or pathlike object for the destination directory
requested_paths – List of requested files as path-objects
- Returns:
destination files, existing files
- waves._fetch.build_source_files(
- root_directory: str | Path,
- relative_paths: Iterable[str | Path],
- exclude_patterns: Iterable[str] = ['__pycache__', '.pyc', '.sconf_temp', '.sconsign.dblite', 'config.log'],
Wrap
available_files()and trim list based on exclude patterns.If no source files are found, an empty list is returned.
- Parameters:
root_directory (str) – Relative or absolute root path to search. Relative paths are converted to absolute paths with respect to the current working directory before searching.
relative_paths (list) – Relative paths to search for. Directories are searched recursively for files.
exclude_patterns (list) – list of strings to exclude from the root_directory directory tree if the path contains a matching string.
- Returns:
source_files, not_found
- Return type:
tuple of lists
- waves._fetch.conditional_copy(copy_tuples: list[tuple[Path, Path]]) None[source]
Copy when destination file doesn’t exist or doesn’t match source file content.
Uses Python
shutil.copyfile, so meta data isn’t preserved. Creates intermediate parent directories prior to copy, but doesn’t raise exceptions on existing parent directories.- Parameters:
copy_tuples – Tuple of source, destination pathlib.Path pairs, e.g.
((source, destination), ...)
- waves._fetch.extend_requested_paths(
- requested_paths: list[Path],
- tutorial: Literal[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12],
Extend the requested_paths list with the necessary tutorial files.
- Parameters:
requested_paths – list of relative path-like objects that subset the files found in the
root_directoryrelative_pathstutorial – Integer to fetch all necessary files for the specified tutorial number
- Returns:
extended requested paths
- Raises:
ChoicesError – If the requested tutorial number doesn’t exist
- waves._fetch.get_parser() ArgumentParser[source]
Return a ‘no-help’ parser for the fetch subcommand.
- waves._fetch.longest_common_path_prefix(file_list: list[Path]) Path[source]
Return the longest common file path prefix.
The edge case of a single path is handled by returning the parent directory
- Parameters:
file_list – List of path-like objects
- Returns:
longest common path prefix
- Raises:
RuntimeError – When file list is empty
- waves._fetch.main(
- subcommand: str,
- root_directory: str | Path,
- relative_paths: Iterable[str | Path],
- destination: str | Path,
- requested_paths: list[Path] | None = None,
- tutorial: Literal[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12] | None = None,
- overwrite: bool = False,
- dry_run: bool = False,
- print_available: bool = False,
Wrap
waves.fetch.recursive_copy()to provide subcommand specific behavior and STDOUT/STDERR.Recursively copy requested paths from root_directory/relative_paths directories into destination directory using the shortest possible shared source prefix.
- Parameters:
subcommand – name of the subcommand to report in STDOUT
root_directory – String or pathlike object for the root_directory directory
relative_paths – List of string or pathlike objects describing relative paths to search for in root_directory
destination – String or pathlike object for the destination directory
requested_paths – list of Path objects that subset the files found in the
root_directoryrelative_pathstutorial – Integer to fetch all necessary files for the specified tutorial number
overwrite – Boolean to overwrite any existing files in destination directory
dry_run – Print the destination tree and exit. Short circuited by
print_availableprint_available – Print the available source files and exit. Short circuits
dry_run
- waves._fetch.print_list(
- things_to_print: list,
- prefix: str = '\t',
- stream: ~typing.IO = <_io.TextIOWrapper name='<stdout>' mode='w' encoding='utf-8'>,
Print a list to the specified stream, one line per item.
- Parameters:
things_to_print (list) – List of items to print
prefix (str) – prefix to print on each line before printing the item
stream (file-like) – output stream. Defaults to
sys.stdout.
- waves._fetch.recursive_copy(
- root_directory: str | Path,
- relative_paths: Iterable[str | Path],
- destination: str | Path,
- requested_paths: list[Path] | None = None,
- tutorial: Literal[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12] | None = None,
- overwrite: bool = False,
- dry_run: bool = False,
- print_available: bool = False,
Recursively copy requested paths from root_directory/relative_paths directories into destination directory.
Destination subdirectories are created using the shortest possible shared source prefix.
If destination files exist, copy non-conflicting files unless overwrite is specified.
- Parameters:
root_directory – String or pathlike object for the root_directory directory
relative_paths – List of string or pathlike objects describing relative paths to search for in root_directory
destination – String or pathlike object for the destination directory
requested_paths – list of relative path-objects that subset the files found in the
root_directoryrelative_pathstutorial – Integer to fetch all necessary files for the specified tutorial number
overwrite – Boolean to overwrite any existing files in destination directory
dry_run – Print the destination tree and exit. Short circuited by
print_availableprint_available – Print the available source files and exit. Short circuits
dry_run
- Raises:
RuntimeError – If the no requested files exist in the longest common source path
_visualize.py#
Internal API module implementing the visualize subcommand behavior.
Should raise RuntimeError or a derived class of waves.exceptions.WAVESError to allow the CLI implementation
to convert stack-trace/exceptions into STDERR message and non-zero exit codes.
- waves._visualize.add_node_count(
- graph: DiGraph,
- text: str = 'Node count: ',
Add an orphan node with the total node count to a directed graph.
The graph nodes must contain a
layerattribute with integer values. Orphan node is assigned to the minimum layer.- Parameters:
graph – original graph
text – Leading text for node name and label
- waves._visualize.ancestor_subgraph(graph: DiGraph, nodes: list[str]) DiGraph[source]
Return a new directed graph containing nodes and their ancestors.
- Parameters:
graph – original directed graph
nodes – iterable of nodes name strings
- Returns:
subgraph
- Raises:
RuntimeError – If one or more nodes are missing from the graph
- waves._visualize.check_regex_exclude(
- exclude_regex: str | None,
- node_name: str,
- current_indent: int,
- exclude_indent: int,
- exclude_node: bool = False,
Return boolean flag and updated exclusion indent level to exclude node names that match the regular expression.
- Parameters:
exclude_regex – Regular expression
node_name – Name of the node
current_indent – Current indent of the parsed output
exclude_indent – Set to current_indent if node is to be excluded
exclude_node – Indicated whether a node should be excluded
- Returns:
Tuple containing exclude_node and exclude_indent
- waves._visualize.get_parser() ArgumentParser[source]
Return a ‘no-help’ parser for the visualize subcommand.
- Returns:
parser
- waves._visualize.graph_to_graphml(graph: DiGraph) str[source]
Return the networkx graphml text.
- Parameters:
graph – networkx directed graph
- waves._visualize.main(
- targets: list[str],
- scons_args: list | None = None,
- sconstruct: Path = PosixPath('SConstruct'),
- output_file: Path | None = None,
- height: int = 12,
- width: int = 36,
- font_size: int = 10,
- node_color: str = '#5AC7CB',
- edge_color: str = '#B7DEBE',
- exclude_list: list[str] = ['/usr/bin'],
- exclude_regex: str | None = None,
- print_graphml: bool = False,
- print_tree: bool = False,
- vertical: bool = False,
- no_labels: bool = False,
- node_count: bool = False,
- transparent: bool = False,
- break_paths: bool = False,
- input_file: str | Path | None = None,
Visualize the directed acyclic graph created by a SCons build.
Uses matplotlib and networkx to build out an acyclic directed graph showing the relationships of the various dependencies using boxes and arrows. The visualization can be saved as an svg and graphml output can be printed as well.
- Parameters:
targets – Strings specifying SCons targets
scons_args – list of SCons arguments
sconstruct – Path to an SConstruct file or parent directory
output_file – File for saving the visualization
height – Height of visualization if being saved to a file
width – Width of visualization if being saved to a file
font_size – Font size of node labels
exclude_list – exclude nodes starting with strings in this list (e.g. /usr/bin)
exclude_regex – exclude nodes that match this regular expression
print_graphml – Whether to print the graph in graphml format
print_tree – Print the text output of the
scons --treecommand to the screenvertical – Specifies a vertical layout of graph instead of the default horizontal layout
no_labels – Don’t print labels on the nodes of the visualization
node_count – Add a node count orphan node
transparent – Use a transparent background
break_paths – Format paths by breaking at path separator with a newline
input_file – Path to text file storing output from SCons tree command
- waves._visualize.parse_output(
- tree_lines: list[str],
- exclude_list: list[str] = ['/usr/bin'],
- exclude_regex: str | None = None,
- no_labels: bool = False,
- break_paths: bool = False,
Parse the string that has the tree output and return as a networkx directed graph.
- Parameters:
tree_lines – output of the scons tree command pre-split on newlines to a list of strings
exclude_list – exclude nodes starting with strings in this list(e.g. /usr/bin)
exclude_regex – exclude nodes that match this regular expression
no_labels – Don’t print labels on the nodes of the visualization
break_paths – Format paths by breaking at path separator with a newline
- Returns:
networkx directed graph
- Raises:
RuntimeError – If the parsed input doesn’t contain recognizable SCons nodes
- waves._visualize.plot(figure: Figure, output_file: Path | None = None, transparent: bool = False) None[source]
Open a matplotlib plot or save to file.
- Parameters:
figure – The matplotlib figure
output_file – File for saving the visualization
transparent – Use a transparent background
- waves._visualize.visualize(
- graph: DiGraph,
- height: int = 12,
- width: int = 36,
- font_size: int = 10,
- node_color: str = '#5AC7CB',
- edge_color: str = '#B7DEBE',
- vertical: bool = False,
Create a visualization showing the tree.
Nodes in graph require the
layerandlabelattributes.- Parameters:
tree – output of the scons tree command stored as dictionary
height – Height of visualization if being saved to a file
width – Width of visualization if being saved to a file
font_size – Font size of file names in points
vertical – Specifies a vertical layout of graph instead of the default horizontal layout
_build.py#
Internal API module implementing the build subcommand behavior.
Should raise RuntimeError or a derived class of waves.exceptions.WAVESError to allow the CLI implementation
to convert stack-trace/exceptions into STDERR message and non-zero exit codes.
- waves._build.get_parser() ArgumentParser[source]
Return a ‘no-help’ parser for the build subcommand.
- Returns:
parser
- waves._build.main(
- targets: list,
- scons_args: list | None = None,
- max_iterations: int = 5,
- working_directory: str | Path | None = None,
- git_clone_directory: str | Path | None = None,
Submit an iterative SCons command.
SCons command is re-submitted until SCons reports that the target ‘is up to date.’ or the iteration count is reached.
- Parameters:
targets – list of SCons targets (positional arguments)
scons_args – list of SCons arguments
max_iterations – Maximum number of iterations before the iterative loop is terminated
working_directory – Change the SCons command working directory
git_clone_directory – Destination directory for a Git clone operation
_parameter_study.py#
Internal API module implementing the parameter study subcommand(s) behavior.
Thin CLI wrapper around waves.parameter_generators() classes
Should raise RuntimeError or a derived class of waves.exceptions.WAVESError to allow the CLI implementation
to convert stack-trace/exceptions into STDERR message and non-zero exit codes.
- waves._parameter_study.get_parser() ArgumentParser[source]
Return a ‘no-help’ parser for the parameter study subcommand(s).
- Returns:
parser
- waves._parameter_study.main(
- subcommand: str,
- input_file: str | Path | TextIOWrapper | None,
- output_file_template: str | None = None,
- output_file: str | None = None,
- output_file_type: Literal['h5', 'yaml'] = 'yaml',
- set_name_template: str = 'parameter_set@number',
- previous_parameter_study: str | None = None,
- require_previous_parameter_study: bool = False,
- overwrite: bool = False,
- dry_run: bool = False,
- write_meta: bool = False,
Build parameter studies.
- Parameters:
subcommand (str) – parameter study type to build
input_file (str) – path to YAML formatted parameter study schema file
output_file_template (str) – output file template name
output_file (str) – relative or absolute output file path
output_file_type (str) – yaml or h5
set_name_template (str) – parameter set name string template. May contain ‘@number’ for the set number.
previous_parameter_study (str) – relative or absolute path to previous parameter study file
overwrite (bool) – overwrite all existing parameter set file(s)
dry_run (bool) – print what files would have been written, but do no work
write_meta (bool) – write a meta file name ‘parameter_study_meta.txt’ containing the parameter set file path(s)
- waves._parameter_study.read_parameter_schema(input_file: str | Path | TextIOWrapper | None) dict[source]
Read a YAML dictionary from STDIN or a file.
- Parameters:
input_file – STDIN stream or file path
- Returns:
dictionary
- Raises:
RuntimeError – if not STDIN and the file name does not exist
_print_study.py#
Internal API module implementing the print_study subcommand behavior.
Should raise RuntimeError or a derived class of waves.exceptions.WAVESError to allow the CLI implementation
to convert stack-trace/exceptions into STDERR message and non-zero exit codes.
- waves._print_study.get_parser() ArgumentParser[source]
Return a ‘no-help’ parser for the print_study subcommand.
- Returns:
parser
- waves._print_study.main(parameter_study_file: Path) None[source]
Open and print a WAVES parameter study file as a table.
- Parameters:
parameter_study_file – The parameter study file to open
- Raises:
RuntimeError – If one or more files fails to open
_utilities.py#
Internal API module storing project utilities.
Functions that are limited in use to a public API should prefer to raise Python built-in exceptions.
Functions that may be used in a CLI implementation should raise RuntimeError or a derived class of
waves.exceptions.WAVESError to allow the CLI implementation to convert stack-trace/exceptions into STDERR
message and non-zero exit codes.
- class waves._utilities._AtSignTemplate(template)[source]
Bases:
TemplateUse the CMake ‘@’ delimiter in a Python ‘string.Template’ to avoid clashing with bash variable syntax.
- waves._utilities.cache_environment(
- command: str,
- shell: str = 'bash',
- cache: str | Path | None = None,
- overwrite_cache: bool = False,
- verbose: bool = False,
Retrieve cached environment dictionary or run a shell command to generate environment dictionary.
Warning
Currently assumes a nix flavored shell: sh, bash, zsh, csh, tcsh. May work with any shell supporting command construction as below.
{shell} -c "{command} && env -0"
The method may fail if the command produces stdout that does not terminate in a newline. Redirect command output away from stdout if this causes problems, e.g.
command = 'command > /dev/null && command two > /dev/null'in most shells.If the environment is created successfully and a cache file is requested, the cache file is always written. The
overwrite_cachebehavior forces the shellcommandexecution, even when the cache file is present. If thecommandfails (raising asubprocess.CalledProcessError) the captured output is printed to STDERR before re-raising the exception.- Parameters:
command – the shell command to execute
shell – the shell to use when executing command by absolute or relative path
cache – absolute or relative path to read/write a shell environment dictionary. Will be written as YAML formatted file regardless of extension.
overwrite_cache – Ignore previously cached files if they exist.
verbose – Print SCons configuration-like action messages when True
- Returns:
shell environment dictionary
- Raises:
subprocess.CalledProcessError – Print the captured output and re-raise exception when the shell command returns a non-zero exit status.
- waves._utilities.create_valid_identifier(identifier: str) str[source]
Create a valid Python identifier from an arbitray string by replacing invalid characters with underscores.
- Parameters:
identifier – String to convert to valid Python identifier
- waves._utilities.cubit_os_bin() str[source]
Return the OS specific Cubit bin directory name.
Making Cubit importable requires putting the Cubit bin directory on PYTHONPATH. On MacOS, the directory is “MacOS”. On other systems it is “bin”.
- Returns:
bin directory name, e.g. “bin” or “MacOS”
- Return type:
- waves._utilities.find_command(options: Sequence[str]) str[source]
Return first found command in list of options.
- Parameters:
options – alternate command options
- Returns:
command absolute path
- Raises:
FileNotFoundError – If no matching command is found
- waves._utilities.find_cubit_bin(options: Sequence[str], bin_directory: str | None = None) Path[source]
Search for the Cubit bin directory given a few options for the Cubit executable.
Recommend first checking to see if cubit will import.
- Parameters:
options – Cubit command options
bin_directory – Cubit’s bin directory name. Override the bin directory returned by
waves._utilities.cubit_os_bin().
- Returns:
Cubit bin directory absolute path
- Raises:
FileNotFoundError – If the Cubit command or bin directory is not found
- waves._utilities.find_cubit_python(options: Sequence[str], python_command: str = 'python3*') Path[source]
Search for the Cubit Python interpreter given a few options for the Cubit executable.
Recommend first checking to see if cubit will import.
- Parameters:
options – Cubit command options
python_command – Cubit’s Python executable file basename or
pathlib.Path.rglobpattern
- Returns:
Cubit Python intepreter executable absolute path
- Raises:
FileNotFoundError – If the Cubit command or Cubit Python interpreter is not found
- waves._utilities.return_environment(
- command: str,
- shell: str = 'bash',
- string_option: str = '-c',
- separator: str = '&&',
- environment: str = 'env -0',
Run a shell command and return the shell environment as a dictionary.
{shell} {string_option} "{command} {separator} {environment}"
Warning
The method may fail if the command produces stdout that does not terminate in a newline. Redirect command output away from stdout if this causes problems, e.g.
command = 'command > /dev/null && command two > /dev/null'in most shells.- Parameters:
command – the shell command to execute
shell – the shell to use when executing command by absolute or relative path
string_option – the shell’s option to execute a string command
separator – the shell’s command separator, e.g.
;or&&.environment – environment command to print environment on STDOUT with null terminated seperators
- Returns:
shell environment dictionary
- Raises:
subprocess.CalledProcessError – When the shell command returns a non-zero exit status
- waves._utilities.search_commands(options: Sequence[str]) str | None[source]
Return the first found command in the list of options. Return None if none are found.
- Parameters:
options (list) – executable path(s) to test
- Returns:
command absolute path
- waves._utilities.set_name_substitution(
- original: Iterable[str | Path] | str | Path,
- replacement: str,
- identifier: str = 'set_name',
- suffix: str = '/',
Replace
@identifierwith replacement text in a list of strings and pathlib Path objects.If the original is not a string, Path, or an iterable of strings and Paths, return without modification.
- Parameters:
original – List of strings
replacement – substitution string for the identifier
identifier – template identifier to replace, e.g.
@identifierbecomesreplacementsuffix – to insert after the replacement text
- Returns:
string or list of strings with identifier replacements
- waves._utilities.tee_subprocess(command: list[str], **kwargs) tuple[int, str][source]
Stream STDOUT to terminal while saving buffer to variable.
- Parameters:
command – Command to execute provided a list of strings
kwargs (dict) – Any additional keyword arguments are passed through to subprocess.Popen
- Returns:
integer return code, string STDOUT
- waves._utilities.warn_only_once(function: Callable) Callable[source]
Suppress warnings raised by successive function calls.
- Parameters:
function – The function to wrap
- Returns:
function wrapped in the warning suppression logic
odb_extract.py#
Extracts data from an Abaqus odb file. Writes two files ‘output_file.h5’ and ‘output_file_datasets.h5’.
Calls odbreport feature of Abaqus, parses resultant file, and creates output file. Most simulation data lives in a group path following the instance and set name, e.g. ‘/INSTANCE/FieldOutputs/ELEMENT_SET’, and can be accessed with xarray as
import xarray
xarray.open_dataset("sample.h5", group="/INSTANCE/FieldOutputs/ELEMENT_SET")
You can view all group paths with ‘h5ls -r sample.h5’. Additional ODB information is available in the ‘/odb’ group path. The ‘/xarray/Dataset’ group path contains a list of group paths that contain an xarray dataset.
/ # Top level group required in all hdf5 files
/<instance name>/ # Groups containing data of each instance found in an odb
FieldOutputs/ # Group with multiple xarray datasets for each field output
<field name>/ # Group with datasets containing field output data for a specified set or surface
# If no set or surface is specified, the <field name> will be 'ALL_NODES' or 'ALL_ELEMENTS'
HistoryOutputs/ # Group with multiple xarray datasets for each history output
<region name>/ # Group with datasets containing history output data for specified history region name
# If no history region name is specified, the <region name> will be 'ALL NODES'
Mesh/ # Group written from an xarray dataset with all mesh information for this instance
/<instance name>_Assembly/ # Group containing data of assembly instance found in an odb
Mesh/ # Group written from an xarray dataset with all mesh information for this instance
/odb/ # Catch all group for data found in the odbreport file not already organized by instance
info/ # Group with datasets that mostly give odb meta-data like name, path, etc.
jobData/ # Group with datasets that contain additional odb meta-data
rootAssembly/ # Group with datasets that match odb file organization per Abaqus documentation
sectionCategories/ # Group with datasets that match odb file organization per Abaqus documentation
/xarray/ # Group with a dataset that lists the location of all data written from xarray datasets
- waves._abaqus.odb_extract.get_odb_report_args(odb_report_args: str, input_file: Path, job_name: Path) str[source]
Generate odb_report arguments.
- Parameters:
odb_report_args – String of command line options to pass to
abaqus odbreport.input_file –
.odbfile.job_name – Report file.
- waves._abaqus.odb_extract.get_parser() ArgumentParser[source]
Get parser object for command line options.
- Returns:
argument parser
- Return type:
parser
- waves._abaqus.odb_extract.odb_extract(
- input_file: list[Path],
- output_file: str | None = None,
- output_type: Literal['h5', 'yaml', 'json'] = 'h5',
- odb_report_args: str | None = None,
- abaqus_command: str = 'abq2024',
- delete_report_file: bool = False,
- verbose: bool = False,
Run the odb_extract Abaqus data extraction tool.
Users should use the associated command line interface, not this API.
- Parameters:
input_file – A list of
*.odbfiles to extract. Current implementation only supports extraction on the first file in the list.output_file – The output file name to extract to. Extension should match on of the supported output types.
output_type – Output file type. Defaults to
h5. Options are:h5,yaml,json.odb_report_args – String of command line options to pass to
abaqus odbreport.abaqus_command – The abaqus command name or absolute path to the Abaqus exectuble.
delete_report_file – Boolean to delete the intermediate Abaqus generated report file after producing the
output_file.verbose – Boolean to print more verbose messages
- waves._abaqus.odb_extract.run_external(cmd: str) tuple[source]
Execute an external command and get its exitcode, stdout and stderr.
- Parameters:
cmd – command line command to run
- Returns:
output, return_code, error_code
Odb Report File Parser#
- class waves._abaqus.abaqus_file_parser.OdbReportFileParser(input_file: str | Path, *args, verbose: bool = False, **kwargs)[source]
Bases:
AbaqusFileParserReturn class for parsing Abaqus odbreport files.
Expected input includes only files that are in the csv format and which have used the ‘blocked’ option.
Results are stored either in a dictionary which mimics the format of the odb file (see Abaqus documentation), or stored in a specialized ‘extract’ format written to an hdf5 file.
Format of HDF5 file#/ # Top level group required in all hdf5 files /<instance name>/ # Groups containing data of each instance found in an odb FieldOutputs/ # Group with multiple xarray datasets for each field output <field name>/ # Group with datasets containing field output data for a specified set or surface # If no set or surface is specified, the <field name> will be # 'ALL_NODES' or 'ALL_ELEMENTS' HistoryOutputs/ # Group with multiple xarray datasets for each history output <region name>/ # Group with datasets containing history output data for specified history region name # If no history region name is specified, the <region name> will be 'ALL NODES' Mesh/ # Group written from an xarray dataset with all mesh information for this instance /<instance name>_Assembly/ # Group containing data of assembly instance found in an odb Mesh/ # Group written from an xarray dataset with all mesh information for this instance /odb/ # Catch all group for data found in the odbreport file not already organized by instance info/ # Group with datasets that mostly give odb meta-data like name, path, etc. jobData/ # Group with datasets that contain additional odb meta-data rootAssembly/ # Group with datasets that match odb file organization per Abaqus documentation sectionCategories/ # Group with datasets that match odb file organization per Abaqus documentation /xarray/ # Group with a dataset that lists the location of all data written from xarray datasets Dataset # HDF5 Dataset that lists the location within the hdf5 file of all xarray datasets
- create_extract_format(odb_dict: dict, h5_file: str, time_stamp: str) None[source]
Format
self.parseddictionary odb data into something that resembles previous abaqus extract method.- Parameters:
odb_dict – Dictionary with odb data
h5_file – Name of h5_file to use for storing data
time_stamp – Time stamp for possibly appending to hdf5 file names
- get_position_index(position: int, position_type: str, values: dict) tuple[int, bool][source]
Return the index of the position (node or element) currently used.
- Parameters:
position – integer representing a node or element
position_type – string of either ‘nodes’ or ‘elements’
values – dictionary where values are stored
- Returns:
index, just_added
- pad_none_values(
- step_number: int,
- frame_number: int,
- position_length: int,
- data_length: int,
- element_size: int | None,
- values: list,
Pad the values list with None or lists of None values in the locations indicated by the parameters.
- Parameters:
step_number (int) – index of current step
frame_number (int) – index of current frame
position_length (int) – number of nodes or elements
data_length (int) – length of data given in field
element_size (int) – number of element lines that could be listed, e.g. for a hex this value would be 6
values (list) – list that holds the data values
- parse(
- data_format: Literal['odb', 'extract'] = 'extract',
- h5_file: str = 'extract.h5',
- time_stamp: str | None = None,
Parse the file and store the results in the self.parsed dictionary.
Can parse csv formatted output with the blocked option from the odbreport command.
- Parameters:
data_format – Format in which to store data can be ‘odb’ or ‘extract’
h5_file – Name of hdf5 file to store data into when using the extract format
time_stamp – Time stamp for possibly appending to hdf5 file names
- parse_analytic_surface(f: TextIO, instance: dict, line: str) None[source]
Parse the section that contains analytic surface.
- Parameters:
f – open file
instance – dictionary for storing the analytic surface
line – current line of file
- parse_components_of_field(f: TextIO, line: str, field: dict) str[source]
Parse the section that contains the data for field outputs found after the ‘Components of field’ heading.
- Parameters:
f – open file
line – current line of file
field – dictionary for storing field output
- Returns:
current line of file
- parse_element_classes(f: TextIO, instance: dict, number_of_element_classes: int) None[source]
Parse the section that contains element classes.
- Parameters:
f – open file
instance – dictionary for storing the elements
number_of_element_classes – number of element classes to parse
- parse_element_set(f: TextIO, instance: dict, number_of_element_sets: int) None[source]
Parse the section that contains element sets.
- Parameters:
f – open file
instance – dictionary for storing the element sets
number_of_element_sets – number of element sets to parse
- parse_elements(f: TextIO, instance: dict, number_of_elements: int) None[source]
Parse the section that contains elements.
- Parameters:
f – open file
instance – dictionary for storing the elements
number_of_elements – number of elements to parse
- parse_field_values(f: TextIO, line: str, values: list | dict) str[source]
Parse the section that contains the data for field values.
- Parameters:
f – open file
line – current line
values – list for storing the field values
- Returns:
current line of file
- parse_fields(f: TextIO, fields: dict, line: str) str[source]
Parse the section that contains the data for field outputs.
- Parameters:
f – open file
fields – dictionary for storing the field outputs
line – current line of file
- Returns:
current line of file
- parse_frames(f: TextIO, frames: list, number_of_frames: int) str[source]
Parse the section that contains the data for frames.
- Parameters:
f – open file
frames – list for storing the frames
number_of_frames – number of frames to parse
- Returns:
current line of file
- parse_history_outputs(f: TextIO, outputs: dict, line: str) str[source]
Parse the section that contains history outputs.
- Parameters:
f (file object) – open file
outputs – dict for storing the history output data
line – current line of file
- Returns:
current line of file
- parse_history_regions(
- f: TextIO,
- line: str,
- regions: dict,
- number_of_history_regions: int,
Parse the section that contains history regions.
- Parameters:
f – open file
line – current line of file
regions – dict for storing the history region data
number_of_history_regions – number of history regions to parse
- Returns:
current line of file
- parse_instances(f: TextIO, instances: dict, number_of_instances: int) None[source]
Parse the section that contains instances.
- Parameters:
f – open file
instances – dictionary for storing the instances
number_of_instances – number of instances to parse
- parse_node_set(f: TextIO, instance: dict, number_of_node_sets: int) None[source]
Parse the section that contains node sets.
- Parameters:
f – open file
instance – dictionary for storing the node sets
number_of_node_sets – number of node sets to parse
- parse_nodes(f: TextIO, instance: dict, number_of_nodes: int, embedded_space: str) None[source]
Parse the section that contains nodes.
- Parameters:
f – open file
instance – dictionary for storing the nodes
number_of_nodes – number of nodes to parse
embedded_space – type of embedded space
- parse_rigid_bodies(f: TextIO, instance: dict, number_of_rigid_bodies: int) None[source]
Parse the section that contains rigid_bodies.
- Parameters:
f – open file
instance – dictionary for storing the rigid bodies
number_of_rigid_bodies – number of rigid bodies to parse
- parse_section_categories(f: TextIO, categories: dict, number_of_categories: int) None[source]
Parse the section that contains section categories.
- Parameters:
f – open file
categories – dictionary for storing the section categories
number_of_categories – number of section categories to parse
- parse_steps(f: TextIO, steps: dict, number_of_steps: int) None[source]
Parse the section that contains the data for steps.
- Parameters:
f (object) – open file
steps – dictionary for storing the steps
number_of_steps – number of steps to parse
- parse_surfaces(f: TextIO, instance: dict, number_of_surfaces: int) None[source]
Parse the section that contains surfaces.
- Parameters:
f – open file
instance – dictionary for storing the surfaces
number_of_surfaces – number of surfaces to parse
- save_dict_to_group(
- h5file: File,
- path: str,
- data_member: dict,
- output_file: str,
Recursively save data from python dictionary to hdf5 file.
This method can handle data types of int, float, str, and xarray Datasets, as well as lists or dictionaries of the aforementioned types. Tuples are assumed to have ints or floats.
- Parameters:
h5file – file stream to write data into
path – name of hdf5 group to write into
data_member – member of dictionary
output_file – name of h5 output file
- setup_extract_field_format(field: dict, line: str) dict[source]
Do setup of field output formatting for extract format.
- Parameters:
field – dictionary with field data
line – current line of file
- Returns:
dictionary for which to store field values
- setup_extract_history_format(output: dict, current_history_output: int) None[source]
Do setup of history output formatting for extract format.
- Parameters:
output – dictionary with history output data
current_history_output – current history output count