.. _vortex: Vortex basics ############# .. toctree:: :maxdepth: 1 Experiment ========== A vortex experiment should be ran in a dedicated directory:: $WORKDIR/vapp/vconf/xpid with : .. list-table:: * - **$WORKDIR** - working directory environment variable * - **vapp** - application of the experiment, for instance *hycom3d* * - **vconf** - configuration of the experiment, for instance *manga* * - **xpid** - name of the experiment, in the format myexp@location, where location defaults to username Then, the experiment directory has to be organized as follow:: conf/ jobs/ tasks/ vortex/ .. list-table:: * - conf - the directory hosting the configuration file * - jobs - the directory hosting the vortex job files to be submitted * - tasks - the directory hosting the vortex task files * - vortex - a link to a vortex repository ----------------------- Configuration ============= The configuration file sets up the experiment, the vortex jobs and the vortex tasks. From this file, the user can communicate with the vortex jobs and tasks through the sections. Example ^^^^^^^ .. literalinclude:: examples/hycom3d_manga.ini :language: ini ----------------------- Jobs ==== The vortex jobs are generated by a vortex tool named *mkjob*. *mkjob* generates a script based on the application configuration file and arguments given in command line:: $ vortex/bin/mkjob.py --job name={jobname} task={taskname} ----------------------- Tasks ===== The *tasks* directory hold the tasks of the experiment. - **Driver** A task is a Python script in which is implemented a *Driver* defined through a *setup* function. A *setup* function of a *Vortex task* is called and executed by a *Vortex job*. In the following example, the *Driver* consists in a family *spinup* split in three *Vortex objects* of type *Task*. These three *Tasks* would be ran sequentially. .. code-block:: python def setup(t, **kw): return Driver( tag='run_ibc_spinup', ticket=t, nodes=[ Family(tag="spinup", ticket=t, nodes=[ HycomIBCRunTime(tag='ibc_time', ticket=t, **kw), HycomIBCRunHoriz(tag='ibc_horizontal', ticket=t, **kw), HycomIBCRunVertSpinup(tag='ibc_vertical', ticket=t, **kw), ], **kw), ], options=kw, ) - **Task Object** A *Task* object has access to the parameters of the configuration via *self.conf*. A *Task* object has a *process* method split in three steps. 1. *Fetch* The *fetch* step is designed to fetch any data *resource* the task needs to be ran correctly via the *input* function of the *Vortex module* named *toolbox*. In the following example, the task would fetch a script executable hosted in the sloop library. Note that command line arguments can be specified via the *rawopts* attributes. Other ways allow to specify command line arguments. The next section is dedicated to the *Vortex resources*. .. code-block:: python tbpreproc = toolbox.input( kind = "script", language = "python", remote = t.sh.path.join( SLOOP_DIR, "bin", "sloop-hycom3d-run-ibc-2-vert-0-preproc.py"), filename = "run-vert-preproc.py", rank = self.conf.rank, rawopts = "--rank [rank] ssh_hyc.cdf" ) 2. *Compute* The *compute* step is dedicated to the algo component vortex resource. The tasks have access to the algo component through the *algo* function of the module *toolbox*. For instance, in the following example, the sloop script executable fetched above, would be ran through the *component_runner* of the task thanks to an algo component defined by its attributes *engine* and *interpreter*. .. code-block:: python tbrunpreproc = toolbox.algo( engine = 'exec', interpreter= "python", extendpypath = [self.sh.path.join(self.conf.rootapp, d) for d in ['sloop', 'sloop-env/lib/python3.7/site-packages']], ) self.component_runner(tbrunpreproc, tbpreproc) 3. *Backup* The files produced in the *compute* step can be stored in the *backup* step via the *output* function of the module toolbox. In the following, the task would store in the vortex cache the nesting hycom3d files. .. code-block:: python tb06 = toolbox.output( vapp = self.conf.vapp, vconf = self.conf.vconf, experiment = self.conf.experiment, namespace = self.conf.namespace intent = "out", role = 'Output', model = self.conf.model, date = self.conf.rundate, geometry = self.conf.geometry, kind = "gridpoint", block = 'preproc/'+(tag if "spnudge" not in tag else self.tag.split("_")[1]), field = ["s", "t", "u", "v", "h"], format = ["a", "b"], nativefmt = "[format]", filename = "[field]-nest.[format]", ) ----------------------- Resources ========= Vortex manages two types of resource : *data* and *algo components*. - **Data** The data resources are described by a *Resource* (i.e. what is in the data), by a *Provider* (i.e. where to get/put the data), and by a *Container* (i.e. where the data will be stored locally). The *Resource*, *Provider*, and *Container* are defined through attributes associated with the available data resources. For instance, the data resources implemented for sloop usage are stored in *vortex/src/shom/data*. There, three types of data resources are coded : consts, resources, and executables. 1. **consts** The module *consts.py* regards the static files, like a namelist of a model parameters or any file that do not change through time. 2. **resources** The resources changing through time are found in the module *resources.py*, like the model outputs. 3. **executables** Any executable (Fortran, Python ...) is implemented in the module *executables.py*. Herebelow is an example of an executable resource describing the program *inicon* for hycom3d: .. code-block:: python class Hycom3dIBCIniconBinary(Binary): """Binary that computes initial condictions for HYCOM.""" _footprint = [ gvar, gdomain, dict( info="Binary that computes initial conditions for HYCOM", attr=dict( gvar=dict(default="master_hycom3d_ibc_inicon_[gdomain]"), kind=dict(values=["vertical_regridder"]), model=dict(values=["hycom3d"]) ), ), ] @property def realkind(self): return "hycom3d_ibc_inicon_binary" def command_line(self, **opts): return ("{datadir} {sshfile} {tempfile} {salnfile} " "{nx} {ny} {nz} {cmoy} " "{sshmin} {cstep}").format(**opts) Note that usefull data resources are implemented in other package as well as in *vortex/src/vortex/data*. Before coding a new resource, the developer has to check carefuly through the other package if it does not exist yet. - **Algo components** The algo components allow to set command line argument or environment variable depending on the program we want to run. The algo components dedicated to sloop are implemented in the module *vortex/src/shom/algo/hycom3d.py*. Hereabove is an example of an algo component implemented for running the programm *inicon* for hycom3d : .. code-block:: python class Hycom3dIBCRunVerticalInicon(BlindRun, Hycom3dSpecsFileDecoMixin): """TODO Class Documentation. :note: Inputs:: ${repmod}/regional.depth.a ${repmod}/regional.grid.a ${repparam}/ports.input ${repparam}/blkdat.input ${repparam}/defstrech.input :note: Exe:: ${repbin}/inicon $repdatahorgrille ssh_hyc.cdf temp_hyc.cdf saln_hyc.cdf "$idm" "$jdm" "$kdm" "$CMOY" "$SSHMIN" """ _footprint = [ dict( info="Run the initial and boundary conditions vertical interpolator", attr=dict( kind=dict(values=["hycom3d_ibc_run_vert_inicon"]), sshmin=dict(), cmoy=dict() ), ), ] @property def realkind(self): return "hycom3d_ibc_run_vert_inicon" def prepare(self, rh, opts): """Get specs data from JSON.""" super(Hycom3dIBCRunVerticalInicon, self).prepare(rh, opts) self._specs = self._get_specs_and_link("inicon.json") def spawn_command_options(self): """Prepare options for the resource's command line.""" return dict( datadir="./", sshfile="ssh_hyc.cdf", tempfile="temp_hyc.cdf", salnfile="saln_hyc.cdf", nx=self._specs["nx"], ny=self._specs["ny"], nz=self._specs["nz"], cmoy=self.cmoy, sshmin=self.sshmin, cstep="0") ----------------------- Caches ====== Before running a vortex job, the data have to be stored locally in caches. The location of the caches are set via the environment variable *MTOOLDIR*. - **Vortex cache** The vortex cache regards the data changing through time. For instance, the raw location of the Mercator forecast on the MERCGM12 grid, in netcdf format, of the psy3 model at term 0h of the 2021/11/09 00UTC run used in the hycom3d application, manga configuration and, OPER experiment would be:: ${MTOOLDIR}/cache/vortex/hycom3d/manga/OPER/20211109T0000P/mercator/cpl.psy4.62n29n-16w37e-00deg083+0000:00.netcdf - **Uget cache** The uget cache regards the data which do not change through time. These resources are directly linked with the constants and executables (see for instance `vortex/src/shom/data/conts.py` and `vortex/src/shom/data/executables.py`). For instance, for vapp = hycom3d and vconf = manga, the 0.03 version of the `savefield_*.input` files of are stored in a tar file named `[vapp].savefield.[vconf]_[version]`. The raw location of this resource would be:: ${MTOOLDIR}/cache/uget/${LOGNAME}/data/hycom3d.savefield.manga_0.03