Skip to content
Snippets Groups Projects

Developer documentation for the development of plugins

Merged Adham Hashibon requested to merge developer-documentation into master
2 files
+ 246
61
Compare changes
  • Side-by-side
  • Inline
Files
2
+ 23
26
Design
Design
------
------
The application is based on five entities, as written in the introduction:
The application is based on four entities, as written in the introduction:
    • Created by: jjenthought

      I guess we're down to four entities now! But since the KPIs will eventually have their own unique properties (The constraint based stuff) we could still mention them here? Though I wouldn't know what to write about that at the moment, we could just have a placeholder there until we know exactly what form things will take.

Please register or sign in to reply
- Multi Criteria Optimizer (MCO)
- Multi Criteria Optimizer (MCO)
- DataSources
- DataSources
- Key Performance Indicator (KPI) Calculators
- Notification Listeners
- Notification Listeners
- UI Hooks
- UI Hooks
There are a few core assumptions about each of these entities:
There are a few core assumptions about each of these entities:
- The MCO design must honor the execution model of Dakota, that is, spawn
- The MCO provides numerical values and injects them into a pipeline
a secondary process that performs a computation starting from a given set
made of multiple layers. Each layer is composed of multiple data sources.
of input parameters, and produces a resulting set of output parameters.
The MCO can execute this pipeline directly, or indirectly by invoking
In our code, this secondary process is ``force_bdss`` itself, invoked with
the force_bdss with the option ``--evaluate``. This invocation will produce,
the option ``--evaluate``.
given a vector in the input space, a vector in the output space.
- The DataSources are entities that, given the MCO parameters, provide some
- The DataSources are entities that are arranged in layers. Each DataSource has
numerical result. This result is passed to the KPI calculators.
inputs and outputs, called slots. These slots may depend on the configuration
- The KPI calculators now compute the final KPIs that are then returned to
options of the specific data source. The pipeline is created by binding
the invoker MCO.
data sources outputs on the layers above with the data sources inputs of a
 
given layer. Each output can be designated as a KPI and thus be transmitted
 
back to the MCO for optimization.
- The Notification Listener listens to the state of the MCO (Started/New step
- The Notification Listener listens to the state of the MCO (Started/New step
of the computation/Finished). It can be a remote database which is filled
of the computation/Finished). It can be a remote database which is filled
with the MCO results during the computation (e.g. the GUI ``force_wfmanager``
with the MCO results during the computation (e.g. the GUI ``force_wfmanager``
@@ -29,21 +30,17 @@ There are a few core assumptions about each of these entities:
@@ -29,21 +30,17 @@ There are a few core assumptions about each of these entities:
bdss, before saving the workflow). Those operations won't be executed by the
bdss, before saving the workflow). Those operations won't be executed by the
command line interface of the bdss.
command line interface of the bdss.
The result can be represented with the following data flow
The result can be represented with the following data flow
1. The MCO produces and, potentially by means of a Communicator (if the
1. The MCO produces and, by means of a Communicator, injects...
execution model is based on invoking ``force_bdss --evaluate``),
2. ...DataSourceParameters, that are passed to...
injects a given vector of MCO parameter values in the pipeline.
3. one or more DataSources, each performing some computation or data
2. These values are passed to one or more DataSources, organised in layers,
extraction and produces
each performing some computation or data extraction and produces results.
4. DataSourceResult, one per DataSource, are then passed (together with the
Layers are executed in order from top to bottom. There is no order among
DataSourceParameters) to...
DataSources on the same layer.
5. one or more KPICalculators, which perform final data evaluation on the
3. Results that have been classified as KPIs are then returned to the MCO
obtained values, eac producing KPIResult...
(again, potentially via the Communicator).
6. Whose values are then returned to the MCO via the Communicator.
4. The KPI values are then sent to the notification listeners with the
7. The KPI values are then sent to the notification listeners with the
associated MCO parameters values
associated MCO parameters values
5. The cycle repeats until all evaluations have been performed.
The resulting pipeline is therefore just two layers (DataSources, then
KPICalculators).
Loading