- Fortran and Python implementation are interchangeable for the setup, the diffusion computation and the forecasts.

In addition:

- Implemented visualisation tools for trajectories and Lowes spectra.

- Possibility to save files in a format readable by [pygeodyn](https://gricad-gitlab.univ-grenoble-alpes.fr/Geodynamo/pygeodyn)

- Unitary and functional tests on around 35% of the code.

- Generated doc using Sphinx

0.2

---

Numerous improvements aiming to optimise the code, the most important being:

* The possibility to run forecasts in parallel using `mpi4py`

* The possibility to save results during computation in `hdf5` or `ASCII` format.

* Number of forecasts and analysis are now determined **ONLY** from the config file

In addition:

* Possibility to run analysis in parallel (not very interesting as the algorithm used in this case is way slower and scientifically equivalent...)

* Some clean-up:

- All raw data are now in the data folder

-*forecast* folder containing the Fortran sources was renamed to *fortran*

* Launching an algorithm can now be done in one line in `run_algo.py` thanks to the encapsulation of the `run` module.

0.3

---

Oops ! Computation of diffusion using cross-covariances only works for samples of the same statistical ensemble... As a consequence :

* Deprecation of the AugKF algorithm with diffusion computation (now 'legacy.augkf' in deprecated module)

* Deprecation of the CorDE algorithm (now in deprecated module as well)

* New implementation of the AugKF algorithm (formely AugKF_dpe) using master equation: $`\dot{b} = A(b)u + e`$ (DIFF is taken as a contribution of ER in:

- AR-1 process on ER

- Analysis on augmented state $`z= [u^Te^T] `$

Changes in observations:

* VO and GO data were added to the repository and can be used as observations for analyses.

* Dates in the algorithm are now handled with [NumPy datetime64](https://docs.scipy.org/doc/numpy/reference/arrays.datetime.html#arrays-dtypes-dateunits). This allows notably to have $`\Delta t`$ in months in the config files rather than floating dates that can have infinite decimals.

In addition:

* $`\Delta t`$ for Euler schemes and $`\Delta t_{forecasts}`$ are no longer different.

* Covariance matrices can be supplied as files in the prior folders. They will be computed from priors if not found.

* Deprecated files were put in a dedicated module that triggers a warning if imported.

* Various code improvements: clean-ups, docs written and tests added.

0.4

---

:warning: **This version introduced a bug in analysis_step (P_zz not scaled) ! Use version 0.5 instead (same features but with bug fixed).**

Scientific improvements of the code:

* Implementation of a AR-1 process using dense drift matrices (#42 and #48)

* Implementation of several scaling methods for dense matrices: v1 is the same for diagonal and dense AR, v2 is the recommended for dense AR.

This was possible thanks to the following improvements in the code:

* Added high-resolution midpath data as hdf5 file

* Redesigned CoreState to be able to dynamically set the measures in it (#46)

* Implemented the possibility to do the computation on a principal componenent analysis of U (PCA_U)

* The covariance of analysed states Z is computed at each analyse and now enters in the expression of the covariance matrice P_zz used for the augmented state Kalman filter.

Other minor improvements:

* Changed magnetic field key from 'B' to 'MF'

* Spectral observations are now truncated by the number of asked coefficients for computation (Nb, Nsv)

* Asking for max degrees higher than the degree of priors will raise a ValueError (#41)

* Fixed docstrings of functions decorated by with_core_state_of_dimensions

* Added instructions to run the code on IST-OAR

0.5

---

:warning: **Important bug fix**:

*Fixed the covariance matrices of analysed states Z that was not scaled (led to inconsistent results when using GO/VO)

Improvements towards a usable package:

*Explained the arguments of `run_algo.py` in README.md

*Added an in-depth guide in `doc` to explain how to change the input data (priors/observations) and how the low-level features (forecast/analysis/_CoreState_) work (#50)

*Refactored the reading of observations/priors in separate functions that are called dynamically according to the types asked in config (#49)

*Redesigned the observations to have a single object _Observation_ handling data, operator and errors.

*Renamed _Forecasts_/_Analysis_ objects in _Forecaster_/_Analyser_

Testing:

*Implemented the use of hypothesis.strategies and composite strategies as input for several tests

* Added basic functional tests of `run_algo.py` and other tests (coverage:78% !)

0.6

---

**Important bug fixes:**

- Fixed the time sampling for dense AR matrices that led to no variation in forecasts (#54)

- Observation error matrices are now properly set for COVOBS (#57)

In addition:

- Seed arg is now working (#56): it allows to have reproducible stochastic processes **when using the same number of MPI processes**. Noising of GO_VO observations does not use this global seed but prints the seeds used in debug logs.

- Misfits between the analysed states and the observations are now saved for analysis times (CoreState-like behaviour) (#58)

- Updated GO_VO data with 2018 data

- Updated guide for advanced users

- Added some tests (coverage: 79%)

0.7

---

New features:

* PCA_U can now performed with energy normalisation (PCA is performed on core flow energy coefficents rather than directly on core flow coefficients)

* CoreState can now be initialised in several fashions:

**constant*: equal to the average prior (for MF, U and ER)

**normal*: normal draw around the average prior (variance = deviation of the prior) (for MF, U and ER)

**from_file*: equal to the corestate of a given file (*init_file*) at a given date (*init_date*) (for all measures)

* Added higher resolution COVOBS files (COVOBS_hd). The COVOBS loading method now computes the error matrix R directly from the files rather than using a `var_*` file.

* Configuration file can now be generated from a hdf5 file of a previous calculation

Other improvements:

* Matrix inversions required by misfits computation were refactored