as a side effect, two-point gradients are now used by default for the
vertex-centered finite volume discretization. (because P1-FE gradients
require the FE shape functions and those are provided by
dune-localfunctions.)
as a consequence, this triggered an update of quite a few reference
solutions: the differences are measurable, but as far as I can see,
the results are compareable. also, this commit regresses the
performance of the test for the reservoir problem with the vertex
centered finite volume scheme a bit. While I would not bet a house on
the reason, I'm pretty sure that this is caused by the switch from P1
FE gradients to two-point ones.
Note that even though I'm the author of this patch, it shamelessly
rips off substantial parts of the @dr-robertk's patch:
https://github.com/OPM/ewoms/pull/69
this problem did not work properly anyway: it oscillated like hell
(very likely to the spatial discretization used being inappropriate)
and it did not even converge if more than a single iteration was
required.
i.e., in most cases the initial solution will be written into file
$SIM_NAME-00000.vtu and each time step will be written to the file
with the same number as the one printed during the simulation. (note
that this statement does not apply if the visualization files are not
written for all time steps.)
now, the dune-alugrid module is required if these tests are to be
run. (note that due to the fact that the OPM build system has not been
detecting the legacy alugrid library for a while, the practical
implications of this patch should be small to non-existant.)
the in-file lists of authors has been removed in favor of a global
list of authors in the LICENSE file. this is done because (a)
maintaining a list of authors at the beginning of a file is a major
pain in the a**, (b) the list of authors was not accurate in about 85%
of all cases where more than one person was involved and (c) this list
is not legally binding in any way (the copyright is at the person who
authored a given change, if these lists had any legal relevance, one
could "aquire" the copyright of the module by forking it and removing
the lists...)
the only exception of this is the eWoms fork of dune-istl's solvers.hh
file. This is beneficial because the authors of that file do not
appear in the global list. Further, carrying the fork of that file is
required because we would like to use a reasonable convergence
criterion for the linear solver. (the solvers from dune-istl do
neither support user-defined convergence criteria not do the
developers want support for it. (my patch was rejected a few years
ago.))
this has slowly become a hassle to support (i.e., it cluttered the
source with many #if's and in particularly the code was not tested
with Dune 2.2 on a regular basis). Also, Dune 2.3 has been out since
more than two years, so IMO it is not asked too much to ask people who
want to use the latest and greatest version of ewoms to upgrade their
Dune.
the changes enable the storage cache and the intensive quantity cache
for all simulators of the lens problem and automatic differentiation
for the one which uses the ECFV discretization.
while the performance improvements are not worthwhile for the problem
in its default incarnation (using automatic diffentiation even
slightly degrades performance), it speeds up linearization by about
30% if the grid exhibits 16 times as many elements (e.g. by passing
the --grid-global-refinements=2) parameter.
dune-alugrid >= 2.4 changed the element ordering changed from
lexicographical ordering to one defined by a space filling curve. the
old reference solutions are still valid (and obtained if older
versions of dune-alugrid are used) and are thus retained.
- start with an initial "do nothing" episode of 100 days to get
hydrostatic conditions.
- after that, produce oil and inject water for 900 days. (thereafter
the reservoir will be empty.)
- make the problem work with element centered FV discretizations. this
requires to make the width of the injection/production areas at
least one cell wide. This is achieved by using the new "WellWidth"
property which specifies the with of wells as a factor of the total
domain width.
- make the problem work with fully compositional models. This implied
to calculate the full composition for the fluid states which specify
the initial condition and the thermodynamic state at the wells.
- add tests and reference solutions for any combination of the {ECFV,
VCFV} discretizations and the {black-oil, NCP} models.
- the residual now does not consider constraints anymore
- instead, the central place for constraints is the linearizer:
- it gets a constraintsMap() method which is analogous to residual()
but it stores (DOF index, constraints vector) pairs because
typically only very few DOFs need to be constraint.
- the newton method consults the linearizer's constraint map to update
the error and the current iterative solution. the primary variables
for constraint degrees of freedom are now directly copied from the
'Constraints' object to correctly handle pseudo primary variables.
- the abilility to specify partial constraints is removed, i.e., it is
no longer possible to constrain some equations/primary variables of
a degree of freedom without having to specify all of them. The
reason is that is AFAICS with partial constraint DOFs it is
impossible to specify the pseudo primary variables for models which
require them (PVS, black-oil).
because of this, the reference solution for the Navier-Stokes test
is updated. the test still oscillates like hell, but fixing this
would require to implement spatial discretizations that are either
better in general (e.g., DG methods) or adapted to Navier-Stokes
problems (e.g., staggered grid FV methods). since both of these are
currently quite low on my list of priorities, let's just accept the
osscillations for now.
the utility is now more verbose (it actually prints what it does or
does not do), the test converts the example ART file shipped with
eWoms, and finally the art2dgf utility is now not considered a "unit
test" anymore (instead, it is an application).
they are required because the element ordering of the latest
dune-alugrid has changed. (it now uses a space-filling curve instead
of lexicgraphic ordering.)
I cannot really say which one is better, but at least the new one
looks more reasonable: gas appears at the top of the reservoir where
the pressure is lower instead of close to the bottom.
the solution itself did not change, but yesterday's change of the
phase indices of the black-oil fluid system caused the fields to be
outputted in different order.
The most important issue was that PR#20 removed the possibility to use
a test driver script. In eWoms, this script is responsible for
(fuzzily) comparing the results of the test runs with reference
solutions, to test the simulators in parallel and compare the results,
and to make sure that the --help command continues to work as
intended.
the approach taken to bring this functionality back is to specify the
test driver script and its parameters using the new macro
'opm_set_test_driver(${DRIVER_COMMAND} ${DRIVER_DEFAULT_ARGUMENTS})'
before the first call to 'opm_add_test'. If no test driver is
specified, the binary is run 'naked', so nothing should change in this
case.
this patch requires the changes for opm-cmake to be merged first.
seems like the changes for vaporized oil caused it to be
different. Since SPE1 and SPE9 still deliver the same result and the
new reference solution is closer to what one would expect, the new
result is probably the correct one...
this helps to keep the core blackoil model code lean and mean and it
is also less confusing for newbies because the ECL blackoil simulator
is not a "test" anymore.
in case somebody wonders, "ebos" stands for "&eWoms &Black-&Oil
&Simulator". I picked this name because it is short, a syllable, has
not been taken by anything else (as far as I know) and "descriptive"
names are rare for programs anyway: everyone who does not yet know
about 'git' or 'emacs' and tells me that based on their names they
must be a source-code managment system and an editor gets a crate of
beer sponsored by me!
before this, gradients had the direction of the face between two
finite volumes, now they exhibit the direction of the two
FV-centers.
For axis-aligned grids the result is identical for interior faces, but
it is different for boundary faces or if faces are not
axis-aligned. This patch fixes the SPE9 troubles with anisotropic
permeabilities on tilted grids...
the goal is to make it faster on computers with many cores: The
easiest way to do this is to ensure that the longest running tests are
not taking too much time and that they need about the same time. Thus
this patch contains the following changes which limits the CPU time
taken by each test to about two minutes in debug mode on my machine:
- the water-air problem using the non-isothermal primary variable
switching model now uses an 16x16 instead of a 32x32 grid. as a
compensation it now runs for a year instead of 5000 seconds and the
global grid refinement is now tested.
- the end time of the lens problem ctests is now 3000 instead of
30000 seconds. The binary itself does not change at all.
- sort the tests in the CMakeLists.txt roughly in the order of their
required time. (this will cause ctest not having to wait for long
running test which were started late for too long.)
this is required so that the element-centered finite volume method
does not handle each partition of the domain separately. (i.e. so that
fluxes accross faces on the process boundaries are considered) mea culpa!
The fix for this is to also include these entries in the matrix which
uses domestic indices. This required some rather extensive changes to
the blacklisting mechanism as for this it must be possible to
translate the index of a blacklisted entity (i.e., an entity in a
ghost or an overlap cell) to a domestic index (i.e., the corresponding
index in the algebraic overlap).
Note that the code for algebraic overlaps is *fun* and the person who
wrote it should be tarred and feathered. (*ouch* ;)) Seriously: Better
approaches than "lets-throw-this-away-and-use-grid-overlaps" are
deeply appreciated. (The grid overlap is not really useful in Dune
because only "Mickey Mouse grids" like Dune::YaspGrid support it.)
For some reason, it changed because of the transition to the primary
variable switching approach. Because I trust the new result more than
the old, let's make this the new reference solution.
this also comes with moving responsibilities around and some smaller
cleanups for the grid creation. (although grid creation could be
possibly done by the simulator now, the GridCreator concept has not
been abandoned, yet...)
this allows to retrieve the name of the problem before it is
instantiated. this is required to be able to print the "Initializing
problem" message at the correct point (i.e., before instantiating the
problem).