For this particular model WetGasPVT::saturationPressure did throw
because convergence in the newton solver is not reached in 20
iterations. Unfortunately, the exception was only seen on one MPI rank
and the others continued.
With this commit we communicate the problem and throw on all MPI
processes. Time step will be cut as a result.
For multi segment well the underlying call to
MultisegmentWell::updateWellStateWithTarget (at least if
updateWellStateWithTHPTargetProd is called for a producer under thp
control) might throw as there might be a singular matrix during the
solve needed in MultisegmentWell::iterateWellEqWithControl.
Previously, if that happened then the MPI process where it happened would
stop the nonlinear iteration as failed and try with a chopped time
step. The others might go one with the current time step and we would
see MPI errors about truncated messages.
Now we communicate any exception happening during this part of
WellModel::updateAndCommunicate and all processes will stop the
nonlinear iteration as failed and chop the time step.
We are experiencing singular matrices when solving mulisegment wells
sometimes. In that case (here during
BlackoilModelEbos::assembleReservoir <-
BlackoilModelEbos::initialLinerization <-
BlackoilModelEbos:::nonlinearIterationNewton ) an exception is thrown
when updating the controls of a well.
The problem here is that this exception only happens on one
process. That one goes to the catch block in
NonLinearSolverEbos::step, marks the nonlinear solve as failed and
cuts the time step. The others move to the collective communication
below. Somehow and somewhen all end up in a non-matching collective
communication with different data types and we get an MPI Error that
the message was truncated.
Now all processes will throw, terminate the nonlinear solver and cut
the timestep as it should be.
Using bool here is at least frowned upon. To be honest, I have no idea
what happens underneath here if we pass a bool. In contrast to other
pod types we do not associate it with a builtin type of MPI (not even
sure what to use). Hence we probably create a custom type for sending
and receiving. That should work. But I have no idea what will be used
for summation.
BTW: I am debugging a case that previously crashed and now suddenly
works and this seems to be the only relevant change I made in the
meantime.
This commit adds a new flag data member,
wellStructureChangedDynamically_
to the generic black-oil well model. This flag captures the
well_structure_changed
value from the 'SimulatorUpdate' structure in the updateEclWells()
member function. Then, in BlackoilWellModel::beginTimeStep(), we
key a well structure update off this flag when set. This, in turn,
enables creating or opening wells as a result of an ACTIONX block
updating the structure in the middle of a report step.