|
def | __init__ (self, name, patch_size, bool amend_priorities, KernelParallelisation parallelisation_of_kernels) |
|
def | create_action_sets (self) |
|
def | __init__ (self, name, patch_size, min_volume_h, max_volume_h, pde_terms_without_state) |
|
def | add_tracer (self, name, coordinates, project, number_of_entries_between_two_db_flushes, data_delta_between_two_snapsots, time_delta_between_two_snapsots, clear_database_after_flush, tracer_unknowns) |
|
def | __init__ (self) |
|
def | enable_second_order (self) |
|
def | add_all_solver_constants (self) |
|
def | add_makefile_parameters (self, peano4_project, path_of_ccz4_application) |
|
Construct the Finite Volume (limiter) scheme
We assume that the underlying Finite Differences scheme has a patch
size of 6x6x6. To make the Finite Volume scheme's time stepping (and
accuracy) match this patch size, we have to employ a 16 times finer
mesh.
It is interesting to see that the limiter does not really have a min
and max mesh size. The point is that the higher order solver dictates
the mesh structure, and we then follow this structure with the
Finite Volume scheme.
Definition at line 45 of file SBH.py.
def SBH.Limiter.create_action_sets |
( |
|
self | ) |
|
Not really a lot of things to do here. The only exception that is
really important is that we have to ensure that we only solve stuff
inside the local domain of the FV. By default, ExaHyPE 2 solves the
PDE everywhere. If data is not stored persistently or loaded from
the persistent stacks, it still solves, as it then would assume that
such data arises from dynamic AMR. In this particular case, we have
to really mask out certain subdomains.
It is not just a nice optimisation to do so. It is absolutely key,
as the application of the compute kernel on garbage would mean that
we end up with invalid eigenvalues.
Definition at line 179 of file SBH.py.
References coupling.StaticCoupling.StaticCoupling._name.
Referenced by mgccz4.MGCCZ4Solver.add_derivative_calculation(), and ccz4.CCZ4Solver.add_Psi4W().