How to use multi-grid "Trilinos/ml" preconditioner when FEniCSx is installed with conda-forge?

Hello :wave: :grinning:
In the β€œComponent-wise Dirichlet BC” example provided in the FEniCSx tutorial, I modified the example code into a 3D mesh, increased the number of elements, and checked the solution result using the β€œtrilinos/ml”(multi-level) preconditioner.
(I installed FEniCSx using the conda-forge command and checked the version 0.7.2)

The code I modified is below.

L = 2
H = 1
W = 1
lambda_ = 1.25
mu = 1
rho = 1
g = 1

from mpi4py import MPI
from petsc4py import PETSc
from petsc4py.PETSc import ScalarType
from dolfinx.fem import (Constant, dirichletbc, Function, FunctionSpace, locate_dofs_geometrical,
                         locate_dofs_topological, form)
from dolfinx.fem.petsc import assemble_matrix, assemble_vector, apply_lifting, set_bc
from dolfinx.mesh import create_box, locate_entities_boundary, CellType, GhostMode
from ufl import Identity, Measure, TestFunction, TrialFunction, VectorElement, dot, dx, inner, grad, nabla_div, sym
import numpy as np

mesh = create_box(MPI.COMM_WORLD, np.array([[0.0, 0.0, 0.0], [L, H, W]]), [256, 128, 128], cell_type=CellType.tetrahedron,
                  ghost_mode=GhostMode.shared_facet)
element = VectorElement("CG", mesh.ufl_cell(), 1)
V = FunctionSpace(mesh, element)
uh = Function(V)

def clamped_boundary(x):
    return np.isclose(x[1], 0)

u_zero = np.array((0,)*mesh.geometry.dim, dtype=ScalarType)
bc = dirichletbc(u_zero, locate_dofs_geometrical(V, clamped_boundary), V)

def right(x):
    return np.logical_and(np.isclose(x[0], L), x[1] < H)
boundary_facets = locate_entities_boundary(mesh, mesh.topology.dim-1, right)
boundary_dofs_x = locate_dofs_topological(V.sub(0), mesh.topology.dim-1, boundary_facets)

bcx = dirichletbc(ScalarType(0), boundary_dofs_x, V.sub(0))
bcs = [bc, bcx]
T = Constant(mesh, ScalarType((0, 0, 0)))

ds = Measure("ds", domain=mesh)

def epsilon(u):
    return sym(grad(u))
def sigma(u):
    return lambda_*nabla_div(u)*Identity(len(u)) + 2*mu*epsilon(u)

u = TrialFunction(V)
v = TestFunction(V)
f = Constant(mesh, ScalarType((0, -rho*g, 0)))
a = inner(sigma(u), epsilon(v)) * dx
L = dot(f, v) * dx + dot(T, v) * ds

_a = form(a)
_L = form(L)

_A = assemble_matrix(_a, bcs)
_A.assemble()

_solver = PETSc.KSP().create(mesh.comm)
opts = PETSc.Options()

# ML ---------------------------------
opts["ksp_type"] = "cg"
opts["ksp_rtol"] = 1.0e-8
opts["pc_type"] = "ml"
opts["mg_levels_pc_factor_levels"] = 4
_solver.setFromOptions()

_b = assemble_vector(_L)
apply_lifting(_b, [_a], bcs=[bcs])
set_bc(_b, bcs)
_solver.setOperators(_A)

_solver.solve(_b, uh.vector)
_solver.view()

When I ran the above code, I confirmed that the β€œtrilinos/ml” package does not exist in PETSc.(also, I checked using conda list command)

  File "petsc4py/PETSc/KSP.pyx", line 616, in petsc4py.PETSc.KSP.setFromOptions
petsc4py.PETSc.Error: error code 86
[0] KSPSetFromOptions() at /home/conda/feedstock_root/build_artifacts/petsc_1702328665571/work/src/ksp/ksp/interface/itcl.c:359
[0] PCSetFromOptions() at /home/conda/feedstock_root/build_artifacts/petsc_1702328665571/work/src/ksp/pc/interface/pcset.c:146
[0] PCSetType() at /home/conda/feedstock_root/build_artifacts/petsc_1702328665571/work/src/ksp/pc/interface/pcset.c:59
[0] Unknown type. Check for miss-spelling or missing package: https://petsc.org/release/install/install/#external-packages
[0] Unable to find requested PC type ml

My questions are:
a. Is there another way to use β€œtrilinos/ml” preconditioner?

b. Or how do I install additional external PETSc packages in an environment where FEniCSx is installed with conda forge? Is there a way to re-configure a user-installed PETSc from source?

It’s been a while since I posted my question, but I realized that I didn’t leave any information about my OS. That’s stupid, I’m sorry. :sweat_smile:

The OS I am using is Ubuntu 22.04 LTS, and I later tried to install trilinos/ml on my computer to solve the problem.
In conclusion, the installation was not successful.
The reason I want to install trilinos is because it is a pre-conditioner supported by PETSc like hypre and gamg, so I want to check whether the pre-conditioner shows good wall-time performance.

I made a total of two efforts to install, and these processes were carried out with FEniCSx conda env (the version was the same as the 0.7.2 version left in the question) enabled.
If there is anything strange, I left a reply because I thought it would be good to solve the problem through the opinions of everyone who reads the article. If you know anything about the solution.
(Actually, I tried to delete the question and post a new one, but I realized it was impossible.)

a) Problem encountered while installing trilinos via conda-forge
Afterwards, I found out that trilinos and pytrilinos existed in conda(of course, they were far behind the version provided by GitHub).
Therefore, I attempted to install through

conda install -c conda-forge trilinos and conda install -c conda-forge pytrilinos

but I found an error that the version did not match between the FEniCSx package inside conda env and the required installation package between trilinos and pytrilinos.

(trilinos solving environment fail error)

Solving environment: \ warning  libmamba Added empty dependency for problem type SOLVER_RULE_UPDATE
failed

LibMambaUnsatisfiableError: Encountered problems while solving:
  - package hdf5-1.14.3-mpi_mpich_ha2c2bf8_0 requires libgfortran5 >=12.3.0, but none of the providers can be installed

Could not solve for environment specs
The following packages are incompatible
β”œβ”€ fenics-dolfinx is installable with the potential options
β”‚  β”œβ”€ fenics-dolfinx 0.7.2 would require
β”‚  β”‚  β”œβ”€ fenics-libdolfinx [0.7.2 h962240e_103|0.7.2 he8d8b1c_3] with the potential options
β”‚  β”‚  β”‚  β”œβ”€ fenics-libdolfinx 0.7.2 would require
β”‚  β”‚  β”‚  β”‚  β”œβ”€ hdf5 >=1.14.3,<1.14.4.0a0 mpi_mpich_*, which requires
β”‚  β”‚  β”‚  β”‚  β”‚  β”œβ”€ libgfortran-ng with the potential options
β”‚  β”‚  β”‚  β”‚  β”‚  β”‚  β”œβ”€ libgfortran-ng [10.3.0|10.4.0|...|9.5.0], which can be installed;
β”‚  β”‚  β”‚  β”‚  β”‚  β”‚  └─ libgfortran-ng [7.2.0|7.3.0|7.5.0] conflicts with any installable versions previously reported;
β”‚  β”‚  β”‚  β”‚  β”‚  └─ libgfortran5 >=12.3.0  but there are no viable options
β”‚  β”‚  β”‚  β”‚  β”‚     β”œβ”€ libgfortran5 13.2.0 would require
β”‚  β”‚  β”‚  β”‚  β”‚     β”‚  └─ libgfortran-ng 13.2.0 , which conflicts with any installable versions previously reported;
β”‚  β”‚  β”‚  β”‚  β”‚     β”œβ”€ libgfortran5 12.3.0 would require
β”‚  β”‚  β”‚  β”‚  β”‚     β”‚  └─ libgfortran-ng 12.3.0 , which conflicts with any installable versions previously reported;
β”‚  β”‚  β”‚  β”‚  β”‚     └─ libgfortran5 13.1.0 would require
β”‚  β”‚  β”‚  β”‚  β”‚        └─ libgfortran-ng 13.1.0 , which conflicts with any installable versions previously reported;
β”‚  β”‚  β”‚  β”‚  └─ libboost-devel with the potential options
β”‚  β”‚  β”‚  β”‚     β”œβ”€ libboost-devel [1.82.0|1.83.0|1.84.0] would require
β”‚  β”‚  β”‚  β”‚     β”‚  └─ libboost [1.82.0 h6fcfa73_3|1.82.0 h6fcfa73_4|...|1.84.0 h6fcfa73_0], which requires
β”‚  β”‚  β”‚  β”‚     β”‚     └─ icu >=73.2,<74.0a0 , which can be installed;
β”‚  β”‚  β”‚  β”‚     └─ libboost-devel 1.82.0 would require
β”‚  β”‚  β”‚  β”‚        β”œβ”€ boost-cpp 1.82.0* , which can be installed;
β”‚  β”‚  β”‚  β”‚        └─ libboost 1.82.0 h1bacd13_2, which requires
β”‚  β”‚  β”‚  β”‚           └─ icu >=72.1,<73.0a0 , which can be installed;
β”‚  β”‚  β”‚  └─ fenics-libdolfinx 0.7.2 would require
β”‚  β”‚  β”‚     └─ libboost-devel, which can be installed (as previously explained);
β”‚  β”‚  β”œβ”€ hdf5 >=1.14.3,<1.14.4.0a0 mpi_mpich_*, which cannot be installed (as previously explained);
β”‚  β”‚  └─ mpich >=4.1.2,<5.0a0  with the potential options
β”‚  β”‚     β”œβ”€ mpich [3.3.1|3.3.2|...|4.1.2] would require
β”‚  β”‚     β”‚  └─ mpi 1.0 mpich, which can be installed;
β”‚  β”‚     └─ mpich 4.1.2 would require
β”‚  β”‚        β”œβ”€ libgfortran-ng with the potential options
β”‚  β”‚        β”‚  β”œβ”€ libgfortran-ng [10.3.0|10.4.0|...|9.5.0], which can be installed;
β”‚  β”‚        β”‚  └─ libgfortran-ng [7.2.0|7.3.0|7.5.0] conflicts with any installable versions previously reported;
β”‚  β”‚        β”œβ”€ libgfortran5 >=12.3.0 , which cannot be installed (as previously explained);
β”‚  β”‚        └─ mpi 1.0 mpich, which can be installed;
β”‚  β”œβ”€ fenics-dolfinx [0.4.1|0.5.0|...|0.7.2] would require
β”‚  β”‚  └─ python >=3.10,<3.11.0a0 , which can be installed;
β”‚  β”œβ”€ fenics-dolfinx 0.4.1 would require
β”‚  β”‚  └─ python >=3.7,<3.8.0a0 , which can be installed;
β”‚  β”œβ”€ fenics-dolfinx [0.4.1|0.5.0|...|0.7.2] would require
β”‚  β”‚  └─ python >=3.8,<3.9.0a0 , which can be installed;
β”‚  β”œβ”€ fenics-dolfinx [0.4.1|0.5.0|...|0.7.2] would require
β”‚  β”‚  └─ python >=3.9,<3.10.0a0 , which can be installed;
β”‚  β”œβ”€ fenics-dolfinx [0.7.0|0.7.1|0.7.2] would require
β”‚  β”‚  └─ python >=3.11,<3.12.0a0 , which can be installed;
β”‚  β”œβ”€ fenics-dolfinx 0.7.2 would require
β”‚  β”‚  β”œβ”€ fenics-libdolfinx 0.7.2 hcfc32b3_3, which requires
β”‚  β”‚  β”‚  β”œβ”€ libadios2 >=2.9.2,<2.9.3.0a0 mpi_openmpi_*, which requires
β”‚  β”‚  β”‚  β”‚  β”œβ”€ libgfortran-ng with the potential options
β”‚  β”‚  β”‚  β”‚  β”‚  β”œβ”€ libgfortran-ng [10.3.0|10.4.0|...|9.5.0], which can be installed;
β”‚  β”‚  β”‚  β”‚  β”‚  └─ libgfortran-ng [7.2.0|7.3.0|7.5.0] conflicts with any installable versions previously reported;
β”‚  β”‚  β”‚  β”‚  β”œβ”€ libgfortran5 >=12.3.0 , which cannot be installed (as previously explained);
β”‚  β”‚  β”‚  β”‚  └─ openmpi >=4.1.6,<5.0a0 , which requires
β”‚  β”‚  β”‚  β”‚     └─ mpi 1.0 openmpi, which conflicts with any installable versions previously reported;
β”‚  β”‚  β”‚  β”œβ”€ libboost-devel, which can be installed (as previously explained);
β”‚  β”‚  β”‚  └─ slepc >=3.20.1,<3.21.0a0 complex_* with the potential options
β”‚  β”‚  β”‚     β”œβ”€ slepc 3.20.1 would require
β”‚  β”‚  β”‚     β”‚  └─ libgfortran-ng with the potential options
β”‚  β”‚  β”‚     β”‚     β”œβ”€ libgfortran-ng [10.3.0|10.4.0|...|9.5.0], which can be installed;
β”‚  β”‚  β”‚     β”‚     └─ libgfortran-ng [7.2.0|7.3.0|7.5.0] conflicts with any installable versions previously reported;
β”‚  β”‚  β”‚     └─ slepc 3.20.1 would require
β”‚  β”‚  β”‚        └─ mpich >=4.1.2,<5.0a0  with the potential options
β”‚  β”‚  β”‚           β”œβ”€ mpich [3.3.1|3.3.2|...|4.1.2], which can be installed (as previously explained);
β”‚  β”‚  β”‚           └─ mpich 4.1.2, which cannot be installed (as previously explained);
β”‚  β”‚  └─ hdf5 >=1.14.3,<1.14.4.0a0 mpi_openmpi_*, which requires
β”‚  β”‚     β”œβ”€ libgfortran-ng with the potential options
β”‚  β”‚     β”‚  β”œβ”€ libgfortran-ng [10.3.0|10.4.0|...|9.5.0], which can be installed;
β”‚  β”‚     β”‚  └─ libgfortran-ng [7.2.0|7.3.0|7.5.0] conflicts with any installable versions previously reported;
β”‚  β”‚     └─ libgfortran5 >=12.3.0 , which cannot be installed (as previously explained);
β”‚  └─ fenics-dolfinx 0.7.2 would require
β”‚     β”œβ”€ fenics-libdolfinx 0.7.2 h1829583_103, which requires
β”‚     β”‚  β”œβ”€ libadios2 >=2.9.2,<2.9.3.0a0 mpi_openmpi_*, which cannot be installed (as previously explained);
β”‚     β”‚  β”œβ”€ libboost-devel, which can be installed (as previously explained);
β”‚     β”‚  └─ slepc >=3.20.1,<3.21.0a0 real_* with the potential options
β”‚     β”‚     β”œβ”€ slepc 3.20.1, which can be installed (as previously explained);
β”‚     β”‚     └─ slepc 3.20.1 would require
β”‚     β”‚        β”œβ”€ libgfortran-ng with the potential options
β”‚     β”‚        β”‚  β”œβ”€ libgfortran-ng [10.3.0|10.4.0|...|9.5.0], which can be installed;
β”‚     β”‚        β”‚  └─ libgfortran-ng [7.2.0|7.3.0|7.5.0] conflicts with any installable versions previously reported;
β”‚     β”‚        └─ petsc >=3.20.1,<3.21.0a0 real_* with the potential options
β”‚     β”‚           β”œβ”€ petsc [3.19.6|3.20.0|3.20.1|3.20.2|3.20.3] would require
β”‚     β”‚           β”‚  └─ hdf5 * mpi_mpich_* with the potential options
β”‚     β”‚           β”‚     β”œβ”€ hdf5 1.14.3, which cannot be installed (as previously explained);
β”‚     β”‚           β”‚     β”œβ”€ hdf5 1.10.6, which can be installed;
β”‚     β”‚           β”‚     β”œβ”€ hdf5 1.12.1, which can be installed;
β”‚     β”‚           β”‚     β”œβ”€ hdf5 1.12.2, which can be installed;
β”‚     β”‚           β”‚     β”œβ”€ hdf5 1.14.0, which can be installed;
β”‚     β”‚           β”‚     β”œβ”€ hdf5 1.14.1, which can be installed;
β”‚     β”‚           β”‚     └─ hdf5 [1.10.4|1.10.5|1.12.0|1.14.2], which can be installed;
β”‚     β”‚           └─ petsc [3.19.6|3.20.0|3.20.1|3.20.2|3.20.3] would require
β”‚     β”‚              └─ fftw [* mpi_openmpi_*|>=3.3.10,<4.0a0 ] with the potential options
β”‚     β”‚                 β”œβ”€ fftw 3.3.10, which can be installed;
β”‚     β”‚                 β”œβ”€ fftw 3.3.10 would require
β”‚     β”‚                 β”‚  └─ libgfortran-ng with the potential options
β”‚     β”‚                 β”‚     β”œβ”€ libgfortran-ng [10.3.0|10.4.0|...|9.5.0], which can be installed;
β”‚     β”‚                 β”‚     └─ libgfortran-ng [7.2.0|7.3.0|7.5.0] conflicts with any installable versions previously reported;
β”‚     β”‚                 └─ fftw [3.3.8|3.3.9] conflicts with any installable versions previously reported;
β”‚     β”œβ”€ hdf5 >=1.14.3,<1.14.4.0a0 mpi_openmpi_*, which cannot be installed (as previously explained);
β”‚     └─ petsc [* real_*|>=3.20.2,<3.21.0a0 ] with the potential options
β”‚        β”œβ”€ petsc [3.19.6|3.20.0|3.20.1|3.20.2|3.20.3], which can be installed (as previously explained);
β”‚        β”œβ”€ petsc [3.15.0|3.15.1|3.15.2|3.15.3] would require
β”‚        β”‚  └─ hdf5 >=1.10.6,<1.10.7.0a0 , which can be installed;
β”‚        β”œβ”€ petsc [3.15.0|3.15.1|...|3.17.1] would require
β”‚        β”‚  └─ hdf5 >=1.10.6,<1.10.7.0a0 mpi_openmpi_*, which can be installed;
β”‚        β”œβ”€ petsc [3.15.0|3.15.1|...|3.17.1] would require
β”‚        β”‚  └─ hdf5 >=1.10.6,<1.10.7.0a0 mpi_mpich_*, which can be installed;
β”‚        β”œβ”€ petsc [3.15.3|3.15.4|...|3.17.1] would require
β”‚        β”‚  └─ hdf5 >=1.12.1,<1.12.2.0a0 , which can be installed;
β”‚        β”œβ”€ petsc [3.17.1|3.17.2|3.17.3] would require
β”‚        β”‚  └─ hdf5 >=1.12.1,<1.12.2.0a0 mpi_mpich_*, which can be installed;
β”‚        β”œβ”€ petsc [3.17.1|3.17.2|3.17.3] would require
β”‚        β”‚  └─ hdf5 >=1.12.1,<1.12.2.0a0 mpi_openmpi_*, which can be installed;
β”‚        β”œβ”€ petsc [3.17.3|3.17.4|...|3.18.4] would require
β”‚        β”‚  └─ hdf5 >=1.12.2,<1.12.3.0a0 mpi_openmpi_*, which can be installed;
β”‚        β”œβ”€ petsc [3.17.3|3.17.4|...|3.18.4] would require
β”‚        β”‚  └─ hdf5 >=1.12.2,<1.12.3.0a0 mpi_mpich_*, which can be installed;
β”‚        β”œβ”€ petsc [3.18.4|3.18.5|3.19.0|3.19.1|3.19.2] would require
β”‚        β”‚  └─ hdf5 >=1.14.0,<1.14.1.0a0 mpi_mpich_*, which can be installed;
β”‚        β”œβ”€ petsc [3.18.4|3.18.5|3.19.0|3.19.1|3.19.2] would require
β”‚        β”‚  └─ hdf5 >=1.14.0,<1.14.1.0a0 mpi_openmpi_*, which can be installed;
β”‚        β”œβ”€ petsc [3.19.2|3.19.3|3.19.4|3.19.5] would require
β”‚        β”‚  └─ hdf5 >=1.14.1,<1.14.2.0a0 mpi_mpich_*, which can be installed;
β”‚        β”œβ”€ petsc [3.19.2|3.19.3|3.19.4|3.19.5] would require
β”‚        β”‚  └─ hdf5 >=1.14.1,<1.14.2.0a0 mpi_openmpi_*, which can be installed;
β”‚        β”œβ”€ petsc [3.19.6|3.20.0|3.20.1|3.20.2|3.20.3], which can be installed (as previously explained);
β”‚        └─ petsc [3.20.2|3.20.3] would require
β”‚           └─ fftw * mpi_openmpi_* but there are no viable options
β”‚              β”œβ”€ fftw 3.3.10, which cannot be installed (as previously explained);
β”‚              └─ fftw [3.3.8|3.3.9] conflicts with any installable versions previously reported;
β”œβ”€ mpich is installable with the potential options
β”‚  β”œβ”€ mpich [3.3.1|3.3.2|...|4.1.2], which can be installed (as previously explained);
β”‚  β”œβ”€ mpich 4.1.2, which cannot be installed (as previously explained);
β”‚  β”œβ”€ mpich [3.2.1|3.3.1|3.3.2] would require
β”‚  β”‚  └─ libgfortran-ng >=7,<8.0a0 , which conflicts with any installable versions previously reported;
β”‚  β”œβ”€ mpich 3.2.1 would require
β”‚  β”‚  └─ mpi 1.0 mpich, which can be installed;
β”‚  β”œβ”€ mpich 3.3.2 would require
β”‚  β”‚  └─ libgfortran4 >=7.5.0  but there are no viable options
β”‚  β”‚     β”œβ”€ libgfortran4 7.5.0 would require
β”‚  β”‚     β”‚  └─ libgfortran-ng 7.5.0 *_18, which conflicts with any installable versions previously reported;
β”‚  β”‚     β”œβ”€ libgfortran4 7.5.0 would require
β”‚  β”‚     β”‚  └─ libgfortran-ng 7.5.0 *_19, which conflicts with any installable versions previously reported;
β”‚  β”‚     β”œβ”€ libgfortran4 7.5.0 would require
β”‚  β”‚     β”‚  └─ libgfortran-ng 7.5.0 *_20, which conflicts with any installable versions previously reported;
β”‚  β”‚     └─ libgfortran4 7.5.0 would require
β”‚  β”‚        └─ libgfortran-ng 7.5.0 *_17, which conflicts with any installable versions previously reported;
β”‚  └─ mpich [3.4|3.4.1|...|4.1.1] would require
β”‚     β”œβ”€ libgfortran-ng with the potential options
β”‚     β”‚  β”œβ”€ libgfortran-ng [10.3.0|10.4.0|...|9.5.0], which can be installed;
β”‚     β”‚  └─ libgfortran-ng [7.2.0|7.3.0|7.5.0] conflicts with any installable versions previously reported;
β”‚     └─ mpi 1.0 mpich, which can be installed;
β”œβ”€ pin-1 is not installable because it requires
β”‚  └─ python 3.12.* , which conflicts with any installable versions previously reported;
└─ trilinos is installable with the potential options
   β”œβ”€ trilinos [12.10.1|12.12.1] would require
   β”‚  └─ openmpi but there are no viable options
   β”‚     β”œβ”€ openmpi [3.1.0|3.1.2|...|5.0.1], which cannot be installed (as previously explained);
   β”‚     β”œβ”€ openmpi [3.1.2|3.1.3|...|4.0.5] would require
   β”‚     β”‚  └─ libgfortran-ng >=7,<8.0a0 , which conflicts with any installable versions previously reported;
   β”‚     └─ openmpi [4.0.5|4.1.3] would require
   β”‚        └─ libgfortran4 >=7.5.0 , which cannot be installed (as previously explained);
   β”œβ”€ trilinos [12.10.1|12.12.1] would require
   β”‚  β”œβ”€ boost-cpp >=1.67.0,<1.67.1.0a0  but there are no viable options
   β”‚  β”‚  β”œβ”€ boost-cpp [1.67.0|1.68.0|1.70.0] would require
   β”‚  β”‚  β”‚  β”œβ”€ icu >=58.2,<59.0a0 , which conflicts with any installable versions previously reported;
   β”‚  β”‚  β”‚  └─ libboost 1.67.0 h46d08c1_4, which requires
   β”‚  β”‚  β”‚     └─ icu >=58.2,<59.0a0 , which conflicts with any installable versions previously reported;
   β”‚  β”‚  └─ boost-cpp 1.67.0 would require
   β”‚  β”‚     └─ icu 58.* , which conflicts with any installable versions previously reported;
   β”‚  └─ mpich >=3.2,<3.3.0a0 , which cannot be installed (as previously explained);
   β”œβ”€ trilinos [12.10.1|12.12.1] would require
   β”‚  └─ boost-cpp >=1.67.0,<1.67.1.0a0 , which cannot be installed (as previously explained);
   β”œβ”€ trilinos 12.12.1 would require
   β”‚  β”œβ”€ boost-cpp >=1.68.0,<1.68.1.0a0 , which cannot be installed (as previously explained);
   β”‚  └─ mpich >=3.2,<3.3.0a0 , which cannot be installed (as previously explained);
   β”œβ”€ trilinos 12.12.1 would require
   β”‚  β”œβ”€ boost-cpp >=1.70.0,<1.70.1.0a0  but there are no viable options
   β”‚  β”‚  β”œβ”€ boost-cpp 1.70.0 would require
   β”‚  β”‚  β”‚  └─ icu >=67.1,<68.0a0 , which conflicts with any installable versions previously reported;
   β”‚  β”‚  β”œβ”€ boost-cpp 1.70.0 would require
   β”‚  β”‚  β”‚  └─ icu >=64.2,<65.0a0 , which conflicts with any installable versions previously reported;
   β”‚  β”‚  └─ boost-cpp [1.67.0|1.68.0|1.70.0], which cannot be installed (as previously explained);
   β”‚  └─ mpich >=3.2.1,<3.3.0a0 , which cannot be installed (as previously explained);
   β”œβ”€ trilinos 12.12.1 would require
   β”‚  └─ boost-cpp >=1.68.0,<1.68.1.0a0 , which cannot be installed (as previously explained);
   β”œβ”€ trilinos [12.12.1|12.18.1] would require
   β”‚  └─ boost-cpp >=1.70.0,<1.70.1.0a0 , which cannot be installed (as previously explained);
   β”œβ”€ trilinos 12.12.1 would require
   β”‚  β”œβ”€ boost-cpp >=1.68.0,<1.68.1.0a0 , which cannot be installed (as previously explained);
   β”‚  └─ openmpi >=3.1.3,<3.2.0a0 , which cannot be installed (as previously explained);
   └─ trilinos 12.18.1 would require
      └─ scikit-umfpack with the potential options
         β”œβ”€ scikit-umfpack [0.2.1|0.2.3|0.3.1|0.3.2] would require
         β”‚  └─ python [2.7* |>=2.7,<2.8.0a0 ], which can be installed;
         β”œβ”€ scikit-umfpack [0.2.1|0.2.3] would require
         β”‚  └─ python 3.4* , which can be installed;
         β”œβ”€ scikit-umfpack [0.2.1|0.2.3] would require
         β”‚  └─ python 3.5* , which can be installed;
         β”œβ”€ scikit-umfpack [0.2.3|0.3.1] would require
         β”‚  └─ python 3.6* , which can be installed;
         β”œβ”€ scikit-umfpack [0.3.2|0.3.3] would require
         β”‚  └─ python >=3.10,<3.11.0a0 , which can be installed;
         β”œβ”€ scikit-umfpack 0.3.2 would require
         β”‚  └─ python >=3.6,<3.7.0a0 , which can be installed;
         β”œβ”€ scikit-umfpack [0.3.2|0.3.3] would require
         β”‚  └─ python >=3.7,<3.8.0a0 , which can be installed;
         β”œβ”€ scikit-umfpack [0.3.2|0.3.3] would require
         β”‚  └─ python >=3.8,<3.9.0a0 , which can be installed;
         β”œβ”€ scikit-umfpack [0.3.2|0.3.3] would require
         β”‚  └─ python >=3.9,<3.10.0a0 , which can be installed;
         └─ scikit-umfpack 0.3.3 would require
            └─ python >=3.11,<3.12.0a0 , which can be installed.

Pins seem to be involved in the conflict. Currently pinned specs:
 - python 3.12.* (labeled as 'pin-1')

(pytrilinos solving environment fail error)

Solving environment: - warning  libmamba Added empty dependency for problem type SOLVER_RULE_UPDATE
failed

LibMambaUnsatisfiableError: Encountered problems while solving:
  - package pytrilinos-12.10.1-py27h7127e17_2 requires python >=2.7,<2.8.0a0, but none of the providers can be installed

Could not solve for environment specs
The following packages are incompatible
β”œβ”€ pin-1 is installable and it requires
β”‚  └─ python 3.12.* , which can be installed;
└─ pytrilinos is not installable because there are no viable options
   β”œβ”€ pytrilinos [12.10.1|12.18.1] would require
   β”‚  └─ python >=2.7,<2.8.0a0 , which conflicts with any installable versions previously reported;
   β”œβ”€ pytrilinos 12.18.1 would require
   β”‚  └─ python >=3.6,<3.7.0a0 , which conflicts with any installable versions previously reported;
   └─ pytrilinos 12.18.1 would require
      └─ python >=3.7,<3.8.0a0 , which conflicts with any installable versions previously reported.

Pins seem to be involved in the conflict. Currently pinned specs:
 - python 3.12.* (labeled as 'pin-1')

(There was no update after the version uploaded from trilinos and pytrilinos to conda. I think the problem occurred because it didn’t work…)

b) Problem that occurred while installing trilinos with source file
As the previous method failed, I attempted to install it with the trilinos source file.
(The shell script below was written by me for convenience.)

#! /bin/bash

if [ ! -d build ]; then
    mkdir build
fi
if [ ! -d install ]; then
    mkdir install
fi
cd build/

CONDA_DIR=/home/miniconda3/envs/dolfinx-0.7.2/
SRC_DIR=/home/trilinos-15.0.0/

BUILD_DIR=$SRC_DIR/build/
INSTALL_DIR=$SRC_DIR/install/

rm -rf CMakeCache.txt CMakeFiles

cmake \
    -D CMAKE_INSTALL_PREFIX:PATH=$INSTALL_DIR \
    -D CMAKE_C_COMPILER=$CONDA_DIR/bin/mpicc \
    -D CMAKE_CXX_COMPILER=$CONDA_DIR/bin/mpic++ \
    -D TPL_ENABLE_DLlib:BOOL=OFF \
    -D TPL_ENABLE_MPI:BOOL=ON \
    -D MPI_BASE_DIR:PATH=$CONDA_DIR/lib/python3.12/site-packages/mpi4py \
    -D MPI_EXEC:FILEPATH=$CONDA_DIR/bin/mpiexec \
    -D PYTHON_EXECUTABLE:FILEPATH=$CONDA_DIR/bin/python3 \
    -D Trilinos_ENABLE_Fortran:BOOL=OFF \
    -D Trilinos_ENABLE_TESTS:BOOL=OFF \
    -D Trilinos_ENABLE_EXAMPLES:BOOL=OFF \
    -D Trilinos_ENABLE_ALL_PACKAGES:BOOL=OFF \
    -D Trilinos_ENABLE_ALL_OPTIONAL_PACKAGES:BOOL=OFF \
    -D Trilinos_ENABLE_Epetra:BOOL=ON \
    -D Trilinos_ENABLE_EpetraExt:BOOL=ON \
    -D Trilinos_ENABLE_Triutils:BOOL=ON \
    -D Trilinos_ENABLE_Teuchos:BOOL=ON \
    -D Trilinos_ENABLE_Tpetra:BOOL=ON \
    -D Trilinos_ENABLE_Domi:BOOL=ON \
    -D Trilinos_ENABLE_Isorropia:BOOL=OFF \
    -D Trilinos_ENABLE_Pliris:BOOL=OFF \
    -D Trilinos_ENABLE_AztecOO:BOOL=ON \
    -D Trilinos_ENABLE_Galeri:BOOL=ON \
    -D Trilinos_ENABLE_Amesos:BOOL=ON \
    -D Trilinos_ENABLE_Ifpack:BOOL=ON \
    -D Trilinos_ENABLE_Komplex:BOOL=ON \
    -D Trilinos_ENABLE_ML:BOOL=ON \
    -D Trilinos_ENABLE_Anasazi:BOOL=ON \
    -D Trilinos_ENABLE_NOX:BOOL=OFF \
    -D Trilinos_ENABLE_PyTrilinos:BOOL=ON \
    -D PyTrilinos_ENABLE_Tpetra:BOOL=OFF \
    -D PyTrilinos_ENABLE_TESTS:BOOL=OFF \
    -D PyTrilinos_ENABLE_EXAMPLES:BOOL=OFF \
    -D BUILD_SHARED_LIBS:BOOL=ON \
    $SRC_DIR
make -j $CPU_COUNT
$PYTHON packages/PyTrilinos/util/configFix.py
ctest --output-on-failure
make install

However, during the installation of the library called Teuchos inside trilinos(I don’t know exactly what role the Teuchos library plays.), an error occurred between the "mpi.h" header file installed inside FEniCSx conda env.

[ 96%] Generating Teuchos.RCP.i
Traceback (most recent call last):
  File "/home/trilinos-15.0.0/build/packages/PyTrilinos/src/gen_teuchos_rcp.py", line 659, in <module>
    main()
  File "/home/trilinos-15.0.0/build/packages/PyTrilinos/src/gen_teuchos_rcp.py", line 557, in main
    print("#define MPI_VERSION %s" % get_mpi_version())
                                     ^^^^^^^^^^^^^^^^^
  File "/home/trilinos-15.0.0/build/packages/PyTrilinos/src/gen_teuchos_rcp.py", line 58, in get_mpi_version
    for line in open(header, 'r').readlines():
                ^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: '/home/miniconda3/envs/dolfinx-0.7.2/lib/python3.12/site-packages/mpi4py/include/mpi.h'
make[2]: *** [packages/PyTrilinos/src/CMakeFiles/PyTrilinos_Teuchos_RCP.dir/build.make:81: packages/PyTrilinos/src/Teuchos.RCP.i] error 1
make[2]: *** 'packages/PyTrilinos/src/Teuchos.RCP.i' delete the file
make[1]: *** [CMakeFiles/Makefile2:4231: packages/PyTrilinos/src/CMakeFiles/PyTrilinos_Teuchos_RCP.dir/all] error 2
make: *** [Makefile:166: all] error 2

I checked the error and was unable to find a solution.

After feeling this wall, I thought of a new approach other than methods a) & b). Delete PETSc(petsc4py) installed in FEniCSx conda env, install source file PETSc(petsc4py), and if FEniCSx conda env can recognize the path where source file PETSc is installed, add an external package that can be installed in PETSc, then install it in FEniCSx. I thought about whether I could run it.
(I haven’t tried it. This is also what the last paragraph in my posting means).

After reading this reply, if there is anyone who has implemented the idea I thought of in the last paragraph or has a clue to the solution to methods a) & b) that I tried earlier, please leave a reply and I would like to think about it together.
(β€œThe direction I approached was wrong.”, β€œHow about trying this among methods a) and b)?”, β€œYou can do it this way.”, etc. Any replies are welcome. :raised_hands:)

Thank you for reading! :pray:

In my machine, there is also no β€œmpi.h” in the mpi4py/include directory

ls /mnt/d/software_install/fenics/lib/python3.12/site-packages/mpi4py/include/mpi4py/
mpi.pxi  mpi4py.MPI.h  mpi4py.MPI_api.h  mpi4py.h  mpi4py.i

So maybe the CMakeList.txt of Trilions is outdated. I also find it very difficult to install Trilions :joy:, so I gave up.

Thank you for your comment. @Smith_Jack :+1:
I see you have the same experience as me!

Regarding the path directory shown in the comment, did you install FEniCSx using apt(or apt-get)?
Will the same problem occur if I install it that way?

No, I install fenics by conda, because it’s the simplest way. By the way, I think this is not correct

-D MPI_BASE_DIR:PATH=$CONDA_DIR/lib/python3.12/site-packages/mpi4py

It should be the directory where the mpi is installed in your conda environment, it should just be $CONDA_DIR I think. β€œ$CONDA_DIR/include” contains mpi.h, and β€œ$CONDA_DIR/lib” contains the *.so files.

 ls /mnt/d/software_install/fenics/lib/ | grep mpi
libadios2_c_mpi.so
libadios2_c_mpi.so.2
libadios2_c_mpi.so.2.9.2
libadios2_core_mpi.so
libadios2_core_mpi.so.2
libadios2_core_mpi.so.2.9.2
libadios2_cxx11_mpi.so
libadios2_cxx11_mpi.so.2
libadios2_cxx11_mpi.so.2.9.2
libadios2_fortran_mpi.so
libadios2_fortran_mpi.so.2
libadios2_fortran_mpi.so.2.9.2
libfftw3_mpi.so
libfftw3_mpi.so.3
(some other filesοΌ‰
ls /mnt/d/software_install/fenics/include | grep mpi
H5FDmpi.h
H5FDmpio.h
fftw3-mpi.f03
fftw3-mpi.h
fftw3l-mpi.f03
hcompi.h
mpi.h
mpi.mod
mpi_base.mod
(some other files)

Thank you for your good comments! @Smith_Jack!

When I modify the suggested command line,

#! /bin/bash

if [ ! -d build ]; then
    mkdir build
fi
if [ ! -d install ]; then
    mkdir install
fi
cd build/

CONDA_DIR=/home/miniconda3/envs/dolfinx-0.7.2/
SRC_DIR=/home/trilinos-15.0.0/

BUILD_DIR=$SRC_DIR/build/
INSTALL_DIR=$SRC_DIR/install/

rm -rf CMakeCache.txt CMakeFiles

cmake \
    -D CMAKE_INSTALL_PREFIX:PATH=$INSTALL_DIR \
    -D CMAKE_C_COMPILER=$CONDA_DIR/bin/mpicc \
    -D CMAKE_CXX_COMPILER=$CONDA_DIR/bin/mpic++ \
    -D TPL_ENABLE_DLlib:BOOL=OFF \
    -D TPL_ENABLE_MPI:BOOL=ON \
    -D MPI_BASE_DIR:PATH=$CONDA_DIR \
    -D MPI_EXEC:FILEPATH=$CONDA_DIR/bin/mpiexec \
    -D PYTHON_EXECUTABLE:FILEPATH=$CONDA_DIR/bin/python3 \
    -D Trilinos_ENABLE_Fortran:BOOL=OFF \
    -D Trilinos_ENABLE_TESTS:BOOL=OFF \
    -D Trilinos_ENABLE_EXAMPLES:BOOL=OFF \
    -D Trilinos_ENABLE_ALL_PACKAGES:BOOL=OFF \
    -D Trilinos_ENABLE_ALL_OPTIONAL_PACKAGES:BOOL=OFF \
    -D Trilinos_ENABLE_Epetra:BOOL=ON \
    -D Trilinos_ENABLE_EpetraExt:BOOL=ON \
    -D Trilinos_ENABLE_Triutils:BOOL=ON \
    -D Trilinos_ENABLE_Teuchos:BOOL=ON \
    -D Trilinos_ENABLE_Tpetra:BOOL=ON \
    -D Trilinos_ENABLE_Domi:BOOL=ON \
    -D Trilinos_ENABLE_Isorropia:BOOL=OFF \
    -D Trilinos_ENABLE_Pliris:BOOL=OFF \
    -D Trilinos_ENABLE_AztecOO:BOOL=ON \
    -D Trilinos_ENABLE_Galeri:BOOL=ON \
    -D Trilinos_ENABLE_Amesos:BOOL=ON \
    -D Trilinos_ENABLE_Ifpack:BOOL=ON \
    -D Trilinos_ENABLE_Komplex:BOOL=ON \
    -D Trilinos_ENABLE_ML:BOOL=ON \
    -D Trilinos_ENABLE_Anasazi:BOOL=ON \
    -D Trilinos_ENABLE_NOX:BOOL=OFF \
    -D Trilinos_ENABLE_PyTrilinos:BOOL=ON \
    -D PyTrilinos_ENABLE_Tpetra:BOOL=OFF \
    -D PyTrilinos_ENABLE_TESTS:BOOL=OFF \
    -D PyTrilinos_ENABLE_EXAMPLES:BOOL=OFF \
    -D BUILD_SHARED_LIBS:BOOL=ON \
    $SRC_DIR
make -j $CPU_COUNT
$PYTHON packages/PyTrilinos/util/configFix.py
ctest --output-on-failure
make install

the following error occurs.
(The error mentioned above does not occur…! :raised_hands: More errors occur. :joy:)
The error follows

[ 96%] Swig source /home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:64: Error: Unable to find 'Teuchos_DLLExportMacro.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:65: Error: Unable to find 'Epetra_DLLExportMacro.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:66: Error: Unable to find 'Anasaziepetra_DLLExportMacro.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:81: Error: Unable to find 'Teuchos_LabeledObject.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:82: Error: Unable to find 'Teuchos_Describable.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:83: Error: Unable to find 'Epetra_Object.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:84: Error: Unable to find 'Epetra_CompObject.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:85: Error: Unable to find 'Epetra_SrcDistObject.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:86: Error: Unable to find 'Epetra_DistObject.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:87: Error: Unable to find 'Epetra_BlockMap.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:88: Error: Unable to find 'Epetra_Map.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:89: Error: Unable to find 'Epetra_BLAS.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:90: Error: Unable to find 'Epetra_MultiVector.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:91: Error: Unable to find 'Epetra_SerialDenseOperator.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:92: Error: Unable to find 'Epetra_Operator.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:93: Error: Unable to find 'Epetra_RowMatrix.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:94: Error: Unable to find 'Epetra_VbrMatrix.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:95: Error: Unable to find 'Ifpack_Preconditioner.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:96: Error: Unable to find 'AnasaziTypes.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:97: Error: Unable to find 'AnasaziOutputManager.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:98: Error: Unable to find 'AnasaziSortManager.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:99: Error: Unable to find 'AnasaziEigenproblem.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:100: Error: Unable to find 'AnasaziOrthoManager.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:101: Error: Unable to find 'AnasaziBasicEigenproblem.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:102: Error: Unable to find 'AnasaziBasicOrthoManager.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:103: Error: Unable to find 'AnasaziBasicOutputManager.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:104: Error: Unable to find 'AnasaziBasicSort.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:105: Error: Unable to find 'AnasaziMatOrthoManager.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:106: Error: Unable to find 'AnasaziMultiVec.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:108: Error: Unable to find 'AnasaziOperator.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:110: Error: Unable to find 'AnasaziSVQBOrthoManager.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:111: Error: Unable to find 'AnasaziStatusTest.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:112: Error: Unable to find 'AnasaziStatusTestCombo.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:113: Error: Unable to find 'AnasaziStatusTestMaxIters.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:114: Error: Unable to find 'AnasaziStatusTestOutput.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:115: Error: Unable to find 'AnasaziStatusTestResNorm.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:116: Error: Unable to find 'AnasaziEpetraAdapter.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:117: Error: Unable to find 'EpetraExt_ModelEvaluator.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:118: Error: Unable to find 'Epetra_BasicRowMatrix.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:119: Error: Unable to find 'Epetra_Comm.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:120: Error: Unable to find 'Epetra_CrsGraph.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:121: Error: Unable to find 'Epetra_CrsMatrix.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:122: Error: Unable to find 'Epetra_Distributor.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:123: Error: Unable to find 'Epetra_Export.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:124: Error: Unable to find 'Epetra_FECrsGraph.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:125: Error: Unable to find 'Epetra_FECrsMatrix.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:126: Error: Unable to find 'Epetra_FEVbrMatrix.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:127: Error: Unable to find 'Epetra_FEVector.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:128: Error: Unable to find 'Epetra_Import.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:129: Error: Unable to find 'Epetra_IntSerialDenseMatrix.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:130: Error: Unable to find 'Epetra_IntSerialDenseVector.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:131: Error: Unable to find 'Epetra_IntVector.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:132: Error: Unable to find 'Epetra_InvOperator.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:133: Error: Unable to find 'Epetra_JadMatrix.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:134: Error: Unable to find 'Epetra_LAPACK.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:135: Error: Unable to find 'Epetra_LinearProblem.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:136: Error: Unable to find 'Epetra_LocalMap.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:137: Error: Unable to find 'Epetra_MapColoring.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:139: Error: Unable to find 'Epetra_MpiComm.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:142: Error: Unable to find 'Epetra_MpiDistributor.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:144: Error: Unable to find 'Epetra_OffsetIndex.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:145: Error: Unable to find 'Epetra_SerialComm.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:146: Error: Unable to find 'Epetra_SerialDenseMatrix.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:147: Error: Unable to find 'Epetra_SerialDenseSVD.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:148: Error: Unable to find 'Epetra_SerialDenseSolver.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:149: Error: Unable to find 'Epetra_SerialDenseVector.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:150: Error: Unable to find 'Epetra_SerialDistributor.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:151: Error: Unable to find 'Epetra_SerialSymDenseMatrix.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:152: Error: Unable to find 'Epetra_Time.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:153: Error: Unable to find 'Epetra_Vector.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:154: Error: Unable to find 'Ifpack_Amesos.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:155: Error: Unable to find 'Ifpack_IC.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:156: Error: Unable to find 'Ifpack_ICT.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:157: Error: Unable to find 'Ifpack_ILU.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:158: Error: Unable to find 'Ifpack_ILUT.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:159: Error: Unable to find 'Ifpack_PointRelaxation.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:160: Error: Unable to find 'MLAPI_MultiVector.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:161: Error: Unable to find 'MLAPI_EpetraBaseOperator.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:162: Error: Unable to find 'MLAPI_Operator_Box.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:163: Error: Unable to find 'ml_MultiLevelPreconditioner.h'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:165: Error: Unable to find 'Teuchos_DefaultComm.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:167: Error: Unable to find 'Teuchos_DefaultMpiComm.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:169: Error: Unable to find 'Teuchos_ParameterList.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:170: Error: Unable to find 'Teuchos_DefaultSerialComm.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:171: Error: Unable to find 'Teuchos_SerialDenseMatrix.hpp'
/home/trilinos-15.0.0/build/packages/PyTrilinos/src/Teuchos.RCP.i:172: Error: Unable to find 'Teuchos_Time.hpp'
make[2]: *** [packages/PyTrilinos/src/CMakeFiles/PyTrilinos_Teuchos_RCP.dir/build.make:74: packages/PyTrilinos/src/Teuchos.RCPPYTHON_wrap.cpp] error 1
make[1]: *** [CMakeFiles/Makefile2:4231: packages/PyTrilinos/src/CMakeFiles/PyTrilinos_Teuchos_RCP.dir/all] error 2
make: *** [Makefile:166: all] error 2

I think in this case it doesn’t recognize the other header file. :smiling_face_with_tear:

maybe you can try a simpler cmake command ?

export C_INCLUDE_PATH=/mnt/d/software_install/fem/include
export CPLUS_INCLUDE_PATH=/mnt/d/software_install/fem/include
export LD_LIBRARY_PATH=/mnt/d/software_install/fem/lib

cmake \
-DTPL_ENABLE_MPI=ON \
-DCMAKE_INSTALL_PREFIX=/mnt/d/software_install/Trilinos-15.0.0/ \
-DBUILD_SHARED_LIBS=ON \
-DTrilinos_ENABLE_ALL_PACKAGES=ON \
-DTrilinos_ENABLE_TESTS=OFF \
-DTPL_ENABLE_Boost=OFF  \
-DTPL_ENABLE_X11=OFF  ..

Remeber to install the β€œnetcdf” and β€œmatio”, make will fail to if these two libs are not installed.

I haven’t β€œmake install” .Sorry, I don’t want to go through this horrible compilation process againπŸ˜‚. I just run

python3 build/packages/PyTrilinos/src/gen_teuchos_rcp.py

but It didn’t generate the β€œTeuchos.RCP.i” file, I think some advance compilation process is required. However, I checked the files under build/packages/PyTrilinos/src/CMakeFiles, it seems that the pathes are correct.

In fact, even the compilation completes successfully, the Trillions may not work. In my previous attempt, it linked to the wrong libc.so lib. When I run the program, segmentation fault occured. (I tried to compile deal.II, it needs Trillions). After this problem was solved, the library was linked to some wrong MPI lib, so segmentation fault occured again. Even if these don’t happen, I think you also need to compile PETSC, this will another hard work. So good luck :joy: