To start, I’m using FEniCS 2019.1.0. So, suppose I’m working with a differential equation \partial_tu + Lu = f where u: [0,T] \times [0,1]^2 \to \mathbb{R}, the analytical solution of which (I’ll call this u_e) is given by an integral over time. Suppose also that I have created a mesh and function space V_h for the domain [0,1]^2, and that I can compute values of u_e at the nodal points of the grid using some numerical integration technique. If I want to compute the (say, L_2) error between my finite element solution and the analytical solution, how would I then interpolate/project u_e onto V_h?
Without having any specific formula for u_e, it is quite hard to give you a general rule that would work for your example. Please provide a minimal toy example that illustrates what u_e could be, i.e. what is u_e(x,y,t)?