Unable to run diffusion

Hi, I am trying to do a simulation of diffusion over time, but my field values are not being updated. What seems to be the issue here?

from __future__ import print_function
from mshr import *
import matplotlib
matplotlib.use('Agg')
from matplotlib import ticker
import numpy as np
import matplotlib.pyplot as plt
from dolfin import *

# Units are seconds, OD, uM, mm
T = 50*60*60           # final time
num_steps = 1000    # number of time steps
dt = T / num_steps # time step size
D=Constant(50.2e-3)
Dn=Constant(50.2e-5)
Da=Constant(50.2e-4)
chi_0=Constant(315e-6)
am=Constant(1e-3)
folder='outputs_test/'

# Read mesh from file
mesh = generate_mesh(Circle(Point(0.0, 0.0), 10), 128)

# Define function space for velocity
W = VectorFunctionSpace(mesh, 'P', 2)

# Define function space for system of concentrations
P1 = FiniteElement('P', triangle, 1)
element = MixedElement([P1, P1, P1])
V = FunctionSpace(mesh, element)

# Define test functions
rho_test, n_test, a_test = TestFunctions(V)

# Define functions for velocity and concentrations
w = Function(W)
u = Function(V)
u_n = Function(V)

# Split system functions to access components
rho_trial, n_trial, a_trial = split(u)
rho_iter, n_iter, a_iter = split(u_n)

rho_iter = Expression('(tanh((4-x[0]*x[0]-x[1]*x[1]))+1)*.029/2', degree=3, R=1)
n_iter   = Constant('40.0')
a_iter   = Constant('0.1')

# Define source terms
f_1 = Constant(0)
f_2 = Constant(0)
f_3 = Constant(0)
k = Constant(dt)

# Define variational problem
w=project(chi_0/am*grad(a_trial))
F = (((rho_trial - rho_iter) / k)*rho_test*dx 
    #+ dot(w, grad(rho_trial))*rho_test*dx # Advection
    + D*dot(grad(rho_trial), grad(rho_test))*dx #Diffusion
    + ((n_trial - n_iter) / k)*n_test*dx 
    + Dn*dot(grad(a_trial), grad(n_test))*dx #Diffusion
    + ((a_trial - a_iter) / k)*a_test*dx
    + Da*dot(grad(n_trial), grad(a_test))*dx #Diffusion
    - f_1*rho_test*dx - f_2*n_test*dx - f_3*a_test*dx)

# Time-stepping
t = 0
for n in range(num_steps):
        
    dpi=200
    # Update current time
    t += dt

    # Define variational problem
    w=project(chi_0/am*grad(a_trial))

    # Solve variational problem for time step
    solve(F == 0, u)

    _rho_trial, _n_trial, _a_trial = u.split()
    
    # Update previous solution
    u_n.assign(u)

I should note that the main problem I run into even upon varying the problem formulation is the creation of singularities. For example, the field of a will look as follows, and then singularities start emerging

The way you’re assigning the initial condition is disconnecting rho_iter, n_iter, and a_iter from u_n. Try something like:

#rho_iter = Expression('(tanh((4-x[0]*x[0]-x[1]*x[1]))+1)*.029/2', degree=3, R=1)
#n_iter   = Constant('40.0')
#a_iter   = Constant('0.1')
u_n.interpolate(Expression(('(tanh((4-x[0]*x[0]-x[1]*x[1]))+1)*.029/2',
                            '40.0','0.1'),degree=3, R=1))

instead.

As @kamensky said, you overwrite your variables before you define the variational formulation. You do the same thing for w which is commented out in the code.
Alternatively, you can use the FunctionAssigner to set initial conditions, see for example function assigner test for usage.

Thanks! That definitely helped! However, I still have the issue that singularities emerge. Is there any way to regularise the solutions? You can find examples of the singularity emerging below: