I have an interesting observation while studying the 2D convection-diffusion equation. As the time step Δt increased, the RMSE error initially increased as expected. However, in the range of larger Δt values, I noticed an unexpected phenomenon where the error actually decreases and fluctuates (see attached figure).
The equation parameters are as follows:
Total simulation time: 1s
Domain: 1m × 1m
Flow velocity: 0.25 m/s
Diffusion coefficient: 1.12e-8
Concentration field: values range from 1 to 100
Time step (dt):1.00e-4
Spatial step (dx): 0.1
I think this might be related to the relationship between the number of computational steps and error accumulation. Are there any theoretical foundations that could explain this behavior? Has anyone encountered similar cases in numerical analysis?
Nothing too remarkable here. If you’re using a too coarse mesh, you’ll end up getting huge errors, well over 100%. That can manifest in all kinds of ways, sometimes 200% sometimes 250% sometimes coincidentally 75%.
Convergence theory guarantees an error smaller than a certain rate of dx. As dx becomes small, this is typically a sharp bound, as dx is large smaller than can look random. That is called the pre-asymptotic range.
Ah I am referencing dx as I first read your other post. The same thing applies to dt.
There was no need to create two post for roughly the same question…
Thank you very much for your response! Your explanation of the “pre-asymptotic range” and the effects of coarse grids on error behavior is invaluable, especially your view about random fluctuations in this regime.
I found some literature related to the “pre-asymptotic range,” such as: Pollock S., “An Improved Method for Solving Quasi-linear Convection Diffusion Problems on a Coarse Mesh,” SIAM Journal on Scientific Computing, 2016, 38(2): A1121-A1145.
If I understand correctly, using dt or dx within the “pre-asymptotic range” is still acceptable as long as stability is ensured. Is this common in practice? For instance, in scenarios with large computational domains (e.g., 10 km × 10 km), it might be necessary to use dx = 1 km and dt = 1 hour.
Additionally, I noticed that for larger time steps (dt), the RMSE not only fluctuates but also decreases at certain points. Could this behavior be related to round-off error effects?
I’d greatly appreciate any further thoughts or experiences you have with such cases. Thank you very much again!
Thank you for pointing out this. I created two separate posts because I thought there might be unique aspects to each parameter. Specifically, for dx, I performed uniform sampling from a dense Gaussian distribution, which I felt might introduce some potential effects worth isolating in a separate discussion.
I hope this clarifies my reasoning for splitting the posts, and I’m very grateful for your feedback on this post.