I have a diffusion model which contains a channel (125 microns) inside a cylinder of 3 mm diameter.
The Initial condition I give for the channel is C=100 ng/ml and for cylinder C=0.
But when I solve it over a period of 30 days, I see that average C at T= 0 in the channel is around 55 ng/ml instead of 100.
I also noticed that when I reduce the timescale, lets say 12 hours. The accuracy increases and average concentration is seen as 90 ng/ml in results.
Why is this happening? The model is very simple. It is simple time-dependant diffusion model.
The Initial condition I give for the channel is C=100 ng/ml and for cylinder C=0.
But when I solve it over a period of 30 days, I see that average C at T= 0 in the channel is around 55 ng/ml instead of 100.
I also noticed that when I reduce the timescale, lets say 12 hours. The accuracy increases and average concentration is seen as 90 ng/ml in results.
Why is this happening? The model is very simple. It is simple time-dependant diffusion model.