I referred to an answer in the thread Program killed when running compute_gradient on transient problem and updated my code as follows:
# Gradient computation
compute_grad_start_time = time.time()
dLdk = ReducedFunctional(L, control_k)
compute_grad_end_time = time.time()
# Retrieve vector values
vector_start_time = time.time()
grad_L_k = dLdk.derivative().vector().get_local().mean()
vector_end_time = time.time()
However, even with this modification, the execution time continues to increase as the epochs progress. Specifically, the vector_time
grows as follows:
- Epoch 1:
vector_time = 0:05:52
- Epoch 2:
vector_time = 0:06:24
- Epoch 3:
vector_time = 0:06:56
- Epoch 4:
vector_time = 0:07:24