### Abstract

Original language | English |
---|---|

Journal | SIAM Journal on Scientific Computing |

Volume | accepted |

Publication status | Accepted/In press - 19 Aug 2019 |

### Fingerprint

### Keywords

- math.NA
- 35Q93, 65F08, 65F10, 65N22, 92C17

### Cite this

*SIAM Journal on Scientific Computing*,

*accepted*.

**Preconditioners and Tensor Product Solvers for Optimal Control Problems from Chemotaxis.** / Dolgov, Sergey; Pearson, John W.

Research output: Contribution to journal › Article

*SIAM Journal on Scientific Computing*, vol. accepted.

}

TY - JOUR

T1 - Preconditioners and Tensor Product Solvers for Optimal Control Problems from Chemotaxis

AU - Dolgov, Sergey

AU - Pearson, John W.

N1 - 23 pages

PY - 2019/8/19

Y1 - 2019/8/19

N2 - In this paper, we consider the fast numerical solution of an optimal control formulation of the Keller--Segel model for bacterial chemotaxis. Upon discretization, this problem requires the solution of huge-scale saddle point systems to guarantee accurate solutions. We consider the derivation of effective preconditioners for these matrix systems, which may be embedded within suitable iterative methods to accelerate their convergence. We also construct low-rank tensor-train techniques which enable us to present efficient and feasible algorithms for problems that are finely discretized in the space and time variables. Numerical results demonstrate that the number of preconditioned GMRES iterations depends mildly on the model parameters. Moreover, the low-rank solver makes the computing time and memory costs sublinear in the original problem size.

AB - In this paper, we consider the fast numerical solution of an optimal control formulation of the Keller--Segel model for bacterial chemotaxis. Upon discretization, this problem requires the solution of huge-scale saddle point systems to guarantee accurate solutions. We consider the derivation of effective preconditioners for these matrix systems, which may be embedded within suitable iterative methods to accelerate their convergence. We also construct low-rank tensor-train techniques which enable us to present efficient and feasible algorithms for problems that are finely discretized in the space and time variables. Numerical results demonstrate that the number of preconditioned GMRES iterations depends mildly on the model parameters. Moreover, the low-rank solver makes the computing time and memory costs sublinear in the original problem size.

KW - math.NA

KW - 35Q93, 65F08, 65F10, 65N22, 92C17

M3 - Article

VL - accepted

JO - SIAM Journal on Scientific Computing

JF - SIAM Journal on Scientific Computing

SN - 1064-8275

ER -