A Kullback–Leibler divergence method for input–system–state identification

Research output: Contribution to journalArticlepeer-review

9 Citations (SciVal)
3 Downloads (Pure)

Abstract

The capability of a novel Kullback–Leibler divergence method is examined herein within the Kalman filter framework to select the input–parameter–state estimation execution with the most plausible results. This identification suffers from the uncertainty related to obtaining different results from different initial parameter set guesses, and the examined approach uses the information gained from the data in going from the prior to the posterior distribution to address the issue. Firstly, the Kalman filter is performed for a number of different initial parameter sets providing the system input–parameter–state estimation. Secondly, the resulting posterior distributions are compared simultaneously to the initial prior distributions using the Kullback–Leibler divergence. Finally, the identification with the least Kullback–Leibler divergence is selected as the one with the most plausible results. Importantly, the method is shown to select the better performed identification in linear, nonlinear, and limited information applications, providing a powerful tool for system monitoring.
Original languageEnglish
Article number117965
Number of pages32
JournalJournal of Sound and Vibration
Volume569
Early online date29 Jul 2023
DOIs
Publication statusPublished - 20 Jan 2024

Data Availability Statement

Data will be made available on request.

Acknowledgements

The author would like to gratefully acknowledge the reviewers for their constructive comments, and Andrew W. Smyth for the previous insightful discussions on the topic.

Keywords

  • Kullback–Leibler divergence
  • Relative entropy
  • Bayesian inference
  • Kalman filter
  • Unscented Kalman filter (UKF)
  • Residual-based Kalman filter (RKF)
  • Particle filter
  • Extended Kalman filter (EKF)
  • Gaussian process latent force model
  • Bayesian expectation–maximization
  • Input–parameter–state estimation
  • Output-only system identification
  • Unknown input estimation
  • Joint input-state-parameter estimation
  • Nonlinear system identification
  • Linear system identification
  • Limited information identification
  • Real-time system monitoring
  • Dynamic state estimation
  • Operational modal analysis
  • Stochastic subspace identification
  • Structural health monitoring (SHM)
  • Damage detection
  • Fault detection
  • Sensor optimization
  • Sparse sensing
  • Pseudo-measurements
  • Ambient vibration monitoring
  • Mechanical systems
  • Civil engineering structures
  • Multi-degree-of-freedom (MDOF) systems
  • Duffing nonlinear system
  • Large-scale structural systems
  • Wind-excited systems
  • Traffic and wind load estimation
  • Bayesian model evidence
  • Uncertainty quantification
  • Regularization parameter
  • Sensitivity analysis
  • Markov chain Monte Carlo (MCMC)
  • Gaussian distributions
  • Real-time filtering
  • Recursive estimation
  • Convergence analysis
  • Measurement noise handling
  • Limited sensor data
  • Parameter identifiability
  • Observability rank condition
  • Noise calibration
  • Algorithmic parameter tuning
  • Machine learning in structural monitoring
  • Information-theoretic methods
  • Model updating
  • Data-driven identification
  • Online estimation algorithms
  • Non-collocated sensing
  • Harmonic disturbance rejection

ASJC Scopus subject areas

  • Chemical Engineering(all)
  • Computer Science(all)
  • Economics, Econometrics and Finance(all)
  • Energy(all)
  • Engineering(all)
  • Materials Science(all)
  • Mathematics(all)
  • Physics and Astronomy(all)
  • Decision Sciences(all)
  • Environmental Science(all)

Fingerprint

Dive into the research topics of 'A Kullback–Leibler divergence method for input–system–state identification'. Together they form a unique fingerprint.

Cite this