Research
My research lies at the intersection of probabilistic machine learning, computational statistics, and deep learning. I am broadly interested in developing rigorous, uncertainty-aware methods that are both theoretically grounded and practically effective.
Probabilistic Machine Learning
I am interested in Bayesian approaches to learning, where uncertainty is treated as a first-class citizen. Key topics include:
- Bayesian inference and approximate posterior methods
- Uncertainty quantification in neural network predictions
- Probabilistic forecasting for real-world time series
Computational Statistics
A significant part of my work focuses on the computational aspects of statistical inference:
- Monte Carlo methods and sequential Monte Carlo
- Kernel-based methods for statistical testing and discrepancy measures
- Stein operators and related techniques in modern computational statistics
Deep Learning
I apply and develop deep learning architectures for structured data problems:
- Sequence modelling — Temporal Convolutional Networks, Transformers
- Graph Neural Networks — CoGraphNet and co-occurrence graph representations
- Sparse attention mechanisms for efficient long-range dependency modelling
Selected Publications
Published in Scientific Reports, 2025
CoGraphNet is a graph neural network framework that leverages word co-occurrence graphs to enhance text classification performance through graph-based representation learning.
Recommended citation: Chen, J., et al. (2025). "CoGraphNet for enhanced text classification with co-occurrence graph-based representation learning." Scientific Reports.
Published in Energy, 2024
A novel deep learning model combining Temporal Convolutional Network (TCN), Dual Attention Network (DANet), and Sparse Transformer for accurate offshore wind power prediction, featuring dual-channel feature extraction and multi-scale temporal fusion.
Recommended citation: Chen, J., et al. (2024). "A novel offshore wind power prediction model based on TCN-DANet-sparse transformer." Energy.