Abstract
Sand is a crucial material in construction, and analyzing its particle size distribution is vital for the safety and durability of civil infrastructures. Traditional sieve analysis is labor-intensive, while modern convolutional neural network (CNN) methods often lack depth information and comprehensive evaluation metrics, leading to inaccuracies. We pioneer in proposing a LiDAR-Fused DenseNet framework that combines visual and depth data from sand images using a densely connected neural network and LiDAR technology. For evaluation, we employ the Kolmogorov-Smirnov statistic to measure differences between predicted and actual particle size distributions. Our model's effectiveness is validated through extensive experiments, outperforming various neural networks. The integration of depth information allows our LiDAR-fused DenseNet to achieve a Kolmogorov-Smirnov value of 0.125, an Anderson-Darling value of 0.0443, a Wasserstein distance of 0.043, a mean squared error of 0.0033, and a mean absolute error of 0.043. Case studies demonstrate its practical utility and robustness, suggesting that this framework improves material assessment and automation in construction. © 2024 Elsevier Ltd.
Original language | English |
---|---|
Article number | 111663 |
Journal | Journal of Building Engineering |
Volume | 100 |
Online published | 24 Dec 2024 |
DOIs | |
Publication status | Published - 15 Apr 2025 |
Funding
This research was financially supported by HK Tech 300 of the City University of Hong Kong, the Ideation Programme Fund of the Hong Kong Science and Technology Parks Corporation (Project No. CityU22–04), and the Postgraduate Research & Practice Innovation Program of Jiangsu Province (Grant Numbers: KYCX23_0274).
Research Keywords
- Neural networks
- DenseNet
- LiDAR
- Particle size distribution
- Sand