1.太原理工大学光电工程学院,山西 太原 030024
2.太原理工大学新型传感器与智能控制教育部重点实验室,山西 太原 030024
3.山西白求恩医院和山西医学科学院医学检验科,山西 太原 030032
4.山西白求恩医院和山西医学科学院医疗部,山西 太原 030032
[ "杨玲珍,女,教授;博士,博士生导师;山西省物理学会理事,山西省光学学会理事;目前主要从事光纤激光技术及应用方面的实验与理论研究工作;主持完成国家级自然科学基金3项和省部级项目4项;曾获山西省高等学校中青年拔尖创新人才。Email:yanglingzhen@tyut.edu.cn" ]
纸质出版日期:2023-03-25,
收稿日期:2023-01-20,
修回日期:2023-02-25,
扫 描 看 全 文
张玉婧,杨玲珍,尚慧锋等.NIRFAST仿真实现乳腺扩散光层析图像智能分类[J].新兴科学和技术趋势,2023,2(1):19-27.
ZHANG Yujing,YANG Lingzhen,SHANG Huifeng,et al.Intelligent classification of breast diffuse optical tomography images using NIRFAST[J].Emerging Science and Technology,2023,2(1):19-27.
张玉婧,杨玲珍,尚慧锋等.NIRFAST仿真实现乳腺扩散光层析图像智能分类[J].新兴科学和技术趋势,2023,2(1):19-27. DOI: 10.12405/j.issn.2097-1486.2023.01.003.
ZHANG Yujing,YANG Lingzhen,SHANG Huifeng,et al.Intelligent classification of breast diffuse optical tomography images using NIRFAST[J].Emerging Science and Technology,2023,2(1):19-27. DOI: 10.12405/j.issn.2097-1486.2023.01.003.
论文采用NIRFAST软件仿真实现乳腺扩散光层析成像,通过深度学习方法实现乳腺扩散光层析图像的智能分类。乳腺扩散光层析成像图像仿真采用开源软件包—NIRFAST软件模拟和重建。乳腺扩散光层析成像图像中乳腺肿瘤的大小、位置和光学特性各不相同,通过不同的光吸收系数区分不同类型的乳腺肿瘤,并采用计算机辅助诊断方法进行图像分析,将其分为良性和不同等级的恶性肿瘤。为提高特征提取能力,提出了DOTResNet模型,DOTResNet模型实现是通过将ResNet-34的输入干中7×7卷积替换为3个3×3卷积,并增加ReLU和BN层。DOTResNet与ResNet-34和预训练的ResNet-34相比,FLOPs减小了,并将分类准确率提高到99.38%。与经典的ResNet-34相比,调整后的DOTResNet模型在乳腺扩散光层析成像分类上具有更高的准确率。
This paper uses NIRFAST, an open-source software package, to simulate the breast diffuse optical tomography. The breast diffuse optical tomography images are intelligently classified through Deep Learning method. The images are thus simulated and reconstructed by NIRFAST. The size, the location, and the optical properties of breast tumors alter in breast diffuse optical tomography images. Different types of breast tumors are distinguished by different light absorption coefficients; the benign tumors and the malignant tumors of different classes are analyzed through computer-aided method. To increase the feature extraction ability, DOTResNet model was proposed. The model is achieved by replacing a 7×7 convolution in the input stem of ResNet-34 with the three 3×3 convolutions and adding the batch normalization layer with rectified linear activation units (ReLU). Compared with ResNet-34 and Pre-trained ResNet-34, DOTResNet has reduced the FLOPs, and the classification accuracy has increased to 99.38%. Compared with the canonical ResNet-34, the tweaked model has higher classification accuracy.
深度学习扩散光层析成像乳腺肿瘤ResNet
deep learningdiffuse optical tomographybreast tumorResNet
DURDURAN T, CHOE R, BAKER W B, et al. Diffuse optics for tissue monitoring and tomography[J]. Reports on progress in physics, 2010, 73(7): 076701. DOI: 10.1088/0034-4885/73/7/076701http://dx.doi.org/10.1088/0034-4885/73/7/076701.
EGGEBRECHT A T, FERRADAL S L, Robichaux-Viehoever A, et al. Mapping distributed brain function and networks with diffuse optical tomography[J]. Nature photonics, 2014, 8(6): 448-454. DOI: 10.1038/nphoton.2014.107http://dx.doi.org/10.1038/nphoton.2014.107.
WANG X, HU R, WANG Y, et al. A Data Self-Calibration Method Based on High-Density Parallel Plate Diffuse Optical Tomography for Breast Cancer Imaging[J]. Frontiers in Oncology, 2021, 11. DOI: 10.3389/fonc.2021.786289http://dx.doi.org/10.3389/fonc.2021.786289.
ZHANG M, LI S, ZOU Y, et al. Deep learning-based method to accurately estimate breast tissue optical properties in the presence of the chest wall[J]. Journal of Biomedical Optics, 2021, 26(10): 106004. DOI: 10.1117/1.JBO.26.10.106004http://dx.doi.org/10.1117/1.JBO.26.10.106004.
WANG Y, LI S, WANG Y, et al. Compact fiber-free parallel-plane multi-wavelength diffuse optical tomography system for breast imaging[J]. Optics Express, 2022, 30(5): 6469-6486. DOI: 10.1364/OE.448874http://dx.doi.org/10.1364/OE.448874.
SIEGEL R L, MILLER K D, FUCHS HE, Jemal A. Cancer Statistics, 2021[J]. CA: A Cancer Journal for Clinicians. 2021;71(1):7-33. DOI: 10.3322/caac.21654http://dx.doi.org/10.3322/caac.21654.
NASSIF A B, TALIB M A, NASIR Q, et al. Breast cancer detection using artificial intelligence techniques: A systematic literature review[J]. Artificial Intelligence in Medicine, 2022: 102276. DOI: 10.1016/j.artmed.2022.102276http://dx.doi.org/10.1016/j.artmed.2022.102276.
DI Sciacca G, MAFFEIS G, FARINA A, et al. Evaluation of a pipeline for simulation, reconstruction, and classification in ultrasound-aided diffuse optical tomography of breast tumors[J]. Journal of biomedical optics, 2022, 27(3): 036003. DOI: 10.1117/1.JBO.27.3.036003http://dx.doi.org/10.1117/1.JBO.27.3.036003.
王慧泉, 吴念, 赵喆, 等. 基于深度学习的扩散光学层析成像重建综述[J]. 激光与光电子学进展, 2020, 57(4): 39-45. DOI: 10.3788/LOP57.040003http://dx.doi.org/10.3788/LOP57.040003.
YEDDER H B, CARDOEN B, SHOKOUFI M, et al. Multitask Deep Learning Reconstruction and Localization of Lesions in Limited Angle Diffuse Optical Tomography[J]. IEEE Transactions on Medical Imaging, 2021, 41(3): 515-530. DOI: 10.1109/TMI.2021.3117276http://dx.doi.org/10.1109/TMI.2021.3117276.
XU Q, WANG X, JIANG H. Convolutional neural network for breast cancer diagnosis using diffuse optical tomography[J]. Visual Computing for Industry, Biomedicine, and Art, 2019, 2(1): 1-6. DOI: 10.1186/s42492-019-0012-yhttp://dx.doi.org/10.1186/s42492-019-0012-y.
谢远志, 闫士举, 魏高峰, 等. 基于U-Net++和对抗性学习网络的乳腺肿块分割[J]. 激光与光电子学进展, 2022, 59(16): 380-386. DOI: 10.3788/LOP202259.1617002http://dx.doi.org/10.3788/LOP202259.1617002.
BOUMARAF S, LIU X, ZHENG Z, et al. A new transfer learning based approach to magnification dependent and independent classification of breast cancer in histopathological images[J]. Biomedical Signal Processing and Control, 2021, 63: 102192. DOI: 10.1016/j.bspc.2020.102192http://dx.doi.org/10.1016/j.bspc.2020.102192.
RASHMI R, PRASAD K, UDUPA C B K. BCHisto-Net: Breast histopathological image classification by global and local feature aggregation[J]. Artificial Intelligence in Medicine, 2021, 121: 102191. DOI:10.1016/j.artmed.2021.102191http://dx.doi.org/10.1016/j.artmed.2021.102191.
LUO Y, HUANG Q, LI X. Segmentation information with attention integration for classification of breast tumor in ultrasound image[J]. Pattern Recognition, 2022, 124: 108427. DOI: 10.1016/j.patcog.2021.108427http://dx.doi.org/10.1016/j.patcog.2021.108427.
CHEN J, HUANG H, COHN A G, et al. A hierarchical DCNN-based approach for classifying imbalanced water inflow in rock tunnel faces[J]. Tunnelling and Underground Space Technology, 2022, 122: 104399. DOI: 10.1016/j.tust.2022.104399http://dx.doi.org/10.1016/j.tust.2022.104399.
于福升,余江,鲁远甫,等. 基于残差网络的虹膜图像性别分类[J]. 激光与光电子学进展, 2021, 58(16): 346-353. DOI: 10.3788/LOP202158.1610022http://dx.doi.org/10.3788/LOP202158.1610022.
CEJUDO J E, CHAURASIA A, FELDBERG B, et al. Classification of dental radiographs using deep learning[J]. Journal of Clinical Medicine, 2021, 10(7): 1496. DOI: 10.3390/jcm10071496http://dx.doi.org/10.3390/jcm10071496.
XING W, HE C, LI J, et al. Automated lung ultrasound scoring for evaluation of coronavirus disease 2019 pneumonia using two-stage cascaded deep learning model[J]. Biomedical Signal Processing and Control, 2022, 75: 103561. DOI: 10.1016/j.bspc.2022.103561http://dx.doi.org/10.1016/j.bspc.2022.103561.
SHEN Y, WU N, PHANG J, et al. An interpretable classifier for high-resolution breast cancer screening images utilizing weakly supervised localization[J]. Medical image analysis, 2021, 68: 101908. DOI: 10.1016/j.media.2020.101908http://dx.doi.org/10.1016/j.media.2020.101908.
HE K, ZHANG X, REN S, et al. Deep residual learning for image recognition[C]//Proceedings of the IEEE conference on computer vision and pattern recognition. 2016: 770-778. DOI: 10.1109/CVPR.2016.90http://dx.doi.org/10.1109/CVPR.2016.90.
ZHU Z, ZHAI W, LIU H, Geng J, Zhou M, Ji C, et al., Juggler-ResNet: A Flexible and High-Speed ResNet Optimization Method for Intrusion Detection System in Software-Defined Industrial Networks[J]. IEEE Trans Industr Inform, 2022, 18(6): 4224-4233. DOI: 10.1109/TII.2021.3121783http://dx.doi.org/10.1109/TII.2021.3121783.
GIRAULT A, Dartoim. NIRFAST Matlab, 2018. https://github.com/nirfast-admin/NIRFASThttps://github.com/nirfast-admin/NIRFAST.
TOMER N, MCGLONE A, KÜNNEMEYER R. Validated multi-wavelength simulations of light transport in healthy onion[J]. Computers and Electronics in Agriculture, 2018, 146: 22-30. DOI: 10.1016/j.compag.2018.01.018http://dx.doi.org/10.1016/j.compag.2018.01.018.
YAMADA Y, OKAWA S. Diffuse optical tomography: Present status and its future[J]. Optical Review, 2014, 21(3): 185-205. DOI: 10.1007/s10043-014-0028-7http://dx.doi.org/10.1007/s10043-014-0028-7.
CHEN L, LIN S, LU X, et al. Deep neural network based vehicle and pedestrian detection for autonomous driving: A survey[J]. IEEE Transactions on Intelligent Transportation Systems, 2021, 22(6): 3234-3246. DOI: 10.1109/TITS.2020.2993926http://dx.doi.org/10.1109/TITS.2020.2993926.
CAI Z, CHEN J, LIU M. Least-squares ReLU neural network (LSNN) method for linear advection-reaction equation[J]. Journal of Computational Physics, 2021, 443: 110514. DOI: 10.1016/j.jcp.2021.110514http://dx.doi.org/10.1016/j.jcp.2021.110514.
MUDULI D, DASH R, MAJHI B. Automated diagnosis of breast cancer using multi-modal datasets: A deep convolution neural network based approach[J]. Biomedical Signal Processing and Control, 2022, 71: 102825. DOI: 10.1016/j.bspc.2021.102825http://dx.doi.org/10.1016/j.bspc.2021.102825.
PASZKE A, GROSS S, CHINTALA S, CHANAN G,Yang E,Devito Z, et al. Automatic differentiation in PyTorch[C]. Proc. Adv. Neural Inf. Process. Syst, 2017. http://refhub.elsevier.com/S1746-8094(20)30330-X/sbref0150http://refhub.elsevier.com/S1746-8094(20)30330-X/sbref0150.
SOULAMI K B, KAABOUCH N, SAIDI M N. Breast cancer: Classification of suspicious regions in digital mammograms based on capsule network[J]. Biomedical Signal Processing and Control, 2022, 76: 103696. DOI: 10.1016/j.bspc.2022.103696http://dx.doi.org/10.1016/j.bspc.2022.103696.
WANG Y, WANG Z, FENG Y, et al. WDCCNet: Weighted Double-Classifier Constraint Neural Network for Mammographic Image Classification[J]. IEEE Transactions on Medical Imaging, 2021, 41(3): 559-570. DOI: 10.1109/TMI.2021.3117272http://dx.doi.org/10.1109/TMI.2021.3117272.
BEURA S, MAJHI B, DASH R. Mammogram classification using two dimensional discrete wavelet transform and gray-level co-occurrence matrix for detection of breast cancer[J]. Neurocomputing, 2015, 154: 1-14. DOI: 10.1016/j.neucom.2014.12.032http://dx.doi.org/10.1016/j.neucom.2014.12.032.
0
浏览量
1
下载量
0
CSCD
关联资源
相关文章
相关作者
相关机构