Citation: | WANG Hong-ye, QIAN Quan, WU Xing. Incremental learning of material absorption coefficient regression based on parameter penalty and experience replay[J]. Chinese Journal of Engineering, 2023, 45(7): 1225-1231. doi: 10.13374/j.issn2095-9389.2022.05.03.006 |
[1] |
梁李斯, 郭文龍, 馬洪月, 等. 多孔吸聲材料吸聲性能預測及吸聲模型研究進展. 材料導報, 2022(23):1
Liang L S, Guo W L, Ma H Y, et al. Research progress of sound absorption performance prediction and sound absorption model of porous sound-absorbing materials. Mater Rep, 2022(23): 1
|
[2] |
Ciaburro G, Iannace G, Ali M, et al. An artificial neural network approach to modelling absorbent asphalts acoustic properties. J King Saud Univ Eng Sci, 2021, 33(4): 213
|
[3] |
Iannace G, Ciaburro G, Trematerra A. Modelling sound absorption properties of broom fibers using artificial neural networks. Appl Acous, 2020, 163: 107239 doi: 10.1016/j.apacoust.2020.107239
|
[4] |
翟婷婷, 高陽, 朱俊武. 面向流數據分類的在線學習綜述. 軟件學報, 2020, 31(4):912 doi: 10.13328/j.cnki.jos.005916
Zhai T T, Gao Y, Zhu J W. Survey of online learning algorithms for streaming data classification. J Softw, 2020, 31(4): 912 doi: 10.13328/j.cnki.jos.005916
|
[5] |
董家源, 楊小渝. 材料數據挖掘與機器學習工具的集成與優化. 數據與計算發展前沿, 2020, 2(4):105
Dong J Y, Yang X Y. Integration and optimization of material data mining and machine learning tools. Front Data &Comput, 2020, 2(4): 105
|
[6] |
Kirkpatrick J, Pascanu R, Rabinowitz N, et al. Overcoming catastrophic forgetting in neural networks. PNAS, 2017, 114(13): 3521 doi: 10.1073/pnas.1611835114
|
[7] |
Mai Z D, Li R W, Jeong J, et al. Online continual learning in image classification: An empirical survey. Neurocomputing, 2022, 469: 28 doi: 10.1016/j.neucom.2021.10.021
|
[8] |
Parisi G I, Kemker R, Part J L, et al. Continual lifelong learning with neural networks: A review. Neural Netw, 2019, 113: 54 doi: 10.1016/j.neunet.2019.01.012
|
[9] |
Li Z Z, Hoiem D. Learning without forgetting. IEEE Trans Pattern Anal Mach Intell, 2018, 40(12): 2935 doi: 10.1109/TPAMI.2017.2773081
|
[10] |
Zenke F, Poole B, Ganguli S. Continual learning through synaptic intelligence // Proceedings of the 34th International Conference on Machine Learning. Sydney, 2017: 3987
|
[11] |
Chaudhry A, Dokania P K, Ajanthan T, et al. Riemannian Walk for Incremental Learning: Understanding Forgetting and Intransigence // European Conference on Computer Vision. Munich, 2018: 556
|
[12] |
Rebuffi S A, Kolesnikov A, Sperl G, et al. iCaRL: Incremental classifier and representation learning // Conference on Computer Vision and Pattern Recognition. Honolulu, 2017: 5533
|
[13] |
Aljundi R, Caccia L, Belilovsky E, et al. Online continual learning with maximally interfered retrieval // Proceedings of the 33rd International Conference on Neural Information Processing Systems. Vancouver, 2019: 11872
|
[14] |
Aljundi R, Lin M, Goujaud B, et al. Gradient based sample selection for online continual learning // Proceedings of the 33rd International Conference on Neural Information Processing Systems. Vancouver, 2019: 11817
|
[15] |
Prabhu A, Torr P H S, Dokania P K. GDumb: A simple approach that questions our progress in continual learning // European Conference on Computer Vision. Glasgow, 2020: 524
|
[16] |
Mallya A, Lazebnik S. PackNet: Adding multiple tasks to a single network by iterative pruning // 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Salt Lake City, 2018: 7765
|
[17] |
Li X L, Zhou Y, Wu T, et al. Learn to grow: A continual structure learning framework for overcoming catastrophic forgetting // International Conference on Machine Learning. Long Beach, 2019: 3925
|
[18] |
Lange M D, Aljundi R, Masana M, et al. A continual learning survey: Defying forgetting in classification tasks. IEEE Trans Pattern Anal Mach Intell, 2022, 44(7): 3366
|
[19] |
Mai Z D, Li R W, Kim H, et al. Supervised contrastive replay: Revisiting the nearest class mean classifier in online class-incremental continual learning // Conference on Computer Vision and Pattern Recognition. Online, 2021: 1177
|
[20] |
Hayes T L, Cahill N D, Kanan C. Memory efficient experience replay for streaming learning // International Conference on Robotics and Automation. Montreal, 2019: 9769
|
[21] |
Liu Y Y, Su Y T, Liu A N, et al. Mnemonics training: Multi-class incremental learning without forgetting // 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Seattle, 2020: 12242
|
[22] |
Chaudhry A, Dokania P K, Ajanthan T, et al. Riemannian walk for incremental learning: Understanding forgetting and intransigence // Proceedings of the European Conference on Computer Vision. Munich, 2018: 556
|
[23] |
Lesort T, Lomonaco V, Stoian A, et al. Continual learning for robotics: Definition, framework, learning strategies, opportunities and challenges. Inf Fusion, 2020, 58: 52 doi: 10.1016/j.inffus.2019.12.004
|
[24] |
Lopez-Paz D, Ranzato M A. Gradient episodic memory for continual learning // Proceedings of the 31st International Conference on Neural Information Processing Systems. Long Beach, 2017: 6470
|
[25] |
Aljundi R, Babiloni F, Elhoseiny M, et al. Memory aware synapses: Learning what (not) to forget // Proceedings of the European Conference on Computer Vision. Munich, 2018: 144
|