Cao Yan, Dong Yihong, Wu Shaoqing, et al. Dynamic network representation learning:a review[J]. Acta Electronica Sinica, 2020, 48(10): 2047-2059. [11] Mikolov T, Sutskever I, Chen Kai, et al. Distributed representations of words and phrases and their compositionality[C]//26th International Conference on Neural Information Processing Systems. North Miami Beach:Curran Associates Incorporated, 2013:3111-3119. [12] Cui Peng, Wang Xiao, Pei Jian, et al. A survey on network embedding[J]. IEEE Transactions on Knowledge and Data Engineering, 2019, 31(5):833-852. [13] Ke Guolin, Meng Qi, Finley T, et al. LightGBM:A highly efficient gradient boosting decision tree[C]//31st International Conference on Neural Information Processing Systems. Long Beach:Curran Associates Incorporated, 2017:3146-3157. [14] Chen Tianqi, Guestrin C. XGBoost:A scalable tree boosting system[C]//22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. San Francisco:ACM, 2016:785-794. [15] Yang Gun, Zhou Fangrong, Ma Yi, et al. Identifying lightning channel-base current function parameters by Powell particle swarm optimization method[J]. IEEE Transactions on Electromagnetic Compatibility, 2018, 60(1):182-187. [16] Meier L, Geer S V D, Bhlmann P, et al. The group lasso for logistic regression[J]. Journal of the Royal Statistical Society Series B (Statistical Methodology), 2008, 70(1):53-71. [17] Lee J S. AUC4. 5:auc-based C4. 5 decision tree algorithm for imbalanced data classification[J]. IEEE Access, 2019:106034-106042. [18] Breiman L. Random forest[J]. Machine Learning, 2001, 45:5-32. |