关于| 联系| 地图| 邮箱
首页>>>学术期刊>>>期刊年表>>>2024年论文集(45卷)
2024年论文集(45卷) 2023年论文集(44卷) 2022年论文集(43卷) 2021年总目录(42卷) 2020年总目录(41卷) 2019年总目录(40卷) 2018年总目录(39卷) 2017全年目录(38卷) 2016全年目录(37卷) 2015全年目录(36卷) 2014全年目录(35卷) 2013全年目录(34卷) 2012全年目录(33卷) 2011全年目录(32卷) 2010全年目录(31卷) 2009全年目录(30卷) 2008全年目录(29卷) 2007全年目录(28卷) 2006全年目录(27卷) 2005全年目录(26卷) 2004全年目录(25卷) 2003全年目录(24卷) 2002全年目录(22卷) 2001全年目录(22卷) 2000全年目录(21卷)

20240610基于改进YOLOv5的科技项目评审过程人员行为分析方法

‖  文章供稿:卢杏坚 杨丹妮 焦泽昱
‖  字体: [大] [中] [小]

卢杏坚 杨丹妮 焦泽昱

(广东省科学院智能制造研究所/广东省现代控制技术重点实验室,广东 广州 510070)

摘要:为提高科技项目评审会议组织实施的质量,规范评审过程中参会人员的行为,提出一种基于改进YOLOv5的科技项目评审过程人员行为分析方法,实时分析评审会议现场监控视频数据,识别参会人员的违规行为。首先,基于改进的YOLOv5构建监控视频小目标检测网络,在YOLOv5主干网络中融合TCANet注意力机制,获取评审会议现场监控视频数据中重点关注的目标区域,并在其头部网络增加了特征图上采样处理,将上采样得到的特征图与主干网络中的浅层特征图进行融合,实现评审会议现场中手机、名片等小目标的检测;然后,提出参会人员行为分析算法,通过人体目标跟踪网络模型实时跟踪参会人员的移动轨迹,建立区域属性与专家位置域联合的时空关联关系判别式,识别参会人员与专家接触、攀谈等违规行为。实验结果表明,该方法对评审会议现场中手机、名片小目标的检测准确率为0.657,相比于YOLOv5m,mAP提升了0.196;参会人员的跟踪准确率Rank-1达到0.938,图像处理帧率为21 F/s,能够准确识别参会人员接触、攀谈行为,对评审会议现场人员行为智能化管理具有重要意义。

关键词:科技项目评审;人工智能;行为分析;目标检测;目标跟踪

中图分类号:TP3              文献标志码:A           文章编号:1674-2605(2024)06-0010-09

DOI:10.3969/j.issn.1674-2605.2024.06.010                     开放获取

Analysis Method of Personnel Behavior in the Technology Project Review Process Based on Improved YOLOv5

LU Xingjian  YANG Danni  JIAO Zeyu

(Institute of Intelligent Manufacturing, Guangdong Academy of Sciences/

Guangdong Key Laboratory of Modern Control Technology, Guangzhou 510070, China)

Abstract: To enhance the quality of organizing and implementing technology project review meetings and to regulate participants' behavior during the review process, a behavior analysis method based on an improved YOLOv5 is proposed. This method enables real-time analysis of surveillance video data from review meetings to identify participants' violations. First, an improved YOLOv5-based small-object detection network is constructed for monitoring video data. By integrating the TCANet attention mechanism into the YOLOv5 backbone network, the model focuses on key target areas within the surveillance footage of review meetings. Additionally, the head network incorporates an upsampling process, where the upsampled feature maps are fused with shallow feature maps from the backbone network to achieve detection of small objects such as mobile phones and business cards in the meeting environment. Next, a participant behavior analysis algorithm is proposed. Using a human target tracking network model, the system tracks participants' movement trajectories in real time. A spatiotemporal correlation model is established by combining regional attributes with the spatial domain of expert locations, enabling the detection of participant behaviors, such as interactions and conversations with experts, which may constitute violations. Experimental results demonstrate that the method achieves a detection accuracy of 0.657 for small objects like mobile phones and business cards, with a mAP improvement of 0.196 compared to YOLOv5m. The participant tracking accuracy reaches 0.938, with an image processing frame rate of 21 frames per second (F/s). This approach effectively identifies participant behaviors such as contact and conversation, making significant contributions to the intelligent management of participant behavior during review meetings.

Keywords: technology project review; artificial intelligence; behavioral analysis; object detection; target tracking

打印