悬链线
受电弓
结构工程
压力(语言学)
材料科学
有限元法
瞬态(计算机编程)
使用寿命
磁道(磁盘驱动器)
工程类
梁(结构)
应力集中
应力场
应变计
涡流
厚板
接触力学
极限抗拉强度
试验数据
拉伸试验
点(几何)
接触分析
蠕动
振动疲劳
横截面(物理)
试验方法
疲劳极限
疲劳试验
弯曲
复合材料
作者
Xu Zhao,Yang Song,Zhigang Liu
标识
DOI:10.1109/tim.2022.3144747
摘要
In an electrified railway system, the pantograph on the train roof is used to collect the electric current through a sliding contact with the contact wire (CW) of the catenary. The CW is mounted laterally in a zigzag relative to the track centerline with the help of a steady arm to ensure symmetrical wear on the pantograph's strip, which is beneficial to slow the wear process and extend the service life. Under the pantograph's impact, the CW around the steady arm forms a stress concentration and has been recognized as a vulnerable point in the catenary system. This article performed a uniaxial tensile test and high-cycle fatigue test to measure the mechanical properties and fatigue characteristics of Cu–Mg alloy that is used for manufacturing high-speed CW. A stress analysis method is proposed based on the combination of the pantograph–catenary interaction model and the solid CW specimen model. In the pantograph–catenary interaction simulation, the catenary is modeled using the Euler–Bernoulli beam element. The 3-D solid element is used to model the CW segment. The nodal displacements of the CW segment obtained from the pantograph–catenary interaction simulation are transferred to the CW specimen model. The stress distribution on the CW section and the entire multiaxial transient stress field are analyzed. Based on fatigue test results and stress time history data obtained by simulation, using the nominal stress method, the uniaxial fatigue and multiaxial fatigue analyses are performed to reveal the fatigue characteristics with different speed classes. The results indicate that the weakest points appear at both the wing point and the top point on a CW section. The CW's fatigue life reduces by over 50% when the speed class is upgraded from 350 to 400 km/h.
科研通智能强力驱动
Strongly Powered by AbleSci AI