材料科学
复合材料
加工硬化
合金
复合数
深冷处理
微观结构
分层(地质)
表面粗糙度
变形(气象学)
铝
脆性
硬化(计算)
冶金
图层(电子)
古生物学
生物
俯冲
构造学
作者
Zeju Weng,Ran Pan,Baosheng Liu,Kai Gu,Mingli Zhang,Chen Cui,Junjie Wang
标识
DOI:10.1016/j.ceramint.2023.02.076
摘要
Subsurface deformation and wear behavior of 15% SiCp/2009Al aluminum alloy matrix composites induced by sliding wear at room and cryogenic temperature were investigated in the present work. An external cryogenic device was designed to produce a steady cryogenic environment in the course of experiments. Microstructure evolution in the subsurface layer and wear mechanism on the worn surface were analyzed by various characterization methods. The results showed that sliding at cryogenic temperature caused greater plastic deformation in the subsurface layer than that at room temperature. This can be attributed to the increased plasticity of aluminum alloy matrix and increased brittleness of SiC particles at cryogenic temperature, which can reduce the block effect of SiC particles on the deformation of aluminum alloy matrix. Wear rate of the composite material was significantly reduced after cryogenic sliding, whose decrement was much higher under higher applied load. Worn surface presented a lower surface roughness and higher surface evenness after sliding at cryogenic temperature. Wear mechanism was found to be transformed from severe oxidative, adhesive and delamination wear at room temperature, to slight delamination wear and cracking feature under cryogenic sliding. Deformation layer after cryogenic sliding played a role of nanocrystalline work-hardening layer, as the work-hardening rate and dislocation density were increased at cryogenic temperature. This is responsible for suppressing the generation of cracks and correspondingly improving the wear performance of material. This work also offers a new strategy for designing high surface-performance aluminum matrix composites via the method of self-adaptively forming nano hardening layer in cryogenic temperature environment.
科研通智能强力驱动
Strongly Powered by AbleSci AI