锌黄锡矿
材料科学
带隙
能量转换效率
太阳能电池
光电子学
溅射
量子效率
捷克先令
薄膜
缓冲器(光纤)
沉积(地质)
图层(电子)
纳米技术
电气工程
古生物学
工程类
生物
沉积物
作者
Jiwon Lee,Temujin Enkhbat,Gyuho Han,Hamim Sharif,Enkhjargal Enkhbayar,Hyesun Yoo,Jihun Kim,SeongYeon Kim,Junho Kim
出处
期刊:Nano Energy
[Elsevier BV]
日期:2020-12-01
卷期号:78: 105206-105206
被引量:38
标识
DOI:10.1016/j.nanoen.2020.105206
摘要
For high efficiency kesterite Cu2ZnSn(S,Se)4 (CZTSSe) solar cell, CdS thin film was usually used as a buffer layer. However, due to the toxicity of Cd and pollution problems involved from the solution-based chemical bath deposition, eco-friendly high efficiency CZTSSe solar cell with Cd-free buffer is necessary. As an Cd-free buffer layer, we investigated (Zn,Sn)O (ZTO) film deposited by sputtering method. In order to achieve high power conversion efficiency, we controlled energy band gaps of CZTSSe absorber as well as ZTO buffer, which was required to optimize conduction band offset (CBO) between the absorber and the buffer and to increase open circuit voltage (Voc) and fill factor (FF). The CBO was optimized by controlling the band gap of ZTO. By varying the Sn/(Zn + Sn) ratio and its deposition temperature, band gap of ZTO was successfully adjusted. Experimental and computational calculation results showed that solar cell performance was strongly affected by the CBO between absorber and buffer. Besides CBO matching, larger band gap of ZTO improved short circuit current density (Jsc) with enhanced external quantum efficiency value in blue photon spectrum range. As an additional way to improve power conversion efficiency of solar cell, band graded CZTSSe absorber was developed by using spray-based two-step process. The sprayed CZTSSe film was engineered to have S-enriched surface, which makes surface band gap widened and surface passivated, and resultantly increases Voc, Jsc and fill factor (FF). By controlling band gaps of both CZTSSe absorber and ZTO buffer, we obtained 11.22% environment-friendly CZTSSe solar cell without MgF2 anti-reflection coating.
科研通智能强力驱动
Strongly Powered by AbleSci AI