计算机科学
管道(软件)
深度学习
软件部署
能源消耗
高效能源利用
计算
人工智能
分布式计算
计算机工程
算法
软件工程
生态学
电气工程
生物
程序设计语言
工程类
作者
Chunyu Yuan,Sos С. Agaian
标识
DOI:10.1007/s10462-023-10464-w
摘要
Deep learning (DL) has recently changed the development of intelligent systems and is widely adopted in many real-life applications. Despite their various benefits and potentials, there is a high demand for DL processing in different computationally limited and energy-constrained devices. It is natural to study game-changing technologies such as Binary Neural Networks (BNN) to increase DL capabilities. Recently remarkable progress has been made in BNN since they can be implemented and embedded on tiny restricted devices and save a significant amount of storage, computation cost, and energy consumption. However, nearly all BNN acts trade with extra memory, computation cost, and higher performance. This article provides a complete overview of recent developments in BNN. This article focuses exclusively on 1-bit activations and weights 1-bit convolution networks, contrary to previous surveys in which low-bit works are mixed in. It conducted a complete investigation of BNN’s development—from their predecessors to the latest BNN algorithms/techniques, presenting a broad design pipeline and discussing each module’s variants. Along the way, it examines BNN (a) purpose: their early successes and challenges; (b) BNN optimization: selected representative works that contain essential optimization techniques; (c) deployment: open-source frameworks for BNN modeling and development; (d) terminal: efficient computing architectures and devices for BNN and (e) applications: diverse applications with BNN. Moreover, this paper discusses potential directions and future research opportunities in each section.
科研通智能强力驱动
Strongly Powered by AbleSci AI