Parametric flatten T-swish: An adaptive nonlinear activation function for deep learning
Số trang: 19
Loại file: pdf
Dung lượng: 776.02 KB
Lượt xem: 12
Lượt tải: 0
Xem trước 2 trang đầu tiên của tài liệu này:
Thông tin tài liệu:
The deep neural networks, these are: 1) the negative cancellation property of ReLU tends to treat negative inputs as unimportant information for the learning, resulting in performance degradation; 2) the inherent predefined nature of ReLU is unlikely to promote additional flexibility, expressivity, and robustness to the networks.
Nội dung trích xuất từ tài liệu:
Parametric flatten T-swish: An adaptive nonlinear activation function for deep learning
Nội dung trích xuất từ tài liệu:
Parametric flatten T-swish: An adaptive nonlinear activation function for deep learning
Tìm kiếm theo từ khóa liên quan:
Information and communication technology Parametric flatten T-swish Adaptive nonlinear activation function for deep learning Adaptive nonlinear activation Activation function Deep learningTài liệu liên quan:
-
8 trang 221 0 0
-
Context ontology in mobile applications
24 trang 39 0 0 -
Application of convolutional neural network for detecting concrete cracks
4 trang 39 0 0 -
E government systems and its impact on quality of service at public hospitals in Amman (filed study)
5 trang 38 0 0 -
11 trang 36 0 0
-
91 trang 33 0 0
-
Improving hand posture recognition performance using multi-modalities
10 trang 33 0 0 -
Modern approaches in natural language processing
25 trang 32 0 0 -
8 trang 31 0 0
-
Research on traffic congestion detection from camera images in a location of Da Lat
13 trang 30 0 0