Back to Search Start Over

Convergence of Two-Layer Regression with Nonlinear Units

Authors :
Deng, Yichuan
Song, Zhao
Xie, Shenghao
Publication Year :
2023

Abstract

Large language models (LLMs), such as ChatGPT and GPT4, have shown outstanding performance in many human life task. Attention computation plays an important role in training LLMs. Softmax unit and ReLU unit are the key structure in attention computation. Inspired by them, we put forward a softmax ReLU regression problem. Generally speaking, our goal is to find an optimal solution to the regression problem involving the ReLU unit. In this work, we calculate a close form representation for the Hessian of the loss function. Under certain assumptions, we prove the Lipschitz continuous and the PSDness of the Hessian. Then, we introduce an greedy algorithm based on approximate Newton method, which converges in the sense of the distance to optimal solution. Last, We relax the Lipschitz condition and prove the convergence in the sense of loss value.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2308.08358
Document Type :
Working Paper