Skip to content

How to hidden layers work? #1022

Answered by LuluW8071
pwang1092 asked this question in Q&A
Jul 25, 2024 · 1 comments · 3 replies
Discussion options

You must be logged in to vote

Yes we would have 3 linear layers,

  • 1st being input layer with ReLU
  • 2nd as hidden layer with ReLU
  • 3rd as output layer

ReLU works as activation function to introduce non-linearity for the layer; if total weight of a neuron is greater than 0 it fires(1) else doesnt fire(0)

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@Luismbpr
Comment options

@LuluW8071
Comment options

@mrdbourke
Comment options

Answer selected by mrdbourke
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
4 participants