Xavier Initialization a=(din+dout)6 Random values drawn uniformly from [−a,a] For Batch Normalization Layers, γ=1 and β=0 For Tanh based activating neural nets