二层神经网络中的Python错误

2024-04-20 13:37:38 发布

您现在位置:Python中文网/ 问答频道 /正文

我试图实现一个2层神经网络从头开始。但有点不对劲。在一些迭代之后,我的损失变成了nan。你知道吗

'''
We are implementing a two layer neural network.
'''
import numpy as np

x,y = np.random.rand(64,1000),np.random.randn(64,10)
w1,w2 = np.random.rand(1000,100),np.random.rand(100,10)
learning_rate = 1e-4
x -= np.mean(x,axis=0) #Normalizing the Training Data Set

for t in range(2000):
  h = np.maximum(0,x.dot(w1))    # Applying Relu Non linearity
  ypred = h.dot(w2) #Output of Hidden layer

  loss = np.square(ypred - y).sum()
  print('Step',t,'\tLoss:- ',loss)

  #Gradient Descent

  grad_ypred = 2.0 * (ypred - y)
  gradw2 = (h.transpose()).dot(grad_ypred)
  grad_h = grad_ypred.dot(w2.transpose())
  gradw1 = (x.transpose()).dot(grad_h*h*(1-h))

  w1 -= learning_rate*gradw1
  w2 -= learning_rate*gradw2

我还实现了线性回归使用Softmax分类器和多类支持向量机损失。同样的问题也会发生。请告诉我怎么解决这个问题。你知道吗

输出:

D:\Study Material\Python 3 Tutorial\PythonScripts\Machine Learning>python TwoLayerNeuralNet.py
Step 0  Loss:-  19436393.79233052
Step 1  Loss:-  236820315509427.38
Step 2  Loss:-  1.3887002186558748e+47
Step 3  Loss:-  1.868219503527502e+189
Step 4  Loss:-  inf
TwoLayerNeuralNet.py:23: RuntimeWarning: invalid value encountered in multiply
  gradw1 = (x.transpose()).dot(grad_h*h*(1-h))
TwoLayerNeuralNet.py:12: RuntimeWarning: invalid value encountered in maximum
  h = np.maximum(0,x.dot(w1))    # Applying Relu Non linearity
Step 5  Loss:-  nan
Step 6  Loss:-  nan
Step 7  Loss:-  nan
Step 8  Loss:-  nan
Step 9  Loss:-  nan
Step 10         Loss:-  nan
Step 11         Loss:-  nan
Step 12         Loss:-  nan
Step 13         Loss:-  nan
Step 14         Loss:-  nan
Step 15         Loss:-  nan
Step 16         Loss:-  nan
Step 17         Loss:-  nan
Step 18         Loss:-  nan
Step 19         Loss:-  nan
Step 20         Loss:-  nan

Tags: inratestepnprandomnandotw1
1条回答
网友
1楼 · 发布于 2024-04-20 13:37:38

因为你的损失太大了
试试这个

loss = np.square(ypred - y).mean()

如果仍然不起作用,试着把学习速度降低到1e-8
观察损失是上升还是下降,如果损失在减少那是好的,如果损失在增加那是坏的迹象,你可能想考虑使用一个更好的数据集并检查权值更新。你知道吗

相关问题 更多 >