matlab方法到python的转换

2024-05-26 20:44:43 发布

您现在位置:Python中文网/ 问答频道 /正文

matlab中有一个非常有用的方法叫做“getwb()”。 对于编写神经网络的开发人员,这种方法在最后的迭代中返回权重和偏差。 我有神经网络(使用张量流工具)。 有没有可能以某种方式转换此方法?你知道吗

我试过很多次了tensorFlow.saver程序()和restore(),但我不太了解这个问题。你知道吗

谢谢!你知道吗

编辑: 我的模型是:

def neuralNetworkModel(x):
  # first step: (input * weights) + bias, linear operation like y = ax + b
  # each layer connection to other layer will represent by nodes(i) * nodes(i+1)

  for i in range(0,numberOfLayers):
    if i == 0:
      hiddenLayers.append({"weights": tensorFlow.Variable(tensorFlow.random_normal([sizeOfRow, nodesLayer[i]])),
                      "biases": tensorFlow.Variable(tensorFlow.random_normal([nodesLayer[i]]))})

    elif i > 0 and i < numberOfLayers-1:
      hiddenLayers.append({"weights" : tensorFlow.Variable(tensorFlow.random_normal([nodesLayer[i], nodesLayer[i+1]])),
                  "biases" : tensorFlow.Variable(tensorFlow.random_normal([nodesLayer[i+1]]))})
    else:
      outputLayer = {"weights": tensorFlow.Variable(tensorFlow.random_normal([nodesLayer[i], classes])),
                  "biases": tensorFlow.Variable(tensorFlow.random_normal([classes]))}


  # create the layers
  for i in range(numberOfLayers):
    if i == 0:
      layers.append(tensorFlow.add(tensorFlow.matmul(x, hiddenLayers[i]["weights"]), hiddenLayers[i]["biases"]))
      layers.append(tensorFlow.nn.relu(layers[i]))  # pass values to activation function (i.e sigmoid, softmax) and add it to the layer

    elif i >0 and i < numberOfLayers-1:
      layers.append(tensorFlow.add(tensorFlow.matmul(layers[i-1], hiddenLayers[i]["weights"]), hiddenLayers[i]["biases"]))
      layers.append(tensorFlow.nn.relu(layers[i]))

  output = tensorFlow.matmul(layers[numberOfLayers-1], outputLayer["weights"]) + outputLayer["biases"]
  finalOutput = output
  return output

Tags: andto方法layerlayerstensorflowrandomvariable
1条回答
网友
1楼 · 发布于 2024-05-26 20:44:43

在代码中,为隐藏层和输出层的权重和偏差创建一组变量。您应该能够在任何时候(会话处于活动状态时)使用tf.Session.run运行()如下:

import tensorflow as tf

tf.reset_default_graph()

v = tf.Variable(tf.random_normal((5, 5)))

init = tf.initialize_all_variables()

with tf.Session() as sess:
  sess.run(init)
  v_val = sess.run(v)  
  print v_val

我也建议使用tf.学习包含有用抽象的库,如fully_connected层。你知道吗

相关问题 更多 >

    热门问题