试图获取注意层权重,但出现错误:FailedPremissionError:找不到可变注意\u分数\u向量\u 1/内核

2024-04-19 18:38:04 发布

您现在位置:Python中文网/ 问答频道 /正文

我使用以下代码初始化并拟合模型: (使用两个纪元来测试它是否运行)

import tensorflow as tf
from attention import Attention

tf.compat.v1.disable_eager_execution()

verbose, epoch, batch_size = 1, 2, 64
activationFunction='relu'


def getattnModel():
  attnmodel = Sequential()
  attnmodel.add(LSTM(128, return_sequences=True, input_shape=(X_train1.shape[1],X_train1.shape[2])))
  attnmodel.add(Attention(64)) 
  attnmodel.add(Dense(128, activation=tf.nn.relu))    
  attnmodel.add(Dense(32, activation=tf.nn.relu))
  attnmodel.add(Dense(9, activation='softmax'))
  attnmodel.compile(optimizer='adam', loss='categorical_crossentropy',metrics=['accuracy'])
  attnmodel.summary()
  return attnmodel

attnmodel = getattnModel()
    
attnhistory= attnmodel.fit(X_train1, y_train1, epochs=epoch, verbose=verbose, validation_split=0.2, batch_size = batch_size)
attnpredictions = attnmodel.predict(X_test1, verbose=1)
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
lstm_2 (LSTM)                (None, 275, 128)          66560     
_________________________________________________________________
last_hidden_state (Lambda)   (None, 128)               0         
_________________________________________________________________
attention_score_vec (Dense)  (None, 275, 128)          16384     
_________________________________________________________________
attention_score (Dot)        (None, 275)               0         
_________________________________________________________________
attention_weight (Activation (None, 275)               0         
_________________________________________________________________
context_vector (Dot)         (None, 128)               0         
_________________________________________________________________
attention_output (Concatenat (None, 256)               0         
_________________________________________________________________
attention_vector (Dense)     (None, 128)               32768     
_________________________________________________________________
dense_7 (Dense)              (None, 128)               16512     
_________________________________________________________________
dense_8 (Dense)              (None, 32)                4128      
_________________________________________________________________
dense_9 (Dense)              (None, 9)                 297       
=================================================================
Total params: 136,649
Trainable params: 136,649
Non-trainable params: 0

在成功拟合模型后,我想获得注意层的层权重。所以我运行代码:

#### max number of instances
num_inst = 10000

#### Get attn layer
get_attn = keras.backend.function([attnmodel.layers[0].input, keras.backend.learning_phase()], [attnmodel.layers[2].output])
attn_weights = get_attn([X_test1[:num_inst]])[0]

#### Softmax layer
get_softmax1_attn = keras.backend.function([attnmodel.layers[0].input, keras.backend.learning_phase()], [attnmodel.layers[-1].output])
softmax1_attn = get_softmax1_attn(([X_test1[:num_inst]]))[0]

但这是我得到的错误:

---------------------------------------------------------------------------

FailedPreconditionError                   Traceback (most recent call last)

<ipython-input-24-6cb36ea36444> in <module>()
      8 #### Get last conv layer
      9 get_attn = keras.backend.function([attnmodel.layers[0].input, keras.backend.learning_phase()], [attnmodel.layers[4].output])
---> 10 attn_weights = get_attn([X_test1[:num_inst]])[0]
     11 
     12 #### Softmax layer

1 frames

/usr/local/lib/python3.7/dist-packages/keras/backend.py in __call__(self, inputs)
   4018 
   4019     fetched = self._callable_fn(*array_vals,
-> 4020                                 run_metadata=self.run_metadata)
   4021     self._call_fetch_callbacks(fetched[-len(self._fetches):])
   4022     output_structure = tf.nest.pack_sequence_as(

/usr/local/lib/python3.7/dist-packages/tensorflow/python/client/session.py in __call__(self, *args, **kwargs)
   1480         ret = tf_session.TF_SessionRunCallable(self._session._session,
   1481                                                self._handle, args,
-> 1482                                                run_metadata_ptr)
   1483         if run_metadata:
   1484           proto_data = tf_session.TF_GetBuffer(run_metadata_ptr)

FailedPreconditionError: Could not find variable attention_score_vec_1/kernel. This could mean that the variable has been deleted. 
In TF1, it can also mean the variable is uninitialized. Debug info: container=localhost, status=Not found: 
Container localhost does not exist. (Could not find resource: localhost/attention_score_vec_1/kernel)
     [[{{node attention_score_vec_1/Tensordot/ReadVariableOp}}]]

这个错误意味着什么?有没有其他方法可以引起人们的注意?请让我知道如何获得其层权重,以可视化时间序列数据


Tags: selfnoneaddbackendinputoutputgetlayers