添加的层必须是类层的实例。发现:<class'tensorflow.python.keras.layers.density\u attention.attention'>

2024-03-28 13:38:23 发布

您现在位置:Python中文网/ 问答频道 /正文

我知道问题已经被贴出来了,但是没有找到我问题的答案

我试图将tensorflow中的“注意”层添加到我的双向NALLSTM层上,但我收到以下错误消息:添加的层必须是类层的实例。找到:<;类“tensorflow.python.keras.layers.density\u attention.attention”>

这是我的密码:

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import LSTM,Dense,Dropout,GRU,BatchNormalization,Activation,Bidirectional,TimeDistributed,RepeatVector
from tensorflow.keras.layers import Attention

def create_attention(units,n_steps_in,n_features,dr):
    model = Sequential()
    
    lstm = LSTM(units, activation='sigmoid',return_sequences=True,
                       input_shape=(n_steps_in, n_features),
                       dropout = dr)
    
    model.add(Bidirectional(lstm, input_shape=(n_steps_in, n_features)))
    model.add(Attention)
    model.add(TimeDistributed(Dense(1, activation='sigmoid')))
    return model

以下是完整的错误消息:

TypeError                                 Traceback (most recent call last)
<ipython-input-107-de8d4616e3c7> in <module>
     74                 #model = create_lstm(u ,n_steps_in,n_features,name,name)
     75                 #model = create_bidirectional(u,n_steps_in,n_features,name)
---> 76                 model = create_attention(u,n_steps_in,n_features,name)
     77                 model.compile(optimizer='adam',loss='binary_crossentropy',metrics=['Precision','Recall',fbeta])
     78                 mode = scrng[s]['mode']

<ipython-input-106-c7dd103a8ded> in create_attention(units, n_steps_in, n_features, dr)
     84     model.add(Bidirectional(lstm, input_shape=(n_steps_in, n_features)))
     85     
---> 86     model.add(Attention)
     87     model.add(TimeDistributed(Dense(1, activation='sigmoid')))
     88     return model

~/.local/share/virtualenvs/predictionpa-SrVGL0Nv/lib/python3.7/site-packages/tensorflow_core/python/training/tracking/base.py in _method_wrapper(self, *args, **kwargs)
    455     self._self_setattr_tracking = False  # pylint: disable=protected-access
    456     try:
--> 457       result = method(self, *args, **kwargs)
    458     finally:
    459       self._self_setattr_tracking = previous_value  # pylint: disable=protected-access

~/.local/share/virtualenvs/predictionpa-SrVGL0Nv/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/sequential.py in add(self, layer)
    159       raise TypeError('The added layer must be '
    160                       'an instance of class Layer. '
--> 161                       'Found: ' + str(layer))
    162 
    163     tf_utils.assert_no_legacy_layers([layer])

TypeError: The added layer must be an instance of class Layer. Found: <class 'tensorflow.python.keras.layers.dense_attention.Attention'>


Tags: inselfaddlayerinputmodellayerstensorflow