Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

I'm having problems when trying to use activations with Keras Functional API. My initial goal was to have choice between relu and leaky relu, so I came up with the following piece of code:

def activation(x, activation_type):
    if activation_type == 'leaky_relu':
        return activations.relu(x, alpha=0.3)
    else:
        return activations.get(activation_type)(x)


# building the model

inputs = keras.Input(input_shape, dtype='float32')
x = Conv2D(filters, (3, 3), padding='same')(inputs)
x = activation(x, 'relu')

but something like this gives error: AttributeError: 'Tensor' object has no attribute '_keras_history'. I found out that it may indicate that my inputs and outputs in Model are not connected.

Is keras.advanced_activations the only way to achieve functionality like this in functional API?

EDIT: here's the version of activation function that worked:

    def activation(self, x):
        if self.activation_type == 'leaky_relu':
            act = lambda x: activations.relu(x, alpha=0.3)
        else:
            act = activations.get(self.activation_type)
        return layers.Activation(act)(x)
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
207 views
Welcome To Ask or Share your Answers For Others

1 Answer

You want to add an activation to your model by means of an activation layer. Currently, you are adding an object that is not a Keras Layer, which is causing your error. (In Keras, layer names always start with a capital). Try something like this (minimal example):

from keras.layers import Input, Dense, Activation
from keras import activations

def activation(x, activation_type):
    if activation_type == 'leaky_relu':
        return activations.relu(x, alpha=0.3)
    else:
        return activations.get(activation_type)(x)


# building the model
inputs = Input((5,), dtype='float32')
x = Dense(128)(inputs)
# Wrap inside an Activation layer
x = Activation(lambda x: activation(x, 'sigmoid'))(x)

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...