keras Layer

Simple Introduction

Keras实现了很多层,包括核心层、卷基层、RNN网络层等诸多常用的网络结构。

Core 核心层

Source

class Layer(object):'''Abstract base layer class.All Keras layers accept certain keyword arguments:trainable: boolean. Set to "False" before model compilationto freeze layer weights (they won't be updated furtherduring training).input_shape: a tuple of integers specifying the expected shapeof the input samples. Does not includes the batch size.(e.g. `(100,)` for 100-dimensional inputs).batch_input_shape: a tuple of integers specifying the expectedshape of a batch of input samples. Includes the batch size(e.g. `(32, 100)` for a batch of 32 100-dimensional inputs).'''def __init__(self, **kwargs):allowed_kwargs = {'input_shape','trainable','batch_input_shape','cache_enabled'}for kwarg in kwargs:assert kwarg in allowed_kwargs, 'Keyword argument not understood: ' + kwargif 'input_shape' in kwargs:self.set_input_shape((None,) + tuple(kwargs['input_shape']))if 'batch_input_shape' in kwargs:self.set_input_shape(tuple(kwargs['batch_input_shape']))if 'trainable' in kwargs:self._trainable = kwargs['trainable']if not hasattr(self, 'params'):self.params = []self._cache_enabled = Trueif 'cache_enabled' in kwargs:self._cache_enabled = kwargs['cache_enabled']@propertydef cache_enabled(self):return self._cache_enabled@cache_enabled.setterdef cache_enabled(self, value):self._cache_enabled = valuedef __call__(self, X, mask=None, train=False):# set temporary inputtmp_input = self.get_inputtmp_mask = Noneif hasattr(self, 'get_input_mask'):tmp_mask = self.get_input_maskself.get_input_mask = lambda _: maskself.get_input = lambda _: XY = self.get_output(train=train)# return input to what it wasif hasattr(self, 'get_input_mask'):self.get_input_mask = tmp_maskself.get_input = tmp_inputreturn Ydef set_previous(self, layer, connection_map={}):'''Connect a layer to its parent in the computational graph.'''assert self.nb_input == layer.nb_output == 1, 'Cannot connect layers: input count and output count should be 1.'if hasattr(self, 'input_ndim'):assert self.input_ndim == len(layer.output_shape), ('Incompatible shapes: layer expected input with ndim=' +str(self.input_ndim) +' but previous layer has output_shape ' +str(layer.output_shape))if layer.get_output_mask() is not None:assert self.supports_masked_input(), 'Cannot connect non-masking layer to layer with masked output.'self.previous = layerself.build()def build(self):'''Instantiation of layer weights.Called after `set_previous`, or after `set_input_shape`,once the layer has a defined input shape.Must be implemented on all layers that have weights.'''pass@propertydef trainable(self):if hasattr(self, '_trainable'):return self._trainableelse:return True@trainable.setterdef trainable(self, value):self._trainable = value@propertydef nb_input(self):return 1@propertydef nb_output(self):return 1@propertydef input_shape(self):# if layer is not connected (e.g. input layer),# input shape can be set manually via _input_shape attribute.if hasattr(self, 'previous'):return self.previous.output_shapeelif hasattr(self, '_input_shape'):return self._input_shapeelse:raise Exception('Layer is not connected. Did you forget to set "input_shape"?')def set_input_shape(self, input_shape):if type(input_shape) not in [tuple, list]:raise Exception('Invalid input shape - input_shape should be a tuple of int.')input_shape = tuple(input_shape)if hasattr(self, 'input_ndim') and self.input_ndim:if self.input_ndim != len(input_shape):raise Exception('Invalid input shape - Layer expects input ndim=' +str(self.input_ndim) +', was provided with input shape ' + str(input_shape))self._input_shape = input_shapeself.input = K.placeholder(shape=self._input_shape)self.build()@propertydef output_shape(self):# default assumption: tensor shape unchanged.return self.input_shapedef get_output(self, train=False):return self.get_input(train)def get_input(self, train=False):if hasattr(self, 'previous'):# to avoid redundant computations,# layer outputs are cached when possible.if hasattr(self, 'layer_cache') and self.cache_enabled:previous_layer_id = '%s_%s' % (id(self.previous), train)if previous_layer_id in self.layer_cache:return self.layer_cache[previous_layer_id]previous_output = self.previous.get_output(train=train)if hasattr(self, 'layer_cache') and self.cache_enabled:previous_layer_id = '%s_%s' % (id(self.previous), train)self.layer_cache[previous_layer_id] = previous_outputreturn previous_outputelif hasattr(self, 'input'):return self.inputelse:raise Exception('Layer is not connected' +'and is not an input layer.')def supports_masked_input(self):'''Whether or not this layer respects the output mask of its previouslayer in its calculations.If you try to attach a layer that does *not* support masked_input toa layer that gives a non-None output_mask(), an error will be raised.'''return Falsedef get_output_mask(self, train=None):'''For some models (such as RNNs) you want a way of being able to marksome output data-points as "masked",so they are not used in future calculations.In such a model, get_output_mask() should return a maskof one less dimension than get_output()(so if get_output is (nb_samples, nb_timesteps, nb_dimensions),then the mask is (nb_samples, nb_timesteps),with a one for every unmasked datapoint,and a zero for every masked one.If there is *no* masking then it shall return None.For instance if you attach an Activation layer (they support masking)to a layer with an output_mask, then that Activation shallalso have an output_mask.If you attach it to a layer with no such mask,then the Activation's get_output_mask shall return None.Some layers have an output_mask even if their input is unmasked,notably Embedding which can turn the entry "0" intoa mask.'''return Nonedef set_weights(self, weights):'''Set the weights of the layer.weights: a list of numpy arrays. The numberof arrays and their shape must matchnumber of the dimensions of the weightsof the layer (i.e. it should match theoutput of `get_weights`).'''assert len(self.params) == len(weights), ('Provided weight array does not match layer weights (' +str(len(self.params)) + ' layer params vs. ' +str(len(weights)) + ' provided weights)')for p, w in zip(self.params, weights):if K.get_value(p).shape != w.shape:raise Exception('Layer shape %s not compatible with weight shape %s.' % (K.get_value(p).shape, w.shape))K.set_value(p, w)def get_weights(self):'''Return the weights of the layer,as a list of numpy arrays.'''weights = []for p in self.params:weights.append(K.get_value(p))return weightsdef get_config(self):'''Return the parameters of the layer, as a dictionary.'''config = {'name': self.__class__.__name__}if hasattr(self, '_input_shape'):config['input_shape'] = self._input_shape[1:]if hasattr(self, '_trainable'):config['trainable'] = self._trainableconfig['cache_enabled'] =  self.cache_enabledreturn configdef get_params(self):consts = []updates = []if hasattr(self, 'regularizers'):regularizers = self.regularizerselse:regularizers = []if hasattr(self, 'constraints') and len(self.constraints) == len(self.params):for c in self.constraints:if c:consts.append(c)else:consts.append(constraints.identity())elif hasattr(self, 'constraint') and self.constraint:consts += [self.constraint for _ in range(len(self.params))]else:consts += [constraints.identity() for _ in range(len(self.params))]if hasattr(self, 'updates') and self.updates:updates += self.updatesreturn self.params, regularizers, consts, updatesdef count_params(self):'''Return the total number of floats (or ints)composing the weights of the layer.'''return sum([K.count_params(p) for p in self.params])

set_previous

设置previous layer, 使previous layer连接到当前的layer,同时会调用build方法初始化regularizers,weights等参数.

build

被set_previous调用,初始化regularizers,weights等参数.

input_shape

python property. 如果该layer是输入层,返回自身的input shape, 否则返回previous layer的input shape.

set_input_shape

设置input shape(tuple, list), 并调用build方法初始化regularizers,weights等参数.

get_input

返回previous layer的output,如果当前层是输入层,则返回当前的输入.

Activation Layer

主要是计算经过激活函数后输出值,激活函数有softmax, softplus, relu, tanh, sigmoid, hard_sigmoid, linear.

source

class Activation(MaskedLayer):'''Apply an activation function to an output.# Input shapeArbitrary. Use the keyword argument `input_shape`(tuple of integers, does not include the samples axis)when using this layer as the first layer in a model.# Output shapeSame shape as input.# Arguments:activation: name of activation function to use(see: [activations](../activations.md)),or alternatively, a Theano or TensorFlow operation.'''def __init__(self, activation, **kwargs):super(Activation, self).__init__(**kwargs)self.activation = activations.get(activation)def get_output(self, train=False):X = self.get_input(train)return self.activation(X)def get_config(self):config = {'name': self.__class__.__name__,'activation': self.activation.__name__}base_config = super(Activation, self).get_config()return dict(list(base_config.items()) + list(config.items()))

Lambda Layer

该layer的output是经过lambda计算,如果该layer是input layer,则lambda的input是当前layer 的input,否则是previous layer的input

example

kerasmodel.add_node(Lambda(lambda x:x.sum(2)), name='merge',inputs=['embedding','embedpoint'], merge_mode='mul')

source

class Lambda(Layer):'''Used for evaluating an arbitrary Theano / TensorFlow expressionon the output of the previous layer.# Input shapeArbitrary. Use the keyword argument input_shape(tuple of integers, does not include the samples axis)when using this layer as the first layer in a model.# Output shapeSpecified by `output_shape` argument.# Argumentsfunction: The function to be evaluated.Takes one argument: the output of previous layeroutput_shape: Expected output shape from function.Could be a tuple or a function of the shape of the input'''def __init__(self, function, output_shape=None, **kwargs):super(Lambda, self).__init__(**kwargs)py3 = sys.version_info[0] == 3if py3:self.function = marshal.dumps(function.__code__)else:assert hasattr(function, 'func_code'), ('The Lambda layer "function"'' argument must be a Python function.')self.function = marshal.dumps(function.func_code)if output_shape is None:self._output_shape = Noneelif type(output_shape) in {tuple, list}:self._output_shape = tuple(output_shape)else:if py3:self._output_shape = marshal.dumps(output_shape.__code__)else:self._output_shape = marshal.dumps(output_shape.func_code)super(Lambda, self).__init__()@propertydef output_shape(self):if self._output_shape is None:return self.input_shapeelif type(self._output_shape) == tuple:return (self.input_shape[0], ) + self._output_shapeelse:output_shape_func = marshal.loads(self._output_shape)output_shape_func = types.FunctionType(output_shape_func, globals())shape = output_shape_func(self.previous.output_shape)if type(shape) not in {list, tuple}:raise Exception('output_shape function must return a tuple')return tuple(shape)def get_output(self, train=False):func = marshal.loads(self.function)func = types.FunctionType(func, globals())if hasattr(self, 'previous'):return func(self.previous.get_output(train))else:return func(self.input)

Embedding Layer

使用keras实现word2Vector时,需要用到Embedding Layer

keras Layer相关推荐

  1. keras.layers.add()和keras.layer.conatenate()

    keras.layers.add()和keras.layer.conatenate() add对张量执行求和运算 concatenate对张量进行串联运算 在深度神经网络中,经常会遇到需要把张量结合在 ...

  2. tf.keras.layers.BatchNormalization、tf.keras.layer.LayerNormalization函数

    1.BatchNormalization函数 函数原型 tf.keras.layers.BatchNormalization(axis=-1, momentum=0.99, epsilon=0.001 ...

  3. keras layer的weight是向后的还是向前的_「马上学tensorflow 2.0」Keras简介、使用流程,后端引擎原理,...

    本文介绍深度学习框架Keras,它的重要特征和优势,后端引擎,以及使用keras的流程. Keras 1. Keras 简介 Keras 是一个 Python 深度学习框架, 以 TensorFlow ...

  4. TensorFlow 2.7 正式版上线,改进 TF/Keras 调试,支持 Jax 模型到 TensorFlow Lite转换

    点击上方"AI遇见机器学习",选择"星标"公众号 重磅干货,第一时间送达 转自 | 机器之心 TensorFlow2.7 正式发布,新版本包括对 tf.kera ...

  5. 学习笔记:Keras 错误提示

    目录: 目录: Keras使用陷阱 TF卷积核与TH卷积核 向BN层中载入权重 Keras的可训练参数在前,不可训练参数在后 Merge层的层对象与函数方法 本系列参考官方文档官方文档 这就是kera ...

  6. How to Visualize Your Recurrent Neural Network with Attention in Keras

    Neural networks are taking over every part of our lives. In particular - thanks to deep learning - S ...

  7. tf keras SimpleRNN源码解析

    环境 package version tensorflow 2.3.0 keras 2.4.3 源码 部分主要源码 class RNN(Layer):def __init__(self,cell,re ...

  8. 使用tf.keras搭建mnist手写数字识别网络

    使用tf.keras搭建mnist手写数字识别网络 目录 使用tf.keras搭建mnist手写数字识别网络 1.使用tf.keras.Sequential搭建序列模型 1.1 tf.keras.Se ...

  9. [深度学习-实践]Transformer模型训练IMDB-tensorflow2 keras

    1. 引言 什么是Self-attention, Muti-attention和Transformer 2. 数据预处理 mdb影评的数据集介绍与下载 下载后执行下面预处理代码,把每个词都转化为索引. ...

最新文章

  1. nodejs 循环中操作需要同步执行解决方案
  2. (转载)Linux Out-of-Memory(OOM) Killer
  3. 中国大学mooc慕课python语言程序设计答案_中国大学MOOC(慕课)_Python语言程序设计基础_网课答案...
  4. python测验5 函数和代码_测验5: 函数和代码复用 (第5周)-单选题
  5. android 将图片路径转二进制,将图像转换为二进制图像中的android
  6. Java中,与;||与|的区别
  7. BZOJ 1688: [Usaco2005 Open]Disease Manangement 疾病管理
  8. 交换排序图解_排序算法(一):初级比较排序
  9. java发送电子邮件以qq邮箱为例
  10. 小白从零开发鸿蒙小游戏(1)“数字华容道”—【深鸿会学习小组学习心得及笔记】
  11. Win10系统怎么安装cab文件?
  12. Linux环境变量PSI指什么,psi是什么单位?
  13. High Scalability创始人Todd Hoff:Facebook网络性能的秘密武器
  14. html 名人名言源代码,基于JQuery及AJAX实现名人名言随机生成器_咋地 _前端开发者...
  15. Harmony鸿蒙开发 四、Ability的生命周期
  16. c语言中的正弦函数与余弦函数
  17. Mybatis-Plus 传入时间查询的方式
  18. 数据人故事——【五花肉】7年数据产品的职业成长之路
  19. [数据库] DSN是什么/是什么意思--解释
  20. html escape unescape

热门文章

  1. Eclipse收藏品的隐藏宝藏
  2. 华为联运游戏或应用审核驳回:使用花币充值时,对支付延时处理不当,导致商品不到账2021-05-24
  3. Java+Swing实现五子棋游戏
  4. Ae 效果快速参考:模糊和锐化
  5. “东亚文化之都·温州”书法作品巡展敦煌站正式启动
  6. 前端大小屏幕分辨率调试问题——利用火狐或谷歌
  7. java动态柱状图_springboot动态加载Echarts柱状图
  8. 菜鸟首个航空货运中心落户深圳 联手深圳机场实现跨境包裹处理效率30%提升
  9. 债权人死亡后债权人家属追债是否需要还
  10. 什么是DirectX,DirectShow与DirectX有什么区别?