Skip to content

How to use DynamicRNNLayer ? #18

@cobnut

Description

@cobnut

Thanks for your debug, but...

  1. "max_length = tf.shape(self.outputs)[1]
    self.outputs = tf.reshape(tf.concat(1, outputs), [-1, max_length, n_hidden])", this is your new code
    but also wrong, "max_length = tf.shape(self.outputs)[1]" no self.outputs, maybe like this
    "max_length = tf.shape(outputs)[1]"

  2. i can not find some DynamicRNNLayer example, the examples:

    input_seqs = tf.placeholder(dtype=tf.int64, shape=[batch_size, None], name="input_seqs")
    network = tl.layers.EmbeddingInputlayer(
                 inputs = input_seqs,
                 vocabulary_size = vocab_size,
                 embedding_size = embedding_size,
                 name = 'seq_embedding')
     network = tl.layers.DynamicRNNLayer(network,
                 cell_fn = tf.nn.rnn_cell.BasicLSTMCell,
                 n_hidden = embedding_size,
                 dropout = 0.7,
                 sequence_length = tl.layers.retrieve_seq_length_op2(input_seqs),
                 return_seq_2d = True,     # stack denselayer or compute cost after it
                 name = 'dynamic_rnn',)
     network = tl.layers.DenseLayer(network, n_units=vocab_size,
                 act=tf.identity, name="output")
    

    i can not understand "shape=[batch_size, None]" i just write None? or n_step(max)
    if my data like this: (Sentiment Analysis)

    x = [[2, 1, 8, 9, 2]
         [2, 4]
         [1, 1, 3, 5] 
         [6, 3, 2]]
    y = [1, 0, 1, 1]
    

    i padding with zero

     x = [[2, 1, 8, 9, 2]
          [2, 4, 0, 0, 0]
          [1, 1, 3, 5, 0] 
          [6, 3, 2, 0, 0]]
    

    how to use DynamicRNNLayer?

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions