The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here

how to connect RNN layer to another RNN layer in keras

dassdass Member Posts: 12 Learner III
edited June 2019 in Help

Hi everyone,

i have been facing this trouble for quite some time and i really hope that i get some reply on this. Is this due to the problem of the keras installation or just that i did not know the proper way of putting them together? thank you guys

Answers

  • sgenzersgenzer Administrator, Moderator, Employee-RapidMiner, RapidMiner Certified Analyst, Community Manager, Member, University Professor, PM Moderator Posts: 2,959 Community Manager

    tagging @pschlunder

     

    Scott

     

  • pschlunderpschlunder Employee-RapidMiner, RapidMiner Certified Analyst, RapidMiner Certified Expert, RMResearcher, Member Posts: 96 RM Research

    Hi @dass,

     

    in general you can put one reccurent layer after another. Under the hood, the data shape of a previous layer is passed as the input shape of the current one. But you don't need to handle that yourself, except for the first layer. The input shape of the first layer is set with the "input shape" parameter in the Keras Model Operator. Please check the help text of this operator for information on how to set it.

     

    If you're still facing problems make sure to post your process here. You can use the XML view to get a XML representation of the process that you can copy & paste here.

     

    Hope this helps,

    Philipp

     
  • dassdass Member Posts: 12 Learner III

    Hi @pschlunder,

     

    thank you for the reply, i really appreciate it. Sad to say, the model still displaying error whenever i added another layer to it. When dealing with only 1 layer, the model runs perfectly fine. However, when i add in another layer, the output from the first layer does not match with the input for the second layer. I'm not sure if that is really the case, so i have included the xml process and also the error here as well.  

    <?xml version="1.0" encoding="UTF-8"?><process version="8.1.003">
    <context>
    <input/>
    <output/>
    <macros/>
    </context>
    <operator activated="true" class="process" compatibility="8.1.003" expanded="true" name="Process">
    <process expanded="true">
    <operator activated="true" class="retrieve" compatibility="8.1.003" expanded="true" height="68" name="Retrieve s&amp;p-500-data" width="90" x="45" y="85">
    <parameter key="repository_entry" value="//Keras Samples/sp_500_regression/s&amp;p-500-data"/>
    </operator>
    <operator activated="true" class="select_attributes" compatibility="8.1.003" expanded="true" height="82" name="Select Attributes" width="90" x="179" y="85">
    <parameter key="attribute_filter_type" value="single"/>
    <parameter key="attribute" value="Close"/>
    <parameter key="attributes" value="Date|Open|Close"/>
    </operator>
    <operator activated="true" class="normalize" compatibility="8.1.003" expanded="true" height="103" name="Normalize" width="90" x="313" y="85">
    <parameter key="attribute_filter_type" value="single"/>
    <parameter key="attribute" value="Close"/>
    </operator>
    <operator activated="true" class="series:windowing" compatibility="7.4.000" expanded="true" height="82" name="Windowing" width="90" x="447" y="85">
    <parameter key="window_size" value="30"/>
    <parameter key="label_attribute" value="Close"/>
    </operator>
    <operator activated="true" class="set_role" compatibility="8.1.003" expanded="true" height="82" name="Set Role" width="90" x="581" y="85">
    <parameter key="attribute_name" value="Close-29"/>
    <parameter key="target_role" value="label"/>
    <list key="set_additional_roles"/>
    </operator>
    <operator activated="true" class="split_data" compatibility="8.1.003" expanded="true" height="103" name="Split Data" width="90" x="581" y="238">
    <enumeration key="partitions">
    <parameter key="ratio" value="0.9"/>
    <parameter key="ratio" value="0.1"/>
    </enumeration>
    <parameter key="sampling_type" value="linear sampling"/>
    </operator>
    <operator activated="true" class="keras:sequential" compatibility="1.0.003" expanded="true" height="166" name="Keras Model" width="90" x="782" y="136">
    <parameter key="input shape" value="(30,1)"/>
    <parameter key="optimizer" value="Adam"/>
    <parameter key="learning rate" value="0.001"/>
    <enumeration key="metric"/>
    <parameter key="epochs" value="5"/>
    <enumeration key="callbacks">
    <parameter key="callbacks" value="TensorBoard(log_dir='./logs', histogram_freq=0, write_graph=True, write_images=False, embeddings_freq=0, embeddings_layer_names=None, embeddings_metadata=None)"/>
    </enumeration>
    <process expanded="true">
    <operator activated="true" class="keras:core_layer" compatibility="1.0.003" expanded="true" height="82" name="Add Core Layer (2)" width="90" x="648" y="136">
    <parameter key="layer_type" value="Reshape"/>
    <parameter key="target_shape" value="(30,1)"/>
    <parameter key="dims" value="1.1"/>
    </operator>
    <operator activated="true" class="keras:recurrent_layer" compatibility="1.0.003" expanded="true" height="82" name="Add Recurrent Layer" width="90" x="782" y="136">
    <parameter key="no_units" value="8"/>
    </operator>
    <operator activated="true" class="keras:recurrent_layer" compatibility="1.0.003" expanded="true" height="82" name="Add Recurrent Layer (2)" width="90" x="916" y="136">
    <parameter key="no_units" value="8"/>
    </operator>
    <operator activated="true" class="keras:core_layer" compatibility="1.0.003" expanded="true" height="82" name="Add Core Layer" width="90" x="1050" y="136">
    <parameter key="activation_function" value="'linear'"/>
    <parameter key="dims" value="1.1"/>
    </operator>
    <connect from_op="Add Core Layer (2)" from_port="layers 1" to_op="Add Recurrent Layer" to_port="layers"/>
    <connect from_op="Add Recurrent Layer" from_port="layers 1" to_op="Add Recurrent Layer (2)" to_port="layers"/>
    <connect from_op="Add Recurrent Layer (2)" from_port="layers 1" to_op="Add Core Layer" to_port="layers"/>
    <connect from_op="Add Core Layer" from_port="layers 1" to_port="layers 1"/>
    <portSpacing port="sink_layers 1" spacing="0"/>
    <portSpacing port="sink_layers 2" spacing="0"/>
    </process>
    </operator>
    <operator activated="true" class="keras:apply" compatibility="1.0.003" expanded="true" height="82" name="Apply Keras Model" width="90" x="849" y="391">
    <parameter key="batch_size" value="128"/>
    </operator>
    <connect from_op="Retrieve s&amp;p-500-data" from_port="output" to_op="Select Attributes" to_port="example set input"/>
    <connect from_op="Select Attributes" from_port="example set output" to_op="Normalize" to_port="example set input"/>
    <connect from_op="Normalize" from_port="example set output" to_op="Windowing" to_port="example set input"/>
    <connect from_op="Windowing" from_port="example set output" to_op="Set Role" to_port="example set input"/>
    <connect from_op="Set Role" from_port="example set output" to_op="Split Data" to_port="example set"/>
    <connect from_op="Split Data" from_port="partition 1" to_op="Keras Model" to_port="training set"/>
    <connect from_op="Split Data" from_port="partition 2" to_op="Apply Keras Model" to_port="unlabelled data"/>
    <connect from_op="Keras Model" from_port="model" to_op="Apply Keras Model" to_port="model"/>
    <connect from_op="Apply Keras Model" from_port="labelled data" to_port="result 1"/>
    <portSpacing port="source_input 1" spacing="0"/>
    <portSpacing port="sink_result 1" spacing="0"/>
    <portSpacing port="sink_result 2" spacing="0"/>
    </process>
    </operator>
    </process>

     error.PNG

  • varunm1varunm1 Member Posts: 1,207 Unicorn
    @dass did your issue resolved? If so can you provide XML code for the same?
    Regards,
    Varun
    https://www.varunmandalapu.com/

    Be Safe. Follow precautions and Maintain Social Distancing

Sign In or Register to comment.