Reader small image

You're reading from  Codeless Deep Learning with KNIME

Product typeBook
Published inNov 2020
Reading LevelIntermediate
PublisherPackt
ISBN-139781800566613
Edition1st Edition
Languages
Tools
Right arrow
Authors (3):
Kathrin Melcher
Kathrin Melcher
author image
Kathrin Melcher

Kathrin Melcher is a data scientist at KNIME. She holds a master's degree in mathematics from the University of Konstanz, Germany. She joined the evangelism team at KNIME in 2017 and has a strong interest in data science and machine learning algorithms. She enjoys teaching and sharing her data science knowledge with the community, for example, in the book From Excel to KNIME, as well as on various blog posts and at training courses, workshops, and conference presentations.
Read more about Kathrin Melcher

Rosaria Silipo
Rosaria Silipo
author image
Rosaria Silipo

Rosaria Silipo, Ph.D., now head of data science evangelism at KNIME, has spent 25+ years in applied AI, predictive analytics, and machine learning at Siemens, Viseca, Nuance Communications, and private consulting. Sharing her practical experience in a broad range of industries and deployments, including IoT, customer intelligence, financial services, social media, and cybersecurity, Rosaria has authored 50+ technical publications, including her recent books Guide to Intelligent Data Science (Springer) and Codeless Deep Learning with KNIME (Packt).
Read more about Rosaria Silipo

View More author details
Right arrow

Introducing RNNs

Let's start with an overview of RNNs.

RNNs are a family of neural networks that cannot be constrained in the feedforward architecture.

Important note

RNNs are obtained by introducing auto or backward connections – that is, recurrent connections – into feedforward neural networks.

When introducing a recurrent connection, we introduce the concept of time. This allows RNNs to take context into account; that is, to remember inputs from the past by capturing the dynamic of the signal.

Introducing recurrent connections changes the nature of the neural network from static to dynamic and is therefore suitable for analyzing time series. Indeed, RNNs are often used to create solutions to problems involving time-ordered sequences, such as time series analysis, language modeling, free text generation, automatic machine translation, speech recognition, image captioning, and other similar problems investigating the time evolution of a given signal...

lock icon
The rest of the page is locked
Previous PageNext Page
You have been reading a chapter from
Codeless Deep Learning with KNIME
Published in: Nov 2020Publisher: PacktISBN-13: 9781800566613

Authors (3)

author image
Kathrin Melcher

Kathrin Melcher is a data scientist at KNIME. She holds a master's degree in mathematics from the University of Konstanz, Germany. She joined the evangelism team at KNIME in 2017 and has a strong interest in data science and machine learning algorithms. She enjoys teaching and sharing her data science knowledge with the community, for example, in the book From Excel to KNIME, as well as on various blog posts and at training courses, workshops, and conference presentations.
Read more about Kathrin Melcher

author image
Rosaria Silipo

Rosaria Silipo, Ph.D., now head of data science evangelism at KNIME, has spent 25+ years in applied AI, predictive analytics, and machine learning at Siemens, Viseca, Nuance Communications, and private consulting. Sharing her practical experience in a broad range of industries and deployments, including IoT, customer intelligence, financial services, social media, and cybersecurity, Rosaria has authored 50+ technical publications, including her recent books Guide to Intelligent Data Science (Springer) and Codeless Deep Learning with KNIME (Packt).
Read more about Rosaria Silipo