Link Search Menu Expand Document

Types and Topologies of Neural Networks

Modern technological applications are increasingly employing the type of computational models called artificial neural networks or ANN. There are all sorts of neural networks that emulate the workings of the human brain. Neural networks are governed by data instead of strict mathematics. ANNs are a type of deep learning, which is a branch of artificial intelligence. The performance of neural networks increases with the increase in the amount of data. Neural networks are composed up of layers; these layers contain activation functions and processors. The number of layers depends on the type of neural network. The information passes through each layer successively.

Types of Neural Networks

Neural networks are of many types depending upon the kind of function they perform. Some important types of neural networks are listed below.

Feedforward Neural Networks

This is one of the fundamental neural network types. The information goes through each layer in sequence in a feedforward neural network until it reaches the output layer. The movement of data is unidirectional, and it is also called a front propagated system. It generally uses the classification activation function. A feedforward network can have only a single layer or multiple hidden layers. Every layer of the feedforward network assigns weights to the input information and calculates the sum of the products. Feedforward neural networks give better results with noisy data. The applications of feedforward networks include classification, computer vision, face recognition, among others.

Convolutional Neural Networks

A convolutional neural network is abbreviated as CNN. It is a kind of multilayer perceptron. CNNs have a primary layer called a convolutional layer. There can be one or many convolutional layers, and these layers can be fully connected or pooled. Convolutional layers contain filters that reduce the processing parameters. CNNs are widely used for object detection in images and video, image classification, semantic parsing, natural language processing, etc.

Recurrent Neural Networks

In a Recurrent Neural Network or CNN, the output of a specific layer is fed back to the input. The first layer of RNN is similar to the feedforward NN. The RNNs are suitable for prediction. As the information moves across layers, each layer would remember some information functioning as a memory cell. If the RNN predicts inaccurately, it tries to relearn during the backpropagation and predict again. This process continues until the prediction is accurate enough. RNNs are suitable for text-to-speech conversion.

Modular and Sequence Neural Networks

A modular neural network contains multiple networks that work independently to achieve a common goal. A large complex task can be broken down into subtasks and processed faster using modular networks. The speed of computation increases because the modular networks run in parallel. A sequence network has two recurrent neural networks. One is the input encoder, and the other is the output decoder. This is particularly useful when the size of input and output are different. Applications of sequence networks include natural language processing, chat robots, translation, and automatic answering machines. Apart from these general types, there are many application-specific neural networks. For example, a two-layer Radial Basis Function Neural Network is particularly useful in power systems and a fully connected Multilayer Perceptron that uses a nonlinear activation function. It is helpful for the classification of data that cannot be linearly classified. It is used for translation and speech recognition.

Other useful articles:


Back to top

© , Neural Network 101 — All Rights Reserved - Terms of Use - Privacy Policy