R-prop training for Multi-Layer Perceptron (MLP)

This algorithm is a legacy one. The API has changed since its implementation. New versions and forks will need to be updated.

Algorithms have at least one input and one output. All algorithm endpoints are organized in groups. Groups are used by the platform to indicate which inputs and outputs are synchronized together. The first group is automatically synchronized with the channel defined by the block in which the algorithm is deployed.

Unnamed group

Endpoint Name Data Format Nature
class_id system/uint64/1 Input
image system/array_2d_floats/1 Input
model tutorial/mlp/1 Output

Parameters allow users to change the configuration of an algorithm when scheduling an experiment

Name Description Type Default Range/Choices
number-of-hidden-units uint32 10
number-of-iterations uint32 50
seed uint32 0

The code for this algorithm in Python
The ruler at 80 columns indicate suggested POSIX line breaks (for readability).
The editor will automatically enlarge to accomodate the entirety of your input
Use keyboard shortcuts for search/replace and faster editing. For example, use Ctrl-F (PC) or Cmd-F (Mac) to search through this box

This algorithm implements a training procedure for a multi-layer perceptron (MLP), a neural network architecture that has some well-defined characteristics such as a feed-forward structure. This algorithm assumes 0 or 1 hidden layer, and the training procedure is based on R-prop [Ri93].

This implementation relies on the Bob library.

The inputs are:

  • image: a two-dimensional array of floats (64 bits), which is flattened for
    the training procedure
  • class_id: an identifier for the class of the image, such that supervised
    training is possible

The output model is the MLP in a Bob-related format.

  1. Riedmiller, H. Braun. A Direct Adaptive Method for Faster Backpropagation Learning: The RPROP Algorithm. IEEE International Conference on Neural Networks 1993, 586-591


Updated Name Databases/Protocols Analyzers
smarcel/tutorial/digit/2/mnist-mlp-nhu10-niter100-seed2001 mnist/1@idiap tutorial/multiclass_postperf/2

This table shows the number of times this algorithm has been successfully run using the given environment. Note this does not provide sufficient information to evaluate if the algorithm will run when submitted to different conditions.

Terms of Service | Contact Information | BEAT platform version 2.1.1b0 | © Idiap Research Institute - 2013-2020