Generalization of protein structure from sequence using a large scale backprogation network

George L. Wilcox, Marius O. Poliac

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations


Summary form only given. The authors have applied a simple backpropagation neural network on a very large scale in an attempt to associate many primary sequences with representations of the corresponding three-dimensional structures. The training set consisted of 25 five sequences (the input layer, 130 amino acids long) associated with 25 130 × 130 distance matrices (the output layer, 16,900 neurons). Each amino acid was coded according to its hydrophobicity (range ±1; the degree to which it avoids contact with water), and the Euclidean distances in the distance matrices were normalized to the largest distance in the training set (range 0-1; about 40 angstrom). The network was configured with a single fully connected hidden layer of 50 to 1000 neurons using the network description language (NDL, also called BigNet). The network simulation was run on a Cray 2 supercomputer with four processors and 512 million words of randon access memory. The network achieved rates of 2 million connections per second in full backpropagation learning mode and was able to learn some aspects of sequence-to-structure mapping.

Original languageEnglish (US)
Title of host publicationIJCNN Int Jt Conf Neural Network
Editors Anon
PublisherPubl by IEEE
Number of pages1
StatePublished - Dec 1 1989
EventIJCNN International Joint Conference on Neural Networks - Washington, DC, USA
Duration: Jun 18 1989Jun 22 1989


OtherIJCNN International Joint Conference on Neural Networks
CityWashington, DC, USA


Dive into the research topics of 'Generalization of protein structure from sequence using a large scale backprogation network'. Together they form a unique fingerprint.

Cite this