Open Access System for Information Sharing

Login Library

 

Article
Cited 6 time in webofscience Cited 7 time in scopus
Metadata Downloads

Generalization in a two-layer neural network with multiple outputs SCIE SCOPUS

Title
Generalization in a two-layer neural network with multiple outputs
Authors
Kang, KJOh, JHKwon, CPark, Y
Date Issued
1996-08
Publisher
AMERICAN PHYSICAL SOC
Abstract
We study generalization in a fully connected two-layer neural network with multiple output nodes. Similar to the learning of fully connected committee machine, the learning is characterized by a discontinuous phase transition between the permutation symmetric phase and the permutation symmetry breaking phase. We find that the learning curve in the permutation symmetric phase is universal, irrespective of the number of output nodes. The first-order phase transition point, i.e., the critical number of examples required for perfect learning, is inversely proportional to the number of outputs. The replica calculation shows good agreement with Monte Carlo simulation.
URI
https://oasis.postech.ac.kr/handle/2014.oak/12336
DOI
10.1103/PhysRevE.54.1811
ISSN
1063-651X
Article Type
Article
Citation
PHYSICAL REVIEW E, vol. 54, no. 2, page. 1811 - 1815, 1996-08
Files in This Item:

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher

오종훈OH, JONG HOON
Grad Program for Tech Innovation & Mgmt
Read more

Views & Downloads

Browse