skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Limited weights neural networks: Very tight entropy based bounds

Abstract

Being given a set of m examples (i.e., data-set) from IR{sup n} belonging to k different classes, the problem is to compute the required number-of-bits (i.e., entropy) for correctly classifying the data-set. Very tight upper and lower bounds for a dichotomy (i.e., k = 2) will be presented, but they are valid for the general case.

Authors:
 [1];  [2]
  1. Los Alamos National Lab., NM (United States)
  2. Wayne State Univ., Detroit, MI (United States). Vision and Neural Networks Lab.
Publication Date:
Research Org.:
Los Alamos National Lab., NM (United States)
Sponsoring Org.:
USDOE Assistant Secretary for Human Resources and Administration, Washington, DC (United States)
OSTI Identifier:
527896
Report Number(s):
LA-UR-97-294; CONF-970939-1
ON: DE97004781; TRN: 97:005163
DOE Contract Number:
W-7405-ENG-36
Resource Type:
Conference
Resource Relation:
Conference: SOCO `97: 2. international symposium on soft computing, fuzzy logic, artificial neural networks, and genetic algorithms, Nimes (France), 17-19 Sep 1997; Other Information: PBD: 1997
Country of Publication:
United States
Language:
English
Subject:
99 MATHEMATICS, COMPUTERS, INFORMATION SCIENCE, MANAGEMENT, LAW, MISCELLANEOUS; NEURAL NETWORKS; ALGORITHMS; ENTROPY

Citation Formats

Beiu, V., and Draghici, S. Limited weights neural networks: Very tight entropy based bounds. United States: N. p., 1997. Web.
Beiu, V., & Draghici, S. Limited weights neural networks: Very tight entropy based bounds. United States.
Beiu, V., and Draghici, S. Tue . "Limited weights neural networks: Very tight entropy based bounds". United States. doi:. https://www.osti.gov/servlets/purl/527896.
@article{osti_527896,
title = {Limited weights neural networks: Very tight entropy based bounds},
author = {Beiu, V. and Draghici, S.},
abstractNote = {Being given a set of m examples (i.e., data-set) from IR{sup n} belonging to k different classes, the problem is to compute the required number-of-bits (i.e., entropy) for correctly classifying the data-set. Very tight upper and lower bounds for a dichotomy (i.e., k = 2) will be presented, but they are valid for the general case.},
doi = {},
journal = {},
number = ,
volume = ,
place = {United States},
year = {Tue Apr 01 00:00:00 EST 1997},
month = {Tue Apr 01 00:00:00 EST 1997}
}

Conference:
Other availability
Please see Document Availability for additional information on obtaining the full-text document. Library patrons may search WorldCat to identify libraries that hold this conference proceeding.

Save / Share: