Whitening transformation: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
 
Line 1: Line 1:
[[File:CMAC system block diagram.jpg|thumb|320px|A block diagram of the CMAC system for a single joint. The vector S is presented as input to all joints. Each joint separately computes an S -. A* mapping and a joint actuator signal pi. The adjustable weights for a l l joints may reside in the same physical memory.<ref name="Albus75">J.S. Albus (1975). "A New Approach to Manipulator Control: the Cerebellar Model Articulation Controller (CMAC)". In: ''Trans. ASME, Series G. Journal of Dynamic Systems, Measurement and Control'', Vol. 97, pp. 220-233, 1975.</ref> ]]
The author's title is Andera and she believes it seems fairly good. Alaska is exactly where I've usually been residing. My working day job is an information officer but I've already utilized for another 1. Doing ballet is some thing she would never give up.<br><br>Also visit my site: real psychic ([http://chungmuroresidence.com/xe/reservation_branch2/152663 look at this now])
The '''Cerebellar Model Articulation Controller''' (CMAC) is a type of neural network based on a model of the mammalian [[cerebellum]]. It is also known as the Cerebellar Model Arithmetic Computer. It is a type of [[associative memory]].<ref>J.S. Albus (1979). "Mechanisms of Planning and Problem Solving in the Brain". In: ''Mathematical Biosciences''. Vol. 45, pp. 247-293, 1979.</ref>
 
== Overview ==
The CMAC was first proposed as a function modeler for robotic controllers by [[James Albus]] in 1975<ref name="Albus75"/> (hence the name), but has been extensively used in [[reinforcement learning]] and also as for automated classification in the [[machine learning]] community. CMAC computes a function <math>f(x_1 ... x_n)</math>, where <math>n</math> is the number of input dimensions. The input space is divided up into hyper-rectangles, each of which is associated with a memory cell. The contents of the memory cells are the weights, which are adjusted during training. Usually, more than one quantisation of input space is used, so that any point in input space is associated with a number of hyper-rectangles, and therefore with a number of memory cells. The output of a CMAC is the algebraic sum of the weights in all the memory cells activated by the input point.
 
A change of value of the input point results in a change in the set of activated hyper-rectangles, and therefore a change in the set of memory cells participating in the CMAC output. The CMAC output is therefore stored in a distributed fashion, such that the output corresponding to any point in input space is derived from the value stored in a number of memory cells (hence the name associative memory). This provides generalisation.
 
== Building blocks ==
[[Image:CmacHashing.jpg|thumb|CMAC, represented as a two-dimensional space]]
In the image on the right, there are two inputs to the CMAC, represented as a two-dimensional space. Two quantising functions have been used to divide this space with two overlapping grids (one shown in heavier lines). A single input is shown near the middle, and this has activated two memory cells, corresponding to the shaded area. If another point occurs close to the one shown, it will share some of the same memory cells, providing generalisation.
 
The CMAC is trained by presenting pairs of input points and output values, and adjusting the weights in the activated cells by a proportion of the error observed at the output. This simple training algorithm has a proof of convergence.<ref>Y. Wong, CMAC Learning is Governed by a Single Parameter, IEEE International Conference on Neural Networks, San Francisco, Vol. 1, pp. 1439-1443, 1993.</ref>
 
It is normal to add a kernel function to the hyper-rectangle, so that points falling towards the edge of a hyper-rectangle have a smaller activation than those falling near the centre.<ref>P.C.E. An, W.T. Miller, and P.C. Parks, Design Improvements in Associative Memories for Cerebellar Model Articulation Controllers, Proc. ICANN, pp. 1207-1210, 1991. </ref>
 
One of the major problems cited in practical use of CMAC is the memory size required, which is directly related to the number of cells used. This is usually ameliorated by using a [[hash function]], and only providing memory storage for the actual cells that are activated by inputs.
 
== References ==
{{reflist}}
 
== Further reading ==
* Albus, J.S. (1971). "[http://www.isd.mel.nist.gov/documents/albus/Loc_01.pdf Theory of Cerebellar Function]". In: ''Mathematical Biosciences'', Volume 10, Numbers 1/2, February 1971, pgs. 25-61
* Albus, J.S. (1975). "[http://www.isd.mel.nist.gov/documents/albus/Loc_04.pdf New Approach to Manipulator Control: The Cerebellar Model Articulation Controller (CMAC)]". In: ''Transactions of the ASME Journal of Dynamic Systems, Measurement, and Control'', September 1975, pgs. 220 - 227
* Albus, J.S. (1979). "[http://www.isd.mel.nist.gov/documents/albus/Loc_5.pdf Mechanisms of Planning and Problem Solving in the Brain]". In: ''Mathematical Biosciences'' 45, pgs 247-293, 1979.
 
== External links ==
* [http://seit.unsw.adfa.edu.au/staff/sites/dcornforth/CMAC.html The Cerebellar Model Articulation Controller (CMAC)] by David Cornforth 2007, University of NSW, Australian Defence Force Academy, Australia.
 
[[Category:Computational neuroscience]]
[[Category:Neural networks]]
[[Category:Artificial intelligence]]
[[Category:Network architecture]]
[[Category:Networks]]

Latest revision as of 23:51, 30 November 2014

The author's title is Andera and she believes it seems fairly good. Alaska is exactly where I've usually been residing. My working day job is an information officer but I've already utilized for another 1. Doing ballet is some thing she would never give up.

Also visit my site: real psychic (look at this now)