Jump to content

User:Mjirina/sandbox

From Wikipedia, the free encyclopedia

Probability distribution mapping function[edit]

MAPPING A DISTRIBUTION[edit]

Informally, the probability distribution mapping function (DMF) is a mapping of the probability distribution of points in n-dimensional space to the distribution of points in one-dimensional space of the distances [1]. The distribution density mapping function (DDMF) is a one-dimensional analogy to the probability density function. The power approximation of the probability distribution mapping function has the form of , where r is a distance and exponent q is the distribution mapping exponent (DME), see reference cited. Function transforms the true distribution of points so that the distribution density mapping function as a function of variable z is constant, at least in the neighborhood of the fixed point. For exact definitions see [ 1 ]. These notions are local, i.e. are related to a particular fixed point. The distribution mapping exponent q is something like a local value of the correlation dimension according to Grassberger and Procaccia [2]. It can be also viewed as the local dimension of the attractor or singularity exponent eventually scaling exponent in the Multifractal system.

DECOMPOSITION OF THE CORRELATION INTEGRAL TO LOCAL FUNCTIONS[edit]

Correlation integral was defined by Grassberger and Procaccia[ 2 ] and can be written in the form

where h(.) is the Heaviside step function, and considering all pairs of points of set of N points. It holds[ 1 ]

where is the distribution mapping function related to point .

THE DISTRIBUTION MAPPING EXPONENT IN CLASSIFICATION METHODS[edit]

The DME can be used for constructing a classifier. Methods are described in [ 1 ], and each individual method in more detail in [3] [4] [5] and in freely available reports [6] [7].

References[edit]

  1. ^ Jiřina, M., Jiřina, jr., M.: Fractal based data separation in data mining. Proceedings of The Third International Conference on Digital Information Processing and Communications (ICDIPC2013), Dubai, UAE, Jan. 30, 2013 - Feb. 1, 2013. pp. 287-295. (paper No. 202) ISBN: 978-0-9853483-3-5 ©2013 SDIWC. Available at http://sdiwc.net/digital-library/fractal-based-data-separation-in-data-mining, and also at http://sdiwc.net/digital-library/web-admin/upload-pdf/00000419.pdf
  2. ^ Grassberger, P., Procaccia, I.: Measuring the strangeness of strange attractors. Physica 9D, 189–208 (1983)
  3. ^ Jiřina, Marcel - Jiřina jr., M.: Correlation Dimension-Based Classifier. IEEE Transactions on Cybernetics. Vol. 44, No. 12 (2014), pp. 2253-2263. ISSN 2168-2267.
  4. ^ Jiřina, M. and Jiřina, Jr., M.: Utilization of Singularity Exponent in Nearest Neighbor Based Classifier. Journal of Classification (Springer) Vol. 3, No. 1, pp. 3-29 (2013)
  5. ^ Jiřina, Marcel - Jiřina jr., M.: Classification Using Zipfian Kernel. Journal of Classification (Springer), Vol. 32, 12 April 2015, pp 305-326. ISSN 0176-4268 Electronic version available at http://www.library.sk/arl-cav/cs/contapp/?idx=cav_un_epca*0420984&repo=crepo1&key=84163352979
  6. ^ Marcel Jiřina: IINC Software. Technical Report No. V-1225, Institute of Computer Science, Academy of Sciences of the Czech Republic, October 2015, 51pp.
  7. ^ Marcel Jiřina: IINC classifier for MS Excel. The principle, method and Program. Technical Report No. V-1199, Institute of Computer Science, Academy of Sciences of the Czech Republic, October 2014, 9 pp.