Some More Noiseless Coding Theorem on Generalized R-Norm Entropy

A parametric mean length is defined as the quantity L R = R R−1 [ 1 − ∑N i=1 p β i D −ni( R−1 R ) ∑N j=1 p β j ] where R > 0 ( 1) , β > 0, pi > 0, ∑ pi = 1, i = 1, 2, . . . ,N. This being the mean length of code words. Lower and upper bounds for L R are derived in terms of R-norm information measure for the incomplete power distribution. AMS Subject classification. 94A15, 94A17, 94A24, 26D15.

Setting r = 1 R in (1.1), we get which is a measure mentioned by Arimoto [1971] as an example of a generalized class of information measure.It may be marked that (1.2) also approaches to Shannon's [1948] entropy as r → 1.
Let Δ N = {P = (p 1 , p 2 , ..., p N ) , p i ≥ 0, p i = 1}, N ≥ 2 be the set of all finite discrete probability distributions, for any probability distribution (p 1 , p 2 , ..., Shannon [1948] defined entropy as: Throughout this paper, will stand for N i=1 unless otherwise stated and logarithms are taken to the base D (D > 1).
Let a finite set of N input symbols X = {x 1 , x 2 , ..., x N } be encoded using alphabet of D symbols, then it has been shown by Feinstien [1956] that there is uniquely decipherable instantaneous code with length n 1 , n 2 , . . ., n N iff where D is the size of code alphabet.
be the average codeword length then for a code which satisfies (1.4) it has also been shown by Feinstien [1956], that with equality iff and that by suitable encoded into words of long sequences, the average length can be made arbitrary close to H (P).This is Shannon's noiseless coding theorem.
By considering Renyi's [1961] entropy, a coding theorem and analogous to the above noiseless coding theorem has been established by Campbell [1965] and the authors obtained bounds for it in terms of H α (P)= 1 1−α log D P α i , α > 0( 1) .Kieffer [1979] defined a class rules and showed H α (P) is the best decision rule for deciding which of the two sources can be coded with expected cost of sequences of length n when n → ∞, where the cost of encoding a sequence is assumed to be a function of length only.Further Jelinek [1980] showed that coding with respect to Campbell [1965] mean length is useful in minimizing the problem of buffer overflow which occurs when the source symbol are being produced at a fixed rate and the code words are stored temporarily in a finite buffer.Hooda and Bhaker [1997] consider the following generalization of Campbell [1965] mean length: and proved where H β α (P) is generalized entropy of order α = 1 1+t and type β studied by Aczel and Daroczy [1963] and Kapur [1967].It may be seen that the mean codeword length (1.5) had been generalized parametrically and their bounds had been studied in terms of generalized measures of entropies.Here we give another generalization of (1.5) and study its bounds in terms of generalized entropy of order α and type β.Longo [1976], Gurdial and Pessow [1977], Singh, Kumar and Tuteja [2003], Parkash and Sharma [2004], Hooda and Bhaker [1997], Khan, Bhat and Pirzada [2005] have studied generalized coding theorems by considering different generalized measure of (1.3) and (1.5) under condition (1.4) of unique decipherability.
In this paper we study some coding theorems by considering a new function depending on parameters R and β.Our motivation for studying this new function is that it generalizes some entropy function already existing in literature Boekee and Lubbe [1980] which is used in physics.
) reduces to a mean code length defined by Shannon [1948].
i.e.L = n i p i .
Also, we have used the condition to find the bounds.It may be seen that in the case when β = 1, then (2.7) reduces to Kraft Inequality (1.4).
We establish a result, that in a sense, provides a characterization of H β R (P) under the condition of (2.7).Theorem 2.1 For all integers D > 1 under the condition (2.7) equality holds iff (2.9) Proof.By Holder inequality we have x i y i , (2.10) where p −1 + q −1 = 1; p( 0) < 1, q < 0 or q( 0) < 1 , p < 0; x i , y i > 0 for each i.
Putting these values in (2.10) and using (2.9), we get Published by Canadian Center of Science and Education (2.11) Here two cases arise Case 1.When 0 < R < 1, then raising power 1−R R to both sides of (2.11), we have (2.12) Case 2. Similarly we can prove (2.8) for R > 1 .
Theorem 2.2 On properly choosing the lengths n 1 , n 2 , , ..........n N in the code of theorem 2.1, L β R (P) can be made to satisfy the following inequality: where H β R (P) and L β R (P) are given by (2.1) and (2.5) respectively.Proof: It can be proved that there is equality in (2.8) if and only if We choose the codeword lengths n i s as integers satisfying (2.14) From the left inequality of (2.14), we have taking sum over i, we get the generalized inequality (2.7).So there exists a generalized personal code with length n i s.