Maximum Entropy Functions of Discrete Fuzzy Random Variables

This work was partially supported by the Guangxi National Science Foundation of China(Gui No. 0832261) Abstract Due to deﬁciency of information, the probability distribution and membership functions of a fuzzy random variable cannot be obtained explicitly. It is a challenging work to ﬁnd an appropriate probability distribution and membership function when certain partial information about a fuzzy random variable is given, such as expected value or moments. This paper solves such problems for the maximum entropy of discrete fuzzy random variables with certain constraints. A genetic algorithm is designed to solve the general maximum entropy model for discrete fuzzy random variables, which is illustrated by numerical experiment.


Introduction
Probability theory is a powerful tool to deal with randomness.In order to describe randomness in a mathematical way, a random variable is defined as a measurable function from a probability space to the set of real numbers.Fuzzy set theory was proposed by Zadeh(1965,p.338-353) to deal with fuzziness.The term 'fuzzy variables' was first introduced by Kaufmann(1975) as a fuzzy set of real numbers to describe fuzzy phenomena.Although probability theory and fuzzy set theory are two different systems, it is possible to apply the probabilistic method to the theoretical analysis of fuzzy set theory.
With the development of probability theory, many researchers turned to consider the generalization of random variables, and many new concepts were presented, such as Banach-valued random variable, random set, fuzzy random variable, and so on.The concept of fuzzy random variable was introduced by Kwakernaak(1978 as a mathematical description for fuzzy stochastic phenomena.Roughly speaking, a fuzzy random variable is a measurable function from a probability space to the set of fuzzy variables.Up to now, fuzzy random theory has been developed by several researchers such as Kruse and Meyer(1987), Puri and Ralescu(1986,p.409-422),Li et al.(2001,p.7-21),Nguyen et al.(2004,p.99-109),Liu and Liu(2003,p.143-160),Liu(2001Liu( ,p.713-720,2001,p.721-726&2002),p.721-726&2002),and so on.
Entropy is used to provide a quantitative measurement of the degree of uncertainty, which has widely been applied in transportation B. Samanta(2005,p.419-428)&S.Islam(2006,p.387-404),risk analysis L. Feng(2009Feng( ,p.2934Feng( -2938)), signal processing M. M(2008,p.639-669)and economics X. Wu(2003,p.347-354).Since the Shannon entropy of random variables was proposed by Shannon(1949), Jaynes(1957,p.620-630) provided the maximum entropy principle of random variables when some constraints were given.Fuzzy entropy was first initialized by Zadeh(1968,p.421-427) to quantify the fuzziness, who defined the entropy of a fuzzy event as a weighted Shannon entropy.Up to now, fuzzy entropy has been studied by many researchers such as De Luca and Termini(1972,p.301-312),Kaufmann(1985), Yager(1979,p.221-229),Kosko(1986,p.165-174),Pal and Pal(1989,p.284-295),Bhandari and Pal(1993,p.209-228),Pal and Bezdek(1994,p.107-118),Li and Liu(2008,p.123-129).Li X, and Liu B(2009,p.105-115)provided the concept of hybrid entropy so as to measure the uncertainty of hybrid variables.However, given some constraints, for example, expected value and variance, there usually exits multiple compatible probability distributions and membership functions.Which probability distribu-tion and membership functions shall we take?Because randomness and fuzziness simultaneously appear in a system, we cannot get the maximum entropy of hybrid variable through Euler-Lagrange equation.For fuzzy variables, Li and Liu(2007,p.43-52) and Gao and You(2009,p.2353-2361),respectively gave an analytical method to find the maximum entropy membership function of continuous and discrete fuzzy variables.On the basis of their work£we promote their ideas to solve the problem for maximum Entropy functions of discrete fuzzy random variables.The organization of our work is as follows: In section 2, some basic concepts and results on fuzzy random variables are reviewed.In section 3, we introduce some constraints.In sections 4 and 5, an effective genetic algorithm is introduced to solve general maximum entropy models for discrete fuzzy random variables and computational experiment is given in illustration of it.Finally, the conclusion is given in the last section.

Preliminaries
Fuzzy set theory has been well developed and applied in a wide variety of real problems.Let ξ be a fuzzy variable with membership function μ and B a set of real numbers.Then the possibility, necessity, and credibility measure of fuzzy event ξ ∈B can be represented by (2.1) (2.2) Let ξ be a fuzzy variable with the membership function μ(x) which satisfies the normalization condition, i.e., sup x μ(x) = 1.
In the setting of credibility theory, the credibility measure for fuzzy event {ξ ∈ B} deduced from μ(x) is given by Where B is any subset of the real numbers R, and B c is the complement of set B. Conversely, for a fuzzy variable ξ, its membership function can be derived from the credibility measure by (2.5) Fuzzy random variables have been defined in several ways.In this paper, we shall adopt the following definition of fuzzy random variables and chance measure of fuzzy random events.
Definition 2.1.(Liu and Liu (2003, p.143-160))A fuzzy random variable is a function from a probability space (Ω, A, Pr) to the set of fuzzy variables such that Pos{ξ(ω) ∈ B} is a measurable function of ω for any Borel set B of R.
Theorem 2.1.(Li X, Liu B(2009,p.105-115))Let (Θ, P, Cr) × (Ω, A, Pr) be a chance space and Ch a chance measure.Then for any event Λ, we have Proof It follows from the basic properties of probability and credibility that sup The conclusion is proved.
Example 2.2.Let a 1 , a 2 , ..., a m be fuzzy variables, and let p 1 , p 2 , ..., p m be nonnegative numbers with (2.10) Definition 2.3.(Li and Liu (2007, p.43-52)) Let ξ be a fuzzy random variable.Then the expected value of ξ is defined by provided that at least one of the two integrals is finite.
In fact, the expected value E[ξ] of ξ may be defined by (Liu and Liu (2003, p.143-160)) provided that at least one of the two integrals is finite, where E[ξ(ω)] is the expected value of random variable ξ(ω) According to Li and Liu(2008, p.123-129), we get a new definition of entropy of fuzzy random variable ξ, denoted by Definition 2.4.Suppose that ξ is a discrete fuzzy random variable taking values in {x 1 , x 2 , ...}.Then its entropy is defined by where S (t) = −t ln t − (1 − t) ln(1 − t).If there exists some index k such that Ch{ξ = x k } = 1, and 0 otherwise, then its entropy H[ξ] = 0. Suppose that ξ is a simple fuzzy random variable taking values in {x 1 , x 2 , ..., x n }.If Ch{ξ = x i } = 0.5 for all i = 1, 2, ..., n, then its entropy H[ξ] = n ln 2. Suppose that ξ is a discrete fuzzy random variable taking values in {x 1 , x 2 , ...}.Then H[ξ] ≥ 0 and equality holds if and only if ξ is essentially a deterministic / crisp number.

Moment constraints
In this section, we consider discrete fuzzy random variables.Let ξ be a discrete fuzzy random variable taking values in {x 1 , x 2 , ..., x n } (in this paper we always assume that x 1 < x 2 < ... < x n ) with probability {p 1 , p 2 , ..., p n }, and membership degrees {μ 1 , μ 2 , ..., μ n } respectively, where p 1 + p 2 + ... + p n = 1.Then the expected value of ξ can be written as (without loss of generality, suppose It is easy to verify that all ω i ≥ 0 and is called the variance of ξ and E[ξ n ] the nth moment of ξ.If the fuzzy random variable ξ reduces to a random variable, i.e., for any μ i ≡ 1, then the expected value reduces to the following form Which is just the expected value of discrete random variable ξ.Thus, the expected value of discrete fuzzy random variable is a natural extension of discrete random variable.
Let ξ be a nonnegative discrete random fuzzy variable taking values in {x 1 , x 2 , ..., x n } with probability {p 1 , p 2 , ..., p n } and membership degrees {μ 1 , μ 2 , ..., μ n }, respectively, where p 1 + p 2 + ... + p n = 1, and k a positive number.Then the k-th moment If the fuzzy random variable ξ reduces to a random variable, i.e., for any i ∈ {1, 2, ..., n}, μ i ≡ 1, then the k-th moment reduces to the following form Which is just the k-th moment of discrete random variable ξ.If k = 1, which is just the expected value of discrete fuzzy random variable ξ.

Genetic algorithm for general maximum entropy model
Genetic algorithm is a stochastic search method for global optimization problems based on the mechanics of natural selection and natural genetics.Genetic algorithm has demonstrated enormous success in providing good solutions to many complex optimization problems.In this section, we will design an effective genetic algorithms integrated with fuzzy random simulation for solving the maximum entropy model for discrete fuzzy random variables.
Let ξ be a discrete fuzzy random variable taking values in {x 1 , x 2 , ..., x n } with probability {p 1 , p 2 , ..., p n } and membership degrees {μ 1 , μ 2 , ..., μ n }, respectively.We have the natural relation 0 ≤ μ i ≤ 1, 0 ≤ p i ≤ 1 and 1≤i≤n p i = 1.By using maximum entropy principle, we have the following maximum entropy model: Where In general, the expected value constraint can be replaced by other moment constraints.For the search spaces of the maximum entropy model (4.1) are particularly irregular, genetic algorithm has succeeded in providing good solutions to complex moment conditions.
As an illustration, the following steps show how the genetic algorithm works.
Step 1: Initialize pop − size feasible chromosomes U t = {μ t 1 , μ Step 2: Calculate the expected values for all chromosomes U t and P t , t = 1, 2, ..., pop − size, respectively.If the expected values do not satisfy the constraints, we regenerate a chromosome to replace the original one until it is feasible.
Step 3: Calculate the entropy of each fuzzy random variable which is represented by each chromosome.The entropy denoted by H(U t ∧ P t ), is to assign a probability of reproduction to each chromosome U t and P t so that its likelihood of being selected is proportional to its entropy relative to the other chromosomes in the population.That is, the chromosomes with larger entropy will have more chance to produce offspring by using roulette wheel selection.
Step 4: Select the chromosomes for a new population by spinning the roulette wheel according to the value of the entropy of all chromosomes.
Step 5: Renew the chromosomes by crossover operations with a predetermined parameters P c , which is called the probability of crossover.In order to determine the parents for crossover operation, let us do the following process repeatedly from t = 1 to pop − size: generating a random number r from the interval [0, 1],the chromosome U t and P t is selected as a parent if r < P c .We denote the selected parents by U 1 , U 2 , U 3 , ... and P 1 , P 2 , P 3 , ... and divide them into the following pairs: (U 1 , U 2 ), (U 3 , U 4 ), (U 5 , U 6 ), ...
(P 1 , P 2 ), (P 3 , P 4 ), (P 5 , P 6 ), ... Let us illustrate the crossover operator on each pair by (U 1 , U 2 ) and (P 1 , P 2 ).At first, we generate a random number c from the open interval (0, 1).Then the crossover operator on U 1 and U 2 , P 1 and P 2 will produce two children X and Y, X and Y as follows: Then we check if the children satisfy the constraints.If both children are feasible, then we replace the parents with them.
If not, we keep the feasible child if it exists, and keep the other parent still Step 6: Update the chromosomes by mutation operations with a predetermined probability of mutation P m .In a similar manner to the process of selecting parents for crossover operation, we repeat the following steps from t = 1 to pop − size: generating a random number r from the interval [0, 1], the chromosome U t and P t is selected as a parent if r < P m .For each selected parent, denoted by U t = {μ t 1 , μ t 2 , ..., μ t n } and P t = {p t 1 , p t 2 , ..., p t n }, we mutate it in the following way.For each selected parent, we randomly select one μ t i , p t i and μ t j , p t j of this chromosome and regenerate their values.Then make 1≤i≤n p t i to be 1 and check the feasibility of them.
Step 7: Repeat Step 3 to Step 6 for N times, where N is a sufficiently large integer.
Step 8: Report the best chromosome U t and P t as the optimal solution.

Numerical example
In order to illustrate the effectiveness of the proposed genetic algorithm, let us consider Example 1 in Table 1 ( 2, and the errors are shown in Table 3: (actual value-optimal value)/optimal value×100% .It follows from Table 3 that the error does not exceed 2%, which shows that the proposed algorithm is effective to solve the above model.

Conclusions
In this paper, we promote the idea of Gao and You(2009,p.2353-2361) to solve the problem for maximum Entropy functions of discrete fuzzy random variables.Along with the improvement of uncertainty theory, we also can use the method to solve many uncertain events such as random fuzzy and toward fuzzy.In the future work, we will continue focus on this area.
ξ is a discrete fuzzy random variable taking values in {1, 2, 3, 4, 5, 6}, and expected value E[ξ] satisfying E[ξ] ≤ 3.5) for the comparison of the algorithm with respect to different parameters.We compare the solutions when different parametric values of P c , P m , pop − size and N are taken in the genetic algorithm.The maximum entropy are shown in Fig, the results are shown in Table