## It useful information of the long term behavior ofthe

It has been recognized by operational weather forecasters over theglobe that uncertainties associated with the initial conditionsused to initialize the models of numerical weather prediction aswell as errors in those models result in uncertainties in theforecast. It has also been emphasized in the literature that theforecasts might be benefited by the greater amount of informationcontained in a probabilistic forecast than in a deterministicforecast cite{n1,n2}. In view of the above, Roulston and Smithcite{roult} demonstrated how information theory might provide auseful theoretic framework to understand and quantify weather andclimate predictability. The modern age of ergodic theory has begun with thedevelopment of the idea of entropy of a random variable by RudolfClausius  in 1850 and around 1870 it was endowed with astatistical meaning by Ludwig Boltzmann, who established itsrelation with statistical mechanics.  In subsequent times, theconcept of entropy was advanced in the works of J. Willard Gibbsin thermodynamics and Von Neumann in quantum mechanics. ClaudeShannon, in 1948, reintroduced it in information theory. Theentropy provides useful information of the long term behavior ofthe random process and this behavior plays as the key factor indeveloping coding theorems of information theory. For detaileddiscussion on the Shannon entropy please see Gray cite{gray} andreferences therein. The idea of entropy is based on the notion ofmutual information between two processes introduced by Shannon as:egin{equation}I(X,Y)=H(X)+H(Y)-H(X,Y)end{equation}label{1}where, \$X\$ and \$Y\$ are two random variables. The mutualinformation is interpreted as the information contained in oneprocess minus the information contained in the process when theother process is known. In spite of existence of another notion ofconditional entropy, the form