Mutual information (MI) can effectively specify the dependencies of variables in nonlinear model and it is extensively used as a measurement in feature selection. In contradiction to the linear correlation coefficient, which often forms the basis of empirical input variable selection method, MI is capable of measuring the dependencies of both linear and nonlinear relationships, making it more appropriate for use with complex nonlinear systems. A high value of MI score would suggest a strong dependency between two variables. When two variables are perfectly dependent, the MI value will move toward infinity. The lower bound of the MI is zero and its upper bound is infinity which is contrary to the correlation coefficient which is bounded between -1 and 1. These suggest that theoretically MI in not bounded completely. In this paper we investigate the behavior of the MI for linear and nonlinear models.