The coefficient of determination, R2 is a very popular measure of association between two variables. However, R2 only measures the degree of linear association and cannot be applied to nonlinear model. An alternative approach is to use another measure of dependence between two variables which is called mutual information (MI). The attractive feature of the MI is that, it is capable of measuring the dependency of both linear and nonlinear relationships, making it more appropriate for use with complex nonlinear systems. The value of the MI ranges from 0 to infinity. When two values are perfectly dependent, the value of MI will approach infinity while value of MI which is closer to zero indicates no association between the two variables. In this situation, the MI is unbounded, contrary to the correlation coefficient which is bounded between -1 and 1. Since MI is unbounded, it does not provide a clear picture about the degree of association between variables. In this paper, an attempt has been made to restructure the MI such that it value ranges from 0 to 1. We call the restructured MI as the Relative MI (RMI) and it is now bounded. The value of the RMI which is closer to 1 indicates high correlation while the value closer to 0 signifies poor correlation between variables. The advantage of the RMI is that it can provide easier interpretation of the dependency between two variables. The computation of RMI is based on the nonparametric kernel method which characterizes the joint probability distribution of the variables.