چکیده
|
One of the effective factors to ensure the desirable operation of drip irrigation systems is the uniform emitter discharge, which is affected by operating pressure and temperature. Accurate estimation of this parameter is crucial for optimal irrigation system management and operation. In this research, the emitter outflow discharge was simulated through artificial intelligence (AI)-based approaches under a wide range of temperature (13-53 °C) and operating pressures (0-240 kPa) variations. The applied AI models included artificial neural networks (ANN), neuro-fuzzy sub-clustering (NF-SC), neuro-fuzzy c-Means clustering (NF-FCM), and least square support vector machine (LS-SVM). The input parameters matrix consisted of operating pressure, water temperature, discharge coefficient, pressure exponent and nominal discharge, while the ratio of measured discharge to nominal discharge (modified coefficient, M) was the output of the models. The applied models were assessed through the robust k-fold testing data scanning mode. The 5-agent Global Performance Indicator (GPI) was used for the final reliable ranking. The results showed that all the applied AI models with an average mean absolute error (MAE) of 8.8% had acceptable accuracy for estimating the modified M coefficient. According to the GPI, the LS-SVM model had the lowest error, followed by the NF-SC model with a slight difference.
|