Comparison of tree-based model with deep learning model in predicting effluent pH and concentration by capacitive deionization

Publication Type

Journal Article

Publication Date (Issue Year)

2023

Journal Name

Desalination

Abstract

Capacitive deionization (CDI) is an emerging technique for water treatment and electroadsorption processes (i.e., brackish water desalination). Various numerical modeling methods have been developed to predict and optimize the performance of CDI, and artificial intelligence techniques have recently been applied to overcome the limitations of numerical modelings, such as the difficulty in handling all complexities in the environment. However, such a complex neural network (i.e., deep learning (DL)) has limitations in that it is difficult to design a structure, takes a long time to train, and requires massive computer resources. Therefore, in this study, a tree-based model that is more effective than a neural network model for processing tabular data was developed to predict effluent pH and concentration in the CDI process. The tree-based ensemble models had a remarkably lower computational cost (100 times less than the DL model) with almost the same prediction accuracy (R2 = 0.998 for the steady random forest model and R2 = 0.986 for the DL model) using a binary feature concept. These findings will contribute to further examining the use of tree-based models for predicting and optimizing the CDI process to reduce computing capacity and minimize modeling complexity.

Keywords

Capacitive deionization, Prediction Effluent, concentration pH, Tree-based ensemble model, Deep learning

Rsif Scholar Name

Bethwel Kipchirchir Tarus

Rsif Scholar Nationality

Kenya

Cohort

Cohort 2

Thematic Area

Minerals, Mining and Materials Engineering

Africa Host University (AHU)

Nelson Mandela African Institution of Science and Technology (NM-AIST), Tanzania

Funding Statement

This work was supported by a National Research Foundation of Korea (NRF) grant funded by the Korean government (MSIT) (No. 2021R1C1C2005643). This study was supported by the Institutional Program of KIST (grant numbers 2E31932, 2E31933, and 2E32442).

Share

COinS