Over recent years, the design of experiments has emerged as a dynamic research field, attracting significant attention from scholars and practitioners. Experimental outcomes inherently exhibit variability due to measurement errors and the complex, non-linear behavior of system responses influenced by unidentified input factors. Within this context, the Taguchi method—with its use of orthogonal arrays—offers an effective framework for identifying optimal input parameters with a reduced number of experiments, typically validated through empirical test data. Conventional statistical techniques, such as the modified Taguchi model and response surface methodology, remain widely employed for parameter estimation and optimization. However, recent advances in machine learning present powerful alternatives. In this study, support vector regression, random forest regression, and XGBoost regression models were compared with traditional approaches to assess their relative efficiencies. The machine learning–based methodologies demonstrated superior predictive accuracy while significantly reducing experimental costs, preserving essential process insights, and minimizing performance variability. Among these models, the XGBoost regression approach delivered the most reliable performance, exhibiting the lowest prediction error and an exceptionally high coefficient of determination (R² = 0.99).