Search alternatives:
coot » cost (Expand Search)
Showing 401 - 420 results of 449 for search '(improved OR improve) (coot OR root) optimization algorithm', query time: 0.30s Refine Results
  1. 401

    Rapid Detection of Key Phenotypic Parameters in Wheat Grains Using Linear Array Camera by Wenjing Zhu, Kaiwen Duan, Xiao Li, Kai Yu, Changfeng Shao

    Published 2025-05-01
    “…The errors estimating the comprehensive grain length of five wheat varieties using the extraction algorithm developed in this study, the determination coefficient and root mean square error indices, were 0.986 and 0.0887, respectively, compared with manual measurements. …”
    Get full text
    Article
  2. 402

    Post-integration based point-line feature visual SLAM in low-texture environments by Yanli Liu, Zhengyuan Feng, Heng Zhang, Wang Dong

    Published 2025-04-01
    “…Abstract To address the issues of weak robustness and low accuracy of traditional SLAM data processing algorithms in weak texture environments such as low light and low contrast, this paper first studies and improves the data feature extraction method, optimizing the AGAST-based feature extraction algorithm to adaptively adjust the extraction threshold according to the gradient size of different data features. …”
    Get full text
    Article
  3. 403

    Explainable Ensemble Learning Model for Residual Strength Forecasting of Defective Pipelines by Hongbo Liu, Xiangzhao Meng

    Published 2025-04-01
    “…The development of this model contributes to improving the integrity management of oil and gas pipelines and provides decision support for the intelligent management of defective pipelines in oil and gas fields.…”
    Get full text
    Article
  4. 404

    Elastic net with Bayesian Density Estimation model for feature selection for photovoltaic energy prediction by Venkatachalam Mohanasundaram, Balamurugan Rangaswamy

    Published 2025-03-01
    “…Research investigations demonstrate that the ELNET-BDE model attains significantly lower Mean Absolute Error (MAE) and Root Mean Square Error (RMSE) than contesting Machine Learning (ML) algorithms like Artificial Neural Network (ANN), Support Vector Machine (SVM), Random Forest (RF), and Gradient Boosting Machines (GBM). …”
    Get full text
    Article
  5. 405
  6. 406

    Estimation of Current RMS for DC Link Capacitor of S-PMSM Drive System by ZHANG Zhigang, CHANG Jiamian, ZHANG Pengcheng

    Published 2023-10-01
    “…The Cotes method eliminates numerous integration calculations, thus improving calculation accuracy. The proposed technique simplifies the tedious calculation process of traditional algorithms and guarantees high calculation accuracy, providing guidance for optimizing the selection of DC link capacitors and the design of life monitoring controllers. …”
    Get full text
    Article
  7. 407

    PSO Tuned Super-Twisting Sliding Mode Controller for Trajectory Tracking Control of an Articulated Robot by Zewdalem Abebaw Ayinalem, Abrham Tadesse Kassie

    Published 2025-01-01
    “…Numerical simulations revealed that the tracking error and root mean square error (RMSE) improvements were approximately 18.33%, 16.66%, and 14.29% for PSO–STSMC compared to STSMC, and 79.50%, 78.04%, and 25.0% compared to PSO–SMC for each of the three joints under ideal conditions, respectively. …”
    Get full text
    Article
  8. 408

    Edge-Fog Computing-Based Blockchain for Networked Microgrid Frequency Support by Ying-Yi Hong, Francisco I. Alano, Yih-der Lee, Chia-Yu Han

    Published 2025-01-01
    “…The parameters and hyperparameters of the LSTM-MFPC are optimized using the Bayesian Adaptive Direct Search (BADS) algorithm. …”
    Get full text
    Article
  9. 409

    Comparison of Machine Learning Methods for Predicting Electrical Energy Consumption by Retno Wahyusari, Sunardi Sunardi, Abdul Fadlil

    Published 2025-02-01
    “…Data pre-processing, specifically min-max normalization, is crucial for improving the accuracy of distance-based algorithms like KNN. …”
    Get full text
    Article
  10. 410

    Deep Mining on the Formation Cycle Features for Concurrent SOH Estimation and RUL Prognostication in Lithium-Ion Batteries by Dongchen Yang, Weilin He, Xin He

    Published 2025-04-01
    “…Models that integrate all formation-related data yielded the lowest root mean square error (RMSE) of 2.928% for capacity estimation and 16 cycles for RUL prediction, highlighting the significant role of surface-level physical features in improving accuracy. …”
    Get full text
    Article
  11. 411

    Rapid Quality Assessment of Polygoni Multiflori Radix Based on Near-Infrared Spectroscopy by Bin Jia, Ziying Mai, Chaoqun Xiang, Qiwen Chen, Min Cheng, Longkai Zhang, Xue Xiao

    Published 2024-01-01
    “…After optimizing the model using CARS, R2C increased by 0.15%, 0.41%, and 0.34%, RMSECV decreased by 0.53%, 0.32%, and 0.24%, R2P increased by 0.21%, 0.63%, and 0.35%, RMSEP decreased by 0.36%, 0.41%, and 0.31%, and RPD increased by 1.1, 0.9, and 0.6, significantly improving the predictive capacity of the model. …”
    Get full text
    Article
  12. 412

    Prediction Model of Household Carbon Emission in Old Residential Areas in Drought and Cold Regions Based on Gene Expression Programming by Shiao Chen, Yaohui Gao, Zhaonian Dai, Wen Ren

    Published 2025-07-01
    “…., electricity usage and heating energy consumption) were selected using Pearson correlation analysis and the Random Forest (RF) algorithm. Subsequently, a hybrid prediction model was constructed, with its parameters optimized by minimizing the root mean square error (RMSE) as the fitness function. …”
    Get full text
    Article
  13. 413

    Predicting hydrocarbon reservoir quality in deepwater sedimentary systems using sequential deep learning techniques by Xiao Hu, Jun Xie, Xiwei Li, Junzheng Han, Zhengquan Zhao, Hamzeh Ghorbani

    Published 2025-07-01
    “…Three sequential deep learning models—Recurrent Neural Network and Gated Recurrent Unit—were developed and optimized using the Adam algorithm. The Adam-LSTM model outperformed the others, achieving a Root Mean Square Error of 0.009 and a correlation coefficient (R2) of 0.9995, indicating excellent predictive performance. …”
    Get full text
    Article
  14. 414

    Predictive modelling of hexagonal boron nitride nanosheets yield through machine and deep learning: An ultrasonic exfoliation parametric evaluation by Jerrin Joy Varughese, Sreekanth M․S․

    Published 2025-03-01
    “…A suite of machine learning regression models including Adaptive Boosting (AdaBoost) Regressor, Random Forest (RF) Regressor, Linear Regressor (LR), and Classification and Regression Tree (CART) Regressor, was employed alongside a deep neural network (DNN) architecture optimized using various algorithms such as Adaptive Moment Estimation (Adam), Root Mean Square Propagation (RMS Prop), Stochastic Gradient Descent (SGD), and Limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS). …”
    Get full text
    Article
  15. 415

    Predicting hospital outpatient volume using XGBoost: a machine learning approach by Lingling Zhou, Qin Zhu, Qian Chen, Ping Wang, Hao Huang

    Published 2025-05-01
    “…Accurate prediction of outpatient demand can significantly enhance operational efficiency and optimize the allocation of medical resources. This study aims to develop a predictive model for daily hospital outpatient volume using the XGBoost algorithm. …”
    Get full text
    Article
  16. 416

    Calibration of the Composition of Low-Alloy Steels by the Interval Partial Least Squares Using Low-Resolution Emission Spectra with Baseline Correction by M. V. Belkov, K. Y. Catsalap, M. A. Khodasevich, D. A. Korolko, A. V. Aseev

    Published 2024-04-01
    “…Further improvement of calibration accuracy was achieved by using the adaptive iteratively reweighted penalized least squares algorithm for spectrum baseline correction. …”
    Get full text
    Article
  17. 417

    Multi-Fidelity Machine Learning for Identifying Thermal Insulation Integrity of Liquefied Natural Gas Storage Tanks by Wei Lin, Meitao Zou, Mingrui Zhao, Jiaqi Chang, Xiongyao Xie

    Published 2024-12-01
    “…The results of the data experiments demonstrate that the multi-fidelity framework outperforms models trained solely on low- or high-fidelity data, achieving a coefficient of determination of 0.980 and a root mean square error of 0.078 m. Three machine learning algorithms—Multilayer Perceptron, Random Forest, and Extreme Gradient Boosting—were evaluated to determine the optimal implementation. …”
    Get full text
    Article
  18. 418

    Development of Advanced Machine Learning Models for Predicting CO<sub>2</sub> Solubility in Brine by Xuejia Du, Ganesh C. Thakur

    Published 2025-02-01
    “…The results underscore the potential of ML models to significantly enhance prediction accuracy over a wide data range, reduce computational costs, and improve the efficiency of CCUS operations. This work demonstrates the robustness and adaptability of ML approaches for modeling complex subsurface conditions, paving the way for optimized carbon sequestration strategies.…”
    Get full text
    Article
  19. 419
  20. 420

    Research on rock burst prediction based on an integrated model by Junming Zhang, Qiyuan Xia, Hai Wu, Sailei Wei, Zhen Hu, Bing Du, Yuejing Yang, Huaixing Xiong

    Published 2025-05-01
    “…Additionally, the sparrow search algorithm (SSA) is employed to optimize hyperparameters, further improving the model’s performance. …”
    Get full text
    Article