A Novel Method for Computationally Efficacious Linear and Polynomial Regression Analytics of Big Data in Medicine

  •  Ahmed Al-Imam    


Background: Machine learning relies on a hybrid of analytics, including regression analyses. There have been no attempts to deploy a scale-down transformation of data to enhance linear regression models. Objectives: We aim to optimize linear regression models by implementing data transformation function to scale down all variables in an attempt to minimize the sum of squared error. Materials and Methods: We implemented non-Bayesian statistics using SPSS and MatLab. We used Excel to generate 40 trials of linear regression models, and each has 1,000 observations. We utilized SPSS to conduct regression analyses, Wilcoxon signed-rank test, and Cronbach’s alpha statistics to evaluate the performance of the optimization model. Results: The scale-down transformation succeeded by significantly reducing the sum of squared errors [absolute Z-score=5.511, effect size=0.779, p-value<0.001, Wilcoxon signed-rank test]. Inter-item reliability testing confirmed the robust internal consistency of the model [Cronbach’s alpha=0.993]. Conclusions: The optimization model is valuable for high-impact research based on regression. It can reduce the computational processing demands for powerful real-time and predictive analytics of big data.

This work is licensed under a Creative Commons Attribution 4.0 License.