Most Read Research Articles


Warning: Creating default object from empty value in /var/www/html/sandbox.ijcaonline.org/public_html/modules/mod_mostread/helper.php on line 79

Warning: Creating default object from empty value in /var/www/html/sandbox.ijcaonline.org/public_html/modules/mod_mostread/helper.php on line 79

Warning: Creating default object from empty value in /var/www/html/sandbox.ijcaonline.org/public_html/modules/mod_mostread/helper.php on line 79

Warning: Creating default object from empty value in /var/www/html/sandbox.ijcaonline.org/public_html/modules/mod_mostread/helper.php on line 79

Warning: Creating default object from empty value in /var/www/html/sandbox.ijcaonline.org/public_html/modules/mod_mostread/helper.php on line 79
Call for Paper - May 2015 Edition
IJCA solicits original research papers for the May 2015 Edition. Last date of manuscript submission is April 20, 2015. Read More

Prediction of Tanzanian Energy Demand using Support Vector Machine for Regression (SVR)

Print
PDF
International Journal of Computer Applications
© 2015 by IJCA Journal
Volume 109 - Number 3
Year of Publication: 2015
Authors:
Baraka Kichonge
Geoffrey R. John
Thomas Tesha
Iddi S. N. Mkilaha
10.5120/19172-0643

Baraka Kichonge, Geoffrey R John, Thomas Tesha and Iddi S n Mkilaha. Article: Prediction of Tanzanian Energy Demand using Support Vector Machine for Regression (SVR). International Journal of Computer Applications 109(3):34-39, January 2015. Full text available. BibTeX

@article{key:article,
	author = {Baraka Kichonge and Geoffrey R. John and Thomas Tesha and Iddi S.n. Mkilaha},
	title = {Article: Prediction of Tanzanian Energy Demand using Support Vector Machine for Regression (SVR)},
	journal = {International Journal of Computer Applications},
	year = {2015},
	volume = {109},
	number = {3},
	pages = {34-39},
	month = {January},
	note = {Full text available}
}

Abstract

This study discusses the influences of economic, energy and environment indicators in the prediction of energy demand for Tanzania applying support vector machine for regression (SVR). Economic, energy and environment indicators were applied to formulate models based on time series data. The experimental results showed the supremacy of the polynomial-SVR kernel function and the energy indicators model in providing the transformation, which achieved more accurate prediction values. The energy indicators model had a correlation coefficient (CC) of 0. 999 as equated to 0. 9975 and 0. 9952 with PUKF-SVR kernels for economic and environment indicators model. The energy indicators model closeness of predicted values as compared to actual values was the best as compared to economic and environment indicators models. Furthermore, root mean squared error (RMSE), mean absolute error (MAE), root relative squared error (RRSE) and relative absolute error (RAE) of energy indicators model were the lowest. Long-run sustainable development of the energy sector can be achieved with the use of SVR-algorithm as prediction tool of future energy demand.

References

  • Kichonge, B. , John, G. R. , Mkilaha, I. S. and Sameer, H. (2014) Modelling of Future Energy Demand for Tanzania, Journal of Energy Technologies and Policy, Vol 4, Issue No 7.
  • Pai, P. and Hong, W. (2005) Forecasting regional electricity load based on recurrent support vector machines with genetic algorithms, Electric Power Systems Research, 74, 417-425.
  • Pai, P. and Hong, W. (2005) Support vector machines with simulated annealing algorithms in electricity load forecasting, Energy Conversion and Management, 46, 2669-2688.
  • Xie, W. , Yu, L. , Xu, S. and Wang, S. (2006) A new method for crude oil price forecasting based on support vector machines Computational Science–ICCS 2006, pp. 444-451 (Springer).
  • Mohandes, M. , Halawani, T. , Rehman, S. and Hussain, A. A. (2004) Support vector machines for wind speed prediction, Renewable Energy, 29, 939-947.
  • Maji, S. , Berg, A. C. and Malik, J. (2008) Classification using intersection kernel support vector machines is efficient, Paper presented at the Computer Vision and Pattern Recognition, 2008. CVPR 2008. IEEE Conference on.
  • Xue, C. , Li, F. , He, T. et al. (2005) Classification of real and pseudo microRNA precursors using local structure-sequence features and support vector machine, BMC bioinformatics, 6, 310.
  • Olson, D. L. and Delen, D. (2008) Advanced Data Mining Techniques (Springer-Verlag Berlin Heidelberg).
  • Goel, A. (2009) Application of SVMs algorithms for prediction of evaporation in reservoirs, Paper presented at the World Environmental and Water Resources Congress.
  • Üstün, B. , Melssen, W. J. and Buydens, L. M. (2006) Facilitating the application of Support Vector Regression by using a universal Pearson VII function based kernel, Chemometrics and Intelligent Laboratory Systems, 81, 29-40.
  • Burges, C. J. (1998) A tutorial on support vector machines for pattern recognition, Data mining and knowledge discovery, 2, 121-167.
  • Smola, A. J. and Schölkopf, B. (2004) A tutorial on support vector regression, Statistics and computing, 14, 199-222.
  • Schölkopf, B. , Smola, A. and Müller, K. -R. (1998) Nonlinear component analysis as a kernel eigenvalue problem, Neural computation, 10, 1299-1319.
  • Cristianini, N. and Shawe-Taylor, J. (2000) An introduction to support vector machines and other kernel-based learning methods (Cambridge university press).
  • Bishop, C. M. (2006) Pattern recognition and machine learning (springer New York).
  • Drucker, H. , Burges, C. J. , Kaufman, L. , Smola, A. and Vapnik, V. (1997) Support vector regression machines, Advances in neural information processing systems, 9, 155-161.
  • Cortes, C. and Vapnik, V. (1995) Support-vector networks, Machine learning, 20, 273-297.
  • Vapnik, V. (2000) The nature of statistical learning theory (springer).
  • Muller, K. , Mika, S. , Ratsch, G. , Tsuda, K. and Scholkopf, B. (2001) An introduction to kernel-based learning algorithms, Neural Networks, IEEE Transactions on, 12, 181-201.
  • Schölkopf, B. and Smola, A. J. (2002) Learning with kernels: support vector machines, regularization, optimization, and beyond (MIT press).
  • Vapnik, V. N. and Vapnik, V. (1998) Statistical learning theory (Wiley New York).
  • Lahcene, B. (2013) On Pearson families of distributions and its applications, African Journal of Mathematics and Computer Science Research, 6, 108-117.
  • Lee, R. J. and Nicewander, W. A. (1988) Thirteen ways to look at the correlation coefficient, The American Statistician, 42, 59-66.
  • Armstrong, J. S. and Collopy, F. (1992) Error measures for generalizing about forecasting methods: Empirical comparisons, International journal of forecasting, 8, 69-80.
  • Chattefuee, S. and Hadi, A. S. (2006) Regression Analysis by Example (New Jersey, A John Wiley & Sons, Inc. , Publication ).