Hindawi Publishing Corporation The Scientific World Journal Volume 2013, Article ID 292575, 10 pages http://dx.doi.org/10.1155/2013/292575

Research Article Electricity Load Forecasting Using Support Vector Regression with Memetic Algorithms Zhongyi Hu, Yukun Bao, and Tao Xiong Department of Management Science and Information Systems, School of Management, Huazhong University of Science and Technology, Wuhan 430074, China Correspondence should be addressed to Yukun Bao; [email protected] Received 25 September 2013; Accepted 19 November 2013 Academic Editors: C. Koroneos and X. Zhou Copyright Β© 2013 Zhongyi Hu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Electricity load forecasting is an important issue that is widely explored and examined in power systems operation literature and commercial transactions in electricity markets literature as well. Among the existing forecasting models, support vector regression (SVR) has gained much attention. Considering the performance of SVR highly depends on its parameters; this study proposed a firefly algorithm (FA) based memetic algorithm (FA-MA) to appropriately determine the parameters of SVR forecasting model. In the proposed FA-MA algorithm, the FA algorithm is applied to explore the solution space, and the pattern search is used to conduct individual learning and thus enhance the exploitation of FA. Experimental results confirm that the proposed FA-MA based SVR model can not only yield more accurate forecasting results than the other four evolutionary algorithms based SVR models and three well-known forecasting models but also outperform the hybrid algorithms in the related existing literature.

1. Introduction Electricity load forecasting has always been the essential part of efficient power system planning and operation. Specially that it is not only critical for automatic generation control, reliable operation, and resource dispatch but is also a fundamental piece of information used for energy transactions in competitive electricity markets [1]. Inaccurate forecast of power load leads to a great deal of loss for power companies, and a 1% increase in forecasting error implied a 10 million increase in operating costs [2]. However, the electricity load is inevitably affected by various factors such as climate factors, social activities, and seasonal factors, thus making it difficult to be accurately predicted. During the past several decades, numerous approaches have been proposed for electricity load forecasting. Traditional methods, such as autoregressive moving average model (ARMA) [3], exponential smoothing models [4, 5], and regression models [6, 7] are often difficult to model the electricity load with high accuracies due to the nonlinearity of the load inherently. On the other hand, with the development of intelligence techniques in recent years, many studies have tried to apply the artificial intelligence techniques to improve

the forecasting accuracy of load. Among them, neural networks (NN) have received much share of attention, and a great number of studies have reported successful results in the load forecasting [8]. Refrence [9] proposed a practical method using NN combined similar days approach, which resulted in a reliable forecasts for one-to-six hour-ahead electricity load. Refrence [10] proposed an adaptive artificial neural network with particle swarm optimization (PSO) used to adjust the network’s weights; computational results indicated that the proposed model can obtain higher forecasting precision with traditional BP algorithm. Refrence [11] applied Bayesian neural network in the short-term load forecasting and the results of the proposed model gains better performance than that of conventional neural networks. Neural network with novel learning algorithm based on a modified harmony search technique also gives better results than several benchmarks [12]. Refrence [8] gave a comprehensive review and evaluation of the neural networks for short-term load forecasting. However, the NN has a large number of parameters to be tuned and suffers from the danger of over-fitting. Different from NN which minimizes the empirical error based on the empirical risk minimization principle (ERM), support vector regression (SVR) implements the structural

2 risk minimization principle (SRM) by minimizing an upper bound to the generalization error [13]. This leads to excellent generalization performance with SVR, which has been shown to outperform other nonliner forecasting techniques including NN based forecasting models [14]. As one of the fields in time series forecasting using SVR [14–20], electricity load forecasting using SVR as well as its varieties has been well studied. Refrence [16] proposed a locally weighted support vector regression to solve the short-term load forecasting problem, and the experimental results proved the superior performance of the proposed model compared with some published models. Combined with fuzzy c-means (FCM) and particle swarm optimization (PSO), a SVR based model is studied to forecast the short-term load of a city [21]. Refrence [17] proposed a TF-πœ€-SVR model with trend fixed and seasonal adjustment to improve the forecasting accuracy of the electricity demand. However, the generalization ability of SVMs highly depends on the adequate setting of parameters [22–25], such as penalty coefficient, kernel parameters, and the width of loss function. Therefore, the selection of the optimal parameters is of critical importance to obtain a good performance in handling electricity load forecasting task with SVR. Recently, various studies try to improve the forecasting accuracy of electricity load when the SVR model is used. Refrence [18] employs simulated annealing algorithms (SA) to choose the parameters of SVR, and computational results show that the SA based SVR model achieves better performance for load forecasting compared with autoregressive integrated moving average (ARIMA) model and the general regression neural networks (GRNN) model. Refrence [26] proposed a differential evolution algorithm based SVR model to forecast the annual load. Refrence [27] proposed a LSSVM based load forecasting model with the fruit fly algorithm used to automatically choose the parameters of LSSVM; experimental results show that the proposed model outperforms some other alternative models. Pai et al. conducted a series of relevant researches by using genetic algorithm (GA) [28, 29], chaotic particle swarm optimization (CPSO) [30], artificial bee colony algorithm (ABC) [31], immune algorithm (IA) [32, 33], and hybrid algorithm [34] for parameters determination of SVR to improve the forecasting accuracy of the electricity load. However, GA and some other evolutionary algorithms (EAs) are not guaranteed to find the global optimum parameters of a SVR model, though they are generally good at finding β€œacceptable good” or near-optimal solutions to problems. More specifically, although they are good at exploring the solution space and detecting the region of attraction of the global optimum efficiently, they lack the abilities to perform a refined tuning search locally [24, 35]. Memetic algorithms (MAs), a powerful algorithmic paradigm that combines the evolutionary algorithms (EAs) with problem-specific local searcher (LS), have been successfully applied in a wide variety of areas [36–38]. MA has the ability to exploit the complementary advantages of EAs (generality, robustness, and global search efficiency), and problem-specific local search (exploiting application-specific problem structure, rapid convergence toward local minima) [39]. In our previous study [24], MA is proposed to tune

The Scientific World Journal the parameters of SVM in classification problems. However, there are, if any, few works related to Mas that have been reported in the load forecasting literature on the issue of SVR parameters optimization. Notice that there are three important parameters in SVR, whereas SVM for classification has only two. With the increase of dimensions and the change of structure complexity for the optimization problem, the performance of MA is a big challenge. As such, it is of interest to involve the MAs for SVR parameters optimization in improving the prediction accuracy of STLF. In this study, by combining firefly algorithm (FA) and pattern search (PS), an efficient FA based memetic algorithm (FA-MA) is proposed to automatically determine the parameters of SVR for improving the forecasting accuracy of electricity load forecasting. In the proposed FA-MA, FA is responsible for the exploration of the search space and the detection of the potential regions with optimum solution, while PS is used to produce an effective exploitation on the potential regions obtained by FA. The performance of proposed FA-MA for parameters optimization in SVR is justified on two real-world cases against selected counterparts. The rest of the study is organized as follows. Section 2 presents a brief review on SVMs. Section 3 elaborates on the FA-MA proposed in this study. The results with discussions are reported in Section 4. Finally, we conclude this study in Section 5.

2. Support Vector Regression Given a set of training data, {(π‘₯1 , 𝑦1 ), (π‘₯2 , 𝑦2 ), . . . , (π‘₯𝑛 , 𝑦𝑛 )} βŠ‚ 𝑅𝑛 Γ— 𝑅, where π‘₯𝑖 is input vector, 𝑦𝑖 is the target, and 𝑛 denotes the number of the data items in the training set. Based on the structured risk minimization (SRM) principle [13], rather than finding minimum empirical errors, SVMs aim to generate a decision function (1) by minimizing a regularized risk function (2) 𝑓 (π‘₯) = βŸ¨π‘€, πœ™ (π‘₯)⟩ + 𝑏, 1 𝑅 = 𝑅emp + ‖𝑀‖2 2 =

𝐢 𝑛 1 βˆ‘πΏ (𝑦𝑖 , 𝑓 (π‘₯)) + ‖𝑀‖2 , 𝑛 𝑖=1 2

󡄨 󡄨󡄨 󡄨𝑦 βˆ’ 𝑓 (π‘₯)󡄨󡄨󡄨 βˆ’ πœ€ 𝐿 (𝑦𝑖 , 𝑓 (π‘₯)) = {󡄨 0

󡄨 󡄨 if 󡄨󡄨󡄨𝑦 βˆ’ 𝑓 (π‘₯)󡄨󡄨󡄨 β‰₯ πœ€ otherwise,

(1)

(2)

(3)

where in (1), ⟨,⟩ denotes the inner product, 𝑀 is the weight vector, that controls the smoothness of the model, and 𝑏 is a parameter of bias. πœ™(π‘₯) is the high-dimensional feature space which is nonlinearly mapped from the input space π‘₯. In the regularized risk function given by (2), the first term 𝑅emp or (𝐢/𝑛) βˆ‘π‘›π‘–=1 𝐿(𝑦𝑖 , 𝑓(π‘₯)) is the empirical risk. In SVR, Vapnik’s πœ€-insensitive loss function [40] given by (3) is often used to measure the empirical risk, and πœ€ is called the tube size. The second term, (1/2)‖𝑀‖2 , is the regularization term to be used as a measure of flatness or complexity of the function. Hence, 𝐢 is referred to as the regularized constant and it specifies

The Scientific World Journal

3

the trade off between the empirical risk and the regularization term. Both 𝐢 and πœ€ are user-determined parameters. By introducing two positive slack variables πœ‰ and πœ‰βˆ— , (2) is transformed into the following constrained form: Minimize

𝑛 1 𝑅 = πΆβˆ‘ (πœ‰π‘– + πœ‰π‘–βˆ— ) + ‖𝑀‖2 2 𝑖=1

Subjected to

𝑦𝑖 βˆ’ βŸ¨π‘€, π‘₯𝑖 ⟩ βˆ’ 𝑏 ≀ πœ€ + πœ‰π‘– βŸ¨π‘€, π‘₯𝑖 ⟩ + 𝑏 βˆ’ 𝑦𝑖 ≀ πœ€ +

(4)

πœ‰π‘–βˆ—

πœ‰π‘– , πœ‰π‘–βˆ— β‰₯ 0, 𝑖 = 1, 2, . . . , 𝑛. According to Wolfe’s dual theorem and the saddle-point condition, the dual optimization problem of the above primal one is obtained as in the following form: max βˆ— 𝛼,𝛼

1 𝑙 βˆ’ βˆ‘ (𝛼𝑖 βˆ’ π›Όπ‘–βˆ— ) (𝛼𝑗 βˆ’ π›Όπ‘—βˆ— ) βŸ¨πœ™ (π‘₯𝑖 ) , πœ™ (π‘₯𝑗 )⟩ 2 𝑖,𝑗=1 𝑙

𝑙

𝑖,𝑗=1

𝑖,𝑗=1

βˆ’ πœ€ βˆ‘ (𝛼𝑖 + π›Όπ‘–βˆ— ) + βˆ‘ 𝑦𝑖 (𝛼𝑖 βˆ’ π›Όπ‘–βˆ— ) 𝑙

s.t.

βˆ‘ (𝛼𝑖 βˆ’ π›Όπ‘–βˆ— ) = 0,

𝑖,𝑗=1

(5)

𝛼𝑖 , π›Όπ‘–βˆ— ∈ [0, 𝐢]

with 𝑛

𝑀 = βˆ‘ (𝛼𝑖 βˆ’ π›Όπ‘–βˆ— ) πœ™ (π‘₯𝑖 ) ,

(6)

𝑖=1

where 𝛼𝑖 , π›Όπ‘–βˆ— are nonnegative Lagrange multipliers that can be obtained by solving the convex quadratic programming problem stated above. Finally, based on the (6) and the trick of kernel function, the decision function given by (1) has the following explicit form: 𝑛

𝑓 (π‘₯) = βˆ‘ (𝛼𝑖 βˆ’ π›Όπ‘–βˆ— ) 𝐾 (π‘₯𝑖 , π‘₯𝑗 ) + 𝑏.

(7)

𝑖=1

Here, 𝐾(π‘₯𝑖 , π‘₯𝑗 ) is defined as kernel function. The value of the kernel function is equivalent to the inner product of two vectors π‘₯𝑖 and π‘₯𝑗 in the feature space πœ™(π‘₯𝑖 ) and πœ™(π‘₯𝑗 ); that is, 𝐾(π‘₯𝑖 , π‘₯𝑗 ) = βŸ¨πœ™(π‘₯𝑖 ), πœ™(π‘₯𝑗 )⟩. The elegance of using the kernel function is that one can deal with feature spaces of arbitrary dimensionality without having to compute the map πœ™(π‘₯) explicitly. Any function that satisfies Mercer’s condition [40] can be used as the kernel function. There are several typical examples of kernel function such as linear kernel, polynomial kernel, radial basis function (RBF), and sigmoid kernel. Each kernel has some parameters. Generally, among these kernel functions, RBF kernel (8) is strongly recommended and widely used for its performance and complexity [41] and thus SVR with RBF kernel function is the one studied in this study. Consider σ΅„©2 σ΅„© βˆ’σ΅„©σ΅„©σ΅„©σ΅„©π‘₯𝑖 βˆ’ π‘₯𝑗 σ΅„©σ΅„©σ΅„©σ΅„© (8) ), 𝐾 (π‘₯𝑖 , π‘₯𝑗 ) = exp ( 2𝛿2 where 𝛿 is kernel parameter. The kernel parameter should be carefully chosen as it implicitly defines the structure of

the high-dimensional feature space πœ™(π‘₯) and thus controls the complexity of the model. Overall, SVR is a powerful learning machine with strong theoretical foundations and excellent generalization performance. Note that before implementing the SVR with RBF kernel, there are three parameters (penalty parameter 𝐢, RBF kernel parameter 𝛿, and width of πœ€ loss function) to be set. Previous studies show that these three parameters play an important role in the success of SVR [42]. In this study, to determine these parameters and to improve the forecasting accuracy of SVR in electricity load forecasting, a firefly algorithm (FA) based memetic algorithm (FA-MA) is proposed in Section 3.

3. Memetic Algorithm for Parameters Selection of SVR Memetic algorithms (MAs), one of the recent growing areas in computational intelligence, is first coined by Moscato and Norman [43]. Inspired by Darwinian principles of natural evolution and Dawkins’ notion of meme, MA has come to light as an union of population based stochastic global evolutionary algorithm and local improvement procedures. As a designed a hybridization, MAs are expected to make full use of the balance between exploration and exploitation of the search space to complement the advantages of population based methods and local based methods. Nowadays, MAs have revealed their successes with high performance and superior robustness across a wide range of problem domains; detail reviews are reported in [44, 45]. Since often there are no free lunches, the hybridization can be more complex and expensive to implement. Considering the effectiveness of firefly algorithm which is introduced recently and can be even superior to the GA and PSO [46–48], this study proposed a FA based memetic algorithm with pattern search as a local individual learner, to improve the forecasting accuracy of electricity load forecasting model using SVR. In the following subsections, we will explain the implementation of the proposed FA-MA for parameters optimization in details. 3.1. Initialization. In the proposed FA-MA, each firefly (or individual) is a parameter set of the SVR model and can be denoted as x𝑖 = ⟨𝐢, 𝛿, πœ€βŸ©. A set of fireflies is called a swarm or population. Traditionally, initial swarm is often generated randomly in firefly algorithm or other evolutionary algorithms. To guarantee an initial swarm with reliability and diversity, Latin hypercube sampling (LHS) method is applied to generate a random sample set. With the use of LHS, we first split the search space into subspaces and then try to take randomly the values within each subspace to achieve an initial sample set which is representative of the whole search space. Hence, it can guarantee the initial samples to be relatively uniformly distributed over each dimension, which is proved to be superior to random initialization [49]. 3.2. Fitness Function. Since the ultimate goal of the SVR model is to forecast the future electricity load with high

4

The Scientific World Journal

accuracy (i.e., known as generalization ability), it is important to choose such fitness function which can estimate the generalization ability when determining the parameters in SVR with FA-MA. In this study, the data is split into three parts which are training set, validation set, and testing set. The training set is used to train the SVR model with a certain parameter set, and the validation set is deserved to assess the generalization ability of the established forecasting model. The parameter set with lowest mean squared percentage error (MAPE) (For convenience, the formulation of MAPE is given in Section 4.2.) in the validation set is selected as the optimal solution. That is to say, MAPE in the validation set is used as the fitness function. In the proposed firefly algorithm based memetic algorithm, the brightness or light intensity of a firefly is determined by the fitness function.

proposed memetic algorithm. In some sense, the main objective of PS is to conduct individual learning by exploiting small local regions effectively in relatively short periods of time. Pattern search investigates nearest neighborhood of the current solution and tries to find a better move. If all neighbors fail to produce an improvement, then the search step is reduced. This search stops until the search step gets sufficiently small, ensuring the convergence to a local minimum. The pattern search is based on a pattern π‘ƒπ‘˜ that defines the neighborhood of current solution. A well often used pattern is five-point unit-size rood pattern which can be represented by the generating matrix π‘ƒπ‘˜ in (10):

3.3. Exploration with Firefly Algorithms (FAs). Firefly algorithm, first introduced by Yang et al. [46, 47], is a swarm based intelligent metaheuristic. The FA mimics the social behavior of fireflies which move and communicate with each other based on their flashing characteristics, such as brightness, frequency, and the time period. Specially, the superiority of FA against genetic algorithms (GAs) and particle swarm optimization (PSO) in existing studies [46, 47] motivates us to use the FA to explore the search space. In FA, each firefly is assumed to be attracted to other ones regardless of their sex, and the attractiveness is proportional to their brightness. Besides, as mentioned before, the brightness of a firefly is determined by the fitness function. To minimize the fitness defined in Section 3.2, the brightness can simply be minus of the MAPE. The movement of a firefly 𝑖 attracted by another more attractive firefly 𝑗 can be formulated as (Details of the definition are shown in Yang [46].)

The procedure of pattern search is outlined in Algorithm 1. Ξ” 0 denotes the default search step of PS, Ξ” is a search step, π‘π‘˜ is a column of π‘ƒπ‘˜ , and Ξ© denotes the neighborhood of the current solution. The termination conditions are the maximum iteration is met or the search step gets a predefined small value. To balance the amount of computational budget allocated for exploration versus exploitation, Ξ”0/8 is experimentally selected as the minimum search step.

2 1 V𝑖 = V𝑖 + 𝛽0 π‘’βˆ’π›Ύπ‘Ÿπ‘–π‘— (V𝑗 βˆ’ V𝑖 ) + 𝛼 (rand βˆ’ ) , 2

(9)

where second term is the attraction of firefly 𝑗 to firefly 𝑖, and the third term is the randomization of the movement. 𝛾 is a absorption coefficient, π‘Ÿπ‘–π‘— is the Cartesian distance between two fireflies 𝑖 and 𝑗. 𝛽0 is the attractiveness at π‘Ÿπ‘–π‘— = 0, 𝛼 is a randomization parameter, rand is a random number generator uniformly distributed in [0, 1]. As recommended by [46], 𝛾 = 1, 𝛽0 = 1, and 𝛼 ∈ [0, 1] are used in this study. Besides, 𝛼 is often replaced by a π›Όπ‘†π‘˜ where the scaling parameters π‘†π‘˜ is determined by the actual scales of the problem. 3.4. Refinement with Pattern Search. In the proposed FA-MA, pattern search is employed to conduct exploitation of the parameters solution space. Pattern search (PS), a simple effective optimization technique, has already been successfully used in parameters optimization in previous studies [50]. By examining the neighborhood of the current solution, pattern search is very effective to exploit the local regions. In addition, its convergence to local minima for constrained problems as well as unconstrained problems has been proven in [51]. Thus, it is deserved to enhance the local exploitation of the FA in

1 0 0 βˆ’1 0 0 π‘ƒπ‘˜ = [0 1 0 0 βˆ’1 0 ] . [0 0 1 0 0 βˆ’1]

(10)

3.5. Description of the Proposed FA-MA. Based on the population initialization, firefly algorithm based exploration, and pattern search based individual learning, a FA-MA is illustrated in Algorithm 2. It can be seen that FA-MA not only applies the FA to effectively perform exploration for promising solution in the whole search space but also employs pattern search to perform exploitation for individual learning in local spaces. To guarantee an initial swarm with diversity, Latin hypercube sampling method is applied to generate a random sample set. In addition, it is important to balance the exploration and exploitation under limited computational budget in MA. Hence, in this study, each firefly undergoes local refinement with a specified probability pl(π‘₯π‘˜ ), and the selection probability is defined by a roulette wheel section scheme with linear scaling [24]: pl (π‘₯π‘˜ ) =

𝑓max (P) βˆ’ 𝑓 (π‘₯π‘˜ ) , βˆ‘π‘¦βˆˆP (𝑓max (P) βˆ’ 𝑓 (𝑦))

(11)

where 𝑓 is a fitness function (i.e., MAPE in this study) and 𝑓max (P) is the maximum fitness value among the current population P. With this selection probability, a firefly with better fitness value gains more chance to be selected for exploitation. Since both exploration and exploitation are stressed and balanced, it is expected to have good ability for improving the load forecasting with SVR. In the next section, we will investigate the performance of the proposed FA-MA.

4. Experimental Results 4.1. Experimental Setup. To verify the electricity load forecasting performance of the proposed SVR-MA model, two

The Scientific World Journal

Begin

5

Initialize: predefine the default search step Ξ” 0 , initialize current solution 𝑝0 ; Ξ” = Ξ” 0; While (Termination conditions are not satisfied) Ξ© = {𝑝0 + Ξ” βˆ— π‘π‘˜ | for each column π‘π‘˜ in π‘ƒπ‘˜ }; Evaluate the nearest neighbors in Ξ©; If (there are improvements in the Ξ©) Update the current solution to the best neighbor in Ξ©; Ξ” = Ξ” 0 ; Else Decrease the search step Ξ” = Ξ” 0 /2; End If End While

End Algorithm 1: Pseudocode of pattern search for individual learning.

Begin Initialize the parameters in MA, such as, swarm size 𝑛, scaling parameters π‘†π‘˜ , the maximum iteration number, and default search step Ξ” in PS; Generate the initial population of fireflies {π‘₯𝑖 , 𝑖 = 1, 2, ..., 𝑛}, using Latin Hypercube Sampling (LHS) method; Evaluate the fitness of each firefly in the population: use each firefly to train the SVR based forecasting model and calculate the fitness (MAPE); While (Termination conditions are not satisfied) For 𝑖 = 1: 𝑛 in all 𝑛 fireflies For 𝑗 = 1: 𝑖 in all 𝑛 fireflies If (𝐼𝑗 > 𝐼𝑖 ) firefly 𝑖 move to a new position according to (11); End If Evaluate the fitness of each firefly and update its light intensity; End For End For Rank the fireflies and find the current best; %%%% Conduct individual learning with pattern search with a probability pl For π‘˜ = 1: 𝑛 in all 𝑛 fireflies If (rand > pl(π‘₯π‘˜ )) Conduct individual learning for firefly π‘₯π‘˜ with pattern search (as shown in Algorithm 1); End If End For End While Output the best firefly as the final optimal solution End Algorithm 2: Pseudocode of proposed firefly algorithm based memetic algorithm.

real-life cases are considered in this study. The first one is the hourly observations from Pennsylvania-New JerseyMaryland (PJM) power system, which is a well-established electricity market in U.S. The data consists of 18 months of hourly observations, from January 1, 2010, to 31 June, 2011 (data are available from PJM Interconnection, http://www. pjm.com). The series consists of 13104 hourly observations. The second one is the monthly electric load of Northeast China which has been investigated in the existing literature [34]. This data consists of 64 monthly observations with the date from January 2004 to April 2009. As a preprocessing stage, several missing load values are filled in by the average of the neighboring values. By adopting linear transformation (in (12)), the series are linearly scaled to the range [0, 1]. The main advantage of scaling is to avoid

attributes in greater numeric ranges dominating those in smaller numeric ranges. Another advantage is to prevent numerical difficulties during the calculation [41] π‘₯ βˆ’ min𝐴 , π‘₯σΈ€  = (12) max𝐴 βˆ’ min𝐴 where π‘₯ is an original value of attribute 𝐴, π‘₯σΈ€  is the scaled value, min𝐴 is the minimum of attribute 𝐴, and max𝐴 is the maximum of attribute 𝐴. It should be noted that the forecasting value will be rescaled back following the reverse of the linear transformation and the forecasting performance is calculated based on the original scale of the data. 4.2. Performance Measures. To assess the forecasting performance of a model, many accuracy measures can be used [52].

6

The Scientific World Journal

Table 1: Performance metrics and their formulas in regression problems. Metrics

Formula MAPE =

MAPE

1 𝑁 󡄨󡄨󡄨󡄨 𝑦𝑑+𝑖 βˆ’ 𝑦̂𝑑+𝑖 󡄨󡄨󡄨󡄨 βˆ‘σ΅„¨ 󡄨 Γ— 100 𝑁 𝑖=1 󡄨󡄨󡄨 𝑦𝑑+𝑖 󡄨󡄨󡄨

󡄨󡄨 󡄨󡄨 󡄨󡄨 𝑦𝑑+𝑖 βˆ’ 𝑦̂𝑑+𝑖 1 󡄨󡄨󡄨 󡄨󡄨 󡄨 MASE = βˆ‘ 󡄨󡄨 󡄨 󡄨 󡄨 𝑑 𝑁 𝑖=1 󡄨󡄨 (1/ (𝑑 βˆ’ 1)) βˆ‘π‘—=2 󡄨󡄨󡄨𝑦𝑗 βˆ’ π‘¦π‘—βˆ’1 󡄨󡄨󡄨 󡄨󡄨󡄨 󡄨 󡄨 󡄨󡄨 𝑁

MASE

𝑁

βˆ‘π‘–=2 𝑑𝑖 Γ— 100 π‘βˆ’1 {1, if (𝑦𝑑+𝑖 βˆ’ 𝑦𝑑+π‘–βˆ’1 ) (𝑦𝑑+𝑖 βˆ’ 𝑦̂𝑑+π‘–βˆ’1 ) β‰₯ 0 𝑑𝑖 = { {0, otherwise DS =

DS

𝑁 is the number of forecasting periods, 𝑦𝑑+𝑖 is the actual value at period 𝑑 + 𝑖, 𝑦̂𝑑+𝑖 is the forecasting value at period 𝑑 + 𝑖, and 𝑦 is the mean of all values. In this study, the day-ahead (24 hours) short-term load is forecasted recursively, so the number of forecasting periods N equals 24.

However, it is hard to say whether one accuracy measure is better or worse than the other [53]. Besides, different metrics may evaluate the quality of the forecasting performance from different perspectives. In this study, three accuracies are selected to assess the prediction performance, they are mean absolute percentage error (MAPE), mean absolute scaled error (MASE), and directional symmetry (DS). The definitions of them can be found in Table 1. MAPE is one often used metric, which measure the percentage error between the actual and predicted values. The smaller the values of MAPE, the closer the predicted values to the actual values. MASE is a scaled error which is scaled by a naΒ¨Δ±ve forecast model. MASE is less than one if the forecast is better than the naΒ¨Δ±ve method, and the smaller the values of MASE, the better the naΒ¨Δ±ve method. It is highly recommended in recent study as it is less sensitive to outliers and easy to be interpreted [52]. DS provides an indication of the accuracy of the predicted direction and the large value suggests a better predictor. Furthermore, a nonparametric Wilcoxon’s signed-rank test [54] is performed to determine if there is significant difference between the two approaches based on the prediction error of the testing data sets. This test performs a two sample rank test for the difference between two population medians. Since the population distributions of the performance measures are unknown, a nonparametric test is suggested for the performance comparison of the two models [55]. 4.3. Results and Discussions. In the first case, the 24 stepahead electricity load is predicted directly. The data are divided into three parts: training set, validation set, and testing set. The periods and number of observations of each set are shown in Table 2. As mentioned in Section 3.2, the training set and validation set are used to determine the optimal parameters, and then the forecasting model is established in the integrated training set (training set and validation set). At last, the testing set is used to assess the out-of-sample forecasting performance of the proposed model with optimal

Table 2: Training, validation, and testing set for the first sample case. Data sets Training set Validation set Testing set

Period 1/1/2010–12/31/2010 1/1/2011–3/31/2011 4/1/2011–6/30/2011

No. of observation 365 βˆ— 24 90 βˆ— 24 91 βˆ— 24

Table 3: MAPE (%) of SVR model with different parameter determination methods. Period April May June ALL

FA-MA 1.24 1.34 1.48 1.35

FA 1.51 1.53 1.78 1.61

GA 1.67 1.77 1.71 1.72

PSO 1.73 1.83 2.00 1.85

SA 1.95 2.00 2.14 2.03

Table 4: MASE of SVR model with different parameter determination methods. Period April May June ALL

FA-MA 0.33 0.37 0.41 0.37

FA 0.38 0.43 0.48 0.42

GA 0.44 0.50 0.48 0.47

PSO 0.47 0.53 0.50 0.50

SA 0.51 0.51 0.56 0.53

parameters obtained by memetic algorithms. Considering the short-run trend, daily and weekly periodicity characteristics of hourly load, the hourly load values of the last one day, and the similar hours in the previous 30 days are selected as the input variables set of the forecasting model. Then, the input variables are selected from the variables set by a filter method which maximizes the mutual information using forwardbackward selection strategy [56]. To verify the improvement of forecasting accuracy with our proposed memetic algorithm in SVR based forecasting model, four well-known evolutionary algorithms (EAs) including genetic algorithm (GA), particle swarm optimization (PSO), simulated annealing (SA), and firefly algorithm (FA) are selected to determine the parameters (𝐢, 𝛿, and πœ€) in SVR based load forecasting model. The experiments are implemented in MATLAB 2012a using computer with Intel Core 2 Duo CPU T5750, 2.00 GHZ, and 2 G RAM. The parameters’ search space in SVR is defined as an exponentially growing space: log2 𝐢 ∈ [βˆ’6, 6], log2 𝛾 ∈ [βˆ’6, 6], and log2 πœ€ ∈ [βˆ’6, 6]. The parameters in each EA are controlled based on initial experiments. More specifically, the population size of each method is set as 30, and the stopping criterions of each method are set as follows: the number of iterations reached 150 or there is no improvement in the fitness for 50 consecutive iterations. The scaling parameters π‘†π‘˜ in firefly algorithm and FA-MA are both set as 1, which is 1/6 percentage of the maximum of search space. The forecasting results of different EAs based SVR forecasting model in each separate month and the whole testing period are illustrated in Tables 3, 4, and 5. For the purpose of reducing statistical errors, the results in Tables 3–5 are average results of 30 independently trial runs. From Tables 3–5, several observations can be drawn. Firstly, compared with GA,

The Scientific World Journal

7

Table 5: DS (%) of SVR model with different parameter determination methods. Period April May June ALL

FA-MA 96.49 95.73 95.30 95.84

FA 94.74 94.32 93.30 94.12

GA 93.78 93.69 93.50 93.66

PSO 93.15 93.08 92.56 92.93

SA 92.77 92.54 91.50 92.27

Table 6: Time consuming of SVR model with different parameter determination methods. CPU time (min)

FA-MA 27.3

FA 20.7

GA 25.6

PSO 21.9

SA 22.5

PSO, and SA, the FA based forecasting model can obtain the best performance in most of the periods for each metric, which imply the superior ability of determining the parameters in SVR and thus improving the forecasting performance. Secondly, by using the proposed MA to determine the parameters in SVR forecasting model, the forecasting results outperform the FA based forecasting model. The superior performance against FA can be contributed to the integration of pattern search for finely exploitation and the balance between exploration and exploitation in proposed MA. Thirdly, the proposed MA has the lowest MAPE and MASE, with the largest DS, which confirms the superiority of MA in improving the forecasting accuracy by enhancing the parameters determination process of SVR forecasting model. Table 6 reports the time consumption of each evolutionary algorithm in selecting the optimal parameters in SVR forecasting model. From Table 6, we can see that the proposed FA-MA is a little more time consuming than that of the other four methods, which is mainly due to the finely exploitation with pattern search. However, in the real-world applications, this computational time is acceptable within a day-ahead decision making framework, and considering the forecasting performance of FA-MA in improving the forecasting, it can be used as an alternative method to improve the forecasting accuracy when the support vector regression (SVR) is used. Furthermore, three well-known forecasting model, including radial basis function neural network (RBFNN), MLP neural network trained by LM (Levenberg-Marquardt), and autoregression integrated moving average (ARIMA), are selected to compare the day-ahead forecasting performance of the proposed FA-MA based SVR model. For the sake of fair comparison, the above two neural network based models have the same process in data preprocess, input selection, and parameters tuning to our proposed FA-MA based SVR model. While for ARIMA, the forecast package [57] in 𝑅 is used to forecast the load. Tables 7, 8, and 9 show the comparison of average results of three separated months and the whole period. From Tables 7–9, it can be observed that the ARIMA is the worse one in each month and the whole period, which is mainly due to the linearity assumption. Besides, the proposed FA-MA based SVR model outperforms all the other forecasting models in terms of MAPE, MASE, and DS. Moreover, to verify the significance of accuracy improvement of proposed FA-MA based SVR model, a nonparametric

Table 7: MAPE of four forecasting models. Period April May June ALL

FA-MA 1.24 1.34 1.48 1.35

RBF 2.37 2.38 2.45 2.40

MLP-LM 2.35 2.39 2.51 2.42

ARIMA 4.91 5.00 5.20 5.04

Table 8: MASE of four forecasting models. Period April May June ALL

FA-MA 0.33 0.37 0.41 0.37

RBF 0.60 0.60 0.64 0.61

MLP-LM 0.59 0.63 0.66 0.63

ARIMA 0.72 0.75 0.78 0.75

Table 9: DS of four forecasting models. Period April May June ALL

FA-MA 96.49 95.73 95.30 95.84

RBF 90.12 89.15 89.78 89.68

MLP-LM 90.34 89.01 89.19 89.51

ARIMA 85.51 85.31 84.40 85.07

Wilcoxon’s signed-rank test is used to test the significant different of FA-MA based SVR with three other models. The significant test shows that our proposed model is statistically superior to others at the 0.05 significance levels. In addition, to give a graphical view about the forecasting performance of the proposed model, the curves of real values, forecast values, and forecast errors are shown in Figure 1. It is obviouse that the forecast curve accurately predicts the real values and only minor errors are obtained. The figure further illustrates the effectiveness of our proposed forecasting model. In addition, to further verify the proposed performance of FA-MA against the existing hybrid methods, the second case, obtained from previous studies [34], is applied here. Similar to previous studies [34], the last 7 months are predicted. The seasonal mechanism effects stated in [34] are also taken into consideration. Table 10 shows the actual values and the forecasting load obtained by different forecasting models. The TF-πœ€-SVR-SA reports the results from [17]; the SVR model is optimized by SA. For CGASA and S-CGASA, the forecasts were generated by the SVR model with or without the seasonal mechanism, respectively. The parameters in these two SVR models were tuned by a hybrid algorithm, namely, chaotic genetic algorithm-simulated annealing algorithm (CGASA). In the last two columns, the SVR models were optimized by our proposed FA-MA algorithm. The only difference between them is that the S-FA-MA takes the seasonal mechanism effects into account. As illustrated in Table 10, the proposed FA-MA gains smaller MAPE, MASE than TFπœ€-SVR-SA and CGASA. However, the results of SVRFA-MA is worse than those of S-CGASA, which is mainly due to an involvement of a seasonal mechanism in S-CGASA. Similar to S-CGASA, S-FA-MA also makes full use of the seasonal effects; it generates superior performance to S-CGASA in terms of MAPE, MASE and has competitive performance

8

The Scientific World Journal Table 10: Comparison with existing hybrid algorithms.

Period Oct.08 Nov.08 Dec.08 Jan.09 Feb.09 Mar.09 Apr.09 MAPE (%) MASE DA (%)

Actual 181.07 180.56 189.03 182.07 167.35 189.3 175.84

TF-πœ€-SVR-SA 184.5035 190.3608 202.9795 195.7532 167.5795 185.9358 180.1648 3.799 0.576 83.333

CGASA 177.3 177.4428 177.5848 177.7263 177.8673 178.0078 178.6806 3.731 0.554 33.333

Γ—104 14

12 10

Load

8 6 4 2 0 βˆ’2

0

500

1000

1500

2000

2500

Hour Error Real Forecast

Figure 1: Curves of real values, forecast values, and errors of proposed FA-MA based SVR model.

with S-CGASA in terms of DA. Thus, in this case, we can conclude that the proposed FA-MA is superior to the existing hybrid SA, CGASA in improving the forecasting accuracy of SVR models. The outstanding forecasting performance of our proposed FA-MA against the existing hybrid algorithm (i.e., CGASA) is caused by the following reasons. Firstly, based on the framework of memetic algorithm, both global exploration and local exploitation are enhanced in the proposed FA-MA, which not only can avoid the premature convergence but also ensure searching capability. Secondly, a roulette wheel section scheme is applied in the proposed FA-MA to select the individuals to be refined, which generates a good balance to the exploration and exploitation.

5. Conclusions Electricity load forecasting is an important issue to operate the power system reliably and economically. In this study, to improve forecasting accuracy of electricity load forecasting using support vector regression (SVR), a firefly algorithm

S-CGASA 175.6385 185.21 189.907 181.9693 163.2805 182.1747 177.6289 1.901 0.237 83.333

FA-MA 175.9047 184.5484 195.4447 185.5828 161.4537 184.854 177.2037 2.433 0.326 83.333

S-FA-MA 178.2513 184.2637 188.9679 181.7957 161.9352 181.9227 176.1128 1.583 0.217 83.333

(FA) based memetic algorithm (FA-MA) was presented. In the proposed FA-MA, FA was employed to explore the search space and detect the potential regions, while pattern search (PS) was used to conduct the individual learning to improve the exploitation ability of FA. With the proposed FA-MA used to determine the parameters of SVR, a novel forecasting model, FA-MA based SVR, was presented to forecast the electricity load with two real cases. In the first case, four evolutionary algorithms (FA, GA, PSO, and SA) based SVR forecasting models and three well-known models (RBFNN, MLPLM, and ARIMA) were selected to compare the forecasting performance. Computational results show that the proposed FA-MA could effectively improve the forecasting accuracy of SVR compared with some other evolutionary algorithms based SVR. Meanwhile, the FA-MA based SVR forecasting model could outperform the selected counterparts significantly. In the second case, comparison results show that the proposed FA-MA is superior to the existing hybrid algorithm in the literature. However, in this study, only the historical load values are taken into consideration to forecast the electricity load, and some exogenous variables (i.e., temperature, humidity) are also very important to improve the forecasting accuracy. Other topics include more extensive comparison with other models, developing more efficient memetic algorithms and seasonal adjustment. Extensive experimental studies in other forecasting problems and benchmark functions can be investigated. Future work will be on the research of the above cases.

Conflict of Interests The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments This work was supported by the Natural Science Foundation of China under Project no. 70771042, the Fundamental Research Funds for the Central Universities (2012QN208HUST), the Ministry of Education in China (MOE) Project of Humanities and Social Science (Project no. 13YJA630002), and a Grant from the Modern Information Management Research Center at Huazhong University of Science and Technology (2013WZ005 2012WJD002).

The Scientific World Journal

References [1] S. Fan and L. Chen, β€œShort-term load forecasting based on an adaptive hybrid method,” IEEE Transactions on Power Systems, vol. 21, no. 1, pp. 392–401, 2006. [2] D. W. Bunn and E. D. Farmer, Comparative Models for Electrical Load Forecasting, John Wiley & Sons, New York, NY, USA, 1985. [3] S. Saab, E. Badr, and G. Nasr, β€œUnivariate modeling and forecasting of energy consumption: the case of electricity in Lebanon,” Energy, vol. 26, no. 1, pp. 1–14, 2001. [4] A. P. Douglas, A. M. Breipohl, F. N. Lee, and R. Adapa, β€œThe impacts of temperature forecast uncertainty on bayesian load forecasting,” IEEE Transactions on Power Systems, vol. 13, no. 4, pp. 1507–1513, 1998. [5] J. H. Park, Y. M. Park, and K. Y. Lee, β€œComposite modeling for adaptive short-term load forecasting,” IEEE Transactions on Power Systems, vol. 6, no. 2, pp. 450–457, 1991. [6] A. Goia, C. May, and G. Fusai, β€œFunctional clustering and linear regression for peak load forecasting,” International Journal of Forecasting, vol. 26, no. 4, pp. 700–711, 2010. [7] H. A. Amarawickrama and L. C. Hunt, β€œElectricity demand for Sri Lanka: a time series analysis,” Energy, vol. 33, no. 5, pp. 724– 739, 2008. [8] H. S. Hippert, C. E. Pedreira, and R. C. Souza, β€œNeural networks for short-term load forecasting: a review and evaluation,” IEEE Transactions on Power Systems, vol. 16, no. 1, pp. 44–55, 2001. [9] P. Mandal, T. Senjyu, N. Urasaki, and T. Funabashi, β€œA neural network based several-hour-ahead electric load forecasting using similar days approach,” International Journal of Electrical Power and Energy Systems, vol. 28, no. 6, pp. 367–373, 2006. [10] Z. A. Bashir and M. E. El-Hawary, β€œApplying wavelets to shortterm load forecasting using PSO-based neural networks,” IEEE Transactions on Power Systems, vol. 24, no. 1, pp. 20–27, 2009. [11] P. Lauret, E. Fock, R. N. Randrianarivony, and J.-F. ManicomRamsamy, β€œBayesian neural network approach to short time load forecasting,” Energy Conversion and Management, vol. 49, no. 5, pp. 1156–1166, 2008. [12] N. Amjady and F. Keynia, β€œA new neural network approach to short term load forecasting of electrical power systems,” Energies, vol. 4, no. 3, pp. 488–503, 2011. [13] V. N. Vapnik, The Nature of Statistical Learning Theory, Springer, New York, NY, USA, 1995. [14] N. Sapankevych and R. Sankar, β€œTime series prediction using support vector machines: a survey,” IEEE Computational Intelligence Magazine, vol. 4, no. 2, pp. 24–38, 2009. [15] Y. Bao, T. Xiong, and Z. Hu, β€œForecasting air passenger traffic by support vector machines with ensemble empirical mode decomposition and slope-based method,” Discrete Dynamics in Nature and Society, vol. 2012, Article ID 431512, 12 pages, 2012. [16] E. E. Elattar, J. Goulermas, and Q. H. Wu, β€œElectric load forecasting based on locally weighted support vector regression,” IEEE Transactions on Systems, Man and Cybernetics C, vol. 40, no. 4, pp. 438–447, 2010. [17] J. Wang, W. Zhu, W. Zhang, and D. Sun, β€œA trend fixed on firstly and seasonal adjustment model combined with the πœ€-SVR for short-term forecasting of electricity demand,” Energy Policy, vol. 37, no. 11, pp. 4901–4909, 2009. [18] P.-F. Pai and W.-C. Hong, β€œSupport vector machines with simulated annealing algorithms in electricity load forecasting,” Energy Conversion and Management, vol. 46, no. 17, pp. 2669– 2688, 2005.

9 [19] T. Xiong, Y. Bao, and Z. Hu, β€œMultiple-output support vector regression with a firefly algorithm for interval-valued stock price index forecasting,” Knowledge-Based Systems, vol. 55, pp. 87–100, 2014. [20] Y. Bao, T. Xiong, and Z. Hu, β€œMulti-step-ahead time series prediction using multiple-output support vector regression,” Neurocomputing, 2013. [21] P. Duan, K. G. Xie, T. T. Guo, and X. G. Huang, β€œShort-term load forecasting for electric power systems using the PSO-SVR and FCM clustering techniques,” Energies, vol. 4, no. 1, pp. 173–184, 2011. [22] O. Chapelle, V. Vapnik, O. Bousquet, and S. Mukherjee, β€œChoosing multiple parameters for support vector machines,” Machine Learning, vol. 46, no. 1–3, pp. 131–159, 2002. [23] Y. K. Bao and Z. T. Liu, β€œA fast grid search method in support vector regression forecasting time series,” in Intelligent Data Engineering and Automated Learningβ€”IDEAL 2006, E. Corchado, H. Yin, V. Botti, and C. Fyfe, Eds., vol. 4224, pp. 504–511, Springer, Berlin, Germany, 2006. [24] Y. Bao, Z. Hu, and T. Xiong, β€œA PSO and pattern search based memetic algorithm for SVMs parameters optimization,” Neurocomputing, vol. 117, pp. 98–106, 2013. [25] T. A. F. Gomes, R. B. C. PrudΛ†cncio, C. Soares, A. L. D. Rossi, and A. Carvalho, β€œCombining meta-learning and search techniques to select parameters for support vector machines,” Neurocomputing, vol. 75, no. 1, pp. 3–13, 2012. [26] J. Wang, L. Li, D. Niu, and Z. Tan, β€œAn annual load forecasting model based on support vector regression with differential evolution algorithm,” Applied Energy, vol. 94, pp. 65–70, 2012. [27] H. Li, S. Guo, H. Zhao, C. Su, and B. Wang, β€œAnnual electric load forecasting by a least squares support vector machine with a fruit fly optimization algorithm,” Energies, vol. 5, no. 11, pp. 4430–4445, 2012. [28] P.-F. Pai and W.-C. Hong, β€œForecasting regional electricity load based on recurrent support vector machines with genetic algorithms,” Electric Power Systems Research, vol. 74, no. 3, pp. 417– 425, 2005. [29] W. C. Hong, Y. Dong, W. Y. Zhang, L. Y. Chen, and B. K. Panigrahi, β€œCyclic electric load forecasting by seasonal SVR with chaotic genetic algorithm,” International Journal of Electrical Power & Energy Systems, vol. 44, no. 1, pp. 604–614, 2013. [30] W.-C. Hong, β€œChaotic particle swarm optimization algorithm in a support vector regression electric load forecasting model,” Energy Conversion and Management, vol. 50, no. 1, pp. 105–117, 2009. [31] W.-C. Hong, β€œElectric load forecasting by seasonal recurrent SVR (support vector regression) with chaotic artificial bee colony algorithm,” Energy, vol. 36, no. 9, pp. 5568–5578, 2011. [32] W.-C. Hong, β€œElectric load forecasting by support vector model,” Applied Mathematical Modelling, vol. 33, no. 5, pp. 2444–2454, 2009. [33] W.-C. Hong, Y. Dong, C.-Y. Lai, L.-Y. Chen, and S.-Y. Wei, β€œSVR with hybrid chaotic immune algorithm for seasonal load demand forecasting,” Energies, vol. 4, no. 6, pp. 960–977, 2011. [34] W. Y. Zhang, W. C. Hong, Y. C. Dong, G. Tsai, J. T. Sung, and G. F. Fan, β€œApplication of SVR with chaotic GASA algorithm in cyclic electric load forecasting,” Energy, vol. 45, no. 1, pp. 850– 858, 2012. [35] S. Areibi and Z. Yang, β€œEffective memetic algorithms for VLSI design = genetic algorithms + local search + multi-level clustering,” Evolutionary Computation, vol. 12, no. 3, pp. 327–353, 2004.

10 [36] Q. H. Nguyen, Y.-S. Ong, and M. H. Lim, β€œA probabilistic memetic framework,” IEEE Transactions on Evolutionary Computation, vol. 13, no. 3, pp. 604–623, 2009. [37] A. Elhossini, S. Areibi, and R. Dony, β€œStrength pareto particle swarm optimization and hybrid EA-PSO for multi-objective optimization,” Evolutionary Computation, vol. 18, no. 1, pp. 127– 156, 2010. [38] Y.-S. Ong, M. H. Lim, and X. Chen, β€œMemetic computationβ€” past, present & future [research frontier],” IEEE Computational Intelligence Magazine, vol. 5, no. 2, pp. 24–31, 2010. [39] J. Tang, M. H. Lim, and Y. S. Ong, β€œDiversity-adaptive parallel memetic algorithm for solving large scale combinatorial optimization problems,” Soft Computing, vol. 11, no. 9, pp. 873–888, 2007. [40] V. N. Vapnik, Statistical Learning Theory, John Wiley & Sons, New York, NY, USA, 1998. [41] C. W. Hsu, C. C. Chang, and C. J. Lin, β€œA practical guide to support vector classification,” Department of Computer Science, National Taiwan University, 2003. [42] F. E. H. Tay and L. Cao, β€œApplication of support vector machines in financial time series forecasting,” Omega, vol. 29, no. 4, pp. 309–317, 2001. [43] P. Moscato and M. G. Norman, β€œA β€œmemetic” approach for the traveling salesman problem implementation of a computational ecology for combinatorial optimization on message-passing systems,” in Proceedings of the International Conference on Parallel Computing and Transputer Applications, 1992. [44] F. Neri and C. Cotta, β€œMemetic algorithms and memetic computing optimization: a literature review,” Swarm and Evolutionary Computation, vol. 2, pp. 1–14, 2012. [45] X. Chen, Y.-S. Ong, M.-H. Lim, and K. C. Tan, β€œA multifacet survey on memetic computation,” IEEE Transactions on Evolutionary Computation, vol. 15, no. 5, pp. 591–607, 2011. [46] X. S. Yang, β€œFirefly algorithms for multimodal optimization,” in Stochastic Algorithms: Foundations and Applications, O. Watanabe and T. Zeugmann, Eds., vol. 5792, pp. 169–178, Springer, Berlin, Germany, 2009. [47] X.-S. Yang, S. S. S. Hosseini, and A. H. Gandomi, β€œFirefly algorithm for solving non-convex economic dispatch problems with valve loading effect,” Applied Soft Computing Journal, vol. 12, no. 3, pp. 1180–1186, 2012. [48] K. Chandrasekaran and S. P. Simon, β€œOptimal deviation based firefly algorithm tuned fuzzy design for multi-objective UCP,” IEEE Transactions on Power Systems, vol. 28, no. 1, pp. 460–471, 2013. [49] M. D. Mckay, R. J. Beckman, and W. J. Conover, β€œA comparison of three methods for selecting values of input variables in the analysis of output from a computer code,” Technometrics, vol. 42, no. 1, pp. 55–61, 2000. [50] M. Momma and K. P. Bennett, β€œA pattern search method for model selection of support vector regression,” in Proceedings of the 2nd SIAM International Conference on Data Mining, pp. 261– 274, 2002. [51] E. D. Dolan, R. M. Lewis, and V. Torczon, β€œOn the local convergence of pattern search,” SIAM Journal on Optimization, vol. 14, no. 2, pp. 567–583, 2003. [52] R. J. Hyndman and A. B. Koehler, β€œAnother look at measures of forecast accuracy,” International Journal of Forecasting, vol. 22, no. 4, pp. 679–688, 2006. [53] J. S. Armstrong and F. Collopy, β€œError measures for generalizing about forecasting methods: empirical comparisons,” International Journal of Forecasting, vol. 8, no. 1, pp. 69–80, 1992.

The Scientific World Journal [54] F. X. Diebold and R. Mariano, β€œComparing predictive accuracy,” Journal of Business and Economic Statistics, vol. 13, pp. 253–263, 1995. [55] W. Conover, Practical Nonparametric Statistics, John Wiley & Sons, New York, NY, USA, 2nd edition, 1980. [56] A. Sorjamaa, J. Hao, N. Reyhani, Y. Ji, and A. Lendasse, β€œMethodology for long-term prediction of time series,” Neurocomputing, vol. 70, no. 16–18, pp. 2861–2869, 2007. [57] R. J. Hyndman and Y. Khandakar, β€œAutomatic time series forecasting: the forecast package for R,” Journal of Statistical Software, vol. 27, no. 3, pp. 1–22, 2008.

Electricity load forecasting using support vector regression with memetic algorithms.

Electricity load forecasting is an important issue that is widely explored and examined in power systems operation literature and commercial transacti...
657KB Sizes 0 Downloads 0 Views