American Journal of Operations Research
Vol.07 No.05(2017), Article ID:78659,9 pages
10.4236/ajor.2017.75018
Optimal Designs Technique for Locating the Optimum of a Second Order Response Function
Idorenyin Etukudo
Department of Mathematics & Statistics, Akwa Ibom State University, Ikot Akpaden, Mkpat Enin, Akwa Ibom State, Nigeria
Copyright © 2017 by author and Scientific Research Publishing Inc.
This work is licensed under the Creative Commons Attribution International License (CC BY 4.0).
http://creativecommons.org/licenses/by/4.0/
Received: June 2, 2017; Accepted: August 20, 2017; Published: August 23, 2017
ABSTRACT
A more efficient method of locating the optimum of a second order response function was of interest in this work. In order to do this, the principles of optimal designs of experiment is invoked and used for this purpose. At the end, it was discovered that the noticeable pitfall in response surface methodology (RSM) was circumvented by this method as the step length was obtained by taking the derivative of the response function rather than doing so by intuition or trial and error as is the case in RSM. A numerical illustration shows that this method is suitable for obtaining the desired optimizer in just one move which compares favourably with other known methods such as Newton-Raphson method which requires more than one iteration to reach the optimizer.
Keywords:
Optimal Designs of Experiment, Unconstrained Optimization, Response Surface Methodology, Modified Super Convergent Line Series Algorithm, Newton-Raphson Method
1. Introduction
The problem of locating the optimum of a second order response function has already been addressed by a method known as response surface methodology (RSM). RSM is simply a collection of mathematical and statistical techniques useful for analyzing problems where several independent variables influence a dependent variable or response. The main objective here is to determine the optimum operating conditions for the system or to determine a region of the factor space in which operating requirements are satisfied [1] . See also [2] [3] [4] [5] and [6] . For instance, the interest of a chemical engineer lies in the optimization of his process yield which is influenced by two variables, reaction time, x1 and reaction temperature, x2. The observed response can be represented as a function of the two independent variables as
(1)
where is the random error term while the expected response function is
(2)
When the mathematical form of Equation (2) is not known, the expected response function can be approximated within the experimental region by a first order or a second order response function [7] .
According to [1] , the initial estimate of the optimum operating conditions for the system is frequently far from the actual optimum. When this happens, the objective of the experimenter is to move rapidly to the general vicinity of the optimum and the actual step size or step length is determined by the experimenter based on experience. The determination of the step length that could guarantee rapid movement to the vicinity of the optimum by experience or trial and error is a pitfall. In order to advance the existing RSM procedure, [8] proposed a modification which utilized the fusion of the Newton-Raphson and Mean-centre algorithms for obtaining the optimum and the exploration of near optimal settings within the optimal region. The problem with this modification is that it uses over 90% of the steps of the previous method and then introduces several other steps, thereby increasing computer time and computer storage space, only to obtain the selection of near-optimal factors settings which is iterative in nature. In order to circumvent this pitfall, this article seeks to solve this problem by making use of the principles of optimal designs of experiment. To design an experiment optimally, we mean a selection of N support points within the experimental region so that the aim of the experimenter could be realized. Unlike RSM where the step length is obtained by trial and error, [9] had already modified an algorithm by [10] to solve an unconstrained optimization problems using the principle of optimal designs of experiment where the step length is obtained by taking the derivative of the response function. As by [9] , a well-defined method to handle interactive effects in the case of quadratic surfaces has been provided. Since this new technique is a line search algorithm, it relies on a well-defined method of determining the direction of search as given by [11] . The algorithmic procedure which is given in the next section requires that the optimal support points that form the initial design matrix obtained from the entire experimental region be partitioned into r groups, . However, [12] has shown that with r = 2, optimal solutions are obtained. This method of locating the optimum of a second order response function is an exact solution method as against iterative solution method used in RSM or any other traditional method.
2. The Algorithm
The sequential steps involved in this algorithm are given below:
Initialization: Let the second order response function, be defined as
Select N support points such that
where r = 2 is the number of partitioned groups and by choosing N arbitrarily, make an initial design matrix
Step 1: Compute the optimal starting point,
Step 2: Partition X into r = 2 groups and calculate
1)
2)
Step 3: Calculate the following:
1) The matrices of the interaction effect of the variables, X1I and X2I
2) Interaction vector of the response parameter,
3) Interaction vectors for the groups are
4) Matrices of mean square error for the groups are
5) The Hessian matrices, Hi and normalized Hessian matrices,
6) The average information matrix,
Step 4: Obtain the response vector, z and the direction vector, d.
Normalize d to have d*.
Step 5: Make a move to the point
for a minimization problem or
for a maximization problem where is the step length obtained from
Step 6: Termination criteria. Is where ε = 0.0001?
1) Yes. Stop and set or as the case may be.
2) No. Replace by and return to Step 5. If , then implement Step 6(1).
3. Numerical Illustration
In this section, we give a numerical illustration of the optimal designs technique for locating the optimum of a second order response function.
by optimal designs technique.
Solution
Initialization: Given the response function, , select N support points such that
where r = 2 is the number of partitioned groups and by choosing N arbitrarily, make an initial design matrix
Step 1: Compute the optimal starting point,
Since
then
Hence, the optimal starting point is
That is,
Step 2: Partitioning X into 2 groups of equal number of support points, we obtain the design matrices,
and
The respective information matrices are
and
and their inverses are
and
Step 3: Calculate the following:
1) The matrices of the interaction effect of the variables for the groups as
2) Interaction vector of the response parameter,
3) Interaction vectors for the groups are
4) Matrices of mean square error for the groups are
5) Matrices of coefficient of convex combinations of the matrices of mean square error are
and by normalizing Hi such that , we have
6) The average information matrix is
Step 4: Obtain the response vector
where
and hence, the direction vector
and by normalizing d such that , we have
Step 5: Obtain the step length, from
That is,
and
Hence,
and by making a move to the next point, we have
Step 6: Since , we make another move and replace by .
The new step length, is obtained as follows:
That is,
which gives
Since the new step length, is zero, then an optimizer had been located at the first move and hence
and
4. Conclusion
By using optimal designs technique, we have been able to locate the optimum of a second order response function in just one move. This method circumvented the noticeable pitfall in RSM by taking the derivative of the response function to obtain the step length rather than doing so by intuition or trial and error as is
the case in RSM. A numerical illustration which gives and
in just one move compares favourably with other known methods such as Newton-Raphson method which requires more than one iteration to reach the optimizer.
Cite this paper
Etukudo, I. (2017) Optimal Designs Technique for Locating the Optimum of a Second Order Response Function. American Journal of Operations Research, 7, 263-271. https://doi.org/10.4236/ajor.2017.75018
References
- 1. Montgomery, D.C. (2001) Design and Analysis of Experiments. John Wiley & Sons, Inc., New York.
- 2. Dayananda, R., Shrikantha, R., Raviraj, S. and Rajesh, N. (2010) Application of Response Surface Methodology on Surface Roughness in Grinding of Aerospace Materials. ARPN Journal of Engineering and Applied Science, 5, 23-28.
- 3. Adinarayana, K. and Ellaiah, O. (2002) Response Surface Optimization of the Critical Medium Components for the Production of Alkaline Phosphate by Newly Isolated Basilus. Journal of Pharmaceutical Science, 5, 272-278.
- 4. Amayo, T. (2010) Response Surface Methodology and its Application to Automative Suspension Designs.
- 5. Arokiyamany, A. and Sivakumaar, R.K. (2011) The Use of Response Surface Methodology in Optimization Process for Bacteriocin. International Journal of Biomedical Research, 2.
- 6. Bradley, D.N. (2007) The Response Surface Methodology. Master's Thesis, Indiana University of South Bend, South Bend. http://www.cs.iusb.edu/thesis/Nbradley.thesis.pdf
- 7. Cochran, W.G. and Cox, G.M. (1992) Experimental Designs. John Wiley & Sons, Inc., New York.
- 8. Akpan, S.S., Usen, J. and Ugbe, T.A. (2013) An Alternative Procedure for Solving Second Order Response Design Problems. International Journal of Scientific & Engineering Research, 4, 2233-2245.
- 9. Umoren, M.U. and Etukudo, I.A. (2010) A Modified Super Convergent Line Series Algorithm for Solving Unconstrained Optimization Problems. Journal of Modern Mathematics and Statistics, 4, 115-122. https://doi.org/10.3923/jmmstat.2010.115.122
- 10. Onukogu, I.B. (2002) Super Convergent Line Series in Optimal Design on Experimental and Mathematical Programming. AP Express Publisher, Nsukka.
- 11. Umoren, M.U. and Etukudo, I.A. (2009) A Modified Super Convergent Line Series Algorithm for Solving Quadratic Programming Problems. Journal of Mathematical Sciences, 20, 55-66.
- 12. Chigbu, P.E. and Ugbe, T.A. ((2002) On the Segmentation of the Response Surfaces for Super Convergent Line Series Optimal Solutions of Constrained linear and Quadratic Programming Problems. Global Journal of Mathematical Sciences, 1, 27-34. https://doi.org/10.4314/gjmas.v1i1.21319