6         OptionsMatlab Examples

6.1        DoE Direct search

Perform a DoE over the problem defined by the input structure, and then plot the results of the DoE.  The results of this DoE are used to build RSM in many of the subsequent examples.

 

input1 = createBeamStruct;

input1.OMETHD = 2.8;        %Design of Experiments

input1.NITERS = 50;         %Number of iterations

input1.OLEVEL = 2;

input1.MC_TYPE = 4;         %Full factorial DoE

output1 = OptionsMatlab(input1)

 

output1 =

 

       VARS: [2x1 double]

     OBJFUN: 3.6877e+003

       CONS: [5x1 double]

     OBJTRC: [1x1 struct]

    CONSTRC: [1x1 struct]

 

%Print a digest of the optimisation and determine if

%optimum returned is valid

isvalid = optimisationDigest(output1, input1)

 

============================================================

 

 Optimisation of the problem defined by "beamobjfun" and "beamobjcon"

 Optimisation method: 2.8

 

 Status after 50 evaluations is :-

 

 Trial vector

 

 Lwr Bound       Vector       Uppr Bound    Variable (units)

 

   5.00000000 <  24.68750000 >  50.00000000  BREADTH 

   2.00000000 <  14.93750000 >  25.00000000  HEIGHT 

 

 No of V. Boundary Violations =   0

 

 Objective Function (min.)    =  3687.6953   AREA

 

 

 Constraints vector

 

 Lwr Bound       Vector       Uppr Bound    Variable (units)

 

              <  81.69200669 > 200.00000000  SIGMA-B 

              <   2.03379058 > 100.00000000  TAU 

              <   3.78699170 >   5.00000000  DEFLN 

              <   6.05063291 >  10.00000000  H-ON-B 

 5000.00000000 < 290554.98816615             F-CRIT 

 

 No of Constraint Violations =   0

 

============================================================

 

%Plot the results of the optimisation

plotOptionsSurfaces(output1, input1)

optimisationTerrain(output1, input1)

optimisationTrace(output1, input1)

optimisationHistory({output1}, {'Design of Experiments'}) 

 

isvalid =

 

     1

 

6.2        RSM returning update points

Build and search a Response Surface Model using the results of example 6.1. This search will return up to 10 update points where the quality of the DoE would be best improved.

 

input2 = createBeamStruct;

input2.OMETHD = 4;          %Genetic Algorithm

input2.NITERS = 50;

input2.OLEVEL = 2;

input2.OBJMOD = 3.3;        %First order polynomial regression

    %model plus squares

input2.CONMOD = 3.3;        %First order polynomial regression

    %model plus squares

input2.NUMUPDATE = 10;      %10 update points

output2 = OptionsMatlab(input2, output1)

 

output2 =

 

         VARS: [2x1 double]

       OBJFUN: 2.5149e+003

         CONS: [5x1 double]

    DOE_TRACE: [1x1 struct]

 

6.3        DoE evaluating candidate points

Perform a candidate point DoE search to evaluate the update points suggested by example 6.2.

 

input3 = createBeamStruct;

input3.OLEVEL = 2;

input3.OMETHD = 2.8;        %Design of Experiments

                            %Specify update points as candidate        

                            %points

input3.DOE_TRACE = output2.DOE_TRACE;

                             %Set the number of iterations

input3.NITERS = output2.DOE_TRACE.NCALLS+1;

input3.MC_TYPE = 7;         %Specify that the DOE uses

                                  %candidate points

                            %Note that the meaning of MC_TYPE

                                  %has changed since version 0.6.5

output3 = OptionsMatlab(input3)

 

output3 =

 

       VARS: [2x1 double]

     OBJFUN: 6000

       CONS: [5x1 double]

     OBJTRC: [1x1 struct]

    CONSTRC: [1x1 struct]

 

%Concatenate the output structures from examples 6.1 and 6.3

output3_cat = optimisationAppendDataPoints(output1,output3)

 

output3_cat =

 

       VARS: [2x1 double]

     OBJFUN: 3.6877e+003

       CONS: [5x1 double]

     OBJTRC: [1x1 struct]

    CONSTRC: [1x1 struct]

 

6.4        RSM using candidate points

Build and evaluate an RSM at specified points. The utility function optimisationSampleRSM can assist you to do this (see example 6.9).

 

input4 = createBeamStruct;

input4.OLEVEL = 2;

input4.OMETHD = 2.8;          %Design of Experiments

                              %Specify the candidate points to

                              %be evaluated

input4.DOE_TRACE.NCALLS = output1.OBJTRC.NCALLS;

input4.DOE_TRACE.VARS = output1.OBJTRC.VARS;

input4.NITERS = input4.DOE_TRACE.NCALLS +1;

input4.MC_TYPE = 7;           %DoE using candidate points

input4.OBJMOD = 3.3;          %First order polynomial

                              %regression model plus squares

input4.CONMOD = 3.3;          %First order polynomial

                              %regression model plus squares

output4 = OptionsMatlab(input4, output3_cat)

 

output4 =

 

      VARS: [2x1 double]

    OBJFUN: 2.6319e+003

      CONS: [5x1 double]

    RSMTRC: [1x1 struct]

 

%Plot the RSM

fig = optimisationTerrain(output4, input4, 2);

optimisationTrace(output4, input4, 2, fig);

 

6.5        Direct search with checkpointing

Checkpoint the search history of a direct search every 300 generations in a file 'optimTest5.mat'.

 

input5 = createBeamStruct;

input5.OLEVEL = 2;

input5.OMETHD = 2.8;

input5.NITERS = 500;    %500 iterations

input5.MAXJOBS = 100;   %Submit jobs in groups of 100

input5.CHKPT_INTV = 300;%Checkpoint every 300 generations

input5.CHKPT_FILE = 'optimTest5.mat'; %Checkpoint file name

delete('optimTest5.mat')%Remove existing checkpoint file

output5 = OptionsMatlab(input5)

 

output5 =

 

       VARS: [2x1 double]

     OBJFUN: 2.9455e+003

       CONS: [5x1 double]

     OBJTRC: [1x1 struct]

    CONSTRC: [1x1 struct]

 

load('optimTest5.mat'%Load checkpoint file

whos CHKPOINT

 

  Name           Size                    Bytes  Class

 

  CHKPOINT       1x1                     49256  struct array

 

Grand total is 6012 elements using 49256 bytes

 

6.6        Parallel job submission with userdata

This example uses the Geodise compute toolbox [3] that provides client functionality to Globus Grid resources that may be used to evaluate computational jobs. The jobs will be submitted to the Globus resource to run concurrently. When the jobs are complete the results will be retrieved and parsed to determine the objective function values. Note that you must have the Geodise compute toolbox installed, and have valid credentials with permissions to submit jobs to the specified compute resource.

 

%Define the Globus server to which to submit the jobs

GLOBUSSERVER = 'escience-dept2.sesnet.soton.ac.uk';

 

gd_createproxy

 

Paused: Press any key...

 

input6 = createBeamStructParallel;

input6.OLEVEL = 0;

input6.MAXJOBS = 10;    %The number of the jobs to be run

                        %concurrently

input6.NITERS = 20;     %The number of iterations

 

                        %USERDATA field is used to pass the

                        %host name upon which to run the

                        %objective function to the Matlab

                        %function

input6.USERDATA.hostname = GLOBUSSERVER;

output6 = OptionsMatlab(input6)

 

[...]

 

ohandle =

 

https://escience-dept2.sesnet.soton.ac.uk:30040/10303/1134728028/

 

 

uniquedir =

 

20051216T101347_57891/

 

 

EVAL =

 

  3.9666e+003

 

output6 =

 

       VARS: [2x1 double]

     OBJFUN: 3.9666e+003

       CONS: [5x1 double]

     OBJTRC: [1x1 struct]

    CONSTRC: [1x1 struct]

 

6.7        Hyper-parameter tuning

This example will tune hyper-parameters for a Stochastic Process Model over the results of the DoE produced in example 6.1. The hyper-parameters will be tuned using the optimisation algorithm specified by OMETHD. The tuned hyper-parameters will be returned in fields of the output structure OBJHYPER (or CONHYPER) that be supplied in the input structure when building a Stochastic Process Model RSM.

 

input7 = createBeamStruct;

input7.OLEVEL = 0;

input7.OBJMOD = 4.1;     %Tune stochastic Process Model  

                         %hyper-parameters over the

                         %objective function

input7.CONMOD = 4.1;     %Tune stochastic Process Model

                         %hyper-parameters over the

                         %constraints

input7.TUNEHYPER = 1;    %Tune the hyper-parameters (do not

                         %search the user's problem)

input7.OMETHD = 5;       %Simulated Annealing

 

                         %Note that if OBJHYPER or CONHYPER are

                         %provided these hyper-parameters will

                         %be used in preference to those

                         %generated by OPTRSS

output7 = OptionsMatlab(input7, output1)

 

output7 =

 

    OBJHYPER: [1x1 struct]

     OBJ_CLF: 712.6938

    CONHYPER: [1x1 struct]

     CST_CLF: 824.2750

 

6.8        User-defined sequential optimiser

This example invokes the sequential optimiser defined by the Matlab function 'optum1.m', which randomly generates searches points within the parameters space. It is possible to write a Matlab function that provides alternative behaviour for a sequential optimiser.

 

input8 = createBeamStruct;

input8.OLEVEL = 2;

input8.OMETHD = 2.1;      %User-defined optimiser 1

input8.OPTUM1 = 'optum1'; %Specifies function 'optum1.m' as

                          %user-defined optimiser

output8 = OptionsMatlab(input8)

 

output8 =

 

       VARS: [2x1 double]

     OBJFUN: 2.6409e+003

       CONS: [5x1 double]

     OBJTRC: [1x1 struct]

    CONSTRC: [1x1 struct]

 

6.9        Sample a Response Surface Model

This example uses the utility function optimisationSampleRSM to build an RSM and sample the RSM at 100 evenly spaced points within the parameter space. Compare this method to example 6.4.

 

%Create an input structure to search an RSM

input9 = createBeamStruct;

input9.OLEVEL = 2;

input9.OBJMOD = 3.3;

input9.CONMOD = 3.3;

 

%Sample 100 evenly spaced points

output9 = optimisationSampleRSM(input9, output1, 100)

 

%Plot the points sampled from the RSM

optimisationTerrain(output9, input9)

 

output9 =

 

      VARS: [2x1 double]

    OBJFUN: 2.4349e+003

      CONS: [5x1 double]

    RSMTRC: [1x1 struct]

 

6.10    Build a stochastic process model RSM with quick tuning

This example builds a stochastic process model RSM using quick hyper-parameter tuning (by setting the flag RSM_QCK_HP). Here the hyper-parameters THETA and EXP will be tuned across all design variables, rather than NVARS values of THETA and EXP corresponding to each design variable. The values of OBJ_EXP and OBJ_THETA, and of CST_EXP and CST_THETA (in the structures OBJHYPER and CONHYPER) will be scalar, rather than a vector of length NVARS.

 

%Create an input structure to search an SPM RSM with quick tuning

input10 = createBeamStruct;

input10.OLEVEL = 0;

input10.OBJMOD = 4.1;

input10.CONMOD = 4.1;

input10.RSM_QCK_HP = 1;

 

output10 = OptionsMatlab(input10, output1);

output10.OBJHYPER

output10.CONHYPER

 

 

ans =

      OBJ_LAMBDA: -6

    U_OBJ_LAMBDA: 3

    L_OBJ_LAMBDA: -20

       OBJ_THETA: 0.1548

     U_OBJ_THETA: 3

     L_OBJ_THETA: -10

         OBJ_EXP: 2

       U_OBJ_EXP: 2

       L_OBJ_EXP: 1

 

ans =

      CST_LAMBDA: -6

    U_CST_LAMBDA: 3

    L_CST_LAMBDA: -20

       CST_THETA: 0.1563

     U_CST_THETA: 3

     L_CST_THETA: -10

         CST_EXP: 2

       U_CST_EXP: 2

       L_CST_EXP: 1

 

6.11    Search a tuned stochastic process model RSM

This example samples and then searches the stochastic process model RSM built using the quick tuned hyper-parameters. The scalar hyper-parameter values OBJ_THETA and OBJ_EXP are duplicated across the design variables of the problem and assigned to the field OBJHYPER of the input structure.

 

% Duplicate the scalar hyperpameter values across the design variables

inputStruct = createBeamStruct;

inputStruct.OBJHYPER.OBJ_THETA  = output10.OBJHYPER.OBJ_THETA*ones(inputStruct.NVRS,1);

inputStruct.OBJHYPER.OBJ_EXP    = output10.OBJHYPER.OBJ_EXP*ones(inputStruct.NVRS,1);

inputStruct.OBJHYPER.OBJ_LAMBDA = output10.OBJHYPER.OBJ_LAMBDA;

 

%Sample the RSM surface

input11a = inputStruct;

input11a.OLEVEL = 0;

input11a.OBJMOD = 4.1;  %Evaluate SPM RSM objective function

input11a.CONMOD = 0.0;  %Evaluate constraint function directly

output11a = optimisationSampleRSM(input11a, output1, 400);

 

%Create an input structure to search the SPM RSM using a GA

input11b = input11a;    %Copy the sampling input structure

input11b.OMETHD = 4;    %Genetic Algorithm

input11b.NITERS = 500   %10 generations

output11b = OptionsMatlab(input11b, output1);

output11b = optimisationSearchTrace(output11b)  %Retrieve optimum from the trace history

 

output11b =

 

        VARS: [2x1 double]

      OBJFUN: 2.6948e+003

        CONS: [5x1 double]

      RSMTRC: [1x1 struct]

    OBJHYPER: [1x1 struct]

 

%Plot the RSM and optimum point

optimisationTerrain(output11a, input11a, 5)

hold on;

plot3(output11b.VARS(1,1), output11b.VARS(2,1), ...

      output11b.OBJFUN, 'ko', 'MarkerFaceColor', 'k')

 

6.12    Search the root mean square error of a tuned stochastic process model RSM

This example samples and then searches the Root Mean Square Error of the stochastic process model RSM built using the quick tuned hyper-parameters. The scalar hyper-parameter values OBJ_THETA and OBJ_EXP are duplicated across the design variables of the problem and assigned to the field OBJHYPER of the input structure.

 

The RMSE surface is invariant to a change in the direction of search for the underlying problem. This means that the surface can be searched in either direction for points of maximum or minimum error. The test first verifies that the RSM is identical when the direction of search is reversed.

 

The reader will be aware that the root mean square error of the SPM falls to zero at all sampled points (since the values of the objective and constraints are known at these points) so searching for the minimum of the surface is of little value. To find the maximum error in the stochastic process model RSM the direction of search in the input field DIRCTN is always set to +1 regardless of the direction of search of the underlying problem. This is worth highlighting because this differs from the searches of the other stochastic process model properties. In the cases of expected improvement (OBJMOD=4.3), constrained expected improvement (OBJMOD=4.31), constrained feasibility of improvement (OBJMOD=4.32) and probability of improvement (OBJMOD=4.33) the RSM surface that is built is critically dependent on the direction of search of the underlying problem. Any searches of these surfaces are hard-coded within OPTIONS to build the surface according to the direction of search for the underlying problem and seek the maximum in that surface accordingly. Only in the case of RMSE must the direction of search be explicitly set to +1 to find the maximum in the root mean square error of the RSM.

 

% Duplicate the scalar hyperpameter values across the design variables

inputStruct = createBeamStruct;

inputStruct.OBJHYPER.OBJ_THETA  = output10.OBJHYPER.OBJ_THETA*ones(inputStruct.NVRS,1);

inputStruct.OBJHYPER.OBJ_EXP    = output10.OBJHYPER.OBJ_EXP*ones(inputStruct.NVRS,1);

inputStruct.OBJHYPER.OBJ_LAMBDA = output10.OBJHYPER.OBJ_LAMBDA;

 

%Sample the RSM surface

input12a = inputStruct;

input12a.OLEVEL = 0;

input12a.OBJMOD = 4.2;  %Evaluate RMSE of SPM RSM over objective function

input12a.CONMOD = 0.0;  %Evaluate constraint function directly

input12a.ONAM = 'RMSE'; %Label objective

input12a.DIRCTN = +1;   %The error surface should not change with DIRCTN

output12a = optimisationSampleRSM(input12a, output1, 400);  %Sample RSM

 

input12b = input12a;    %Copy the sampling input structure

input12b.DIRCTN = -1;   %The error surface should not change with DIRCTN

output12b = optimisationSampleRSM(input12b, output1, 400);

 

%Check that the RMSE surface is invariant under change of DIRCTN

if (sum(abs(output12a.RSMTRC.OBJFUN - output12b.RSMTRC.OBJFUN)) > 0)

    error('*** RMSE of Stochastic Process Model is not invariant under change of DIRCTN ***')

end

 

input12c = input12b;    %Copy the sampling input structure

input12c.DIRCTN = +1;   %Search for maximum in RMSE of the SPM

                        %(NB. This value is set to +1 regardless of the

                        % direction of the underlying problem)

input12c.OMETHD = 4;    %Genetic Algorithm

input12c.NITERS = 500   %10 generations

output12c = OptionsMatlab(input12c, output1);

output12c = optimisationSearchTrace(output12c)  % Search the trace history for optimum

 

output12c =

 

        VARS: [2x1 double]

      OBJFUN: 746.8510

        CONS: [5x1 double]

      RSMTRC: [1x1 struct]

    OBJHYPER: [1x1 struct]

 

%Plot the RSM and optimum point

optimisationTerrain(output12a, input12a, 5)

hold on;

plot3(output12c.VARS(1,1), output12c.VARS(2,1), ...

      output12c.OBJFUN, 'ko', 'MarkerFaceColor', 'k')

 

6.13    Search the expected improvement of a tuned stochastic process model RSM

This example samples and then searches the Expected Improvement of the stochastic process model RSM built using the quick tuned hyper-parameters. The scalar hyper-parameter values OBJ_THETA and OBJ_EXP are duplicated across the design variables of the problem and assigned to the field OBJHYPER of the input structure. Note that for a minimisation problem OPTIONS inverts the Expected Improvement calculation, returning a minimum value of the inverted problem, at the point of maximum expected improvement of the RSM.

 

% Duplicate the scalar hyperpameter values across the design variables

inputStruct = createBeamStruct;

inputStruct.OBJHYPER.OBJ_THETA  = output10.OBJHYPER.OBJ_THETA*ones(inputStruct.NVRS,1);

inputStruct.OBJHYPER.OBJ_EXP    = output10.OBJHYPER.OBJ_EXP*ones(inputStruct.NVRS,1);

inputStruct.OBJHYPER.OBJ_LAMBDA = output10.OBJHYPER.OBJ_LAMBDA;

 

%Sample the RSM surface

input13a = inputStruct;

input13a.OLEVEL = 0;

input13a.OBJMOD = 4.3;  %Evaluate EI of SPM RSM over objective function

input13a.CONMOD = 0.0;  %Evaluate constraint function directly

input13a.ONAM = 'EI';   %Label objective

output13a = optimisationSampleRSM(input13a, output1, 400);  %Sample RSM

 

%Create an input structure to search the SPM RSM using a GA

input13b = input13a;    %Copy the sampling input structure

input13b.OMETHD = 4;    %Genetic Algorithm

input13b.NITERS = 500   %10 generations

output13b = OptionsMatlab(input13b, output1);

output13b = optimisationSearchTrace(output13b)  %Search the trace history for optimum

 

output13b =

 

        VARS: [2x1 double]

      OBJFUN: 115.1293

        CONS: [5x1 double]

      RSMTRC: [1x1 struct]

    OBJHYPER: [1x1 struct]

 

%Plot the RSM and optimum point

optimisationTerrain(output13a, input13a, 5)

hold on;

plot3(output13b.VARS(1,1), output13b.VARS(2,1), ...

      output13b.OBJFUN, 'ko', 'MarkerFaceColor', 'k')

 

6.14    Search the probability of improvement of a tuned stochastic process model RSM

This example samples and then searches the Probability of Improvement of the stochastic process model RSM built using the quick tuned hyper-parameters. The scalar hyper-parameter values OBJ_THETA and OBJ_EXP are duplicated across the design variables of the problem and assigned to the field OBJHYPER of the input structure. Note that for a minimisation problem OPTIONS inverts the Probability of Improvement calculation, returning a minimum value of the inverted problem, at the point of maximum probability of improvement of the RSM (this is why this calculation may return negative value for the probability when searching a minimisation problem).

 

% Duplicate the scalar hyperpameter values across the design variables

inputStruct = createBeamStruct;

inputStruct.OBJHYPER.OBJ_THETA  = output10.OBJHYPER.OBJ_THETA*ones(inputStruct.NVRS,1);

inputStruct.OBJHYPER.OBJ_EXP    = output10.OBJHYPER.OBJ_EXP*ones(inputStruct.NVRS,1);

inputStruct.OBJHYPER.OBJ_LAMBDA = output10.OBJHYPER.OBJ_LAMBDA;

 

%Sample the RSM surface

input14a = inputStruct;

input14a.OLEVEL = 0;

input14a.OBJMOD = 4.33; %Evaluate PI of SPM RSM over objective function

input14a.CONMOD = 0.0;  %Evaluate constraint function directly

input14a.ONAM = 'PI';   %Label objective

output14a = optimisationSampleRSM(input14a, output1, 400);  %Sample RSM

 

%Create an input structure to search the SPM RSM using a GA

input14b = input14a;    %Copy the sampling input structure

input14b.OMETHD = 4;    %Genetic Algorithm

input14b.NITERS = 500   %10 generations

output14b = OptionsMatlab(input14b, output1);

output14b = optimisationSearchTrace(output14b)  %Search the trace history for optimum

 

output14b =

 

        VARS: [2x1 double]

      OBJFUN: -1.0776e-042

        CONS: [5x1 double]

      RSMTRC: [1x1 struct]

    OBJHYPER: [1x1 struct]

 

%Plot the RSM and optimum point

optimisationTerrain(output14a, input14a, 5)

hold on;

plot3(output14b.VARS(1,1), output14b.VARS(2,1), ...

      output14b.OBJFUN, 'ko', 'MarkerFaceColor', 'k')

 

6.15    Search the constrained expected improvement of a tuned stochastic process model RSM

This example samples and then searches the constrained Expected Improvement of the stochastic process model RSM built using the quick tuned hyper-parameters. The scalar hyper-parameter values OBJ_THETA, OBJ_EXP, CST_THETA and CST_EXP are duplicated across the design variables of the problem and assigned to the fields OBJHYPER and CONHYPER of the input structure. Note that for a minimisation problem OPTIONS inverts the constrained Expected Improvement calculation, returning a minimum value of the inverted problem, at the point of maximum expected improvement of the constrained RSM.

 

% Duplicate the scalar hyperpameter values across the design variables

inputStruct = createBeamStruct;

inputStruct.OBJHYPER.OBJ_THETA  = output10.OBJHYPER.OBJ_THETA*ones(inputStruct.NVRS,1);

inputStruct.OBJHYPER.OBJ_EXP    = output10.OBJHYPER.OBJ_EXP*ones(inputStruct.NVRS,1);

inputStruct.OBJHYPER.OBJ_LAMBDA = output10.OBJHYPER.OBJ_LAMBDA;

inputStruct.CONHYPER.CST_THETA  = output10.CONHYPER.CST_THETA*ones(inputStruct.NVRS,1);

inputStruct.CONHYPER.CST_EXP    = output10.CONHYPER.CST_EXP*ones(inputStruct.NVRS,1);

inputStruct.CONHYPER.CST_LAMBDA = output10.CONHYPER.CST_LAMBDA;

 

%Sample the RSM surface

input15a = inputStruct;

input15a.OLEVEL = 0;

input15a.OBJMOD = 4.31;       %Evaluate constrained EI of SPM RSM over objective function

input15a.CONMOD = 4.1;        %Evaluate constraint function using SPM RSM

input15a.ONAM = 'CST-EI';     %Label objective

output15a = optimisationSampleRSM(input15a, output1, 400);  %Sample RSM

 

%Create an input structure to search the SPM RSM using a GA

input15b = input15a;    %Copy the sampling input structure

input15b.OMETHD = 4;    %Genetic Algorithm

input15b.NITERS = 500   %10 generations

output15b = OptionsMatlab(input15b, output1);

output15b = optimisationSearchTrace(output15b)  %Search the trace history for optimum

 

output15b =

 

        VARS: [2x1 double]

      OBJFUN: -7.5469

        CONS: [5x1 double]

      RSMTRC: [1x1 struct]

    OBJHYPER: [1x1 struct]

    CONHYPER: [1x1 struct]

 

%Plot the RSM and optimum point

optimisationTerrain(output15a, input15a, 5)

hold on;

plot3(output15b.VARS(1,1), output15b.VARS(2,1), ...

      output15b.OBJFUN, 'ko', 'MarkerFaceColor', 'k')

 

6.16    Search the constrained feasibility of improvement of a tuned stochastic process model RSM

This example samples and then searches the constrained Feasibility of Improvement of the stochastic process model RSM built using the quick tuned hyper-parameters. The scalar hyper-parameter values OBJ_THETA, OBJ_EXP, CST_THETA and CST_EXP are duplicated across the design variables of the problem and assigned to the fields OBJHYPER and CONHYPER of the input structure. Note that for a minimisation problem OPTIONS inverts the constrained Feasibility of Improvement calculation, returning a minimum value of the inverted problem, at the point of maximum feasibility of improvement of the constrained RSM.

 

% Duplicate the scalar hyperpameter values across the design variables

inputStruct = createBeamStruct;

inputStruct.OBJHYPER.OBJ_THETA  = output10.OBJHYPER.OBJ_THETA*ones(inputStruct.NVRS,1);

inputStruct.OBJHYPER.OBJ_EXP    = output10.OBJHYPER.OBJ_EXP*ones(inputStruct.NVRS,1);

inputStruct.OBJHYPER.OBJ_LAMBDA = output10.OBJHYPER.OBJ_LAMBDA;

inputStruct.CONHYPER.CST_THETA  = output10.CONHYPER.CST_THETA*ones(inputStruct.NVRS,1);

inputStruct.CONHYPER.CST_EXP    = output10.CONHYPER.CST_EXP*ones(inputStruct.NVRS,1);

inputStruct.CONHYPER.CST_LAMBDA = output10.CONHYPER.CST_LAMBDA;

 

%Sample the RSM surface

input16a = inputStruct;

input16a.OLEVEL = 0;

input16a.OBJMOD = 4.32;       %Evaluate constrained FI of SPM RSM over objective function

input16a.CONMOD = 4.1;        %Evaluate constraint function using SPM RSM

input16a.ONAM = 'CST-FI';     %Label objective

output16a = optimisationSampleRSM(input16a, output1, 400);  %Sample RSM

 

%Create an input structure to search the SPM RSM using a GA

input16b = input16a;    %Copy the sampling input structure

input16b.OMETHD = 4;    %Genetic Algorithm

input16b.NITERS = 500   %10 generations

output16b = OptionsMatlab(input16b, output1);

output16b = optimisationSearchTrace(output16b)  %Search the trace history for optimum

 

output16b =

 

        VARS: [2x1 double]

      OBJFUN: 0

        CONS: [5x1 double]

      RSMTRC: [1x1 struct]

    OBJHYPER: [1x1 struct]

    CONHYPER: [1x1 struct]

 

%Plot the RSM and optimum point

optimisationTerrain(output16a, input16a, 5)

hold on;

plot3(output16b.VARS(1,1), output16b.VARS(2,1), ...

      output16b.OBJFUN, 'ko', 'MarkerFaceColor', 'k')

 



faq

contents

references

Copyright © 2007, The Geodise Project, University of Southampton