The Sample Average Approximation Method for Stochastic Discrete Optimization
read more
Citations
Lectures on Stochastic Programming: Modeling and Theory
Robust Stochastic Approximation Approach to Stochastic Programming
Convex Approximations of Chance Constrained Programs
A stochastic programming approach for supply chain network design under uncertainty
Monte Carlo Sampling Methods
References
Large Deviations Techniques and Applications
Introduction to Stochastic Programming
Multiple Comparison Procedures
Introduction to Stochastic Programming
Monte Carlo bounding techniques for determining solution quality in stochastic programs
Related Papers (5)
Frequently Asked Questions (11)
Q2. How many decisions can be made with the fourth component?
The fourth component zα(S2N′ (x̂)/N ′ + S2 M /M)1/2 can also be made small withrelatively little computational effort by choosing N ′ and M sufficiently large.
Q3. What is the effect of the SAA method on the performance of the problem?
the bias seems to decrease slower for the instances with more decision variables than for the instances with fewer decision variables.
Q4. What is the effect of the SAA method on the convergence rate?
It was found that this convergence rate depends on the well-conditioning of the problem, which in turn tends to become poorer with an increase in the number of decision variables.
Q5. What is the problem in the second numerical experiment?
As mentioned above, in the second numerical experiment it was noticed that often the optimality gap estimator is large, even if an optimal solution has been found, i.e., v∗−g(x̂) = 0, (which is also a common problem in deterministic discrete optimization).
Q6. How can the first component be made small with relatively little computational effort?
The first component g(x̂)− ĝN′ (x̂) can be made small with relatively little computational effort by choosing N ′ sufficiently large.
Q7. How much does the probability increase in the sample size?
It was shown that the probability that a replication of the SAA method produces an optimal solution increases at an exponential rate in the sample size N .
Q8. How many times did the optimal solution be produced?
for the harder instance with 20 decision variables (instance 20D), the optimal solution was not produced in any of the 270 total number of replications (but the second best solution was produced 3 times); for instance 20R1, the optimal solution was first produced after m = 12 replications with sample size N = 150; and for instance 20R5, the optimal solution was first produced after m = 15 replications with sample size N = 50.
Q9. What is the optimality gap in the second numerical experiment?
The second component, the true optimality gap v∗ − g(x̂) is often small after only a few replications m with a small sample size N .
Q10. What is the effect of the bias on the hard instances?
The most noticeable effect is that the bias decreases much slower for the harder instances than for the randomly generated instances as the sample size N increases.
Q11. What is the effect of the SAA method on the performance of the algorithm?
a more efficient optimality gap estimator can make a substantial contribution toward improving the performance guarantees of the SAA method during execution of the algorithm.