前言

2020-10-27  本文已影响0人  Ka猫

Preface

    This book for people interested in solving optimization problems. Because of the wide (and growing) use of optimization in science, engineering, economics, and industry, it is essential for students and practitioners alike to develop an understanding of optimization algorithms. Knowledge of capabilities and limitations of these algorithms leads to a better understanding of their impact on various applications, and points the way to future research on improving and extending optimization algorithms and software. Our goal in this book is to give a comprehensive description of the most powerful, state-of-the-art, techniques for solving continuous optimization problems. By presenting the motivating ideas for each algorithm, we try to stimulate the reader's intuition and, make the technical details easier to follow. Formal mathematical requirements are kept to a minimum.

    Because of our focus on continuous problems, we have omitted discussion of important optimization topics such as discrete and stochastic optimization. However, there are a great many  applications that can be formulated as continuous optimization problems; for instance, 

    finding the optimal trajectory for an aircraft or a robot arm;

    identifying the seismic properties of a piece of the earth's crust by fitting a model of the region understudy to a set of readings from a network of recording stations;

    designing a portfolio of investments to maximize expected return while maintaining an acceptable level of risk;

    controlling a chemical process or a mechanical device to optimize performance or meet standards of robustness;

    computing the optimal shape of an automobile or aircraft component.

    Every year optimization algorithms are being called on to handle problems that are much larger and complex than in the past. Accordingly, the book emphasizes large-scale optimization techniques such as interior-point methods, inexact Newton methods, limited-memory methods, and the role of partially separable functions and automatic quadratic programming more thoroughly than existing texts, and includes comprehensive discussion of such "core curriculum" topics as constrained optimization theory, Newton and quasi-Newton methods, nonlinear least squares and nonlinear equations, the simplex method, and penalty and barrier methods for nonlinear programming.

Emphasis and Writing Style

    We have used a conversational style to motivate the ideas and present the numerical algorithms. Rather than being as concise as possible, our aim is to make the discussion flow in a natural way. As a result, the book is comparatively long, but we believe that it can be read relatively rapidly. The instructor can assign substantial reading assignments from the text and focus in class only on the main ideas.

    A typical chapter begins with a nonrigorous discussion of the topic at hand, including figures and diagrams and excluding technical details as far as possible. In subsequent sections, the algorithms are motivated and discussed, and then stated explicitly. The major theoretical results are stated, and in many cases proved, in a rigorous fashion. These proofs can be skipped by readers who wish to avoid technical details.

    The practice of optimization depends not only on efficient and robust algorithms, but also on good modeling techniques, careful interpretation of results, and user-friendly software. In this book we discuss the various aspects of the optimization process—modeling, optimality conditions, algorithms, implementation, and interpretation of results—but not with equal weight. Examples throughout the book show how practical problems are formulated as optimization problems, but our treatment of modeling is light and serves mainly to set the stage for algorithmic developments. We refer the reader to Dantzig [86] and Fourer, Gay, and Kernighan [112] for a more comprehensive discussion of this issue. Our treatment of optimality conditions is thorough but not exhaustive; some concepts are discussed more extensively in Mangasarian [198] and Clarke [62]. As mentioned above, we are quite comprehensive in discussing optimization algorithms.

Topics not covered

    \iota _{1} We omit some important topics, such as network optimization, integer programming, stochastic programming, nonsmooth optimization, and global optimization. Network and integer optimization are described in some excellent texts: for instance, Ahuja, Magnanti, and Orlin [1] in the case of network optimization and Nemhauser and Wolsey [224], Papadim- itriou and Steiglitz [235], and Wolsey [312] in the case of integer programming. Books on stochastic optimization are only now appearing; we mention those of Kall and Wallace [174], Birge and Louveaux [22]. Nonsmooth optimization comes in many flavors. The relatively simple structures that arise in robust data fitting (which is sometimes based on the \iota _{1} norm) are treated by Osborne [232] and Fletcher [101]. The latter book also discusses algorithms for nonsmooth penalty functions that arise in constrained optimization; we discuss these briefly, too, in Chapter 18. A more analytical treatment of nonsmooth optimization is given by Hiriart-Urruty and Lemare ́chal [170]. We omit detailed treatment of some important topics that are the focus of intense current research, including interior-point methods for nonlinear programming and algorithms for complementarity problems.

Additional Resource

    The material in the book is complemented by an online resource called the NEOS Guide, which can be found on the World-Wide Web at

http://www.mcs.anl.gov/otc/Guide/

    The Guide contains information about most areas of optimization, and presents a number of case studies that describe applications of various optimization algorithms to real-world problems such as portfolio optimization and optimal dieting. Some of this material is interactive in nature and has been used extensively for class exercises.

    For the most part, we have omitted detailed discussions of specific software packages, and refer the reader to More ́ and Wright [217] or to the Software Guide section of the NEOS Guide, which can be found at

http://www.mcs.anl.gov/otc/Guide/SoftwareGuide/

    Users of optimization software refer in great numbers to this web site, which is being constantly updated to reflect new packages and changes to existing software.

Acknowledgments

    We are most grateful to the following colleagues for their input and feedback on various sections of this work: Chris Bischof, Richard Byrd, George Corliss, Bob Fourer, David Gay, Jean-Charles Gilbert, Phillip Gill, Jean-Pierre Goux, Don Goldfarb, Nick Gould, Andreas Griewank, Matthias Heinkenschloss, Marcelo Marazzi, Hans Mittelmann, Jorge More ́, Will Naylor, Michael Overton, Bob Plemmons, Hugo Scolnik, David Stewart, Philippe Toint, Luis Vicente, Andreas Wa ̈chter, and Ya-xiang Yuan. We thank Guanghui Liu, who provided help with many of the exercises, and Jill Lavelle who assisted us in preparing the figures. We also express our gratitude to our sponsors at the Department of Energy and the National Science Foundation, who have strongly supported our research efforts in optimization over the years.

    One of us (JN) would like to express his deep gratitude to Richard Byrd, who has taught him so much about optimization and who has helped him in very many ways throughout the course of his career.

Final Remark

    In the preface to his 1987 book [101], Roger Fletcher described the field of optimization as a “fascinating blend of theory and computation, heuristics and rigor.” The ever-growing realm of applications and the explosion in computing power is driving optimization research in new and exciting directions, and the ingredients identified by Fletcher will continue to play important roles for many years to come.



Preface    

    这本书适合对解决优化问题感兴趣的人。由于最优化在科学、工程、经济和工业中的广泛(且日益增长)的应用,学生和实践者对优化算法的理解是必不可少的。对这些算法的能力和局限性的了解有助于更好地理解它们对各种应用的影响,并为今后改进和扩展优化算法以及软件的研究指明方向。我们在这本书中的目标是给出一个最强大的,最先进的,解决连续优化问题的技术的全面描述。通过展示每个算法的激励思想,我们试图激发读者的直觉,并使技术细节更容易理解。形式数学要求保持在最低限度。

    由于我们关注的是连续问题,我们省略了对重要优化主题的讨论,如离散优化和随机优化。然而,有很多应用程序可以表述为连续优化问题;例如,

    为飞机或机械臂寻找最佳轨迹;

    通过将所研究区域的模型与记录站网络的一组读数拟合,来确定地壳的地震特性;

    设计一个投资组合,使预期收益最大化,同时保持可接受的风险水平;

    控制化学过程或机械装置,以优化性能或达到稳健性标准;

    计算汽车或飞机部件的最佳形状。

    每年优化算法都被要求处理比过去大得多、复杂得多的问题。因此,该书强调大规模优化技术,如内点法、不精确牛顿法、有限记忆法、部分可分离函数和自动二次规划的作用比现有文献更为深入,包括对约束优化理论、牛顿和拟牛顿方法、非线性最小二乘法和非线性方程、单纯形法、非线性规划的惩罚和障碍法等“核心课程”主题的全面讨论。

Emphasis and Writing Style

    我们使用了一种对话的方式来激发这些想法,并给出了数值算法。我们的目的不是尽可能简洁,而是让讨论自然流畅。因此,这本书比较长,但我们相信它可以读得比较快。教师可以从课文中分配大量的阅读作业,并且在课堂上只关注主要观点。

    尽可能从一个非技术性的图表和细节开始讨论。在后面的章节中,我们将对算法进行激励和讨论,然后明确地进行说明。主要的理论结果陈述,并在许多情况下,以严格的方式证明。希望避免技术细节的读者可以跳过这些证明。

    优化的实践不仅依赖于高效和健壮的算法,还依赖于良好的建模技术、对结果的仔细解释以及用户友好的软件。在这本书中,我们讨论了优化过程建模的各个方面,最优性条件,算法,实现和结果解释,但不是等权重。本书中的例子说明了如何将实际问题表述为优化问题,但我们对建模的处理很轻,主要是为算法开发奠定基础。我们建议读者参考Dantzig[86]和Fourer,Gay,and Kernighan[112]对这个问题进行更全面的讨论。我们对最优性条件的处理是彻底的,但并不详尽;一些概念在Mangasarian[198]和Clarke[62]中得到了更广泛的讨论。如上所述,我们在讨论优化算法方面相当全面。

Topics not covered

    我们省略了一些重要的主题,如网络优化、整数规划、随机规划、非光滑优化和全局优化。网络和整数优化在一些优秀的文本中进行了描述:例如,在网络优化的情况下,Ahuja、Magnanti和Orlin[1]以及Nemhauser和Wolsey[224]、Papadim-itriou和Steiglitz[235]以及Wolsey[312]对于整数规划。关于随机优化的书籍现在才出现;我们提到了Kall和Wallace[174]、Birge和Louveaux[22]的书籍。非光滑优化有很多种。Osborne[232]和Fletcher[101]处理了稳健数据拟合中出现的相对简单的结构(有时基于\iota _{1} 范数)。后一本书还讨论了约束优化中出现的非光滑罚函数的算法;我们在第18章中也简要讨论了这些算法。Hiriart Urruty和Lemaréchal[170]给出了非光滑优化的一种更具分析性的处理方法。我们省略了一些重要主题的详细处理,这些主题是当前研究的热点,包括非线性规划的内点方法和互补问题的算法。

Additional resources

    书中的材料由一个名为“NEOS guide”的在线资源补充,可在World-Wide Web上找到

http://www.mcs.anl.gov/otc/Guide/

    该指南包含了有关大多数优化领域的信息,并提供了大量的案例研究,描述了各种优化算法在实际问题中的应用,如投资组合优化和最优节食。其中一些材料是互动的,并已广泛用于课堂练习。

    在大多数情况下,我们省略了对特定软件包的详细讨论,并请读者参考Moré和Wright[217]或NEOS指南的软件指南部分,可在

http://www.mcs.anl.gov/otc/Guide/SoftwareGuide/

    优化软件的用户大量地参考这个网站,这个网站正在不断更新,以反映新的软件包和对现有软件的更改。

Acknowledgments

我们非常感谢以下同事对本工作各个部分的意见和反馈:克里斯·比肖夫、理查德·伯德、乔治·科利斯、鲍勃·福勒、大卫·盖伊、让·查尔斯·吉尔伯特、菲利普·吉尔、让-皮埃尔·古克斯、唐·戈德法布、尼克·古尔德、安德烈亚斯·格里万、马蒂亚斯·海因施洛斯、马塞洛·马拉齐、汉斯·米特尔曼、豪尔赫·莫尔́,威尔·纳伊勒、迈克尔·奥弗顿、鲍勃·普莱蒙斯、雨果·斯科尔尼克、大卫·斯图尔特、菲利普·托因特、路易斯·维森特、安德烈亚斯·瓦希特和亚香元。我们感谢刘广辉(音译)和吉尔·拉维尔(Jill Lavelle)帮助我们准备了这些数据。我们还对能源部和国家科学基金会的赞助者表示感谢,他们多年来大力支持我们在优化方面的研究工作。

Final Remark

我们中的一位(JN)想对Richard Byrd表示深深的感谢,他教会了他很多关于优化的知识,在他的职业生涯中,他在很多方面帮助了他。

上一篇下一篇

猜你喜欢

热点阅读