|
THE
FIELDS INSTITUTE FOR RESEARCH IN MATHEMATICAL SCIENCES
20th
ANNIVERSARY
YEAR |
|
2012-13
Fields
Industrial Optimization Seminar
at 5:00 p.m.
at
the Fields Institute, 222 College St., Toronto
Map
to Fields
|
|
The inaugural meeting of the Fields Industrial Optimization Seminar
took place on November 2, 2004. The seminar meets in the early evening
of the first Tuesday of each month. Each meeting is comprised of
two related lectures on a topic in optimization; typically, one
speaker is a university-based researcher and the other is from the
private or government sector. The series welcomes the participation
of everyone in the academic or industrial community with an interest
in optimization theory or practice, expert or student . Please
subscribe to the Fields mail list to be
informed of upcoming seminars.
The Fields Institute makes a video record of this seminar through
FieldsLive. If you make a presentation
to the Seminar, the Institute will be video-recording the presentation
and will make the video record
available to the public.
Upcoming
Seminars
Talks streamed live at FieldsLive
|
June 4, 2013 |
Kai Huang (McMaster University) (slides)
Benchmarking Non-First-Come-First-Served Component Allocation in
an Assemble-To-Order System
In a multi-product, multi-component Assemble-To-Order (ATO) system,
component allocation policy has significant influence on
the service and cost performance measures. In this paper,
we study a series of simple non-First-Come-First-Served
(non-FCFS) component allocation rules in a periodic review
ATO system with component base stock policy, i.e. the Last-Come-First-Served
(LCFS) rule, the Product-Based-Priority-Within-Time-Windows
(PTW) rule, and the Myopic Optimization (MO) rule. For the
LCFS rule and the PTW rule, we express the demand fulfillment
rates analytically. Based on these representations, we can
optimize the base stock levels. Moreover, to test the non-FCFS
allocation rules, we propose to use three mathematical programs
as benchmarks. These mathematical programs maximize the
average cycle service level, maximize the aggregate fill
rate, and minimize the operational cost per period separately
under the FCFS rule. Our computational study shows that
the proposed simple non-FCFS rules can significantly increase
the service measures or decrease the cost measure, and outperform
the benchmarks. Moreover, the values of the LCFS rule and
the PTW rule increase when there is a greater need for customer
service differentiation.
Giles Laurier (Capstone Technology) (slides)
Industrial Plant Optimization in Reduced Dimensional Spaces
The implementation history of real time optimization (RTO) in the
refining industry is a sobering case study of a failed product launch.
Starting from a heady beginning in the early 1990's, rapid adoption,
and published successes, many of these projects have been abandoned.
It is a cautionary tale to the optimization community that the ability
to successfully go "live" in an operating plant may depend
more on industrial psychology than algorithms. This seminar discusses
the math and methods of the early RTO efforts, and proposes an alternative
approach based on latent space methods which may prove to be more
commercially viable.
|
Past
Seminars |
March 19, 2013
|
Vlad Mahalec (McMaster)
Inventory Pinch Algorithms for Gasoline Blending (slides)
Optimal gasoline blending requires optimization of blend
recipes and scheduling of blends in a manner that minimizes
switching between grades and minimizes total cost of the
blends. Rigorous computation of blend properties requires
solution of complex non-linear models (e.g. EPA reformulated
gasoline). MINLP models with such nonlinear constraints
often involve large computational times. This work introduces
a decomposition of the blend planning models into optimization
of blend recipes and allocation of volumes to be produced
based on these optimal blend recipes. It is shown that a
specific blend recipe is optimal for a region delineated
by inventory pinch points, or sometimes by its sub region
which can be found iteratively. The top level of the algorithm
minimizes the number of periods which have different blend
recipes by solving a multiperiod NLP (periods delimited
by the pinch points). The lower level computes the blend
plan via fixed-recipe MILP. The algorithm leads to a much
smaller number of blend recipes than the current paradigm.
We also introduce a variation of the algorithm, where only
single period NLP model is solved in order to optimize the
blend recipes.
Dimitrios Varvarezos (Aspen Technology, Inc.) (slides)
Refinery Optimization - Recent Advances in Planning and
Blending Operations
In this seminar we discuss recent advances made in the
area of refinery optimization. Two very prominent and challenging
large-scale mixed integer optimization problems are discussed:
the optimization of crude purchasing decisions and the optimal
blending of refinery streams without intermediate storage.
For the crude acquisition problem, we present a robust optimization
framework based on a combination of Pareto-type analysis,
parametric optimization, and goal programming. This approach
allows users to evaluate a range of options close to the
optimal solution that preserve the economic optimization
but also take account important strategic and operational
goals. Together with novel global optimization techniques
for non-convex models, this methodology offers actionable
information to refinery planners and traders. This presentation
describes the techniques employed and demonstrates their
potential value to refinery planning and trading organizations.
For the rundown blending problem, we present a novel modeling
and optimization approach that determines the optimal sequence
and timing of blend events, as well as rundown component
tank switches in order to handle the blending of "hot"
streams into a finished product tank. This solution incorporates
multiple blend headers and multiple blends in a multi-period,
event-driven campaign, using open-equation based optimization
and modeling technology. The proposed approach is both comprehensive
and practical and much superior to the current practice.
|
Dec 4, 2012
|
Ricardo Fukasawa (Waterloo) (slides)
(video archive of the
talk)
MIP reformulations of some chance-constrained mathematical
programs
In mathematical programs, an often used assumption is that
the problem data is deterministic, or in other words it
is known in advance. This simplifying assumption may be
reasonable in many situations, but it may be too strong
in others. In this talk we will focus on a specific type
of model that addresses uncertain data, namely chance-constrained
mathematical programs with discrete random right-hand sides.
Chance-constrained mathematical programs are optimization
problems where some of the data is assumed to be random
and we are interested in an optimal solution satisfying
constraints with a pre-specified high probability. Luedtke,
Ahmed and Nemhauser (2010) proposed a mixed-integer programming
(MIP) model for dealing with the case where the randomness
is solely on the right-hand side of the inequalities and
the distribution is discrete. They were able to obtain some
strong inequalities for the model by studying the polyhedral
structure of a mixing-type set subject to an additional
constraint. Later, Kucukyavuz (2012) extended and improved
on their results. However, the results on both of these
papers are mostly for the case where the distribution is
uniform.
In this talk, we will give some background on the problem
and present some of our results in extending and generalizing
the results of these papers to the case of general probabilities.
Based on joint work with Ahmad Abdi.
François Welt (Hatch) (slides)
(video archive of the talk)
A Fully Integrated Model for the Optimal Operation of HydroPower
Generation
In this presentation, we will describe the approach and
experience gained in the development of a suite of fully
integrated optimization models for water and power optimization
that has been implemented over many different hydro systems
in North America and various parts of the world. In particular,
the handling of large size problems through various decomposition
schemes and integration of multiple time scales over the
entire planning process will be discussed. The representation
of hydrologic, market and load uncertainty and the challenges
of solving for discrete decisions as related to reserve
allocation and the unit commitment problem within the short
term scheduling problem will be reviewed.
|
Oct.
2, 2012
|
Robert McCann (University of Toronto) (video
archive of the talk)
Pricing multidimensional products and contracts facing
informational asymmetry
The monopolist's problem of deciding what types of products
to manufacture and how much to charge for each of them,
knowing only statistical information about the preferences
of an anonymous field of potential buyers, is one of the
basic problems analyzed in economic theory. The solution
to this problem when the space of products and of buyers
can each be parameterized by a single variable (say quality
X, and income Y) garnered Mirrlees (1971) and Spence (1974)
their Nobel prizes in 1996 and 2001. The multidimensional
version of this question is a largely open problem, which
arises when pricing products or contracts parameterized
by several variables. It is of both theoretical and computational
interest to know when this optimization problem is convex,
and when it is not.
I describe joint work with A Figalli and Y-H Kim (JET 2011),
identifying structural conditions on the value b(X,Y) of
product X to buyer Y which are sufficient (and nearly necessary)
to reduce this problem to a convex program in a Banach space.
This leads to uniqueness and stability results for its solution,
confirms the robustness of certain economic phenomena observed
by Armstrong (1996) such as the desirability for the monopolist
to raise prices enough to drive a positive fraction of buyers
out of the market, and yields conjectures concerning the
robustness of other phenomena observed Rochet and Chone
(1998), such as the clumping together of products marketed
into subsets of various dimension. The passage to several
dimensions relies on ideas from differential geometry /
general relativity, optimal transportation, and nonlinear
partial differential equations.
Ti Wang (RBC Capital Markets) (video
archive of the talk)
An Interval-based Smooth Interpolation and Its Applications
in Finance
To find a suitable interest rate curve construction method
is one of the basic but crucial components for building
a sophisticated interest rate pricing and risk management
framework in todays banking industry. In this post-crisis
era, building interest rate curves is no longer a trivial
task: financial engineers are working on new curves that
are smooth, stable and fast to compute in order to meet
the increasingly complicated and time-critical requirements
from the business and financial regulators. One of the most
challenging problems is to build a smooth curve from a set
of crowded forward interest rate instruments. We have mapped
this problem to an interval-based interpolation, and assigned
a penalty measure to the problem. Under a trivial condition,
we have proved that this problem has a unique solution which
is given by a piecewise quadratic and continuously differentiable
function.
|
Back to top
|
|