33rd ORSNZ Conference Online Proceedings

Session 1A: OR Applications

Monday 9:10am - 11:15am

Session 1B: Finance and Economics Applications

Monday 9:10am - 11:15am

Keynote Address: Professor Azim Houshyar

Monday 11:35am - 12:35pm
  • A reality check: Are we building reliable and maintainable equipment?

    Session 2A: Young Practitioner Prize

    Monday 1:35pm - 3:00pm

    Session 2B: Young Practitioner Prize

    Monday 3:30pm - 4:30pm

    Session 3A: Electricity

    Tuesday 9am - 10:40am

    Session 3B: Theory

    Tuesday 9am - 10:40am

    Keynote Address: Professor Michael Saunders

    Tuesday 11am - 12pm
  • Optimization Algorithms and Applications

    Session 4A: Theory and Applications

    Tuesday 1pm - 2:40pm

    Session 4B: Rostering

    Tuesday 1pm - 2:40pm

    Session 5A: Practice and Education

    Tuesday 3pm - 4:40pm

    Session 5B: Theory and Applications

    Tuesday 3pm - 4:40pm


    A New Telecommunication Service Provision Model

    PDF File Shane Dye, Jan Audestad, Leen Stougie, Asgeir Tomasgard, Marteen van der Vlerk and Stein W. Wallace
    [email protected]

    The purpose of this talk is to describe a service provision model for telecommunication networks with distributed processing. One of the main advantages of the technological framework is its inherent flexibility, which enables dynamical planning. When we present the models, emphasis is placed on the shift of focus towards processing. Several factors are important in the technological push that has opened the possibilities for distributed processing in telecommunication networks. Examples of technological factors are digital technology, modern high-speed network architectures like B-ISDN, packet switched data transmission concepts like ATM and fibre optic cables. This allows for provision of more and more complex services. In addition, several standardization initiatives for distributed telecommunication architectures have been developed, for instance, the {\em Telecommunications Information Networking Architecture Consortium\/} (TINA-C) scheduled to be completed in 1997. One of the main differences between new networks based on distributed processing and traditional telecommunication networks is the increased flexibility in resource allocation. One impact of the above developments is that an enormous number of {\em services\/} can be provided on a telecommunication network. Services capable of processing information are offered through {\em computing nodes\/} by software applications running on these nodes. While traditional services tend to be transportation oriented, we believe that the extensive growth in newer services will come from those requiring more resources of the computing nodes. At the same time the investment cost of transportation capacities has decreased. This means that the limited resources in these networks will often be the computing resources, such as the processing capacity at the network nodes. Another development is that deregulation all over the world introduces competition in telecom markets. New players enter the telecommunications scene and the roles of old players change. For example, the difference between distributed computer networks and telecommunication networks are disappearing.


    Ranking Projects Using the ELECTRE Method

    PDF File John Buchanan, Phil Sheppard
    [email protected]

    Ranking and selecting projects is a relatively common, yet often difficult task. It is complicated because there is usually more than one dimension for measuring the impact of each project and more than one decision maker. This paper considers a real application of project selection for ECNZ, Northern Generation, using an approach called ELECTRE. The ELECTRE method has several unique features not found in other solution methods such as decision analysis or AHP. These are the concepts of outranking and indifference and preference thresholds, which provide a different perspective and solution process. The ELECTRE method is explained and applied to the project selection problem using a Visual Basic application within Microsoft Excel. Results show that ELECTRE was well received by the decision makers and, importantly, provided sensible and straightforward rankings. After contrasting the ELECTRE approach with an alternative decision analysis based solution method, also used within ECNZ for project selection, the paper concludes with a brief discussion of the importance of problem structuring for decision making.


    Blending petroleum products at NZ Refining Company

    PDF File Geoffrey Gill,
    [email protected]

    There are many petroleum products which New Zealand Refining Company (NZRC) must constantly blend from component tanks. The "operation" at NZRC goes twenty-four hours a days which implies that component tanks are continuously either filling up with a component, or being blended into a final product tank, or neither or both. Products can be made from many different combinations of components, but they must meet specifications (such as 96 octane for premium petrol and 91 octane for regular petrol) and be completed on time for loading onto a ship. Developing a blending schedule is a problem that has features of both a continuous and a batch-type nature. This paper is intended to provide an insight into difficulties that blending possesses and ideas behind developing a good blending schedule.


    Production Management Practice In New Zealand

    PDF File Chuda Basnet,
    [email protected]

    The function of production management (PM) is under increasing competitive pressure. Firms are emphasising customer-orientation, and they compete on the basis of cost, quality, customer-tailoring, and market response time. This pressure to perform is further aggravated by the decreasing product life cycle caused by rapid innovation. Recent developments in manufacturing philosophy such as just-in-time, flexibility, lead time reduction, and total quality control; and recent developments in manufacturing technology such as robotics, flexible manufacturing systems, and computer aided manufacturing have placed further demands on the PM function. How well is the profession of production management in New Zealand standing up to these challenges? Recent evidence shows that while the performance is generally poor, some companies are doing particularly well. However, these studies do not go into the tasks and responsibilities faced by production managers. Some research have been done abroad to study PM practice, and there exists a somewhat dated study of the New Zealand production executive done in 1981, but little is known about current production management practice in New Zealand, particularly in the face of competitive pressures mentioned above. This paper presents a preliminary report of a study of current practice of PM in New Zealand with a view to make useful contribution to curriculum design in this area.


    Descriptive Decision Making: Comparing Theory with Practice

    PDF File Stuart Dillon,
    [email protected]

    The more classical theories of choice emphasise decision making as a process of rational choice. In general, these theories fail to recognise the formulation stages of a decision and typically can only be applied to problems comprising two or more measurable alternatives. In response to such limitations, numerous descriptive theories have been developed over the last forty years, intended to describe how decisions are made. This paper presents a framework that classifies descriptive theories using the theme of comparison; comparisons involving attributes, alternatives and situations. The paper also reports on research undertaken within a New Zealand local authority. Twenty three senior managers were interviewed about their decision making with the aim of comparing the responses of participants with how the descriptive decision making literature purports decisions are made. Evidence of behaviour consistent with recognised descriptive theories was also investigated. It was found that few managers exhibited behaviour consistent with what is described in the literature. The major difference appears to be the lack of decision formulation contained within most descriptive theories. Descriptive theories are, in general, theories of choice and few decisions described by participants contained a distinct choice phase.


    Coordination in vehicle routing

    PDF File Catherine M Rivers,
    [email protected]

    A coordination point is a place that exists in space and time for the transfer of materiel and/or personnel and/or information between vehicles. The transfer may be initiated to accommodate payload, driver or vehicle requirements. This is in contrast to a node that exists for the transfer of goods between customer(s) and/or depot(s). Size, location or movement does not limit coordination points. They can be as large as a sub-depot or as small as a single letter drop-point; they can be strips of roads, areas of seas or deserts, or volumes of air space. They can be stationary, as in most land-based applications, or mobile as in the mid-air refuelling of aircraft. They have spatial and temporal characteristics that play a role in determining their classification, and can be serendipitous or pre-determined. In this variation of the vrp, the payload of at least some vehicles is changed after they leave the depot, in a pure delivery scenario with identical vehicles that begin and end daily tasks at a common, single depot. The objectives are to reduce the total distance travelled and to reduce the size of the vehicle fleet, thereby reducing operating costs. Loads can be re-deployed between the depot and the customer node by using coordination points that might involve split deliveries. Although such points can occur anywhere, this talk will concentrate on coordination at existing nodes. Some effects of coordination are to reduce the need to return to the depot between routes for multi-routed vehicles; to allow vehicles to meet en route and to swap loads; or to act as interim steps in route improvement. This is demonstrated by reference to particular examples in both the Euclidean and rectilinear grid network environments and represents work in progress.


    Learning and Forgetting Curves: A Practical Study

    PDF File Andrew Thomassen,
    [email protected]

    There exists a substantial body of literature on learning curves; however considerable less work has been done on the impact of forgetting. This paper, after surveying the literature on learning and forgetting, describes an experiment involving the processing of circulars (or "junk mail") designed to satisfy three objectives. These are to compare learning curves between the production line assembly and the single person assembly, to evaluate the impact of forgetting on the length of break, and to develop an equation which measures the time lost due to forgetting. In reference to these three objectives, the study first shows that the learning ability of an individual appears greater than that of a production line, but slower. The production line assembly is therefore more efficient than the individual. In terms of the impact of the break length on forgetting, it was found that the same power model equation used to model forgetting in earlier studies best measured the forgetting phenomenon. The time lost equation parameters showed that the greatest amount of forgetting occurs only after a short break, which is consistent with other studies.


    The rate of false signals for control charts with limits estimated from small samples

    PDF File Dan Trietsch, Diane Bischak
    [email protected]

    Due to more frequent use of short production runs and the need for shorter setup times in order to make these runs cost-effective, there has lately been a great deal of interest in the statistical properties of X-bar control charts with control limits that are based on unknown process parameters. Typically the true process distribution can be assumed to be normal, but the process mean m and variance s2 are estimated from some number m of initial subgroups of size n each. For short production runs it would be desirable to use only a small number of subgroups to estimate m and s2 in order to get on with charting the run as soon as possible. However, as a number of authors have discussed, basing the chart on estimates of m and s2 made from a small number of subgroups may give rise to some unexpected and undesirable effects. In particular, a chart with limits estimated from only a few subgroup means will tend on average to produce a greater number of false signals, which will add to the cost of production. The statistical properties of charts with limits estimated from small samples have usually been studied from the perspective of the effects of estimation on the average (expected) run length for a chart, known as its ARL. We examine the ARL and find that it is easily misunderstood and that, even when it is unambiguously defined, it has only moderate value as a focal point for the study of control charts. We argue that the rate at which false signals occur in a chart is both a more intuitive concept and a more useful one for determining a reasonable number of subgroups to sample in order to construct control limits.


    Managing Regime-Change

    PDF File Erne Houghton, Victor Portougal
    [email protected]

    Two aspects of regime-change management are transitory planning, which avoids assembly feed-stock shortages in the new processing regime, and the timing of the change, which enhances the meshing of adjacent plans. In this paper, the stock implications of regime changes which incorporate transitory planning, are optimised with respect to the timing of the change. To highlight the stock effect, processing within regimes is assumed to operate under JIT processing planning.


    Concave Envelope Analysis in Nonconvex Optimisation

    PDF File Gavin Bell,
    [email protected]

    Common solution techniques for nonconvex optimisation problems, such as branch and bound, cone covering, outer approximation, and inner approximation methodologies, involve the use and solution of relaxations of the original problem. These relaxations generally involve the construction of the convex envelopes or hulls of (some part of) the original problem's objective function, feasible region, or both. One disadvantage of using convex envelopes or hulls is that the shape or behaviour of the objective function or feasible region between extreme points is not explicitly taken into account. In this paper, we develop the concept of underestimating "concave envelopes". Similar in concept to convex envelopes, underestimating concave envelopes incorporate non-extreme point information in their construction. We then use underestimating concave envelopes to develop a method of parametric analysis for a class of nonconvex optimisation problems.


    On the optimal selection of portfolios under limited diversification

    PDF File Jay Sankaran, C. Krishnamurti, A. A. Patil
    [email protected]

    Empirical evidence suggests that individual investors typically hold only a small number of securities. Market imperfections such as fixed transaction costs provide one explanation for the prevalence of undiversified portfolios. Also, a small investor who chooses to invest in only a limited number of securities can devote more attention to the individual behavior of those securities and their mean-variance characteristics. There is also empirical evidence that diversification beyond 8-10 securities may not be worthwhile provided these securities are chosen not randomly but through a systematic, optimum-seeking procedure. Further, it may be superfluous to enlarge the number of securities in a portfolio beyond a limit because the variance-covariance matrix of the returns on the securities in a portfolio that has a large number of securities tends to conceal significant singularities or near-singularities. In light of the above, we address two problems concerning the selection of optimal portfolios of stocks under limited diversification (i.e., an upper limit on the number of securities in the portfolio). The first problem is that of selecting portfolios which maximize the ratio of the average excess return over the riskless rate to the standard deviation, among all those portfolios which comprise at most a pre-specified number, k, of securities from among the n securities that comprise the universe. The second problem is that of optimally selecting index-tracking portfolios which comprise at most k securities. (We measure the degree of tracking through the coefficient of correlation between the return on the portfolio and the return on a given market-index.) For both problems, we develop polynomial algorithms that are optimal under standard assumptions about the correlation structure of stock returns. We also prove that in some cases, the marginal benefit from diversification decreases with the number of securities in the portfolio; thus, diversification yields diminishing `returns'.


    The role of the expert's decision making skills in management

    PDF File Peter Gilmour,
    [email protected]

    Experts have been studied in detail by academics; managers have also received considerable attention, but little has been said about experts as managers. The label 'expert' has been well defined by many authors as people who, through training and extensive experience in a particular field of expertise, have achieved a level of performance that is recognised by their peers as that of an expert. Many experts are attracted to management positions that require numerous decisions outside the expert's field of expertise. This paper studies the decision making of experts in managerial positions. Are the decision making skills that are developed during the acquisition of their expertise the key to their success? Experts, such as scientists, teachers, engineers for example, bring to management positions a strong personal background of training and experience that is often lacking in specific management expertise, yet they are successful in their new roles. I will discuss my recent observations of experts who are working as managers, and suggest possible explanations for the observed behaviour of the experts in their new role. The discussion will focus on the decision making processes developed by experts in their field of expertise, and the utilisation of these processes by the expert in their new role as a manager. The methodology of the study will be explained and a model of the expert as a manger will be presented.


    Scheduling Generator Outages to Minimise Financial Impact

    PDF File Phil Sheppard,
    [email protected]

    Many papers have been written on short and medium term scheduling of hydro-electric generators on a river system so as to maximise operating profit. Over the last few years the importance of analysing the financial impact of hydro-electric generator maintenance outages has grown significantly. In general, generators are required to be released from service to perform maintenance. Historically, the co-ordination of generator outages on a river system has been performed on a "manual" basis using "rules of thumb" without knowledge of the financial impact. Determining the exact financial impact of a historical outage is a very complex task, the market equilibrium may be very different with and without the outage. For example, the market clearing price may change significantly and the resulting distribution of generation among market participants may be quite different. However, the financial impact of a set of outages can be approximated with sufficient accuracy to provide sensible data for decision making. A methodology to (i) cost a set of outages, and (ii) minimise their total cost is proposed. The total cost of a set of outages is adjusted by the timing of the outages, and a genetic algorithm is used to identify an improved solution. It is shown that the financial impact of a set of outages can be reduced given some freedom in outage timing.


    Long Term Memory Strategies for Solving the E/T Scheduling Problem

    PDF File Ross James,
    [email protected]

    Long term memory is an effective way of improving the performance of any heuristic search function. In this paper we examine different ways of incorporating long term memory strategies into a tabu search for the early/tardy machine scheduling problem. Computational experiments are carried out on a test set of problems and the performance of the different strategies is reported.


    Reservoir management with risk aversion

    PDF File Andrew Kerr, E.G. Read, John Kaye
    [email protected]

    We consider the problem faced by a manager planning the operation of a mixed hydro/thermal system, where the manager controls the reservoir release made in each week (single reservoir), as well as the generation from other sources. Demand is deterministic and must be met in each period, while the inflows experienced in each week are uncertain. Stochastic Dynamic Programming (SDP) is a technique often applied to reservoir management problems, with the scheduling horizon divided naturally into discrete time periods, storage as the state variable, and release as the decision variable. If the objective is to minimise the expected annual cost, then a large number of observations with low costs can cancel out a few observations with large costs (because all outcomes are weighted equally). In reality, the manager might want to be able to trade-off a reduction in extremely good outcomes for an improvement in extremely bad outcomes ie, putting more ‘weight’ on the bad outcomes, but this invalidates the standard dynamic programming recursive relationship. We describe a SDP formulation which accommodates a non-linear end-of-horizon utility function by augmenting the state space. We describe some simple algorithmic modifications which significantly reduce the computational requirements of the optimisation, and illustrate the impact of risk averse attitudes on system performance.


    A Simulation Model of Horseshoe Layouts

    PDF File Rakesh Sharma, Les Foulds
    [email protected]

    With the advent of information systems and the recent developments in computing technology, traditional manufacturing systems are fast becoming obsolete. A new generation of factories is being designed with cellular or linked-cell manufacturing systems. This recent shift from traditional to cellular manufacturing systems has necessitated the redesign of the factory and its layout. Traditional process and product layouts (involving straight and long assembly lines or departments) are now being replaced by horseshoe layouts which feature small, semi circular, work cells. We report on a comparison between traditional and horseshoe layouts. We shall discuss when each is the more appropriate with regard to quality and to cost.


    Simulation Results from a Hydro Management Model for a Deregulated Electricity Industry

    PDF File Tristram Scott, Stephen Batstone
    [email protected]

    We have developed a reservoir management model which integrates a Cournot spot market model into a Dual Dynamic Programming framework. Simulations using this model show that the market outcomes depend strongly upon three factors: the number of competitors, the presence of contracts, and the elasticity of demand. Results will be presented showing the effect these factors have on the reservoir management policy, and on the efficiency of the market as a whole.


    The scheduling of forest harvesting with adjacency constraints

    PDF File Alistair McNaughton, M. Ronnqvist, D. M. Ryan
    [email protected]

    A new model of the forest harvesting problem with road construction is presented integrating both strategic (long term) and tactical (short term) planning constraints. The solution algorithm involves innovative column generation and constraint branching techniques. A practical application to a New Zealand production forest will illustrate this talk.


    Tool Carousel Design via Integer Programming

    PDF File Les Foulds, J. M. Wilson
    [email protected]

    We describe a branch and bound algorithm for an assignment problem subject to a special set of side constraints. The problem has application in the design of tool carousels for certain flexible manufacturing systems. The resulting model represents a special case of the restricted facilities layout problem in which it is forbidden to locate any facility in certain zones. The bounds for the algorithm are generated by relaxing the side constraints and using the Hungarian method to solve the resulting assignment problem. Partitioning in a manner similar to subtour elimination for the traveling salesman problem leads to encouraging computational results.


    Analysis of Non-Physical Dispatch in the Gas and Electricity Pricing Models

    PDF File Gavin Bell, Deb Chattophadhyay, E. Grant Read
    [email protected]

    Over the past decade, the New Zealand and the Australian electricity industries have undertaken a process of reform and deregulation. Currently the Victorian gas market is undergoing a similar process. One important aspect of such markets is the short-term dispatch of supply and demand connected to a transmission network, and the consequent calculation of half-hourly spot prices for gas or electricity at all points (or nodes) in the network. In practice the optimisation models used to determine the short-term dispatch are linear programmes that, by definition, assume the system constraints to be linear (or convex piecewise linear). However, both the gas and electricity systems contain system constraints that are nonconvex. A consequence of this is that a linear programming model may occasionally produce a non-physical dispatch. The resultant spot prices are therefore based on a schedule that cannot in reality be achieved. In this paper we first provide a description of the non-physical dispatch problem as it applies to the gas and electricity industries. We then outline some techniques and procedures that have been considered to overcome this problem.


    On Symmetric Traveling Salesman Polytope and Separation

    PDF File Tiru Arthanari,
    [email protected]

    An alternate formulation of the Symmetric Traveling Salesman Problem was given in Arthanari(1982).There, the Symmetric Traveling Salesman Problem(STSP) is posed as a multistage decision problem and a mathematical programming formulation of the same is given in three subscripted variables , X. The slack variables, u, that arise out of this formulation are precisely the edge-tour incidence vectors. In this paper we consider the LP relaxation of this formulation and show that if the X variables further satisfy certain polyhedral inequalities (called star- inequalities ) then u, correnponding to X, is in the convex hull of tours.It is also shown that given a u in the subtour elimination polytope and a corresponding X feasible to the above said Lp relaxation, it can be verified in polynomial time in n, number of cities, whether X satisfies the star - inequalities. Any violted star-inequality can be found otherwise. However in general, it is not true that if X violates a star- inequality then the corresponding u does not belong to the convex hull of tours. Illustrative examples are given.


    A Review of Goldratt's Theory of Constraints (TOC) - lessons from the international literature

    PDF File Steve Balderstone, Vicky Mabin
    [email protected]

    The two authors are currently finalising the first comprehensive bibliography of TOC for publication by North River Press, the publishers of Eli Goldratt's books, and other books on TOC. Based on this extensive search of the literature, this talk will draw on examples of applications of Goldratt's Theory of Constraints (TOC), report on the results of applications both in New Zealand and overseas, and summarise important findings on the theory and practice of TOC.


    Optimal participant behaviour in electricity markets

    PDF File Andy Philpott, Eddie Anderson, John Kaye
    [email protected]


    Yacht Match Race Simulation

    PDF File David Teirney,
    [email protected]

    There are many factors that can determine the winner of a yacht match race. Factors such as the dynamic performance of a yacht, the tactical decisions made while sailing, the weather conditions observed during the race, and the necessity to adhere to the rules of yacht match racing can all determine the winner. By constructing a simulation model that takes all of these factors into consideration, the probability that one yacht wins or loses to another could be calculated. The dynamic motion of a yacht can be simulated using performance measures obtained from a velocity prediction program. Then by creating observable weather conditions and designating a course to race on, the direction of travel can be calculated using logic that approximately simulates tactics that would be carried out during an actual match race. By running the simulation many times in slightly different probabilistic weather conditions, the probability that one yacht beats another could be calculated for that set of observed weather conditions. By working in conjunction with Team New Zealand, the dynamic motion, the weather and the tactical decision models can be perfected so that the simulation model is as realistic as possible. Once the simulation model is complete it could be used to choose the most suitable yacht to defend the America’s Cup if the observable weather conditions could be estimated in advance. It could also be useful in the design process, giving the ability to look at different design tradeoffs, such as upwind speed versus downwind speed, to maximise the probability of winning. And finally it could also be used to trial different tactical strategies whilst racing.


    Pedagogical Strategies for the teaching of ORMS

    PDF File John Davies,
    [email protected]

    The early development of ORMS was characterised by the deliberate use of multi-disciplinary project teams, whose composite strengths, creativity and problem-solving abilities were enhanced by the complementarity of the different approaches and perspectives that were brought to bear on a problem. This paper reflects on the view that ORMS education programmes have provided, in the main, a 'traditional' hard systems emphasis on technical analytical skill and the generation of optimal solutions, rather than emphasise the harnessing of complementary approaches and perspectives, or the development of 'softer' problem identification and problem structuring skills. However, in the last decade, various methodologies that value and encourage the development of multiple problem representations and multiple perspectives, have gained credence and increasing use as aids to problem identification and problem structuring. This paper outlines a pedagogical strategy which can be used to guide the design of problem structuring and modelling activities in ORMS courses that attempt to embrace alternative approaches; and to legitimise and integrate the multiple perspectives that arise from the use of multiple problem representations.


    From Design to Support of Air New Zealand Rostering Systems

    PDF File Stephen Miller,
    [email protected]

    Air New Zealand rostering problems come in different shapes and sizes. To efficiently solve these problems complex mathematical models must be developed. In addition the implementation of these model is complicated and may take considerable time. The software produced must then be integrated into both the work environment and other existing systems. It must be tuned for speed and performance and then support must be provided. All these are challenges that face the operation research practitioner. In this paper we will discuss our experiences, both the good and bad, from design through to support of one of Air New Zealand's rostering systems. We review the history of this system that has evolved from a single roster solver to a versatile multi-roster solver. The model we consider involves column generation technology and we take a close look at the inside of the column generator including state space, merging and rules. We look at various options and techniques that have been successfully used to speed up the software.


    Progressive Hedging in Parallel

    PDF File Michael Somervell,
    [email protected]

    Progressive Hedging in Parallel. The Progressive Hedging Algorithm is a technique for solving linear discrete stochastic programs. It is of special interest in large-scale problems (which can easily happen with discrete stochastic problems) as it decomposes the problem into smaller problems that can be solved independently. My thesis has implemented the Progressive Hedging Algorithm in parallel, by solving the scenarios in parallel at each stage. The Progressive Hedging Algorithm is very well suited to this as the all the scenarios are solved independently. I have then created several extensions to this, most of which are simply heuristics at this stage, as I have not proven that they converge. The extensions include asynchronisation within each iteration and not solving all of the scenarios within each iteration. The most significant results to date are that the problems still obtain an optimal solution when less than all of the scenarios are solved before updating, and that this leads to a speedup in the solution time. I also hope to perform an investigation into the choice of the penalty parameter, and into dynamic update schemes for it. Preliminary results suggest that a large speedup is also possible here.


    Optimised Cell Batching for New Zealand Aluminium Smelters Ltd

    PDF File David Ryan,
    [email protected]

    New Zealand Aluminium Smelters Ltd operates a smelting facility at Tiwai Point near Invercargill. The smelter produces aluminium by the electrolytic reduction of alumina in 650 cells. The cells are laid out in long lines and are grouped into tapping bays. The purity of the aluminium varies from cell to cell depending on a number of factors including the age of the cell, the purity of the alumina feed and the manner in which the cell has been operated during its production life. High purity aluminium commands a premium price on the metals market. In the production process molten aluminium is tapped from the cells into crucibles that are used to transport the metal to furnaces from where finished product is cast in the form of billet or ingot. Each crucible taps the metal from three cells in a tapping bay that are located within some specified limit of spread. The spread limit is imposed to minimise the time taken to fill the crucible. This paper describes the development of a set partitioning optimisation model to batch or group triples of cells so that the total value of metal produced is maximised. The main aim is to minimise the dilution of high purity (high value) metal by low purity (low value) metal. The optimised batches tend to group high purity cells together and leave the lower purity cells to be batched with other lower purity cells. Numerical results show that significant improvements of up to 12% can be achieved in the value of metal by carefully batching cells.


    Modelling the International Flight Attendant Tour of Duty Problem

    PDF File Chris Wallace, David Ryan
    [email protected]

    A Tour of Duty (TOD) is an alternating sequence of duty periods and rest periods that makes up a work schedule for a number of people. An airline's flight schedule is fully crewed by the interaction of a large number of TODs, combined in such a way that each flight will have the required number of crew on board. The International Flight Attendant problem is probably the most difficult of all the TOD problems because the number of flight attendants required on a flight depends on the aircraft type, and flight attendants are qualified to operate on all aircraft types. This means that crews can split into smaller groups and recombine with other groups to form different complements. This paper describes the problem and presents a novel approach which will be used to solve the problem. This approach involves a combination of column generation and row generation.


    Commercial development of a general application optimisation-based rostering engine

    PDF File David Neilsen, Andrew Mason
    [email protected]

    This paper presents two products of research work undertaken at the University of Auckland involved with the commercial development of a general application optimisation-based rostering engine. The first product is a zero-one IP solving package (ZIP_R) that employs linear programming and branch and bound methods. This solving package is specialised for the treatment of rostering problems. The second product is a general rostering software framework that uses the ZIP_R package as a core solving engine.


    Call Centre Rostering by Iterating Integer Programming and Simulation

    PDF File Andrew Mason, Shane Henderson
    [email protected]

    We present a new technique (RIIPS) for solving rostering problems in the presence of service uncertainty. RIIPS stands for "Rostering by Iterating Integer Programming and Simulation''. RIIPS allows essentially arbitrary complexity of the stochastic system being rostered. This modelling freedom comes at a price, as the approach can be extremely computationally intensive.


    Keynote Address: Optimization Algorithms and Applications

    PDF File Michael Saunders,
    [email protected]

    The Systems Optimization Laboratory was formed by George Dantzig and Richard Cottle nearly 25 years ago, to foster research on algorithms for constrained optimization (linear and nonlinear programming). Keywords are computational methods, large-scale systems, and more recently, optimization under uncertainty. We review the main solvers (MINOS, NPSOL, and SNOPT) and some interesting applications that have arisen through their use within larger systems (GAMS, OTIS and DECIS). Excerpts from "Life in NZ" will illustrate some important optimization concepts.


    Keynote Address: A reality check: Are we building reliable and maintainable equipment?

    PDF File Azim Houshyar,
    [email protected]

    A review of two major industry (automotive and nuclear) in the US reveals their lack of attention to the use of statistical tools and techniques in: . Constructing valid simulaion models of manufacuring processes. . Designing, manufacturing, operating, and mainaining reliable machinery and equipment. This presentation focuses on two case studies, and argues that the machine tool builders and manufatcuring industry, alike, have failed to fully appreciate the importance of statistical tools and techniques in improving their productivity and reducing their operational cost.


    Heuristics in Rostering for Call Centres

    PDF File Shane Henderson, Andrew Mason, Ilze Ziedins, Richard Thomson, David Burgess
    [email protected]

    An important new feature on the business scene is the development of calls centres, whereby a pool of staff is used to answer incoming calls from customers. This project develops a model that enables staffing levels to be determined to meet specified quality targets on customer wait times. Unlike most previous work, this new model explicitly considers call arrival rates that vary during the day, and exploits linkage between periods to keep staffing costs to a minimum.


    Aggregate Planning for Winstone Aggregates

    PDF File Stuart Mitchell,
    [email protected]

    Winstone Aggregates Ltd are facing major changes as the Auckland based quarries, specifically Mt Wellington, are running out of resource. This project develops a linear programming model of the production of aggregate in the quarries and its transportation to the markets. This system is currently being used at Winstone’s to assist in their strategic planning.


    Improved Shift Generation for NZ Customs

    PDF File David Geary,
    [email protected]

    For many years, the University of Auckand has been involved with the staffing problems faced by NZ Customs at Auckland International Airport. To date, the shift generation phase of this work has been implemented using heuristics that attempt to capture many of the non-linearities associated with Custom’s quality measure. This project develops improved linear programming based tools to generate these shifts, and shows that this approach leads to improved quality rosters for Customs staff.


    Risk averse reservoir management in a deregulated electricity market

    PDF File Mark Craddock, A. D. Shaw and B. Graydon
    [email protected]

    In a deregulated electricity market, there are three major influences on the strategy of a generation company, market interactions, contract positions and attitude towards risk. An optimisation tool to address the medium-term scheduling problem should incorporate all three factors when formulating a strategy. Recent research has shown much progress in market interactions and contract position, and independently progress has also been made in looking at risk aversion. This paper draws together this work and presents a medium-term model which combines all three factors. We provide an overview of the formulation of this model and discuss the behaviour of the model. We also outline some future directions for this model.


    Pricing for congestion in an M/G/1 queue.

    PDF File Moshe Haviv,
    [email protected]

    No doubt that a customer (or a job) who joins a queue imposes delays on others. He should be charged for that. The question then is by how much. In particular, one looks for a price mechanism with which total waiting costs should be shared while customers pay in accordance with their own length of service. In the talk two such price mechanisms will be presented. The first is when customers are charged for the total waiting time accumulated while they are being served. The second is based on applying the Aumann-Shapley cost-sharing price mechanism. These mechanisms will be exemplify in the M/G/1 queueing system with various entrance disciplies like first-come first-served and processor sharing.


    CSL - a knowledge management tool for life

    PDF File John Paynter, Don Sheridan, David White
    [email protected]

    An Internet browser-based computer-support learning tool was designed and built by MTTU to serve as an assessment vehicle for business students. Now in its third year of operation it has migrated across computer platforms and been completely redesigned to become a university resource. Hundreds of students use CSL daily. This paper will outline the development of CSL and discuss our work in connecting taxonomy of knowledge with the multimedia assets needed for learning and assessment. In particular, its use in teaching Accounting, Information systems and Software Engineering courses and the lessons to be learnt will be explored. The implications for self-directed study and our University’s goal to provide life-long learning will be presented.


    An Algorithm for Capacity Expansion in Local Access Networks

    PDF File Andrew Coyle,
    [email protected]

    Growing demand, innovations in telecommunication technologies, and the expansion and diversification of services offered have created a variety of design and expansion problems in telecommunication networks. This paper looks at one of these problems, the Local Access Network Expansion Problem (LANEP), in which existing cables and concentrators must be installed or upgraded to handle growing demand. The general problem can be modeled as an integer linear program, but this is known to be NP-hard and thus typical network sizes preclude the use of such methods. Various other solution methodologies and algorithms have been proposed, including a dynamic programming algorithm which was solvable in pseudo-polynomial time. This paper presents an extension to this algorithm which representatively and implementationaly simplifies it, and allows two of the major LANEP assumptions (contiguity and non-bifurcation) to be easily switched on or off. Translating this general LANEP algorithm into a real world application does create practical problems which prevent it from being implemented as is. These problems are discussed and current solutions under investigation are presented.


    The Matrix, the Spiderweb & the Influence Diagram: Development of Systems Thinking in the NZ Customs Service

    PDF File Robert Cavana, Leslie V. Clifford
    [email protected]

    This paper discusses the introduction of systems thinking concepts into the strategic planning process at New Zealand Customs Service (NZCS). The recent state sector reforms and restructuring at NZCS are briefly outlined. A Working Group of government officials from NZCS, Treasury and State Services Commission was engaged to undertake a Baseline Review of Customs activities. This included commissioning a pilot study to investigate the suitability of the system dynamics methodology to determine the desired purchase mix for the NZCS. The pilot study was performed on the "Search for Drugs" output group from the Purchase Agreement between the Minister of Customs and the Chief Executive of NZCS. This paper summarises the pilot study and presents a range of system diagrams showing the interrelationships between the inputs, outputs and outcomes associated with the search activity. A lack of suitable data and oucome measurements prevented the development of a quantitative model. However, the study has provided a framework for strategic analysis at the NZCS.


    Measuring the Effects of Local Government Reform: A New Zealand Highway Maintenance Application

    PDF File Paul Rouse, Martin Putterill
    [email protected]

    In 1989 several hundred New Zealand local government authorities were amalgamated into less than one hundred larger entities. The interest in this research is on determining pre- and post-amalgamation efficiency focusing on highway maintenance activities, being the major item of expenditure for most NZ local authorities.


    ORSNZ
    ORSNZ 33rd Conference

    Last modified August 20, 1998.