**Editor's note: INFORMS will not publish papers that appear in our Proceedings.** If you submit an expanded or modified version of your ORSNZ paper to an INFORMS journal, INFORMS will judge it on its marginal contribution.

Besides that, this editor's reaction to our Conference papers is *Wow!*
After some nerve-racking weeks where it seemed we would have a rather small conference,
the Twenty Naught One is shaping up to be first class, almost exactly the same size as last year's conference
(even a bit bigger). The quality of the papers is very good - many are “journal ready.” Thanks to all of you for contributing! - *jfr*

The abstracts are sorted by title. If the full paper is available, you can click the title for the PDF file.

Click the bullet by the title to show an individual abstract.

Click here to show or hide all abstracts.

A Beam Search Heuristic for Multi-Mode Single Resource Constrained Project Scheduling, by Basnet, Tang & Yamaguchi

Chuda Basnet, Dept. of Management Systems, Univ. of Waikato, Private Bag 3105, Hamilton, NZ.

Guochun Tang, School of Management, Shanghai Second Polytechnic Univ., 50 Yuan Ming Yuan Road, Shanghai 200002, China.

Tadashi Yamaguchi, Faculty of Information Media, Hokkaido Information Univ., Nishinopporo 59-2, Ebetsu-shi, Hokkaido, Japan

In this paper, the beam search technique is applied to resource constrained project scheduling where there is a single renewable resource to consider. Such projects occur frequently in practice: such as use of labour in construction projects, or constraints on number of programmers to carry out a software project. Usually the manpower needs are estimated in units such as work-hours or work-days. The multi-mode consists essentially of how many people can be employed to finish an activity. This problem is also called the discrete time/resource trade-off problem. The beam search employs a truncated breadth-first search tree, where all the choices are evaluated at each node, but only a limited number of the choices are selected at each level for further search. A number of evaluation techniques are presented for employment within the beam search heuristic. Computational results are presented.

Adjacency Constraints in Forest Harvesting, by McNaughton, Page, & Ryan

Alastair McNaughton, Dept. of Mathematics, Univ. of Auckland.

Geoff Page.

David Ryan, Dept. of Engineering Science, Univ. of Auckland.

Area-restricted adjacency constraints, which control the availability for harvest of specific blocks of trees relative to the harvesting of certain adjacent blocks, are a necessary part of many large-scale forest harvesting applications. However, such constraints are difficult to formulate and often precipitate major difficulties in the practical implementation of the solution algorithm. A new model which involves non-standard concepts will be presented along with some test results from a couple of case studies.

Togar M. Simatupang, Institute of Technology and Engineering, Massey Univ., NZ.

R. Sridharan, Institute of Information Sciences and Technology, Massey Univ., NZ

A supply chain typically consists of interrelated members such as raw-material suppliers, manufacturers, distributors, and retailers who have different ranges of private information. Information sharing is a strategy to achieve cohesion of all functions amongst the chain members, so as to provide adequate visibility to enable them make good decisions that can improve the total chain profitability. It is not surprising that many studies in both theoretical and practical orientations have been devoted to emphasising the benefits of information sharing. However, little attention has been given to highlighting a comprehensive characterisation of information sharing in supply chains. This paper provides a simple characterisation of information sharing in terms of research content and approach. Since total visibility is difficult to obtain amongst the chain members, we also conceptualise the conditions that make it optimal to share private information for profit maximisation from the global perspective. Using these conditions, information sharing leads to a win-win situation to the problem of unequitable benefits amongst participating members.

Key words: information sharing, supply chain management.

A Model for Demand-side Electricity Optimisation, by Pettersen

Erling Pettersen, Norwegian University of Science and Technology.

This paper presents a Stackelberg game model of strategic interaction between an electricity retailer and an end-user in an electricity pool market. The retailer offers a vector of 24 hourly prices to the end user who lives in an intelligent house. The intelligent house minimises the electricity costs based on the offered prices taking into account the residents' habits. The consumption pattern chosen by the consumer must be purchased by the retailer in the risky wholesale market. We assume that the risk averse end-user is unwilling to take the risk in the wholesale market while the risk neutral retailer seeks to maximise expected profit. Thus the price profile offered includes a risk premium. To model the house's cost minimization we have chosen to view the house as an energy storage, where space and water heating are carried out at the lowest possible cost, an approach which results in the house solving a linear program. This is convenient due to the highly non-linear and non-convex nature of the mathematical program with equilibrium constraints (MPEC) which constitutes the overall problem.

Michael Lauren, Head, Operational Analysis Section, Defence Technology Agency, Devonport Naval Base, Private Bag 32901, Auckland, NZ

Many militaries use combat modelling as an Operational Research technique to assess potential risks for various kinds of operations. However, conventional combat equations tend to be firepower centric and of limited use for describing the environment that the modern warfighter must operate in. Significantly, they fail to explore how armed forces can adapt and re-organise themselves on the battlefield to defeat an apparently superior opponent, but rather treat each side as little more than "cannon fodder". A new equation-based method is suggested which deals with dispersed land warfare operations. A generic scenario of a small but powerful Blue force maneuvering through a larger but dispersed Red force is used to illustrate this theoretical framework. The scenario is run within an advanced cellular automaton combat model called MANA, designed by the New Zealand Defence Technology Agency. The results show that a certain ensemble of runs will never reach a pre-determined casualty level. For the runs outside of this ensemble, the role of individual lethality is not as crucial as might be expected from the Lanchester equation, due to the ability of model entities to concentrate firepower. This is symptomatic of the correlated nature of the distribution of entities, both in space and time. These correlations are describable by fractal power laws. Furthermore, the distribution of model outcomes exhibit a fat tail, as is characteristic of other systems which display fractal traits. The model results appear to support the suggested equation.

*YPP competitor:* An Online Optimised Pump Scheduling System, by Pegg

Stephanie Pegg, Beca Consultancy Services, Auckland, NZ.

A common problem facing the suppliers of bulk drinking water is how to minimize the cost of supply. In a typical potable water distribution network, water is supplied from a number of water treatment plants into a mains network. From there it can either be directly piped into domestic reticulation or pumped into a load balancing municipal reservoir. The cost of supply has several aspects: the chemical cost of treating water, the cost of pumping water into the mains network and the cost of pumping water from the mains network into the municipal reservoirs. The demand of water varies seasonally and by time of day so it is uncommon for water treatment plants and reservoir pumping stations to operate at their peak capacity continuously. The goal of optimization is to find a schedule of water treatment plant outputs and pump running times that will minimize the cost of supplying the demand while meeting operational constraints. This paper describes a solver called Derceto which solves the pump scheduling problem for the Wellington Regional Council (WRC)'s Wainuiomata-Waterloo potable water distribution network. Since installation, WRC have reported a 10% reduction in the cost of supplying water to the Wainuiomata-Waterloo network.

Cameron Walker, Dept. of Engineering Science, School of Engineering, Univ. of Auckland.

This paper describes the development and implementation of an optimisation model used to resolve disruptions to an existing train timetable throughout the day, in real-time. It is the further development of a model designed by J. Snowdon (1999) which simultaneously solved the train timetable and train crewing problems. This parallel construction process was a unique approach to these problems at the time, and is continued in the methodology described in this paper. Results are presented which suggest that the speed of this solution process is sufficient to deal with real-time disruptions.

Asset Management of Water Distribution Systems, by Watson & Mason

Tim Watson, Dept. of Engineering Science, The University of Auckland.

Dr Andrew Mason, Dept. of Engineering Science, The University of Auckland.

The efficient long term management of large-scale public funded assets is an area of growing importance. Ageing infrastructure, growth, and limited capital, all result in the need for more robust and rigorous methodology to prioritise rehabilitation and renewal decisions and more importantly, to forecast future expenditure requirements. Research is currently underway at Harrison Grierson Consultants Limited to develop such a methodology as part of a structured research programme. The overall objective of the research is to develop a Bayesian-based decision support system that will facilitate the identification of efficient asset management policy. This paper focuses on areas of research relating to the long term management of water distribution systems and in particular, will present: 1) A general review of existing research; 2) development and initial results for an object orientated discrete event simulation, and 3) proposed future research and development.

Bookmobile Routing and Scheduling in Buskerud County, Norway, by Foulds, Wallace, Wilson, & Sagvolden

Les Foulds, Univ. of Waikato, NZ.

Stein W. Wallace, Molde College, Norway.

John Wilson, Univ. of Loughborough, Great Britain.

Liv Sagvolden, Buskerud Fylkesbibliotek, Norway.

A bookmobile is a specially adapted bus or van used as part of the outreach operations of public library systems. Bookmobiles play a significant part in the service of the public library system in Buskerud County, Norway. They are used to deliver and collect library materials (printed books, audio books, periodicals, and music) to and from lender groups throughout the County, many in remote areas. The question of how best to utilize the County's bookmobile resources can be modelled as an interesting variation of one of the classical models of OR; the vehicle scheduling and routing problem. The combination of the features that make this scenario non-standard include: multiple depots, simultaneous cost minimization and prize collection objectives, differing customer service levels, time windows, route start time flexibility for some routes, multiple route duration restrictions, route lunch breaks, and overnight stays on certain routes. We report on a model for the bookmobile problem and the outcome of applying it to the Buskerud County bookmobile system.

Key words: Bookmobile operations, vehicle scheduling, integer programming, case study.

Capital Planning in the Paper Industry using COMPASS, by Everett, Aoude, & Philpott

Graeme Everett, Norske Skog Limited.

Sami Aoude, Norske Skog Limited.

Andrew Philpott, Dept. of Engineering Science, Univ. of Auckland.

Norske Skog operate three paper mills in Australia and New Zealand. These mills together run six large paper machines. Decisions on upgrading the paper machines, replacing them, or shutting them down must take account of changing demand for different grades of paper. In upgrading the machines there are many different choices of technology, each of which provides different production possibilities and has different costs. This paper describes the development and implementation of a model used by Norske Skog to assist in determining an optimal capital plan. The model, called COMPASS, is a large mixed integer programming problem formulated with a ten-year planning horizon, and solved using AMPL and Cplex 7.0. We present some of the special features of this model and describe how it was used in the capital planning process.

Congestion Control Algorithms in High Speed Telecommunication Networks, by Haider, Sirisena, Pawlikowski, & Ferguson

Aun Haider, Dept. of EEE, Univ. of Canterbury, Christchurch, NZ.

Harsha R. Sirisena, Dept. of EEE, Univ. of Canterbury, Christchurch, NZ.

Krzysztof Pawlikowski, Dept. of Computer Science, Univ. of Canterbury, Christchurch, NZ.

Michael J. Ferguson, INRS-Telecommunications, Univ. of Quebec, Canada.

Modern telecommunication and computer networks, including the Internet, are being designed for fast transmission of large amounts of data, for which Congestion Control Algorithms (CCAS) are very important. Without proper CCAS congestion collapse of such networks is a real possibility. Random tele-traffic is a heterogeneous mixture of streams of data packets that have different quality-of-service requirements. By buffering submitted packets at gateway nodes we can regulate rates at which packets enter the network, but this may increase the overall packet delays to an unacceptable level. Therefore it is increasingly important to develop mechanisms for gateways that are able to keep throughput of a network high while maintaining average queue lengths sufficiently small.

Several algorithms proposed recently try to provide the efficient solution to the problem. In one of these, Active Queue Management (AQM) with Explicit Congestion Notification (ECN), packets generated by different data sources are marked at the gateways. Thus, different senders of data can be required to reduce their traffic volume if it is needed. Communication with original data senders is maintained by sending them back the marked acknowledgement packets.

In this paper we give an overall classification, survey and comparison of the major CCAS proposed for gateways. These CCAS strive to provide fairness in bandwidth allocation among the competing flows, low queuing delay, high throughput and decreasing packet drop rate at the routers before congestion actually occurs.

*YPP competitor:* Data-mining and Neural Networks from a Commercial Perspective, by Cerny

Portia Cerny, Aim Proximity, Auckland, NZ.

Companies have been collecting data for decades, building massive data warehouses in which to store it. Even though this data is available, very few companies have been able to realize the actual value stored in it. The question these companies are asking is how to extract this value. The answer is data-mining.

There are many technologies available to data-mining practitioners, including Artificial Neural Networks, Regression, and Decision Trees. Many practitioners are wary of Neural Networks due to their black box nature even though they have proven themselves in many situations.

In our current research we are attempting to compare the aforementioned technologies and determine if Neural Networks outperform the more traditional statistical techniques. This paper is an overview of artificial neural networks and questions their position as a preferred tool by data-mining practitioners.

*Plenary address*: Decision Making Under Uncertainty: Is Sensitivity Analysis of Any Use? by Wallace

Stein W. Wallace, Molde College, Norway.

Sensitivity analysis, combined with parametric optimization, is often presented as a way of checking if the solution of a deterministic linear program is reliable even if some of the parameters are not fully known, but are instead replaced by a best guess, often a sample mean. It is customary to claim that if the region over which a certain basis is optimal is large, one is fairly safe by using the solution of the linear program. If not, the parametric analysis will provide us with alternative solutions that can be tested. This way, sensitivity analysis is used to facilitate decision making under uncertainty by means of a deterministic tool, namely parametric linear programming. We show in this talk that this basic idea of stability has little do with optimality of an optimization problem where the parameters are uncertain. The tool does not deliver what it promises.

Determinants for Franchise Success, by Paynter & Arthanari

John Paynter, Dept. of Management Science, Univ. of Auckland.

Tiru Arthanari, Dept. of Management Science, Univ. of Auckland.

At a time when the rest of the economy has been gloomy, franchising in New Zealand has seen annual growth rates of between 8 to 30% over the last five years. The first of the annual surveys in 1997 recorded a turnover of $3.5B. This year's estimates exceed $10B with 14,000 units and employment figures of around 70,000. Yet the success and growth has not been uniform across all of the 300 systems involved in this business format. The failure rate of the individual units over a three-year period is around 7%, ranging from 0 to 50%. It is hard to pin down accurate figures for non-franchised businesses but they are commonly reported to be much higher. The franchisor's initial investment is typically in the order of $100,000 and the median start-up costs of franchised units is $125,000. It is therefore in the best interests of both parties to ensure that the franchisees receive the correct types of support and training in return for the investment and on-going royalties, to maximise their survival and the overall health of the system. This paper examines some of the factors that contribute to the success of franchise systems. In particular the Return on Investment (ROI), failure rates and overall satisfaction levels are analysed against the training and support services offered by the systems.

Development of FIDO - A Network Design Tool for Fibre Cable Layout, by Mason & Philpott

Dr Andrew Mason, Dept. of Engineering Science, The University of Auckland.

Andrew Philpott, Dept. of Engineering Science, Univ. of Auckland.

There are currently a number of new fibre optic cable networks being laid through several of New Zealand's major cities. This paper discusses a design tool, termed FIDO (Fibre Diversity Optimiser), that has been developed for a network operator to assist them in their network planning. FIDO combines a specialised visualisation GUI with a mix of optimisation and heuristic procedures to create a customised decision support tool for fibre cable layout. FIDO is currently being used by the the operator's design team.

*YPP competitor:* Electricity Network Reliability Optimization, by Singh

Kavinesh Singh, Dept. of Engineering Science, Univ. of Auckland.

Electricity distribution networks are subject to random faults. On occurrence of a fault in the distribution network, the circuit breaker on the faulty feeder trips, which disconnects supply to all customers connected to that feeder. Once the fault is located, the faulty component is isolated and then repaired. The cost of isolating and repairing faults depends on the way the network is configured, and the size and location of its components, since a major part of this cost comes from compensating customers if a fault in the network causes the interruption duration to be greater than a given threshold duration. We describe a model developed in collaboration with a local lines company for minimizing the cost of reliability of distribution networks by switch reconfiguration. Three local search heuristics to reconfigure radial distribution networks are investigated. For each component, the time between failures, the isolation duration, and the repair duration are modelled using Weibull distributions. The model allows the company to reconfigure their current radial distribution networks to minimize their total expected cost of reliability, as well as providing a tool for investigating capital investment scenarios.

*YPP competitor:* Estimation and Optimisation in Electricity Pool Markets, by Hu

Pei-Teh Hu, Dept. of Engineering Science, Univ. of Auckland.

In electricity pool markets, generators submit offers of quantity-price pairs to an independent system operator, who dispatches power to meet loads at the lowest cost using these offers. The market-clearing price at any location is the marginal cost of supplying electricity to this location. By withholding capacity or offering at a high price, generators can influence the market-clearing price, which is also affected by their competitors behavior. We represent the uncertainty in the competitor offers and the demand by a single probability distribution called the market distribution function. In this paper, we explore a methodology to estimate the market distribution function and construct an offer stack that gives the maximum expected profit with respect to this distribution. A case study based on historical data from Mighty River Power will be presented.

*YPP competitor:* Fast Shortest Path Algorithms for Large Road Networks, by Engineer

Faram Engineer, Dept. of Engineering Science, Univ. of Auckland.

Shortest path problems are among the most studied network flow optimisation problems, with interesting applications in a range of fields. One such application is in the field of GPS routing systems. These systems need to quickly solve large shortest path problems but are typically embedded in devices with limited memory and external storage. Conventional techniques for solving shortest paths within large networks cannot be used as they are either too slow or require huge amounts of storage. In this project we have tried to reduce the runtime of conventional techniques by exploiting the physical structure of the road network and using network preprocessing techniques. Our algorithms may not guarantee optimal results but can offer significant savings in terms of memory requirements and processing speed.

Our work uses heuristic estimates to bound the search and direct it towards a destination. We also associate a radius with each node that gives a measure of importance for roads in the network. The further we get from either the origin or destination the more selective we become about the roads we travel on, giving priority to roads with greater importance (i.e. roads with larger radii).

By using these techniques we were able to dramatically reduce the runtime performance compared to conventional techniques while still maintaining an acceptable level of accuracy.

Hard OR, Soft OR, Problem Structuring Methods, Critical Systems Thinking - A Primer, by Daellenbach

H. G. Daellenbach, Professor Emeritus, Dept. of Management, Univ. of Canterbury.

Up until the early 1970s there was a fair consensus among operations researcher of what OR/MS meant, i.e., the use of quantitative methods to model decision making situations in view of achieving some well defined objective. This all changed in the 70s, 80s, and 90s with the development of alternative approaches to decision making and problem solving. These methods or methodologies are also based on systems thinking and claim to be operations research and/or management science. This paper gives an elementary overview of this development, classifying the approaches, highlighting their major features, and giving brief descriptions of some of them.

Hurst Parameter Estimation Techniques: A Critical Review, by Jeong, Pawlikowski, & McNickle

Hae-Duck J. Jeong, Dept. of Computer Science, Univ. of Canterbury, NZ.

Krzysztof Pawlikowski, Dept. of Computer Science, Univ. of Canterbury, NZ.

Don McNickle, Dept. of Management Univ. of Canterbury, NZ.

Many recent studies of real teletraffic data in computer networks have shown evidence of self-similarity. The Hurst parameter H plays an important role in characterising self-similar processes. Thus, the estimation of the Hurst parameter of a sequence with a finite number of values is crucial in determining whether a process is self- similar. Parameter estimation techniques of the Hurst parameter have received great attention in recent years. A comparative analysis of the most frequently used H estimation techniques, the wavelet-based H estimator and Whittle's MLE estimator, periodogram, R/S-statistic, variance-time and IDC estimation techniques, has been studied. Our results have revealed that the wavelet-based H estimator is the least biased of the H estimation techniques considered.

Keywords: Hurst parameter estimation techniques, self-similar processes, sequential and fixed-length self-similar generators.

Interval Estimators of Proportions in Coverage Analysis of Simulation Output Results, by Lee, McNickle, & Pawlikowski

Jong-Suk R. Lee, Dept. of Computer Science, Univ. of Canterbury, NZ.

Don McNickle, Dept. of Management Univ. of Canterbury, NZ.

Krzysztof Pawlikowski, Dept. of Computer Science, Univ. of Canterbury, NZ.

A normal distribution is commonly used for defining confidence interval estimators for proportions even though alternative, more accurate and efficient estimators have been proposed in the past. This is probably because the normal approximation has been easier to use in practice than other estimators. However, current computing technology can now deal with such estimators quite efficiently. One application of proportions is in the coverage analysis of simulation output results. Coverage analysis is often used to measure the robustness of methods of simulation output data analysis. In this paper we study three interval estimators of proportions to find out a more accurate estimator for the applications in coverage analysis. The estimators (based on the normal distribution, the arcsin transformation and the F distribution) were compared with exact values, which are calculated by the binomial probability density function. The numerical results show that estimators based on the F distribution should be used in practice.

Long-Term Hydro Scheduling in a Pool Market, by Neame, Philpott, & Pritchard

Philip Neame, Dept. of Engineering Science, Univ. of Auckland.

Andrew Philpott, Dept. of Engineering Science, Univ. of Auckland.

Geoff Pritchard, Dept. of Statistics, Univ. of Auckland.

A generator offering power into a pool market like New Zealand's needs to construct an offer stack for each half-hour period, indicating how much power they will generate at each price. For a hydroelectric generator, decisions in each period are linked by the quantity of water in the dam. We consider a simple model where the generator is unable to influence the market price by withholding capacity, and hence we model the uncertainty with a price distribution function. We have devised a dynamic programming algorithm that can then find the optimal stack for each half-hour period. However, this is only practical for short-term models. Hence, we consider a model with weekly periods, initially by offering the same stack in for each half-hour period within that week. Because the price is uncertain, the revenue gained is uncertain, and so is the release (the total quantity of water used). We model the release with a normal distribution, and consider the combined problem of selecting a mean and a variance release for each week, and constructing stacks with this mean and variance. This leads to some elegant theoretical results and gives some preliminary insights into constructing a practical algorithm.

Minimizing the Total Completion Time in a Two-Machine Group Scheduling Problem with Carryover Sequence Dependency, by Logendran & Sriskandarajah

Rasaratnam Logendran, Dept. of Industrial and Mfg. Engineering, Oregon State Univ., USA

Chelliah Sriskandarajah, School of Management, The Univ. of Texas at Dallas, USA

The printed circuit board assembly in electronics manufacturing is characterized as a two-machine, carry-over sequence-dependent group-scheduling problem with anticipatory setups. Typically, the board types that exhibit similar features are included in one group and the setup time required on either machine is evaluated based on a surrogate board group, representing all board types that belong to the group. The setup on the second machine is performed in anticipation of the arriving board group. In addition, the setup time required of a surrogate board group on either machine is dependent upon the entire set of preceding surrogate board groups that have so far been processed. This is in marked contrast to the sequence-dependent group-scheduling problem in hardware (discrete parts) manufacturing where the setup time required of a group of parts is only dependent upon the immediately preceding group. As the problem is proven to be strongly NP-hard, we present an effective mechanism for evaluating the lower bound for the total completion time. The solution associated with this lower bound is subsequently used as an initial solution to trigger the application of the higher-level search algorithm based on tabu search, to finally find the best solution for the problem. We present a representative example obtained from industry to demonstrate the applicability of the steps associated with the solution algorithm.

*YPP competitor:* Modifications to the Pure Preferential Bidding System for Rostering Pilots at Air New Zealand, by Rouse

Natalie M. Rouse, Dept. of Engineering Science, Univ. of Auckland.

At Air New Zealand, flight crew are assigned to particular flights by complex computer-based rostering systems. One of these crew rostering systems is called the Preferential Bidding by Seniority (PBS) system. This system constructs rosters for international pilots by accepting bids for certain work structures from the crew and performing a series of individual optimizations (called the squeeze process) which ensures that bids are satisfied in strict order of seniority. While the PBS system produces excellent quality rosters in strict seniority order, the amount of bid satisfaction typically achieved by junior crew members is very low. In addition, the series of optimizations are slow and expensive to solve.

We have demonstrated that it is possible to avoid the squeeze process and improve roster quality for junior crew while maintaining an acceptable degree of the strict seniority enforced by PBS. Improvements have also been made to the branching strategy used by the Branch and Bound algorithm to produce sensible integer solutions. A representative group of Air New Zealand pilots was consulted during the course of the project to ensure that a solution was developed that could be approved by the pilots and implemented as an improvement to the current PBS system.

Mistaken Identity or Multiple Identities? A Case for Multiple Frames, by Mabin & Davies

Vicky Mabin, School of Business and Public Management, Victoria Univ. of Wellington, PO Box 600, Wellington, NZ.

John Davies, School of Business and Public Management, Victoria Univ. of Wellington, PO Box 600, Wellington, NZ.

This paper explores the merits and drawbacks of using a multi-framing approach in problem solving/decision making, by applying it to a mini-case based on a real situation involving/invoking a moral or ethical dilemma. We provide a rationale for such an approach, which is founded on the notions that there is no one correct way to approach most real problems, and that single-frame approaches can induce frame blindness.

The case analysis demonstrates benefits of multi-framing that include building frame awareness, overcoming frame blindness, and understanding the development of perspectives that contribute to more robust and acceptable 'solutions'. We describe our experiences of using framing as a meta-framework for problem solving and illustrate how the approach may be used in the classroom. In doing so, we indicate how the nature and success of problem-solving interventions can be frame dependent.

Optimal Dynamic Auctions for Revenue Management, by Vulcano, van Ryzin & Maglaras

Gustavo Vulcano, Columbia University, Grad. Sch. of Bus., New York, NY 10027.

Garrett J. van Ryzin, Columbia Univ., Grad. Sch. of Bus., 412 Uris Hall, New York, NY 10027.

Constantinos Maglaras, Columbia Univ., 409 Uris Hall, 3022 Broadway, New York, NY 10027.

We consider a revenue management problem in which buyers act strategically and bid for units of a fixed capacity over time. We prove that modified versions of the classic first (descending or "Dutch" auction) and second price (ascending or "English" auction) mechanisms maximize the sellers' revenue. Moreover, we show explicitly how to conduct these optimal auctions. We then compare the optimal mechanisms to traditional list price mechanisms. In several important cases, we show theoretically that the optimal auction is in fact no better than using list prices. In other cases, numerical results show that significant revenues gains are achieved by using an optimal auction mechanism.

*YPP competitor:* Plant Location Modelling for the Concrete and Asphalt Industries, by Clist

Michael Clist, Dept. of Engineering Science, Univ. of Auckland.

The Auckland concrete and asphalt industries currently face substantial change in the supply of their principal ingredient, aggregate rock, as key aggregate quarries have been mined to completion and are scheduled to close. Consequent economic and environmental conditions mean a number of Auckland concrete and asphalt plants are in turn expected to close, with new plants being built in more suitable locations, and possible modifications made to remaining plants for increased capacity. This paper discusses a transportation and blending model used to assist two Auckland companies; Firth Industries with regards to the location of their ready-mix concrete plants, and Works Infrastructure for the location of their asphalt plants. The model provides simulation and analysis of the expected costs involved with each of the various scenarios of plant locations currently under consideration by the respective companies, accounting for the raw material, blending and transportation costs. The choice of plant location is shown to impact decisions throughout the industry supply chain, influencing the selection of raw material suppliers, along with the assignment of customer orders to plants for final delivery.

Saleh A. Wasimi, Faculty of Informatics and Communication, Central Queensland Univ., Rockhampton, Queensland 4702

Traditionally, streamflow is estimated from a linear time-invariant basin response function with effective rainfall as input, which in turn is estimated from point rainfall observations using a runoff coefficient. With the advent of powerful computers this concept has been extended to gray-box type conceptual models. In this study the Variable Source Area (VSA) conceptual model has been investigated and its extension by Raper and Kuczera (1991) has been modified to apply to Little River catchment located west of Melbourne, Australia. Model parameters were first optimized using a combination of Gauss-Newton and steepest descent algorithms and later fitted to selected storms by the least squares method.

Radiation Therapy Planning by Multicriteria Optimization, by Ehrgott & Burjony

Matthias Ehrgott, Dept. of Engineering Science, Univ. of Auckland.

Mena Burjony, Dept. of Engineering Science, Univ. of Auckland.

Radiation is one of the major forms of treatment in cancer therapy besides chemotherapy and surgery. The determination of a treatment plan is a complex task that involves finding directions of beams, beam intensities and a realization of optimal intensities on the equipment. In this talk we focus on determination of beam intensities. Dose distributions must satisfy the conflicting goals of effectively destroying the tumour while at the same time avoiding dangerous overdosing in surrounding tissue and organs at risk. We present a multicriteria model of the problem and show how to find a starting solution in which deviations from prescribed dose levels are balanced for all organs under consideration. Such a solution can serve as a starting point for the search for a best treatment plan among a precomputed representative set of “efficient” solutions in an on-line database environment.

*YPP competitor:* Relief Staff Rostering for the St John Ambulance Service, by Jasim

Hala Jasim, Dept. of Engineering Science, The University of Auckland.

In an organisation it is important to manage the work periods when staff members are on leave. Within St John, full time staff might take holiday leave or outside training at a known time. The shifts that the full time staff would have covered have to be allocated to other staff. Relief staff is the term given to the staff members who cover the planned leave that is being taken by other staff within the organisation.

A computerised system has been developed that makes use of optimisation techniques that allow for the efficient allocation of staff to roster periods while meeting shift requirements and staff objectives. The rostering problem is modelled as a Set Partitioning Problem, which is a zero-one integer-programming problem. The optimisation methods used to solve this problem are the Revised Simplex Method and Branch and Bound.

A newly developed fatigue model is applied to the optimal solution, which is used to predict the impact of working hours on the fatigue that is experienced by the staff member. The fatigue model will assist in the design of effective rostering schedules that take into account the social, domestic and personal needs of employees.

Revenue Management under a General Discrete Choice Model of Demand, by van Ryzin & Talluri

Garrett J. van Ryzin, Columbia Univ., Grad. Sch. of Bus., 412 Uris Hall, New York, NY 10027.

Kalyan Talluri, Univ. Pompeu Fabra, Ramon Trias, Fargas 25-27, Barcelona, 08005, Spain.

We analyze an airline yield management problem on a single flight leg in which the buyers' choice of fare classes is modeled explicitly. The choice model we use is very general and includes a wide range of discrete choice models of practical interest. The optimization problem is to find, at each point in time, the optimal subset of fare classes to offer. We characterize the optimal policy for this problem exactly and show it has a surprisingly simple form. The analysis also provides insights into when so-called "nested allocation policies" (a popular form of control in practice) are optimal.

Some Initial Ideas Regarding a Replenishment Decision Involving Partial Postponement, by Silver, Segal, & Minner

Edward A. Silver, Faculty of Management, The Univ. of Calgary, Calgary, Alberta, Canada.

Yannai Segal, Division of Mfg. Engineering, The Univ. of Calgary, Calgary, Alberta, Canada.

Stefan Minner, Faculty of Management, The Univ. of Calgary, Calgary, Alberta, Canada

In this paper we consider the choice of the order quantities for a group of end items, each facing random demand in a period of interest. There is the additional feature of having W units of unfinished stock that can be customized to produce any of the end items once demands materialize. The paper was motivated by a practical situation in the fast food industry. Some interesting analytical results are developed as well as suggestions for dealing with the case where there are a large number of distinct items.

*Plenary address*: Some Mathematical Models of Cancer Treatment, by Wein

Lawrence M. Wein, Sloan School of Management, MIT.

This talk will describe several (about five) mathematical problems concerning the analysis, design and/or control of traditional (chemotherapy, radiation, surgery) and novel (gene therapy, angiogenesis inhibitors) cancer therapies. We will omit the mathematical details, and instead focus on the biological aspects of the model formulations and the insights from our analyses.

The Electricity Seller's Psi, by Pritchard, Philpott, Neame, & Zakeri

Geoff Pritchard, Dept. of Statistics, Univ. of Auckland.

Andrew Philpott, Dept. of Engineering Science, Univ. of Auckland.

Philip Neame, Dept. of Engineering Science, Univ. of Auckland.

Golbon Zakeri, Univ. of Auckland.

A producer of electricity, or indeed any other commodity, may be faced with the problem of formulating an "offer curve" representing the quantity that it is willing to supply at any given price. We consider this problem as one of stochastic variational programming - find the offer curve which maximizes expected profit, given stochastic market conditions (demand and competitor behaviour). Sufficient information to solve the problem is contained in the market distribution (or "psi") function, which gives the probability that the market can absorb particular (quantity, price) pairs. We discuss ways to estimate psi functions, using both parametric and non-parametric models. More complex offer-curve optimization problems can arise when offering electricity into a transmission network. In such markets, there is usually a different price at every node. Even if a producer has only one generating plant, it will often have an interest in prices at other nodes, through financial contracts or retail customers located at those nodes. We present some examples of such problems, and a method for solving them.

David Boland, Boland Associates Limited, 4 Hicks Crescent, Waikanae, NZ

Operational research is a process of making decisions, or at least providing insights which lead to decisions, in an objective, scientific way. The primary tool of operational research is the mathematical model.

For operational research to work everyone concerned must think of the workplace as an object, which can be analysed mathematically, modified, and reshaped. It also requires a strong management structure, so that decisions can be implemented.

But we are moving into a post modern world in which the workplace is considered to be an organic, living being, and management should be by consensus. Businesses are not mechanical objects. They cannot be manipulated, restructured, experimented on without damaging their ability to achieve success. There is less confidence now in the hierarchical style of management. It is recognised that all staff should be allowed a share in decision making and everyone has a useful contribution to make.

What will the post modern business be like? What are the trends? Will there be a place for operational research in it? Is operational research possible in a democratic workplace?

Using Cournot Gaming Models for Deciding Vesting Contracts for an Emerging Electricity Market, by George, Read and Duffy

John A. George, PA Consulting Group, PO Box 1659, Wellington, NZ.

E. Grant Read, Dept. of Management, Univ. of Canterbury, NZ.

Alastair S. Duffy, PA Consulting Group, PO Box 1659, Wellington, NZ.

In oligopoly wholesale electricity markets, there can exist significant potential for market power where generating companies are able to drive up pool prices by withholding capacity or offering it in at high prices. One way of curbing market power in such a market is to allocate Vesting Contracts to the generating companies. In this context Vesting Contracts are financial contracts for energy between generating companies and purchasers in the wholesale market, which are imposed (or vested) by the government or regulator upon the industry. This effectively reduces the size of the market that can be gamed. Cournot gaming is a simulation of a wholesale electricity market that focuses on the ability of participants to maximise their revenue by withholding capacity. Our Cournot gaming model finds a Nash equilibrium, where no generating company is able to further benefit from withholding more capacity. By applying contracts to players in the model, it is possible to assess the effectiveness of different levels of Vesting Contracts on curbing marker power. Our analysis shows results from modelling the Singapore wholesale electricity market for both one-way and two-way Vesting Contracts.

*19 Nov 2001, jfr*