ABSTRACTS AND FULL PAPERS

 Title: A Stochastic Programming Approach to the Airline Crew Scheduling Problem (postscript version or pdf version)
 Authors: Joyce Yen, University of Michigan, Michigan, USA John R. Birge, Northwestern University, Illinois, USA

Traditional methods model the billion-dollar airline crew scheduling problem as deterministic and do not explicitly include information on potential disruptions. Unfortunately, the effort used to optimize crew schedules and reduce crew costs is often wasted as flight schedules are often disrupted. Consequently, airlines spend a great deal of time and money optimizing crew schedules in the planning phase only to change and readjust the crew schedules in the operational phase in response to disruptions. Disruptions are expensive and lead to loss of time, money, and customer goodwill.

Instead of modeling the crew scheduling problem as deterministic, we consider a stochastic crew scheduling model and devise a solution methodology for integrating disruptions in the evaluation of crew schedules. The goal is to use that information to find robust solutions that better withstand disruptions. Such an approach is important because we can proactively consider the effects of certain scheduling decisions. By identifying more robust schedules, cascading delay effects will be minimized.

In this paper we describe our stochastic integer programming model for the airline crew scheduling problem and offer a branching algorithm to identify expensive flight connections and find alternative solutions. The branching algorithm uses the structure of the problem to branch simultaneously on multiple variables without invalidating the optimality of the algorithm. We present computational results demonstrating the effectiveness of our branching algorithm.

 Authors: Eric Huggins, University of Michigan, USA Tava Olsen, Washington University, Missouri, USA

We consider a manufacturer that must fill stochastic demand. When shortages occur, the manufacturer may choose to meet the unmet demand with overtime production and/or by premium freight shipments. We derive optimal production policies for regular and overtime production and discuss the tradeoffs involved.

 Title: Automated Large-scale Simulation Experiments in AKAROA-2
 Authors: Don McNickle, Department of Management, University of Canterbury Krys Pawlikowski and Greg Ewing, Department of Computer Science, University of Canterbury

The availability of cheap under-utilised networked computing resources raises new possibilities for large-scale simulation experiments, especially if the analysis and control of these experiments can be automated, so as to ensure that novice simulation practitioners are guaranteed credible results. AKAROA2 is a Unix-based simulation controller which can launch multiple independent replications of simulation programs, and control and analyse the results. This simulation scenario raises new requirements for output analysis, especially if the procedure is to be largely automated

 Title: A Comparison of Two Reference Point Methods in Multiple Objective Mathematical Programming (postscript version or pdf version)
 Authors: John Buchanan, Department of Management Systems, University of Waikato Lorraine Gardiner, College of Business, Auburn University, Alabama, USA

When making decisions with multiple criteria, a decision maker often thinks in terms of an aspiration point or levels of achievement for the criteria. In multiple objective mathematical programming, solution methods based on aspiration points can generate nondominated solutions using a variety of scalarizing functions. These reference point solution methods commonly use a scalarizing function that reaches down from the ideal solution, in a direction specified by the aspiration point. Conversely, a similar scalarizing function can push out from the nadir point toward a specified aspiration point. In this paper we explore the structure of these two reference point approaches, examine the discrepancy between resulting solutions and consider the effect of problem framing on decision maker behavior. We discuss the results of a practical experiment that investigates whether decision makers perceive any difference in practice between the two approaches.

 Author: Ross James, Department of Management, University of Canterbury

Search heuristics, such as Tabu Search and Simulated Annealing, start from a single solution and incrementally change it in order to find better solutions. Given a particular problem, there are a large number of possible ways of formulating a search in order to solve it. The choice of search space, neighbourhood scheme, search heuristic and its many possible additions and enhancements are all elements of a search formulation. This paper identifies some of the key relationships between these different elements and how these could impact the overall performance of the search. The framework developed from these relationships helps to classify different search heuristic modifications and identifies areas where relatively little research has been done.

 Title: Identification of Resource Needs for Learners with Vision Impairment: Using DEA in a Mixed Method Approach (postscript version or pdf version)
 Author: Nicola Ward Petty, Department of Management, University of Canterbury

Identification of resource needs for learners with vision impairment is one of a class of problems that occurs in areas such as social work, special education, rehabilitation and health. The problem is to identify needs and allocate scarce resources in an efficient and equitable way. A mixed method approach is developed, using qualitative interviewing to inform the development of a DEA model. The use of DEA on data relating to individuals, rather than organisational groups, is illustrated, with parallels drawn from similar studies in the education and health sectors.

 Title: Bonus Incentive Schemes in the Workplace: Their Effect on Operational Research Projects (postscript version or pdf version)
 Author: David Boland, Boland Associates Limited, Waikanae

Traditionally bonus schemes were applied to factory workers, but gradually fell out of fashion. But times have changed. The trend is now to say that no staff will perform well unless they are given the incentive of a bonus scheme. In many businesses bonus schemes are applied to white collar and creative workers, specialists, managers and even the most senior managers.

The many different types of bonus scheme have three features in common, viz: the intention of the scheme will eventually be subverted, because people do what they can to get the most benefit for themselves, even to the detriment of the business; all bonus schemes degenerate as time goes on, and unless they are upgraded will collapse into disputes and cheating; incompetent management is made worse, not better, by bonus schemes.

No matter how good an operational research project is, it can be rendered totally ineffective by an ill-conceived bonus scheme, so we shall explore types of schemes, how they work, why they fail, what to avoid, and how they affect operational research projects.

 Author: Mark Stewart, Department of Management, University of Canterbury

Communications services, such as those provided by telecommunications and the Internet, have become an important component in society. The significance of these services means that it is important to investigate how best to provide them, and hence be able to meet the ever increasing level of demand. Significant research has been conducted on determining appropriate routing systems. However, less attention has been given to determining the processing ability needed to meet the demand (higher-level capacity investment decisions), or how to best use the processing capacity given (lower-level operational decisions). A framework is discussed for these operational decisions in the Internet setting. This includes factors such as processing and routing congestion, demand, and node and arc reliability. A mixed-integer program is then presented for the lower level operational processing decisions.

 Author: Stephen R. J. Batstone, Department of Management, University of Canterbury

Electricity generators in most deregulated markets simultaneously operate in both financial (contract) and physical (spot) markets. Decisions in each of these markets are not mutually exclusive, and in the case of imperfectly competitive scenarios generating companies can use their market power to influence spot and contract prices. A model of oligopolistic market equilibrium is presented where the relationship between physical and financial markets is based on the impact spot prices and the variance of spot prices have on contract demand. Risk averse consumers are assumed to maximise a mean-variance utility in purchasing contracts, while risk neutral generators face uncertain input costs and compete with Cournot conjectures in the physical market. Results are presented that show incentives exist for generators to amplify cost variations/uncertainties in their quantity bids, in order to extract higher contract prices out of consumers.

 Authors: Shane Dye, Department of Management, University of Canterbury Asgeir Tomasgard, SINTEF, Trondheim, Norway Stein W. Wallace, NTNU, Trondheim, Norway

Bender's decomposition is often used to solve two-stage stochastic programming problems. The problem is transformed into a master problem and many subproblems with similar structure. For a two-stage problem with integer variables in the first-stage branch and bound may be used to resolve the integral requirements. This involves solving many relaxations of the problem. Using Bender's decomposition to solve these relaxations means there are many, many subproblems to solve with similar structure.

This paper considers such problems where the subproblems have a network or transportation problem structure. Ways of exploiting the structure of the subproblems and the combination of branch and bound with Bender's decomposition are investigated.

 Author: John F. Raffensperger, Department of Management, University of Canterbury

Wool prices depend on the quality of the wool. The quality of wool depends on a variety of factors, a key factor being the diameter of the wool fibres. Thinner fibres feel more pleasant to the touch, and are more highly valued. Merino NZs best and thinnest wool commands prices in the tens of thousands of dollars per kilogram!

NZ PAC, a subsidiary of MNZ, classifies every fleece of Merino wool for each grower. NZ PAC sorts each growers wool into 16 bins on a mechanical sorting machine. The problem is to choose attributes for each bin to maximise return for each grower. Usually, small quantities of wool with different diameters must be combined into a single lot to satisfy a minimum lot size constraint. The price of the lot depends on the average diameter of the wool in the lot. Fleeces can be combined into lots in different ways, resulting in different average diameters and quality, and therefore different market prices. This paper presents models to assign attributes to bins to maximise the revenue. The first model is a non-linear assignment problem, with blending constraints. The second is a large integer program with blending constraints. We expect this work to be implemented by NZ PAC, resulting in increased revenue for growers.

 Title: The Must-run Auctions in an Electricity Market
 Author: Geoffrey Pritchard, Department of Statistics, University of Auckland

We consider a wholesale spot market for electricity, of the type operated in New Zealand, Australia, and some parts of the UK and USA. In these markets there are circumstances in which generators may wish to offer energy at negative prices, e.g. to avoid being shut down for a short period. Such behaviour creates some severe computational difficulties. The "must-run auction" is a system used to handle such cases in New Zealand; we consider its properties under stochastic demands.

 Title: Feeding Strategies for Maximising Gross Margin in Pig Production (postscript version or pdf version)
 Authors: David Alexander, Institute of Information Sciences and Technology, Massey University Patrick Morel, Institute of Food, Nutrition and Human Health, Massey University Graham Wood, Institute of Information Sciences and Technology, Massey University

Pig farming is, by weight, the largest meat industry in the world. This paper reports on the successful application of stochastic optimisation algorithms to the problem of choosing feeding strategies to maximise gross margin. A pig growth model and a combination of optimisation techniques allow the composition and quantity of feed to be optimised so as to maximise the return to the farmer. The nature of the objective function is examined, providing a theoretical explanation of the observed level of superiority of optimisation algorithms over pure random search.

 Authors: Alastair McNaughton and Simon Leong, Division of Science and Technology, Tamaki Campus, University of Auckland

The practical aspects involved in the construction of a schedule for the delivery of petrol from a bulk supply depot to a network of service stations is discussed. An optimisation model will be presented which incorporates use of multiple trucks of varying capacity, time windows for delivery, and multiple days or shifts. A solution algorithm will be outlined. Some examples of performance levels attained will be given. Directions for on-going research will be discussed.

 Title: Optimal Offer Stacks for Hydroelectric Generators in Electricity Markets(postscript version or pdf version)
 Authors: Phil Neame, Andy Philpott, Geoff Pritchard and Golbon Zakeri, University of Auckland

We look to extend some results on offering strategies in an electricity market to the case of a hydroelectric generator, whose costs are dominated by the marginal cost of using water now instead of having it available for later use. In particular, we consider the case of a generation company controlling a number of dams on a river chain. We consider how to extend value of water functions to many dams, as well as considering the effect of the choice of value of water function at the time horizon.

 Title: IPManager: A Microcomputer-based DSS for Intellectual Property Management (postscript version or pdf version)
 Authors: Chuda Basnet and Les Foulds, Department of Management Systems, University of Waikato Warren Parker, AgResearch, Ruakura Research Centre, Hamilton

We describe a menu-driven decision support system (DSS) that can be used to manage the costs associated with the development of an organization's intellectual property (IP). IPManager is a DSS developed for use by managers of IP and is designed to aid them in creating or improving IP registration and maintenance strategies, using their experience and preferences. The DSS uses multiple, resizable, overlapping windows to assist the managers in their tasks. The system can be used to analyze a wide variety of "what-if ?" scenarios with potential cost impacts. The system is written in Microsoft Access, within the Microsoft Windows environment. A case study is reported, involving the successful use of the system in the environment that motivated its development, namely for the management of intellectual property at AgResearch, Hamilton, New Zealand. The system provides a useful addition to the intellectual property manager's toolkit and an interesting application of the DSS approach in an important area.

 Title: Maui Gas Outage - A Case Study in Energy Demand Side Response
 Author: Graeme Everett, Norske Skog Tasman

In May 2000 the Maui Gas field was taken out of service for three days in order to replace a major valve in the pipe-line. This had a large effect on the electricity market in the North Island, removing 900 MW of gas-fired generation. The Norske Skog Tasman pulp and paper mill is a major user of electricity, and wished to understand possiblities of high prices whilst the Maui gas was unavailable, so that options to minimise additional costs could be considered.

Tasman developed a small version of SPD, the New Zealand Electricity Market pricing software. This model was used to develop a number of possible scenarios for energy and reserve prices. A model of the major mill processes was developed which allowed planners to find the best operational strategy for each price scenario, and strategies that were optimal for all scenarios.

Production was re-scheduled, and as many contingencies put into place as possible prior to the outage. During the weekend itself special attention was paid to dispatch prices as they were published. Decision making was simplified as a result of the prior modelling work.

In this paper we discuss the design of the models and how they were used to develop operational strategies.

 Authors: Graeme Everett, Norske Skog Tasman Andy Philpott, Department of Engineering Science, University of Auckland Gordon Cook, Fletcher Challenge Canada Limited

Fletcher Challenge Canada Limited (FCCL) operates three pulp and paper mills in British Columbia. FCCL has traditionally been a newsprint producer and has kept pace with increasingly stringent quality requirements by continually rebuilding existing paper machines and ancillary plant. The company wished to take a longer term view and consider other options, including converting machines to different grades of paper.

Because of the complexity associated with the large number of possible options available, the company decided to develop an optimisation model to assist with this strategic decision making. Market forecasts, capital requirements, production and other pertinent data were collated for a ten year planning horizon.

Initially this model proved to be extremely difficult to solve. Based on knowledge of the business a number of extra constraints were added that improved its performance and allowed optimal solutions to be obtained. The model has proved to be successful in challenging entrenched views within FCCL regarding strategic direction, and stimulating wide ranging thought and discussion.

In this paper we describe the mathematical formulation of this model in some detail.

 Title: Market Analysis Illustrated by the Mandelbrot Model (postscript version)
 Author: Hiroshi Kumakura, Tokyu Agency Co. Ltd and Graduate School of Decision Science and Technology, Tokyo Institute of Technology, Tokyo, Japan

We consider global order of the packaged goods market. First, by using home scanning panel data obtained from 13 markets, we show there is global order in the packaged goods market. Namely, the relationship between ranking and size of product obeys the Mandelbrot model. That is, s(r)=b(r+k)-a, where r: the ranking of product sales and where s(r): purchasing dollar volume per 100 households of the r-th ranking product. This means that markets are illustrated by both the rate of new entry of the product into a market (shown by parameter a) and the expected growth rate of the product (shown by parameter k). Next, based on this we classify the market under four. Namely,

• the Concentrated Market: market share is concentrated,
• the Dispersed Market: market share is dispersed,
• the Stagnant Market: market share is not dispersed, because new products are hard to emerge, nevertheless the higher-ranked products are not charming,
• the Growth Market: market share is not concentrated, because new products emerge lively, nevertheless the higher-ranked products are charming.

Then, we discuss character of these markets by interrupting concrete cases of a cooked curry market and a heavy detergent market.

 Title: Bicriteria Robustness versus Cost Optimisation in Tour of Duty Planning at Air New Zealand (postscript version or pdf version)
 Authors: Matthias Ehrgott and David M. Ryan, Department of Engineering Science, University of Auckland

Current optimisation based computer systems to solve all aspects of both planning and rostering processes for both national and international airline operations allow the construction of minimal cost tours of duty and rosters. However, today major airlines do not only require cost effective solutions, but are also very interested in robust solutions. A more robust solution is understood to be one where disruptions in the schedule (due to delays) are less likely to be propagated into the future, causing delays of subsequent flights. Current scheduling systems based solely on cost do not automatically provide robust solutions.

These considerations lead to a multiobjective framework, as the maximisation of robustness will be in conflict with the minimisation of cost, as for example crew changing aircraft between duties is discouraged if inadequate ground time is provided. We develop a bicriteria optimisation framework to generate "efficient" schedules for the domestic airline. An efficient schedule is one which does not allow an improvement in cost and robustness at the same time.

In this talk, we present preliminary results on the incorporation of a robustness objective in the optimisation process. We also outline future research to be undertaken in this area.

 Title: Types of Market Research and its Usefulness: An Empirical Study (postscript version or pdf version)
 Authors: Ganeshasundaram Raguragavan, Tony Lewis and Zane Kearns, Department of Marketing, Massey University

Much of the literature on market research begins with the premise that information plays a critical role in the success or failure of organisations. However, information acquired by decision makers will have little impact on company performance if it is not actually put to use in the making of decisions. Clearly, the type of information available, and the use to which it is put, are the deciding features in the value of information. However, with very few exceptions, the literature has tended to focus on the how and what of gathering market information, rather than on the usefulness of such information, once it has been acquired.

Kohli and Jaworski (1990) write that if market intelligence and marketing research generated information is to play a critical role in a firm's quest to become more market oriented, relevant information must be produced and disseminated to the various departments in the most appropriate form to enhance its use. Moreover, there is likely to be "bad" research and "good" research, though this distinction carries within it the seeds of circular reasoning. Hart and Diamantopoulos (1993) found that the level of use of marketing research had no apparent effect on the organisations' performance. They argued that what really matters is not what information is collected or from where it is obtained. Instead, the crucial question is how good is the information gathered, and how effectively it is used. Hart and Diamantopoulos (1993) did not attempt to explore possible explanations for their finding of an insignificant relationship between research activity and success. It is probably the case that different types of marketing research have different effects.

The literature on research use has revealed the existence of two key dimensions in the evaluation of research: namely, instrumental and conceptual. Instrumental use or what in this paper is termed 'decision research' has been defined as the direct application of research findings and conclusions to solve a specific problem, or to make a particular decision (Deshpande and Zaltman, 1982). Conceptual use or 'background research' information refers to the indirect application of information, in the sense that information is used to broaden the managerial knowledge base without serving any one particular project (Moorman, 1995).

This paper reports on part of a large scale study investigating the relationship between the type of marketing research on the one hand, and company performance on the other. We specifically explore the relationship between the purpose for which research has been commissioned, and managers' perceptions of the usefulness of the information provided by the research. It was developed from the univariate findings reported in an earlier conference paper, by using multivariate analysis. The data were collected by personal interviews and a mail questionnaire. The results from 34 New Zealand organisations and 775 research projects, suggest that if the research is conducted with the specific purpose of helping to make a decision, it is evaluated more favourably than if it is conducted to provide background information. Background research predominates over decision research as the research activity of choice, yet is regarded as less useful. A possible explanation for the predominance is that it is the prevailing convention among research companies and marketers. The results of our research suggests that the emphasis should be changed.

 Title: Cell Batching Optimisation for the New Zealand Aluminium Smelter (postscript version or pdf version)
 Author: Thorsten Piehl, Department of Engineering Science, University of Auckland

New Zealand Aluminium Smelter (NZAS) in Tiwai Point produces aluminium in tapping bays of 48 to 51 electrolysis cells. Three cells are tapped into one batch. A tapping bay consists of up to 17 batches. The purity of the aluminium produced differs from cell to cell. The premium of a batch increases more than linear with increasing purity. The objective of the cell batching process is it to maximise the overall premium of a tapping bay by choosing the optimal combination of batches.

The cell batching process is modelled as a Set Partitioning Model. This model is solved using the Revised Simplex Method and Branch-and-Bound (B&B) with a constraint branching approach.

This approach has difficulties solving the smelter problem to optimality because the LP-IP gap and the B&B-tree, which has to be explored, are large.

Improvements of the existing approach for dealing with the LP-IP gap and the tree size have been implemented.

This paper outlines the improvement of the software currently used for the cell batching process of NZAS.

 Title: BartSim: A Tool for Analysing and Improving Ambulance Performance in Auckland, New Zealand (postscript version or pdf version)
 Authors: Shane G. Henderson, Industrial and Operations Engineering Department, University of Michigan, Michigan, USA Andrew J. Mason, Department of Engineering Science, University of Auckland

The problem of determining where ambulances should be stationed, and at what times they operate, has received a great deal of attention in the literature. The trade-off driving this work is between additional resources to improve response times to emergency calls, and the cost of those additional resources. We discuss a simulation and analysis software tool BartSim that has been developed by the authors as a strategic decision support tool for use within the St John Ambulance Service (Auckland Region) in New Zealand (St Johns). The novel features incorporated within this study include the use of a detailed time-varying travel model for modelling travel times in the simulation, methods for reducing the computational overhead associated with computing time-dependent shortest paths in the travel model, and the development of a geographical information sub-system (GIS) within BartSim that provides spatial data visualisation exploring both historical data and the results of simulations. Our experience with St Johns, and discussions with emergency operators in Australia and Europe, suggest that the emergency services do not have good tools available to support their operations management at the strategic level. Our experience has shown that a customised system such as BartSim can successfully combine GIS and simulation approaches to provide a quantitative decision support tool highly valued by managers.

 Title: Optimal Sailing Routes with Uncertain Weather (postscript version or pdf version)
 Authors: Toby Allsopp, Andrew J. Mason and Andy Philpott, Department of Engineering Science, University of Auckland

We consider the problem of finding a route that minimises the expected sailing time between two points on the ocean under uncertain weather conditions. This has applications in long distance offshore yacht racing. The uncertainty in the weather is modelled by a branching scenario tree that captures the serial correlation inherent in the evolution of weather systems over time. The stochastic solution method extends a deterministic dynamic programming approach to include the weather scenario as a state variable, yielding a stochastic dynamic programming algorithm. Careful attention to implementation details yields an approach that optimises with uncertainty while maintaining acceptable solution times on a PC. This paper summarises the work presented in Allsopp (1998).

 Title: On an Application of Pedigree Approach to Symmetric Traveling Salesman Problem (postscript version or pdf version)
 Author: Tiru Arthanari, Department of Engineering Science, University of Auckland

Arthanari(1982) gave an alternative formulation for the Symmetric traveling salesman problem (STSP), based on multi stage insertion decisions. In a recent paper Arthanari and Usha(2000) study some of the properties of this formulation. The classical STSP polytope is generally studied as embedded in the standard sub tour elimination polytope (SEP). Boyd & Pulleyblank(1991) study the structure of the vertices of SEP and identify two classes of fractional vertices to show how complex they can be. Arthanari (1999) introduced the objects called pedigrees which are in 1-1 correspondence with the tours. The convex hull of these pedigrees yields a new polytope, called the pedigree polytope. In this paper, we apply some of the necessary and sufficient conditions for membership in pedigree polytope developed in Arthanari(1999), to show that some vertices of SEP identified by Boyd & Pulleyblank(1991) are not in the corresponding STSP polytope.

 Author: Simon A. Brandon, PA Consulting Group, Wellington

 Title: A Gibbs Sampler Approach to Estimate the Number of Faults in a System Using Capture-recapture Sampling
 Authors: Yu Hayakawa, Victoria University of Wellington Paul S.F. Yip, The University of Hong Kong, Hong Kong, China

A new recapture debugging model is suggested to estimate the number of faults in a system, $\nu$, and the failure intensity of each fault, $\phi$. The Gibbs sampler and the Metropolis algorithm are utilised in the proposed inference procedure. A numerical illustration suggests a notable improvement on the estimation of $\nu$ and $\phi$ compared with that of a removal debugging model.

 Title: A Dynamic Model for Structuring Decision Problems (postscript version or pdf version)
 Authors: Jim Corner and John Buchanan, Department of Management Systems, University of Waikato Mordecai Henig, Tel Aviv University, Tel Aviv, Israel

This paper develops a new model of decision problem structuring which synthesises a number of models and approaches cited in the decision making literature in general and the multi-criteria literature in particular. The model advocates a dynamic interaction between criteria and alternatives as a decision maker understands his preferences and expands the set of alternatives. This model endeavours to bridge the gap between prescriptive and descriptive approaches. It is prescriptive in its orientation, recommending an approach based on earlier prescriptive work. The model, however, is also validated empirically, based on the descriptive decision making literature and reported case studies of actual decision making.

 Title: Flight Schedule Optimisation for Air New Zealand's International Fleet (postscript version or pdf version)
 Author: Rochelle Meehan, Air New Zealand

Air New Zealand had done very little work in the area of aircraft scheduling using optimisation techniques until last year. A basic optimisation model was develped to allocate flights and maintenance to aircraft over a period of time. This project implements several extensions to this model. Departure time windows were incorporated to allow flexibility within the flight schedule. Each of the individual sectors were also allocated a priority. These priorities were then used to reduce the number of aircraft in the fleet. A linear programming (LP) relaxation and branch and bound approach was used to solve this model as a generalised set-partitioning problem. This approach led to the efficient construction of a good quality legal solution of the Boeing 747-400 fleet.

 Title: Electricity Market Models: Primal/Dual Formulation Issues
 Authors: E. Grant Read and Deb Chattopadhyay, Department of Management, University of Canterbury

Recent years have seen the development of electricity markets in many parts of the world, with most of those markets being "cleared" by Linear Programming models. Such models must produce optimal "primal" dispatch schedules, but also "dual" pricing schedules which not only match those dispatch schedules, but also comply with the market pricing rules. By and large, these twin goals can be met using a formulation in which the system representation closely matches physical reality, but many markets rely on simplified system representations. Typically, the natural "nodal" structure of the transmission network may be reduced to a relatively small set of transmission zones, constraints may be derived using statistical correlations, or variables appearing in constraints may be modelled as constants. We will examine the impact of each of these simplifications on pricing, and show that there are often alternative constraint representations which produce exactly the same primal solution, but very different prices. But, naturally, traditional constraint representations have been chosen with a view to ease of expression, or computation, with no regard to dual implications at all. Thus we conclude that the choice of constraint representation can have significant implications in market clearing models for electricity, and presumably for other commodities, too.

 Title: Stakeholder Analysis for Systems Thinking and Modelling (postscript version or pdf version)
 Authors: Arun Abraham Elias, Robert Y. Cavana, School of Business and Public Management, Victoria University of Wellington

Problem structuring is a common first step in most problem-solving approaches. In the systems thinking and modelling framework, the problem-structuring phase is followed by causal loop modelling, dynamic modelling, scenario planning & modelling, and implementation & organisational learning. In this paper we explain how and why stakeholder analysis would enrich the problem-structuring phase in the systems thinking and modelling process. We also argue that stakeholder analysis is applicable for other problem solving approaches. We demonstrate the usefulness of stakeholder analysis in systems thinking and modelling by presenting a New Zealand case study.

 Title: A Practical Application of Multi Criteria Methods Using Decision Support Software
 Author: Kieran Turner, Decision Lab Limited, Auckland

The aim of this presentation is to discuss the role of decision support software in the decision process. Multi criteria decisions are by nature complex and many methods for structuring and evaluating these decisions have been formulated. In this paper I present an overview of our work and experience to date using Logical Decisions, a decision support software package that utilises Multi Attribute Utility Theory (MAUT) and the Analytic Hierarchy Process (AHP). This type of decision support software can greatly assist both the decision makers and the analyst/decision team. It allows greater insight to be obtained and a better quality decision can be achieved. A case study will be presented demonstrating how the software has been of assistance in a real world application.

 Title: Optimal Reactive Power Management in Electric Power Systems
 Authors: Bhujanga B. Chakrabarti, Network Planning Group, Transpower NZ Limited E. Grant Read and Deb Chattopadhyay, Department of Management, University of Canterbury

In an electric power system, the loads at the customer ends consume both active and reactive power. The active power is converted into heat, light or other form of useful energy. On the other hand, reactive power can not be converted into a useful form but its existence is an inherent part of electrical loads. Reactive power must be carefully managed to ensure optimal utilisation of real power, to maintain voltage profile and also to avoid system-wide catastrophic events such as voltage collapse.

Reactive power management in a broad sense includes a range of issues including decisions on installation of reactive power sources (where, how much, and when), allocation of the associated costs among the responsible agents (e.g., customers/loads), and finally dispatch/pricing of reactive power in real-time. Voltage stability is a critical consideration in all these facets of reactive power management because its ultimate goal is to avoid a catastrophic event.

The reactive power (VAR) planning process is involved in deciding the worst contingency the system is to be protected against, location of installation of VAR equipment, the requirements of optimal amount of VAR at different locations and the voltage stability margin. The objective of VAR planning is to maintain satisfactory voltage profile and voltage stability across the network under normal and contingency conditions. The operating point should be maintained at a distance, in terms of MVA, from the point of voltage collapse (also called the critical point or the PoC) to provide safety margin from the PoC for unforeseen events beyond planning criteria.

The need for reactive power pricing is now well recognised. The theory and computational aspects of reactive power pricing have been topics of research for almost a decade. Development of optimal dispatch/pricing market models that can account for various complex power system phenomena involving reactive power has started receiving attention in recent years. Although various short run reactive power pricing schemes have been discussed and debated in the literature for a decade, voltage stability, which is often a critical determinant of reactive power allocation in many systems, has rarely been discussed in the dispatch/pricing literature. Voltage stability constrained power systems pose a difficult challenge for model development, analysis and implementation. Stability aspects have often been incorporated in the real-life dispatch/pricing procedure using trial and error methods, or approximated in the dispatch optimisation directly as a set of linear constraints on generation/transmission.

This paper briefly discusses a VAR planning model. A contingency constrained optimal power flow (OPF)-based model incorporating (static) voltage stability constraints is developed to analyse VAR support decisions. The model is contingency constrained i.e., given a set of contingencies, it performs the contingency ranking, decides optimal locations and quantity of VAR support to cover the worst contingency and simultaneously ensures that no other contingency constraint is violated. The contingencies are ranked endogenously based on the stability margin. DICOPT++, a state-of-the-art mixed integer non-linear programming (MINLP) algorithm, is proposed to solve the model efficiently. The model is tested for a test system.

The paper also presents preliminary experiences with the development of an OPF model that optimally allocates real and dynamic reactive reserves among the generators to meet a pre-specified voltage stability margin. The model deals with two states: A base dispatch to meet the current loads; and a contingency/collapse related reserve dispatch to ensure that the system has enough MW and MVAr reserve available to meet the required stability margin. The impact of the voltage stability constraints on nodal prices is of special interest. Experiments conducted for a simple 3-node system with zero and non-zero reserve offers for reactive power reveal some interesting impact of the voltage stability constraint on real and reactive power prices.

 Title: Te Wananga: Developing a Bicultural Model Using Critical Systems (postscript version or pdf version)
 Authors: Wayne Taurima and Michael Cash, The Open Polytechnic of New Zealand

(No abstract submitted)