Friday, December 26, 2008

Pour Feliciter 2009

if (celebrateXmas)
    { echo "Merry Christmas!" }
else
    { echo "Happy Holidays!" }

//but we prefer: "Merry Christmas!" :)

Warmest wishes from ThinkOR.org

Sunday, December 21, 2008

Operations Research's Irrational Fear of IT

IT is so embedded into the everyday running of businesses that almost no one can do their work without proper IT support these days. When was the last time your company's network or email or a very important application was down for maintenance, and literally employees would discuss whether they should go home early because they don't have anything to do? Well, that just shows the extent of IT and its effect on almost all departments of a company. OR cannot escape from IT either; in fact, I see OR depending on IT even more than the other lines of work. Advanced computing power is what helped OR to flourish in the first place. 

However, in practice, what I see more often than not is the fear of involving IT in OR projects, even though using an IT solution would avoid the painful change management required for process changes involving people. Let's face it. We only do change management because we have to, not because we like it. However, OR professionals do not like dealing with IT guys, because IT often has lengthy and formal processes to get things moving and done. The funny thing is, isn't this where OR guys can lend a hand to IT to help make the processes better? Just because a rule is defined (or in this case a set of IT processes is defined), it doesn't mean that we cannot invent new ways of doing things that would suit the situation better. Rules are made to guide the norm. When exceptions occur, new rules are employed and adopted. Pre-existing rules should not be the barrier to solving problems with simple solutions (simpler than changing people's behaviours that is).

On the other hand, to the IT guys, I would like to say the following, and bear in mind that I am on your side, IT folks, since I used to be one of you. As much as IT would like to think of themselves as the smart cookies, or worse the god's gift to humanity, IT should always remember that its entire purpose for existing is to simplify other people's lives with technological solutions. It is mostly a means to an end in the business world. Therefore, remember who your customers are, and serve them well - like everybody else does. Saying "No" should not be an option, since your purpose is to serve. IT is known as the "cutting edge" industry. Being innovative should be the virtue that IT guys hold high up above anything else.

I was reading an article on Analytics, titled "What's an IT Guy Doing at an O.R. Conference?" by Jeremy Yang from Cisco. He talked about some interesting differences between IT and OR professionals that were discussed at a session in the Practice Conference held at Vancouver, BC in 2007. One interesting point raised was the quest for an "end state": 
In practice, IT always needs an "end state" to the requirements of a project in order to start the development and testing process. However, in O.R. there is never a true "end state", since O.R. is constantly changing to find the optimal model. By nature, this is a cause of conflict and frustration between O.R. and IT.
Am I being silly or is this a prime candidate for the well-known agile development by iteration? Wait, isn't Google doing this constantly with their beta release of products?

The two groups that I am proud to be members of, Operations Research and IT, have highly intelligent people. Don't be afraid of each other. Use each other wisely. Foster a better working relation between the two groups to make each other's work having a greater impact. What about the idea of having an Operations Research liaison to IT? Personally, I think it would be an idea worth trying, because it would help simplify IT processes and reduce miscommunication or re-work. IT is not a black box to be avoided. OR and IT are both very logical groups of people. We should get each other, not be afraid of each other!

Tuesday, November 4, 2008

ORdinary Spreadsheets and ORdnances

The July-August 2008 Interfaces Journal ran a theme of "The Use of Spreadsheet Software in the Application of Management Science and Operations Research". In their article from that issue, "A Spreadsheet Implementation of an Ammunition Requirements Planning Model for the Canadian Army", Hurley and Balez describe a successful spreadsheet model for planning training ammunition expenditures.

Ammunition is expended in training courses in a highly uncertain environment. Course registration rates, failure rates (before completing the entire course), and other uncertainties make it difficult to accurately forecast ammunition consumption. Planners must choose a course portfolio that will not result in an ammunition shortage. Of course, to minimize the chances of running out of ammo, planners to date had been planning to the maximum expenditure per course. The consequence of this was that the program came repeatedly under budget, 38.7% in 2002-03. This is far from ideal when attempting to allocate scare resources. Naturally this was an opportunity to apply risk management principles in order to either request a smaller budget to accomplish the same goals or to do more with the same budget.

The solution took the form of a spreadsheet tool. The Excel spreadsheet combined with Visual Basic for Applications (VBA) provided an easy and inuitive interface for planners to interact with the risk model. As a result, in 2004-05 the program was only 3.1% under budget. I will not go into too many more details as they are available in the article, but I had two interesting thoughts:

[1] A big advantage of using spreadsheets is the familiarity most managers have with it. Leveraging this, the team built a simple spreadsheet simulation to demonstrate the portfolio effect of running several courses. With repeated "F9-Simulations" (my term) they were able to demonstrate that while 10 course sections will never use 10 times the maximum (as presently budgeted), it is actually reliably much less than this. Moving up a level and using @Risk to run 10,000 simulations they were able to convincingly demonstrate the concept.
We cannot overemphasize the value of this type of spreadsheet demonstration in selling the potential of an OR model.
Interestingly enough their experience differs from my own. I tried to convince an utlrasound department supervisor that if average-45-minute appointments of uncertain lengths are booked every 45 minutes, her technologists would reliably work overtime. To do this I built a simple spreadsheet simulation, but it was totally lost on her. This is not meant as a knock against this approach, but rather to emphasize the importance of manager familiarity with spreadsheets. My ultrasound supervisor as a senior medical radiation technologist thinks differently from a rising Canadian Colonel.

[2] When selecting a portfolio of courses to fit the approved budget (less than requested), the Army chose to manually optimize using the tool rather than accept a priority-optimized result from a linear program. This perplexed the authours and I think they wrongly blamed themselves for a failure to achieve buy-in. It is my experience that when dealing with problems of a magnitude that an individual can wrap their heads around, clients prefer to leverage their intuition and optimize by hand. As OR practitioners we may not trust the client to acheive a truly optimal result, but as a client they do not trust a model to capture all of the nuances they know inuitively and the answer, of course, is somewhere in between.

The idea of doing OR with Excel probably wasn't what got you started in the field, but if you like seeing results it might just keep you in it.

Hurley, W.J., Balez, Mathieu. 2008. A Spreadsheet Implementation of an Ammunition Requirements Planning Model for the Canadian Army. Interfaces 38(4) 271-280

Monday, November 3, 2008

Computer Age Workers Suffer Digital Fatigue - Can OR Help?

The October/November Issue of Scientific American Mind magazine highlights the increasing digital fatigue we are facing as a result of always being plugged into technology.

According to a study conducted by the article's authors, our neural circuitry is actually rewired as we become more computer savvy. Internet-naive subjects, after only five consecutive days of internet use (one hour/day), had already (unconsciously, of course) rewired their brains to match those of the computer-savvy subjects in the study.

While the prospect of adapting our brains to optimize our use of the internet may sound exciting, the authors warn that the computer age has plunged us into a state of "continuous partial attention" - which they describe as keeping tabs on everything, while never truly focusing on anything.

This can result, according to the authors, in a state of "techno-brain burnout", where people place their brain in a heightened state of stress by paying continuous partial attention to anything and everything. Because our brains were not built to sustain this level of monitoring for such extended periods of time, this "techno-brain burnout" is threatening to become an epidemic.

While these heightened stress levels can have short-term productivity benefits, they are proving to be significant hindrances to medium-long term productivity, due to worker fatigue and inability to concentrate.

So where does OR come in? Well, let's review the characteristics of this vexing problem to the computer age, with workers who:
  • Have too much to do, too little time
  • Are overwhelmed by incoming stimuli
  • Fatigued and drained
Perhaps an old school approach - from the early days of scientific management - could be the right prescription.

Our old friend Frank Gilbreth, father of motion study, on his second day on the job as a bricklayer, questioned why he was being taught several different methods for laying bricks. So Frank developed motion and fatigue study, and created a process for laying bricks that was vastly more efficient than the processes currently in place.

In fact, Frank's new method increased productivity by nearly 200%, while simultaneously reducing worker fatigue. Here's a nice two-minute overview of Frank's bricklaying study from YouTube - a nice refresher course if it's been awhile:



It seems that computer age workers could greatly benefit from motion study analysis - and who better to deliver it than OR practitioners?

If you walked into a factory, and saw everyone on the assembly line improvising and doing their job any way they pleased, without any knowledge of best practices or recommended techniques, wouldn't you be stunned? Yet to this point in time, this is how the computer age workforce operates.

Time management and productivity experts have been at the forefront of efforts to tackle these problems, but their recommendations are usually fairly general, and without quantification. While advice such as "don't check email first thing in the morning" may indeed be worth practicing, eventually, we should be able to help guide specific individuals and workers to their optimal level of productivity.

And if that isn't ripe for OR, I don't know what is.

Tuesday, October 28, 2008

Approaches to Reporting Access to Diagnostic Imaging in Health Care

In the publicly funded and administrated health care system in Canada the absence of market forces makes access to services of chief concern. Thus reporting, synthesizing and acting on data regarding access is critical. In the context of diagnostic imaging, an area that I have recently had experience with, access is typically talked about in terms of waiting times or waiting lists. The issue of waiting times in imaging is, like so much in health care, a complex one. Multiple exam types requiring varying specialty resources are performed on patients with a kaleidoscope of urgency levels. Typically data exists at a patient-by-patient level, but the challenge is how to aggregate the information in such a way that waiting times can be reported for both the benefit of the decision maker and the benefit of the public. The details oriented operations research practitioner is tempted to over-deliver on their level of analysis when presenting these metrics and we must seek to trim it back while still including critical information that impacts what are ultimately life and death decisions. Below I hope to combine a survey of the current state of public information on CT (Computed Tomography) and MRI (Magnetic Resonance Imaging) waiting times in Canada with a discussion of nuances of the metrics chosen.

Beginning with the worst example I saw in my research, we look at the Nova Scotia Department of Health Website. Waiting times are reported by authority and by facility, important data for individuals seeking to balance transportation with access. However, it's how the wait times are measured that worries me the most. Waiting time is defined as the number of calendar days from the day the request arrives to the next available day with three open appointments. I have found that this is the traditional manner in which department supervisors like to define waiting lists, but at a management level it's embarrassingly simplistic. At the time of writing, the wait time at Dartmouth General Hospital for a CT scan is 86 days. I guarantee you that not every patient is waiting 86 days for an appointment. Not even close. Neither is the average 86 days, nor is the median 86 days. The question of urgency requires that we discuss our level of access for varying urgencies. Additionally, there's the fact that 3 available appointments 86 days from now says nothing about what day my schedule and the hospital's schedule will allow for an appointment. If there's that much wrong with this measurement method, then why do they do it? The simple fact is that it is very easy to implement. In healthcare where good data can be oh so lacking, this system of measuring "waiting lists" is cheap and easy to implement. No patient data is required, one needs simply to call up the area supervisor or a booking clerk and ask for the information. So hats off to Nova Scotia for doing something rather than nothing, which indeed is better than some of the provinces, but there's much work to be done.

Next, we'll look at the Manitoba Health Wait Time Information website. Again we have data reported by health authority and facility. Here we see the "Estimated Maximum Wait Time" as measured in weeks. The site says, "Diagnostic wait times are reported as estimated maximum wait times rather than averages or medians. In most cases patients typically wait much less than the reported wait time; very few patients may wait longer." If this is true, and it is, then this is pretty useless information, isn't it? Indeed I am reconsidering my accusation of Nova Scotia being the worst of the lot. If this information represents something like the 90th or 95th percentile then I apologize because, as I discuss later, this is a decent figure to report. However, it is not explicitly described as such.

Heading west to Alberta, we visit the Alberta Waitlist Registry. Here we can essentially see the waiting time distribution of most patients scanned in MRI or CT accross the province in the last 90 days. The site reports the "median" (50th) and "majority" (90th) percentiles of waiting time. It then follows to report the % of patients served in <3>18 months. What is lacking in this data is two key elements. For one, both day patients and in patients are included in this data. This means that both the patient waiting for months to get an MRI on their knee and the patient waiting for hours to get one on their head are treated as equal. Patients admitted to the hospital and outpatients experience waiting times on time scales of different orders of magintude and should not be considered together. The percentage of patients seen in less than 3 weeks must therefore include many inpatients and thus overstates their true level of service. The other key element is the notion of priority. Once again, for an individual in the population looking for information about how long they might wait or for a manager/politician looking to quantify what the level-of-care consequences are of current access levels, this data isn't very useful because it lacks priority. If urgent patients are being served at the median waiting time, this shows significant problems in the system, but without data reported by urgency, we can only guess that this is being done well. As someone who has seen it from the inside, I would NOT be confident that it is.

Now I return to what westerners would rather not admit is the heart of Canada, Ontario and the Ontario Ministry of Health and Long-Term Care website. This site measures wait times in terms of the time between request and completion. It reports the 90th percentile wait times in days by facility and provincially and calls it the "point at which 9 out of 10 patients have had their exam." The data excludes inpatients and urgent outpatients scanned the same-day, addressing a critical issue I had with the Alberta data. Priorities are lacking, but with a little digging you can find the province's targets by priority, so there is, perhaps, hope. Reporting the 90th percentile seems like a good practice to me. With the funky distributions we seen when measuring waiting times, averages are certainly of no use. Additionally the median isn't of great interest because this is not an indication what any one individual's experience will be. This leaves the 90th percentile which expresses what might be called a "reasonable worst case scenario".

Finally I turn to the organization whose explicit business is communicating complex issues with the public, the Canadian Broadcasting Corporation. Their CBC News Interactive Map from November 2006 assigned letter grades from A-F converted from %ages of the population that were treated within benchmark. Who knows if this is glossing over the lack of priority data or if it includes the %age that met the benchmark for each priority, but it's a start. Letter grades given were: BC N/A, AB D, SK N/A, MN F, ON C, QC N/A, NB N/A, NS N/A, PEI F, NF N/A. So with over half not reporting, there wasn't much they could do.

So what have we learned from this survey? Well we have certainly learned that the writer has a love of detail and is dissatisfied with each province that omits any. This is, as discusses in the introduction, natural for an operations research practitioner. If I were advising someone on the development of diagnostic access-to-care metrics I would tell them this: (1) Focus on the patient experience. Averages and medians don't tell me what my experience might be. 90th percentiles do a much better job of this. (2) Focus on the context. Waiting times in the context of an inpatient are in a different universe than those of an outpatient and should be treated as such. Waiting times of urgent cases vs. routine cases bear different significance and should be similarly separated.

Monday, October 20, 2008

Healthcare and OR in Canada: selected talks at INFORMS 2008

The Ontario Ministry of Health in Canada would like to reduce the delay in transfer of care from the ambulance to hospital emergency department. The delay usually occurs when the ambulance is at the hospital site waiting to transfer patients to the emergency wards. The ministry would like to use alternative sites, UCCs (Urgent Care Centres), to accommodate the ambulance patients who would typically be discharged on the same day, so as to free up time at the ED needed to deal with these type of patients. The good news is that the ministry has the smarts to research the feasibility of this solution before doing anything. However, the bad news is that the two databases necessary (EMS & hospital databases) for doing this study do not have identifiers for patients. What’s new, right? Healthcare and bad data almost always go hand in hand. Therefore, the team lead by Ali Vahit Esensoy at the University of Toronto cannot identify the same patient in both databases. However, using accurate GPS timestamps and various triage indicators, the team was able to come up with an algorithm to match over 90% of the patients in the two databases. Then with the help of the physicians and staff, the team was able to devise a set of decision rules to filter out the patients that would be candidates for UCC. The result of the study is that the proposed UCC solution is in fact not a good idea, because there are simply not enough such patients. This is a classic case illustrating the importance of quantitative analysis for informed decision making.

On the west coast of Canada, two groups within the CIHR (Canadian Institutes of Health Research) team in Operations Research to improve cancer care, are making impacts in the BC Cancer Agency. They would like to call out to the OR community to help them join in their efforts of establishing an online community to share resources among the OR people working in cancer care.

The British Columbia Cancer Agency (BCCA) is the sole cancer treatment provider for the entire province. The problem to be resolved at the facility is a lack of space (examination rooms and space for physicians to dictate) at the ambulatory care unit (ACU). However, again, the process flow related data was not available. The BCCA OR team, Pablo Santibáñez and Vincent Chow mapped the patient flow process, and then manually collected time and motion data to track the movement of patients and physicians. The data was used for a simulation model to evaluate different what-if scenarios: different appointment scheduling methods and room allocation methods. As a result, the team was able to achieve up to 70% reduction in patient appointment wait time with minimum impact on the clinical duration. They were also able to free up to 26% of the exam rooms to accommodate for other physician duties.

On the academic front, a Ph.D student at the Sauder School of Business in the University of BC, Antoine Sauré, has been helping BCCA in another department: Radiation Therapy treatment units. This research is motivated by the adverse effect of delays on patients’ health such as physical distress and deterioration of the quality of life, and the inefficiencies in the use of expensive resources. Rather than maximizing revenue, the main purpose of our work is to identify good scheduling policies for dynamically allocating available treatment capacity to incoming demand so as to achieve wait time targets in a cost-effective manner. Currently, the number of urgent treatments that would start after the recommended target time is significantly below the target. This goal involves the development of a Markov Decision Process model and simulation models for identifying and testing different types of policies. This is still an on-going research. No results are currently available. However, the team is ready to test algorithms for determining the optimal scheduling policy based on an affine approximation of the value function and a column generation algorithm to tackle the otherwise very large MDP problem.

The papers for the above two projects are available online at http://www.orincancercare.org/cihrteam/ if you wish to obtain more information.

Credits: These 3 talks were given at the INFORMS 2008 conference in Washington DC. The track sessions were TB21, TB34, and TB34. Speakers are Ali Vahit Esensoy, University of Toronto, Canada; Pablo Santibanez, Operations Research Scientist, British Columbia Cancer Agency, Canada; and Antoine Saure, PhD Student, Sauder School of Business, University of British Columbia, Canada. The talks were titled "Evaluation of Ambulance Transfers into UCCs to Improve Ambulance Availability & Reduce Offload Delay", "Reducing Wait Times & Improving Resource Utilization at the BC Cancer Agency’s Ambulatory Care Unit", and "Radiation Therapy Treatment Scheduling".

Friday, October 17, 2008

Doing Good with Good OR: Energy & The Environment

How could analysis and Operations Research help us foresee trends and make intelligent and informed political decisions? Philip Sharp, the president of Resources for the Future, former Congressman from Indiana, US House of Representatives, attempts to answer this question in the Doing Good with Good OR series of plenaries. The humble but frank former congressman speaks to the scientific crowd about the importance of rigorous analysis for important political issues. Furthermore, Sharp elaborates on the institutional connections and the ability to communicate complex issues simply as the crucial factors to make the analysis matter.

From the 70’s to now, we have been on the roller coaster ride with oil prices. In the 1970’s, with the belief of an energy shortage, the National Energy Act was signed to protect the US from the over consumption of oil. Tax credits were handed out to engage people to convert gasoline cars to burning natural gas. The go-ahead for ethanol was also around the same time. However, in 1986, oil prices crashed down, which caused policies and investors who relied on conventional wisdom to withdraw. Then came 2004, when oil prices went up unexpectedly, and the government had to update the fuel economy mandate to adjust to the rise. Similar rise and fall goes for the natural gas industry as well. The ups and downs of natural gas influenced the liquefied natural gas (LNG) terminals development, the talks of an Alaska pipeline to serve the lower 48 states, the energy ties with the northern neighbor, Canada, and much more. In the US, the prices of oil and gas largely impact the country. When energy prices go through the roof, “all bets are off”, says Sharp, just like the stock market crash that the country is now deeply suffering from.

However, being open and frank, Sharp thinks analysis is powerful and important, but it is not the means to all ends. “It will help us organize and attack so many unknowns and uncertainties”, but persistency to get the ideas across to politicians is crucial. Responding to an audience’s question, Sharp stresses the importance of identifying institutional connections. In the world of politics, the voice of an individual scientist may have some impact, but a collective agreement of a group of politicians will have far greater reach. “The most difficult thing in the congress is to get people to agree on something”, says Sharp. The scientist’s ability to communicate simply the complex things and the level of confidence and trust s/he can offer to the politician is golden. Politicians need to find out who and what to trust, especially when reports are being thrown around to counter other reports. It is too tempting to just pick the easiest path or whatever supports one’s self interest.

It is always a pity to see brilliant ideas sitting on the book shelves because of ineffective communication. Politicians are typically the most powerful figures in a country and can make the biggest impact. If Operations Researchers want to do good with good OR, then as Sharp suggests, identifying institutional connections will be the key.

Credits: The talk was given at the INFORMS 2008 conference in Washington DC as a plenary in the series of Doing Good with Good OR).

Doing Good with Good OR: Health Policy, AHQR


Dr. Steven Cohen, Director at the Center for Financing, Access & Cost Trends at the Agency for Healthcare Research & Quality (AHRQ) in the Department of Health & Human Services, USA, shared with the audience the various topics that the Center has been working on.


The Center employs statisticians, economists, social scientists and clinicians to achieve their motto, “advancing excellence in healthcare”. They also monitor and analyze the current trends in healthcare, ranging from cost, to coverage, to access, and to quality of care. Some figures that were shared were:
  • Every $1 out of $6 of the US GDP (or 16%) goes to healthcare – largest component of federal and State budget
  • Other western European nations spend less than 10% on healthcare.
  • In 2006, $2.1 trillion was spent on healthcare. That is $7,026 per capita, which is a 6.7% increase over 2005.
  • At this kind of growth, it is projected that $4.1 trillion will be spent in healthcare in 2016 (1/5 of GDP). 
  • 5% of the US population account for 50% of the healthcare expenditures.
  • Prescribed medication expenditures almost doubled from 12% to 21% in ~10 years. 
  • The largest expenditure is on inpatients (over 30%).
  • The second largest is on ambulatory care (over 20%). 
  • Chronic diseases (heart, cancer, trauma-related disorders, mental health, and pulmonary) account for a majority of the expenditures.
  • Medical errors accounted for 44,000 avoidable deaths in hospitals a year.
  • Americans are less healthy: 33% obesity rate and high rate of chronic diseases.

AHRQ aims to provide long term system wide improvement for healthcare quality and safety. To provide policy makers with informed recommendations, surveys, simulations , forecasting models, and other OR tools are often employed to answer difficult questions. Through these methods, AHRQ is able to establish evidence and assess risks associated with alternative care choices.

AHRQ’s focus on efficient allocation of resources and structuring of care processes that meet patient care needs aids policy makers to establish the necessary high-level strategies and policies. Especially in dire times like these, issues of rationing become the center of discussion. It is AHQR’s responsibility to have the right information to help policy makers make the right trade offs.

Credits: The talk was given at the INFORMS 2008 conference in Washington DC as a plenary in the series of Doing Good with Good OR).

Tuesday, October 14, 2008

Social Media and Operations Research

Social media and Web 2.0 have been the buzz words in the internet marketing world for a few years now. Of course, we can count on the Numerati (the new term for Operations Researchers in reference to the title of the new book by Stephen Baker) to start scratching their heads and eventually come up with systematic ways of mining vast amount of data (i.e. analytics), and then applying the harvest of knowledge from other disciplines, such as psychology, to study people’s behaviours (hence diverting from the non-traditional OR application field of mechanical processes). Claudia Perlich from IBM and Sanmay Das from Rensselaer Polytechnic Institute individually explain ways they have used OR to dissect the world of blogs and Wikipedia to provide insight to marketers and to demonstrate the conversion of Wikipedia’s highly edited pages to a stable and credible source.

Ever since the existence of blogs, marketers have been nervous about the reputation of their products. Lucky for the IBMers, when the marketers at IBM are nervous about the client response to their product (i.e. Lotus), help is within reach from the IBM Research team. Marketers want to know: 1. What are the relevant blogs? 2. Who are the influencers? 3. What are they saying about us? 4. What are the emerging trends in the broad-topic space? Perlich’s team went about these four questions by starting with a small set of blogs identified by the marketers (the core “snowball” of blogs), then “rolling” over the “snowball” twice to discover other blogs related the core set (i.e. max 2 degrees of separation from the core). To find the authoritative sources in the snowball, Google’s Page Rank algorithm came to the rescue. Using sentiment labeling, the team was able to say whether the overall message of a blog was positive or negative. Then to allow useful interaction with and leveraging of the data by the users (i.e. marketers), a visual representation was rendered to show the general trend in the blogosphere about the product in question (see image). At which stage, marketers are able to dig into each blog that is identified as positive or negative, and the problem would seem much more manageable.



Das’ talk fits in rather nicely with Perlich’s, as he examines the trend of blog comment activities and Wikipedia’s edit histories to try to demonstrate the eventual conversion of these information sources to a stable state. The underlying assumptions are that the more edits a Wikipedia page has, the more reliable its information is, hence the higher the quality; and the higher the quality a page is, the less likely that there will be more talks/edits on that page, because most likely what needed to be said has already been said (aka informational limit). Das obtained the data dump of all pages on Wikipedia in May 2008, and obtained all 15,000 pages (out of 13.5 million in total) that had over 1,000 edits. Using dynamic programming to model a stochastic process, Das was able to find a model for the edit rate of an aggregation of these highly edited Wikipedia pages. Then he applied the same model to blog comment activities. In both cases, the model fit extremely well to the data, and surprisingly the shape of the activity pattern over time looked very much alike between blog comment and Wikipedia page edit activities. An interesting inference made by Das was that people contribute less of their knowledge to Wikipedia pages than blogs.

This is the beauty of Operations Research. It is a horizontal plane that can cut through and be applied to many sciences and industries. Aren’t you proud to be a dynamic Numerati?

Credits: The talk was given at the INFORMS 2008 conference in Washington DC. The track session was MA12 & MB12. Speakers are Claudia Perlich, T.J. Watson IBM Research, and Sanmay Das, Assistant Professor, Rensselaer Polytechnic Institute, Department of Computer Science. The talks were titled "BANTER: Anything Interesting Going On Out There in the Blogosphere?", and "Why Wikipedia Works: Modeling the Stability of Collective Information Update Processes".

Similarity between “A Single EU Sky” and “Ford Suppliers”? Optimization

How should EUROCONTROL select which improvement projects to invest in for a unified European sky for air traffic by the year 2020? Will Ford’s Automative Holding Components shut down their facilities and outsource everything? Optimization models come to the rescue, and they are big! 2 of the 6 finalists for the Daniel H. Wagner Prize for Excellence in Operations Research presented their exciting projects at INFORMS on Monday.

The air traffic is getting more and more congested in Europe, and a solution is needed to unite the European sky with a single set of protocols for all EU countries. EUROCONTROL is in the position to do so. It is the European Organization for the Safety of Air Navigation, and it currently has 37 member states. As a part of the modernization activities, EUROCONTROL needs to select a set of technological improvement projects from four major categories: network efficiency, airport operations, sector productivity, and safety net. For example, if considering only a subset of 5 projects out of a list of 20+ projects, each with 2 to 5 implementation options, there would be 300 possible combinations. Therefore, the question is which set of projects to select that would satisfy the various objectives and constraints of multiple stakeholders, such as the airports, the airlines, the society (environment), and more. Grushka’s team from the London Business School was asked by EUROCONTROL to provide vigorous and transparent analysis of this problem by involving all stakeholders. The team used an integrative and iterative framework to approach the problem. They used a mixed integer programming model to find the optimal combination of projects. As a result of this team’s effort, the EU may now use the same language to talk about uniting the European sky with a common understanding of the problem.

The Ford team had a very different but extremely urgent problem to solve. They needed to figure out whether 2 of their automotive interior supply production facilities should be closed down and their work outsourced to other suppliers because of unprofitability and underutilization at the plants. The number one problem was time – 8 weeks was all they had. The optimization problem they faced, however, was extremely large. The mixed integer non-linear programming model (MINLP) had around 450,000 variables and 200,000 constraints. After removing nonlinearity in the model, the mixed integer programming model (MIP) had 4.3 million variables and 1.8 million constraints. Just imagine the data gathering process and the model formulation! What a nightmare. Luckily, the team had the unconditional support from the CEO and was able to obtain a complete set of data –150,000 data points in the model. After 8 weeks of 20 hour-days, the team was able to deliver a model to test out the what-if scenarios and therefore removing the subjective decision making that is so common in most enterprises. As a result, Ford will be able to maintain 1 facility and outsource only a certain percentage of the work. It presented a saving of $50 million over five years compared to the alternative of outsourcing all the work.

The two projects were fascinating to listen to. They both showcased the importance of quantitative decision making in business. It will be a tough job to select a winner out of these two. Good luck to both teams!

Credits: The talk was given at the INFORMS 2008 conference in Washington DC. The track session was MC32. Speakers are:Yael Grushka-Cockayne, London Business School, Regent’s Park, London, United Kingdom, and Erica Klampfl, Technical Leader, Ford Research & Advanced Engineering, Systems Analytics & Env. Sciences Department. The talk was titled "Daniel H. Wagner Prize Competition: Towards a Single European Sky, and Using OR to Make Urgent Sourcing Decisions in a Distressed Supplier Environment".