WorldCat Identities

Leaders for Global Operations Program

Overview
Works: 279 works in 279 publications in 1 language and 284 library holdings
Classifications: HD57.5,
Publication Timeline
.
Most widely held works by Leaders for Global Operations Program
Reducing the demand forecast error due to the bullwhip effect in the computer processor industry by Emily Smith( )

1 edition published in 2010 in English and held by 1 WorldCat member library worldwide

Intel's current demand-forecasting processes rely on customers' demand forecasts. Customers do not revise demand forecasts as demand decreases until the last minute. Intel's current demand models provide little guidance for judging customer orders when the market changes. As a result, during the economic downturn of Q3 and Q4 '08, Intel's model could not predict how much billings would decrease. The demand forecast had large amounts of error caused by the bullwhip effect (order amplification in a supply chain). This project creates a new demand forecast model in two phases. The first phase investigated the supply chain of OEMs and Retailers. The second phase of the project used the supply chain information discovered in phase one to create a new demand forecast that reduces the error caused by the bullwhip effect. The first phase determined that the average time it takes a CPU to go from Intel to end customer purchase is seventeen weeks. The first phase also indentified ownership of products throughout the supply chain and parties making purchase decisions. The supply chain information was then used in the second phase of the project to create a demand forecast model. The new model is a heuristic model that simulates quarterly purchase decisions of retailers and OEMs including lead times and inventory. The resulting model allows Intel to monitor and react to consumption changes faster than waiting for customers to change their demand forecasts. The model also provides a better forecast during times of change. The model reduces the error due to the bullwhip effect and indentifies early when a downturn or upturn is going to happen in ordering behavior
Modeling neuroscience patient flow and inpatient bed management by Jonas Hiltrop( )

1 edition published in 2014 in English and held by 1 WorldCat member library worldwide

Massachusetts General Hospital (MGH) experiences consistently high demand for its more than 900 inpatient beds. On an average weekday, the hospital admits about 220 patients, with the emergency department (ED) and the operating rooms (OR) being the main sources of admissions. Given MGH's high occupancy rates, a comparable number of discharges have to occur daily, and the intraday time distributions of admissions and discharges have to be aligned in order to avoid long wait times for beds. The situation is complicated by the specialization of beds and the medical needs of patients, which place constraints on the possible bed-patient assignments. The hospital currently manages these processes using fairly manual and static approaches, and without clear prioritization rules. The timing of discharges is not aligned with the timing of new admissions, with discharges generally occurring later in the day. For this reason MGH experiences consistent bed capacity constraints, which may cause long wait times for patients, throughput limitations, disruptions in the ED and in the perioperative environment, and adverse clinical outcomes. This project develops a detailed patient flow simulation based on historical data from MGH. The model is focused on the neuroscience clinical specialties as a microcosm of the larger hospital since the neuroscience units (22 ICU beds and 64 floor beds) are directly affected by the hospital's important capacity issues (e.g., patient overflows into other units, ICU-to-floor transfer delays). We use the model to test the effectiveness of the following three interventions: 1. Assigning available inpatient beds to newly admitted patients adaptively on a just-in-time basis; 2. Discharging patients earlier in the day; 3. Reserving beds at inpatient rehabilitation facilities, thereby reducing the MGH length of stay by one or more days for patients who need these services after discharge from the hospital. Intervention effectiveness is measured using several performance metrics, including patient wait times for beds, bed utilization, and delays unrelated to bed availability, which capture the efficiency of bed usage. We find that the simulation model captures the current state of the neuroscience services in terms of intraday wait times, and that all modeled interventions lead to significant wait time reductions for patients in the ED and in the perioperative environment. Just-in-time bed assignments reduce average wait times for patients transferring to the neuroscience floor and ICU beds by up to 35% and 48%, respectively, at current throughput levels. Discharges earlier in the day and multi-day length of stay reductions (i.e., interventions 2 and 3) lead to smaller wait time reductions. However, multi-day length of stay reductions decrease bed utilization by up to 4% under our assumptions, and create capacity for throughput increases. Considering the expected cost of implementing these interventions and the reductions in patient wait times, we recommend adopting just-in-time bed assignments to address some of the existing capacity issues. Our simulation shows that this intervention can be combined effectively with earlier discharges and multi-day length of stay reductions at a later point in order to reduce wait times even further
Enabling strategic fulfillment : a decision support tool for fulfillment network optimization by Bryan Drake( )

1 edition published in 2012 in English and held by 1 WorldCat member library worldwide

Dell's Third-Party (3P) Product network uses several different order fulfillment methods, though the determination of which products are fulfilled under which method is not clearly delineated. We have developed a tool to assist in the decision making process for Dell's 3P distribution network. This tool transparently presents the results of cost modeling and forecast variance simulation while maintaining usability to achieve broad adoption and exert influence on product fulfillment method decisions. The cost model created takes into account product, overhead, logistics, and capital costs and has the capability to deal with volume uncertainties through simulation. This tool solidifies the discussion around choosing the correct fulfillment method decision process and is the first step towards quantifying the fulfillment method decision
Diagnosing intensive care units and hyperplane cutting for design of optimal production systems by J. Adam Traina( )

1 edition published in 2015 in English and held by 1 WorldCat member library worldwide

This thesis provides a new framework for understanding how conditions, people, and environments of the Intensive Care Unit (ICU) effect the likelihood the preventable harm will happen to a patient in the ICU. Two years of electronic medical records from seven adult ICUs totalling 77 beds at Beth Israel Deaconess Medical Center (BIDMC) were analysed. Our approach is based on several new ideas. First, instead of measuring safety through frequency measurement of a few relatively rare harms, we leverage electronic databases in the hospital to measure Total Burden of Harm, which is an aggregated measure of a broad range of harms. We believe that this measure better reflects the true level of harm occurring in Intensive Care Units and also provides hope for more statistical power to understand underlying contributors to harm. Second, instead of analysing root causes of specific harms or risk factors of individual patients, we focus on what we call Risk Drivers, which are conditions of the ICU system, people (staff, patients, families) and environments that affect the likelihood of harms to occur, and potentially their outcomes. The underlying premise is that there is a relatively small number of risk drivers which are common to many harms. Moreover, our hope is that the analysis will lead to system level interventions that are not necessarily aiming at a specific harm, but change the quality and safety of the system. Third, using two years of data that includes measurements of harms and drivers values of each shift and each of seven ICUs at BIDMC, we develop an innovative statistical approach that identifies important drivers and High and Low Risky States. Risky States are defined through specific combinations of values of Risk Drivers. They define environmental characteristics of ICUs and shifts that are correlated with higher or lower risk level of harms. To develop a measurable set of Risk Drivers, a survey of current ICU quality metrics was conducted and augmented with the clinical experience of senior critical care providers at BIDMC. A robust machine learning algorithm with a series of validation techniques was developed to determine the importance of and interactions between multiple quality metrics. We believe that the method is adaptable to different hospital environments. Sixteen statistically significant Risky States (p < .02) where identified at BIDMC. The harm rates in the Risky States range over a factor of 10, with high risk states comprising more that 13.9% of the total operational time in the ICU, and low risk states comprise 38% of total operating shifts. The new methodology and validation technique was developed with the goal of providing a basic tools which are adaptable to different hospitals. The algorithm described within serves as the foundation for software under development by Aptima Human Engineering and the VA Hospital network with the goal of validation and implementation in over 150 hospitals. In the second part of this thesis, a new heuristic is developed to facilitate the optimal design of stochastic manufacturing systems. The heuristic converges to optimal, or near optimal results in all test cases in a reasonable length of time. The heuristic allows production system designers to better understand the balance between operating costs, inventory costs, and reliability
Investigation of integrally-heated tooling and thermal modeling methodologies for the rapid cure of aerospace composites by Harrison Scott Bromley( )

1 edition published in 2015 in English and held by 1 WorldCat member library worldwide

Carbon Fiber Reinforced Polymer (CFRP) composite manufacturing requires the CFRP part on the associated tool to be heated, cured, and cooled via a prescribed thermal profile. Current methods use large fixed structures such as ovens and autoclaves to perform this process step; however heating these large structures takes significant amounts of energy and time. Further, these methods cannot control for different thermal requirements across a more complex or integrated composite structure. This project focused on the below objectives and approaches: - Gather baseline energy and performance data on ovens and autoclaves to compare with estimations of new technologies; - Determine feasibility, applicability, and preliminary thermal performance of proposed heated tooling technologies on certain part families via heat transfer analyses. The project yielded the below results and conclusions: - Proved the capability of the modeling software to mimic an oven cure with less than 3% error in maximum exothermic temperature prediction; - Provided guidelines on when to use 1D, 2D, and 3D heat transfer analyses based on part thickness; - Concluded which size/shape of parts would work best for the single sided integral heating technologies; - Calculated energy intensity of incumbent technologies for comparison of future experiments on integrally heated tooling. Overall, this project helped steer the team into the next phase of their research of the technology and its applications. It provided recommendations on what type of parts the technology can be used as well as quantified the energy intensity of incumbents for comparison
Using analytics to improve delivery performance by Tacy J Napolillo( )

1 edition published in 2014 in English and held by 1 WorldCat member library worldwide

Delivery Precision is a key performance indicator that measures Nike's ability to deliver product to the customer in full and on time. The objective of the six-month internship was to quantify areas in the supply chain where the most opportunities reside in improving delivery precision. The Nike supply chain starts when a new product is conceived and ends when the consumer buys the product at retail. In between conception and selling, there are six critical process steps. The project has provided a method to evaluate the entire supply chain and determine the area that has the most opportunity for improvement and therefore needs the most focus. The first step in quantifying the areas with the most opportunity was to identify a framework of the supply chain. The framework includes the target dates that must be met in order to supply product to the customer on schedule and the actual dates that were met. By comparing the target dates to the actual dates, the area of the supply process that caused the delay can be identified. Next a data model was created that automatically compares the target dates to actual dates for a large and specified set of purchase orders. The model uses the framework and compiles all orders to quantify the areas in the supply chain that create the most area for opportunity. The model was piloted on the North America geography, Women's Training category, Apparel product engine, and Spring 2013 season, for orders shipped to the Distribution Center (DC). The pilot showed that the most area for opportunity lies in the upstream process (prior to the product reaching the consolidator). In particular the pilot showed that the area with the most opportunity for the sample set was the PO create process. This conclusion was also confirmed with the Running category. The method developed during the internship provides Nike with a method to measure the entire supply chain. By quantifying the areas in the process, Nike can focus and prioritize their efforts on those areas that need the most improvement. In addition the model created can be scaled for any region, category, or product engine to ultimately improve delivery precision across the entire company
Cycle-time analysis and improvement using lean methods within a retail distribution center by Hugh Churchill( )

1 edition published in 2015 in English and held by 1 WorldCat member library worldwide

Fulfillment cycle-time, or the time it takes to pick an item from inventory, pack it into a box, and load it on a truck for shipment, is one of the main inputs in determining how quickly an online retailer can promise customer order delivery. The faster the fulfillment cycle-time, the later an order can be received and still make the appropriate truck for guaranteed, on-time arrival (e.g. same-day, next day, 3-5 business days). Thus, the customer experience is improved, as they are allowed to place an order later and still receive their purchases quickly. To take advantage of this, the retailer must first be able to measure cycle-time appropriately within their facility. This thesis examines the outbound fulfillment process within an under-performing Amazon fulfillment center (Site A) with the purpose of fully characterizing and measuring fulfillment cycle-time. Comparisons are drawn with like Amazon facilities, and a lean operations approach is taken to identify and eliminate major forms of waste in an effort to shorten cycle-time. The baseline analysis within this thesis provides evidence that current-state cycle-time at Site A is in fact 15% faster than originally thought. However, process improvements were still needed to bring cycle-time in line with the network standard. The remainder of the work within this thesis focuses on these process improvements and develops the following recommendations: 1. Standardize the pick process with a move closer to single piece flow. 2. Reduce and control queue length prior to the pack process in order to reduce non-value-added wait time. 3. Reduce batch size for critical items that must move through the facility the fastest. 4. Rearrange process steps to allow completion in parallel rather than series. The method for evaluating cycle-time and the implementation of lean solutions introduced throughout this thesis are useful as a template for similar analyses throughout the Amazon FC network, as well as within other warehousing and online retailer operations
Reducing energy usage in a manufacturing facility through a behavior change based approach by Michael A Norelli( )

1 edition published in 2010 in English and held by 1 WorldCat member library worldwide

Many companies have developed energy reduction programs for their manufacturing facilities to reduce their operational costs while also decreasing their greenhouse gas emissions. The majority of these manufacturing facilities have made progress in reducing their energy usage through technology changes, such as purchasing more efficient lighting or replacing old chillers, however, these improvements are often capital intensive. The goal of this thesis is to explore the use of low cost employee behavior changes to help a manufacturing facility reduce its energy usage. The author conducted a six month case study at Raytheon's Integrated Air Defense Center (IADC) in which a new approach for achieving energy related employee behavior changes was implemented. The framework is unique to the author but builds upon lean manufacturing principles, social psychology research, and energy management fundamentals. The approach first raises awareness and engages employees, second, helps employees develop energy saving improvements, and lastly, creates a mechanism to sustain improvements and behavior changes moving forward. The benefits of using such an approach are greater employee engagement (the percentage of employees who participated in a voluntary energy reduction program rose from 38% to 78%), more energy saving ideas being implemented (over 60 employee generated energy saving improvements were implemented on the manufacturing floor), and, ultimately, a reduction in wasted energy. Additionally, a real-time feedback system was designed and installed that provided manufacturing employees with information on their cell's energy usage. This real-time feedback system was developed to help sustain improvements and further enable energy reductions through employee behavior changes. While specific tactics and tools of the applied approach may be unique to Raytheon's IADC facility, the strategy and insights can be universally applied
Modeling of ICU nursing workload to inform better staffing decisions by Yiyin Ma( )

1 edition published in 2015 in English and held by 1 WorldCat member library worldwide

Beth Israel Deaconess Medical Center (BIDMC) has partnered with the Gordon and Betty Moore Foundation's to eliminate preventable harm in the Intensive Care Unit (ICU). Many medical publications suggest nursing workload as a major contributor to patient safety. However, BIDMC was not using any tool to measure nursing workload, and as a result, nurse staffing decisions were made solely based on the ad hoc judgment of senior nurses. The objective of this thesis is to create a prospective nursing workload measurement and ultimately use it to improve staffing decisions in ICUs. To create a nursing workload measurement, a wildly-adopted patient-based scoring system, the Therapeutic Intervention Score System (TISS), was modified to BIDMC's ICUs. With consultation from clinicians and nurses, changes were made to the TISS to reflect BIDMC's workflow, and a new nursing workload scoring system called the Nursing Intensity Score (NIS) was created. The NIS for each patient per shift was calculated over a two-year period to gain further insights to improve staffing decisions. After looking at the current state, there was no correlation between nursing staffing and overall patient workload in the unit. In addition, nurses with 1 patient (1:1 nurses) had significantly less workload than nurses with two patients (1:2 nurses) even though they were expected to be the same. Finally, there was one overworked nurse (150% of median nursing workload) in every three shifts in the ICU. A prospective approach to analyze patient workload was developed by dividing patients based on clinical conditions and categorizing the results on two axis: the nominal workload level and the variability around the nominal value of workload. This analysis suggests that, a majority of patients are predictable, including a few patients with high but predictable load. On the other hand, some patients are highly unpredictable. A nursing backup system was proposed to balance workload between 1:1 and 1:2 nurses. To test the proposal, a simulation was developed to model the ICU with the goal of minimizing the number of overworked nurses. The best backup system was a buddy pairing system based on predictive model of patient conditions, with the resource nurse as the ultimate backup
Characterizing and improving the service level agreement at Amazon by Alberto Luna( )

1 edition published in 2015 in English and held by 1 WorldCat member library worldwide

Amazon's Service Level Agreement (SLA) is a promise to its customers that they will receive their orders on time. At the Fulfillment Center (FC) level, the SLA is based on the capability to fulfill open orders scheduled to ship at each departure time. Each center's capability depends on a complex interaction between fluctuating product demand and time-dependent processes. By lowering SLA, Amazon could provide an enhanced the customer experience, especially for same day delivery (SDD). However, providing additional time to the customer also means that the FCs have less time available to fulfill open orders, placing the customer experience of those orders at an increased risk of a missed delivery. This thesis explores cycle time reductions and throughput adjustments required to reduce the SLA at one of Amazon's Fulfillment Centers. First, a method to analyze time-dependent cycle time is used to evaluate the individual truck departure times, revealing that the current process conditions have difficulty meeting current demand. Then, using lean principles, process changes are tested to assess their ability to improve the current processes and allow for an SLA reduction. Although a 1% increase in capacity is possible by improving the processes, system constraints make the changes impractical for full implementation. Consequently, a capacity analysis method reveals that an additional capacity of up to 9.38% is needed to improve the current process conditions and meet current demand. The capacity analysis also reveals that reducing the SLA from its current state requires up to 13.79% more capacity to achieve a 50% reduction in SLA. Through capacity adjustments, the added cost of late orders is mitigated, resulting in a reduced incidence of orders late to schedule and a reduced risk of missed deliveries. The methods utilized in this thesis are applicable to other Amazon FC's, providing a common capability and capacity analysis to aid in fulfillment operations
Identification of leading indicators for producibility risk in early-stage aerospace product development by Allen J Ball( )

1 edition published in 2015 in English and held by 1 WorldCat member library worldwide

Producibility is an emergent property of product development and manufacturing systems that encapsulates quality, product compliance, cost, and schedule. Detailed product definition and process variation have traditionally been a focus area for understanding risk for producibility losses. It is proposed for this investigation that while assumptions inherent to product configuration and process selection can significantly impact producibility, producibility risk and realized producibility losses are primarily indicated by organizational design assumptions and associated phased implementation of programmatic governance. This premise is systematically explored through an assessment of organizational dynamics and product development performance within Aerospace Corporation X. An extension of the hazard analysis technique System Theoretic Process Analysis (STPA) is invoked for leading indicator derivation from assumptions underlying causality of inadequate producibility control. Indicator integration with risk management processes is outlined, and a combination of expert-assessments and quality loss correlation are used to validate indicator significance. As a result of these investigations, it is concluded that functional isolation, phased capability and control, and differing performance incentives are central to producibility loss. In addition, these factors are deemed to be more important than product feature-based sources of producibility risk. Extension of STPA for indicator identification is validated and recommendations are provided for implementation of a leading indicator monitoring program
Understanding supply chain trade-offs through models and scenario planning with a focus on postponement by Concepcion Alexandra Kafka( )

1 edition published in 2015 in English and held by 1 WorldCat member library worldwide

The two objectives of this project were to develop an understanding of the challenges and opportunities of the supply chain of a family of currently marketed products manufactured overseas and distributed/sold worldwide and to increase the agility of the supply chain while achieving a target service level of 99% and maintaining or decreasing costs. A model was created to explore the current supply chain as well as the idea of supply chain agility through the implementation of postponement as models can easily be used to understand the cause and effect relationships through the ability to analyze any number of possible outcomes in a time and cost effective manner. Monthly demand and forecast data was analyzed to determine if there were in biases in the forecasts and to understand the relation between demand and forecast error through the use of a power law model. The forecast and demand data demonstrated a strong log-log relationship between RMSE and demand implying that there are economies of scale when demand is aggregated. The model shows that the implementation of postponement can reduce overall inventory levels, leading to decreased supply chain costs (if the cost of implementing postponement is less than the savings achieved through the inventory decrease). In looking at air versus ocean transport, the savings coming from inventory reduction due to decreasing lead times outweighed the increase in costs for both supply chain designs. As expected, increasing forecast accuracy leads to a decrease in safety stocks while decreasing forecast accuracy leads to an increase. Finally, increasing demand lead to increasing safety stocks and costs while decreasing demand had the opposite effect. For the forecast accuracy and changing demand scenarios there is a larger magnitude of savings for the current design of the supply chain than for one with postponement
Predicting adequacy of supplier responses for multi-year government contracts based on supplier performance metrics by David Allen Hahs( )

1 edition published in 2015 in English and held by 1 WorldCat member library worldwide

Aerospace Company X (ACX) is a designer and manufacturer of advanced aerospace systems and its primary customer is the United States Government (USG). In order to reduce cost and minimize risk, both parties have embraced a multi-year contracting model in which productions agreements are signed for up to five-year periods. This allows for significant cost savings over single-year contracts while allowing for predictable production levels for ACX and its suppliers. At the time of this research, the company was soliciting bids from suppliers for the next five-year multi-year contract. Since this is a sole-source situation, ACX must substantiate all costs to justify that the pricing is fair and reasonable. Costs of purchased hardware are substantiated through three primary means: competition, commerciality, and cost-price analysis. Competition is preferred because the pricing can be justified by free-market forces. However, due to intellectual property rights or unique capabilities, suppliers are often contracted as sole-source. The supplier then can claim commerciality (i.e. the part is for sale commercially) or submit for a complete cost review of material, labor, and overhead rates. In some cases the supplier will not release this data to ACX and a government agency performs the review. The success of the cost substantiation phase hinges on getting complete and accurate data from suppliers in a timely manner. This thesis explores the challenges of obtaining cost data from suppliers and proposes recommendations that can be applied to general supplier management situations. First, a metric of proposal adequacy is developed and used to score the adequacy of each received bid. These scores are then analyzed to determine if there is any correlation with the existing enterprise ACX supplier rating system. Finally, recommendations for process improvements are made which focused on communication, IT systems, and standard work
Methods for predicting inventory levels in a segmented retail supply chain by Ryan Jacobs( )

1 edition published in 2015 in English and held by 1 WorldCat member library worldwide

Inventory is the largest asset on Nike's balance sheet-$3.9 Billion on May 3 1 st, 2014-and a key indicator of supply chain health. With new markets, products, and channels being added to Nike's sales portfolio each year, the environment in which Nike's supply chain must operate is becoming increasingly complex. Nike has responded to this complexity by splintering their supply chain into smaller segments, tailoring each segment to specific market and consumer needs. As a result of these market developments and Nike's organizational response, the task of understanding and predicting inventory movements has become increasingly challenging for Nike's business planning teams. This project creates an analytical method by which Nike can combine historical supply chain performance with sales forecasts to accurately predict future changes to company inventory levels. To achieve this goal and facilitate simple and flexible inventory predictions, a model was developed around the key segmentation dimensions that define Nike's supply chain. Use of this model enables Nike's senior management team to accurately predict movements in inventory due to product mix changes in the baseline sales forecasts. Additionally, the model provides Nike with a mechanism to evaluate sensitivity to forecast errors and the inventory costs associated with key strategic decisions to grow or shrink segments of their business. Preliminary results from the model over the time period FY15 - FY18 show a 2% increase in baseline inventory by the end of FY18 due both to growth in Apparel relative to Footwear and to growth in Direct-to-Consumer relative to Wholesale. This upward pressure on inventory leaves Nike in a precarious spot with Wall Street analysts who associate inventory growth relative to sales with poor marketplace performance. By carefully segmenting inventory, applying segment specific forecasts, and analyzing aggregated results through the use of the model, Nike can more accurately predict and explain movements in inventory to shareholders
Modeling the effects of advanced automation and process design on Cell Line Development by Ryan Shofnos( )

1 edition published in 2015 in English and held by 1 WorldCat member library worldwide

Research and development of biologic drugs is a time- and resource-intensive process that can span several years and billions of dollars. Any improvements in the efficiency and end-to-end cycle time of this process provide value to producers in the form of reducing at-risk investment in new drug programs and improving speed to market. Cell Line Development (CLD), a major portion of the research and development lifecycle, is responsible for creating the parent cell for these new drug programs. The biotechnology industry has made great gains in CLD technologies and procedures, though many fields continue to advance and can further contribute to improved operational efficiency. This thesis proposes a methodology for evaluating CLD systems, characterizing alternative processes and technologies, and determining the ideal investments that can maximize system efficiency and processing speed. Approaches that are currently available in the industry are reviewed and used as model inputs to determine realistic short-term gains. Furthermore, nascent technologies that may reach industrial applicability are considered for an additional potential system design. Pfizer's CLD system is used as a case study, in which it is shown that total system utilization and cycle time can be improved by 29.6% and 8.8%, respectively, through the use of currently available technologies and procedures. The costs and risks of the new approaches are reviewed and found to be significantly low when compared with these gains. As technologies continue to develop in the future, they may further improve CLD system performance. However, the majority of gains are achieved by applying currently available approaches
Business case assessment of unmanned systems level of autonomy by Edward W Liu( )

1 edition published in 2012 in English and held by 1 WorldCat member library worldwide

The federal government has continually increased its spending on unmanned aerial vehicles (UAVs) during the past decade. Efforts to drive down UAV costs have primarily focused on the physical characteristics of the UAV, such as weight, size, and shape. Due to the saturation of the UAV business in the federal sector, the civilian sector is not as penetrated. Hence, companies see this phenomenon as an opportunity to establish itself as the standard bearer in this sector. This thesis will address how Boeing can establish guidelines for business strategies in UAV offerings to potential clients. The key innovation that will be introduced is a modeling tool that will focus on simulation/trending and sensitivity analysis to help provide some insight into what these guidelines will be. The modeling tool will quantify many of the benefits and costs of the components and features of the production and utilization of UAVs. Other notable recommendations include defining a new data recording process to obtain sets of sample data to validate the results of the modeling tool and streamlining the complexity of additional features and enhancements that will be incorporated in future versions of the modeling tool
Driving cycle time reduction through an improved material flow process in the electronics assembly manufacturing cell by Paul Millerd( Book )

1 edition published in 2012 in English and held by 1 WorldCat member library worldwide

Many companies have implemented lean and six sigma programs over the past twenty years. Lean has been a proven system that has eliminated waste and created value at many companies throughout the world. Raytheon IDS's lean program, "Raytheon Six Sigma" became a top priority in the past ten years at the Integrated Air Defense Center (IADC) in Andover, MA. However, as Raytheon's corporate goals state, they want to take this further and bring "Raytheon Six Sigma" to the next level, fully engaging customers and partners. A focus of this continuous improvement effort was the Electronics Assembly Rack manufacturing cell, which was experiencing high levels of cycle time variability. To help reduce cycle times within the cell, a continuous improvement project was undertaken to improve the material flow process. A current state analysis of the process showed an opportunity to improve process standardization and prioritization while lowering inventory levels. In addition to working with managers from EA to evaluate the material flow process, a kitting cart was developed with a cross functional project team to serve as a tool to help improve the process. Although the improvements were not rolled out to the entire cell during the project, a successful pilot was conducted that helped improve engagement with operators and create a path for future success
Long range planning of biologics process development and clinical trial material supply process by Emily Edwards( )

1 edition published in 2011 in English and held by 1 WorldCat member library worldwide

This thesis investigates the feasibility of using a complex model with a Monte Carlo simulation model to forecast the financial, personnel, and manufacturing capacity resources needed for biologic drug development. Accurate forecasting is integral across industries in order to make strong longterm, strategic decisions and an area many companies struggle with. The resources required for the development of a biologic drug are especially hard to estimate due to the variability in the time and probability of success of each development phase. However, in the pharmaceutical industry getting products to market faster allows the company more time to recoup the substantial development investments before the patent expires and also potentially has a large impact on a company's market share. For these reasons, Novartis Biologics wanted to develop a simulation model to provide an objective opinion and assist them in their long-range planning. This thesis describes the design, development, and functionalities of the resultant model. During validation runs, the model demonstrated accuracy of greater than 90% when compared against historical data for headcount, number of campaigns, costs, and projects per year. In addition, the model contains Monte Carlo simulation capabilities to allow users to forecast variability and test the sensitivity of the results. This proves the model can be confidently used by project management, operations, and finance to predict their respective future resource needs
Waveless picking : managing the system and making the case for adoption and change by G. Todd Bishop( )

1 edition published in 2010 in English and held by 1 WorldCat member library worldwide

Wave-based picking systems have been used as the standard for warehouse order fulfillment for many years. Waveless picking has emerged in recent years as an alternative pick scheduling system, with proponents touting the productivity and throughput gains within such a system. This paper analyzes in more depth the differences between these two types of systems, and offers insight into the comparative advantages and disadvantages of each. While a select few pieces of literature perform some analyses of wave vs. waveless picking, this paper uses a case-study of a waveless picking system in an Amazon.com fulfillment center as a model for how to manage a waveless system once it has been adopted. Optimization methods for decreasing chute-dwell time and increasing throughput by utilizing tote prioritization are also performed using discrete-simulation modeling. The analysis concludes that managing waveless picking warehouse flow by controlling the allowable quantity of partially picked orders to match downstream chute capacity can lead to reduced control over cycle times and customer experience. Suggestions are also made on possible future research for how to optimally implement a cycle-time controlled system
 
moreShow More Titles
fewerShow Fewer Titles
Audience Level
0
Audience Level
1
  Kids General Special  
Audience level: 0.60 (from 0.34 for Do the rig ... to 0.68 for Predicting ...)

Associated Subjects
Alternative Names

controlled identityLeaders for Manufacturing Program

LGO

Massachusetts Institute of Technology. Leaders for Global Operations Program

MIT LGO

Sloan School of Management. Leaders for Global Operations Program

Languages
English (20)