Creative Education
Vol.07 No.01(2016), Article ID:63087,9 pages
10.4236/ce.2016.71011

New Product Development―Experience from Distance Learning and Simulation-Based Training

Avraham Shtub

Faculty of Industrial Engineering and Management, Technion-Israel Institute of Technology, Haifa, Israel

Copyright © 2016 by author and Scientific Research Publishing Inc.

This work is licensed under the Creative Commons Attribution International License (CC BY).

http://creativecommons.org/licenses/by/4.0/

Received 25 May 2015; accepted 24 January 2016; published 27 January 2016

ABSTRACT

This paper presents our experience teaching how to manage New Product Development (NPD) projects to students from several universities―members of an international school network. A combination of lectures, problem solving and hands-on experience provided by a Simulation- Based Learning platform made it possible for students from universities distributed around the globe to work as NPD teams that develop new products in a virtual environment.

Keywords:

Flipped Class, Simulation-Based training, Gamification, Project management, New product Development

1. Introduction

Innovators developing new products need a keen awareness of both the global and local environments in which they’ll be sold. A new course, titled “New Product Development projects” was developed at the Technion-Is- rael Institute of Technology, and taught to students at member schools of the Global Network for Advanced Management around the world http://advancedmanagement.net/whos.

The Global Network for Advanced Management is a group of business schools from both economically strong countries and those on the horizon of economic development. Each network member benefits from the perspective and intellectual contribution of every other member. The member schools are:

1) Asian Institute of Management (The Philippines);

2) EGADE Business School, Tecnológico de Monterrey (Mexico);

3) ESMT European School of Management and Technology (Germany);

4) FGV Escola de Administração de Empresas de São Paulo (Brazil);

5) Fudan University School of Management (China);

6) HEC Paris (France);

7) Hitotsubashi University, Graduate School of International Corporate Strategy (Japan);

8) Hong Kong University of Science and Technology Business School (China);

9) IE Business School (Spain);

10) IMD (Switzerland);

11) INCAE Business School (Costa Rica, Nicaragua);

12) Indian Institute of Management Bangalore (India);

13) INSEAD (France, Singapore);

14) Koç University Graduate School of Business (Turkey);

15) Lagos Business School, Pan-Atlantic University (Nigeria);

16) London School of Economics and Political Science, Department of Management (United Kingdom);

17) National University of Singapore Business School (Singapore);

18) Pontificia Universidad Católica De Chile School of Business (Chile);

19) Renmin University of China School of Business (China);

20) Sauder School of Business, University of British Columbia (Canada);

21) Seoul National University Business School (South Korea);

22) Technion-Israel Institute of Technology (Israel);

23) UCD Michael Smurfit Graduate Business School (Ireland);

24) University of Cape Town Graduate School of Business (South Africa);

25) University of Ghana Business School (Ghana);

26) Universitas Indonesia Faculty of Economics (Indonesia);

27) Yale School of Management (USA).

In the NPD course, students from the member schools attend lectures and discussions delivered through an online video conference platform and develop NPD projects in virtual distributed teams using the Project Team Builder (PTB), a simulator that models the new product development process.

Using the PTB software, student teams, composed of students from different school members of GNAM, follow the development life cycle of a new product from its inception to its practical implementation, facing questions regarding available resources, time management, and production goals. The students learn how to develop and test an efficient NPD project plan to develop new products and services.

The focus of the course is on “glocal” products: goods that are global in their conception, but locally targeted. Multinational businesses adapt a popular product from one country or region to another. To be successful, managers must understand how the local culture and environment will impact sales.

In the lectures, specific case studies are discussed―for example how products like the Big Mac sandwich and Chicken McNuggets were adopted for India, where many consumers don’t eat beef and some are entirely vegetarian. This was done by creating the Chicken Maharaja Mac sandwich and Veggie McNuggets. The end result for the company is a product that is more profitable than one that attempts to be universal.

The students learn how to analyze the difference between the needs and expectations of customers within different countries and develop a product to satisfy those needs.

Through the Project Team Builder software, students can take risks without suffering the consequences they could face in the real world. Students can rewind or fast-forward the development process within the software, in order to see what challenges they may face and how one decision can impact choices in the future.

Uncertainty is typical to NPD projects. This uncertainty leads to risks (and opportunities) and to the need for proper risk management. Simulation-based Training (SBT) presents a unique approach to the teaching and training of the management of NPD projects, a risky environment where the cost of mistakes is negligible.

In the following sections, the need for SBT in the NPD project management training arena is discussed. A specific tool for SBT in project management is presented and information about the students’ feedback and performances is presented.

2. The Need

The need for experienced, well-trained NPD project managers and teams is growing quickly. The number of undergraduate and graduate programs that offer NPD courses is a good indication, as well as the number of books on NPD projects and the number of case studies and other teaching material developed around the globe. Like many other fields, lectures, books and case studies are not enough and on-the-job training is an important part of the development of NPD teams and project managers. In some fields, sophisticated simulators replace on-the-job training or reduce it to a minimum while ensuring that the quality of training is the highest possible. This is common, for example, in training pilots who spend many hours on advanced simulators to save the high cost of actual flights. The cost of on-the-job training in this case should also include the cost of risks associated with mistakes frequently made by inexperienced pilots. In a similar way, training NPD project managers and team members on the job is expensive due to the high cost of mistakes done by inexperienced managers.

This fast development of theoretical knowledge, methodologies, tools and techniques for NPD projects was not accompanied by a similar progress in developing teaching and training tools. Traditional teaching based on textbooks, articles and case studies is still the backbone of most training programs in this area.

3. The Use of Simulation in the Learning Process

Confucius said: “I hear and I forget. I see and I remember. I do and I understand.”

This is the essence of Simulation-Based Training. We must do things ourselves in order to really understand them. In his article, James I. Grieshop (1987) stated that: “Games and simulations (ranging from role playing to case studies, from guided fantasy to problem solving) have become widely recognized methods for instruction and learning. Since the early work in the United States in the late 1950s and in Europe in the late 1960s, gaming/ simulation has become increasingly important to training and decision-making processes in academic settings as well as in business, the military, and the social sciences.”

More than twenty years later it seems that the same trend persists and games and simulations are recognized as important facilitators of the learning and training processes in many fields.

Grieshop (1987) listed some of the benefits of games and simulations:

1) Emphasizes questioning over answering on the part of players.

2) Provides opportunities to examine critically the assumptions and implications that underlie various decisions.

3) Exposes the nature of problems and possible solution paths.

4) Creates an environment for learning that generates discovery learning.

5) Promotes skills in communicating, role-taking, problem solving, leading, and decision-making.

6) Increases the motivation and interest in a subject matter.

Grieshop (1987) states that evidence is offered for:

1) Increased retention,

2) Energizing the learning process,

3) Facilitation of understanding the relationships between areas within a subject matter.

Since the publication of Grieshop’s work in 1987, simulation has been used for training in a wide range of fields: In Engineering (International Journal of Engineering in Education, Special Issue on Simulators for Engineering Education and for Professional Development, 2009) , in management of quality (Wang, 2004) , in supply chain management (Knoppen & Sáenz, 2007) , and in process re-engineering ( Smeds & Riis, 1998 ; Thoben, Hauge, Smeds, & Riis, 2007 ). Empirical research ( Millians, 1999 ; Ruben, 1999 ; Randel, Morris, Wetzel, & Whitehill, 1992 ; Wolfe & Keys, 1997 ; Meijer, Hofstede, Beers, & Omta, 2006 ) expanded our knowledge of this training approach presenting new ways of understanding and implementing simulation for training. Today it is widely accepted that learning through simulation is based on three pillars ( Keys, 1976 ; Kolb, 1984 ; Kirby, 1992 ):

1) Learning from content―the dissemination of new ideas, principles, or concepts.

2) Learning from experience―an opportunity to apply content.

3) Learning from feedback―the results of actions taken and the relationship between the actions and performance.

A well designed simulator supports a process of action-based learning. Instead of talking about different ways of doing things, simulators offer an opportunity to experiment with various methods without risking the consequences of doing so in the real world.

Simulators create an environment that requires the participant to be involved in a meaningful task. The source of learning is what the participants do rather than what they are told by the trainer.

Thompson, Purdy, & Fandt (1997) list the advantages of using simulations as a learning tool:

1) Simulators are characterized as tools enabling the acquisition of practical experience and acceptance of an immediate response of the learned system to the user’s decisions and actions.

2) Simulators offer a realistic model of the interdependence of decisions that the trainee makes.

3) Simulation-based training reduces the gaps between the learning environment and the “real” environment.

4) Simulators facilitate training in situations that are difficult to obtain in the “real world”.

5) Simulations promote active learning, especially at the stage of debates that arise because of the complexity, interconnectedness, and novelty of decision-making.

Wolfe (1993) notes that simulations develop critical and strategic thinking skills. He claims that the skills of strategic planning and thinking are not easy to develop and that the advantage of simulators is that they provide a strong tool for dealing with this problem.

An important development in the design of training simulators is to provide the learners with automatic or semi-automatic feedback on their progress. A learning history mechanism was used in several simulation-based teaching tools. The user of these systems obtains access to past states and decisions and to the consequences of these decisions. Learning histories encourage the users to monitor their behavior and reflect on their progress ( Beyerlein, Ford, & Apple, 1996 ; Guzdial, Kolodner, Hmelo, Narayanan, Carlso, & Rappin, et al., 1996 ). Learning histories enables analysis of the decision-making process as opposed to analysis of results only and therefore it is very effective because the direct influence on the user’s actions can be seen. For example, learning history is used as a quality improvement tool for programmers (Prechelt, 2001) .

The most basic view of history recording and inquiry is the temporal sequence of actions and events. In its simplest form, user actions are logged and recorded, and are then accessible in various ways for recovery and backtracking purposes (Vargo, Brown, & Swierenga, 1992) . Such a mechanism is used as “undo”. Several recovery mechanisms have been developed using the simple undo/undo or undo/redo (e.g., see (Archer, Conway, & Schneider, 1984) ).

Parush, Hamm, & Shtub (2002) described simulation-based teaching of the order fulfillment process in a manufacturing context, using the Operations Trainer (Shtub, 1999; 2001) with a built-in learning history recording and inquiry mechanism. The study addressed two basic questions:

1) Can history recording and inquiry affect the learning curve during the training phase with the simulator?

2) Can history recording and inquiry affect the transfer of what was learnt with the simulator?

The findings showed that with the learning history recording and inquiry available to users, better performance was obtained during the learning process itself. In addition, the performance of learners with the history mechanism was better transferred to a different context, compared to learners without the history mechanism. The studies reviewed above demonstrated that having an opportunity to review learning history had a positive impact on learning. However, these studies did not examine whether the mode of history recording could have an impact on learning. History recording can be done either by automatic mechanism or by learner control. In automatic history recording, the training system such as the simulator determines when to record a given state in the learning process. These recording points are predetermined by the simulator designer or the instructor that prepares the training program. In such a situation, the learner is not involved in the decision when to keep a specific state in the learning process. In contrast, in a learner-controlled mode, the learner determines if and when to keep a specific state in the learning process. It was shown, however, that giving the learners some control over the learning environment by letting them actively construct the acquired knowledge produces better learning (Cuevas, Fiore, Bowers, & Salas, 2004) .

The successful use of a simulator for teaching project management was reported in several studies (Davidovitch, Parush, & Shtub, 2006; 2008; 2009) . The simulator called the Project Management Trainer (PMT) was used in those studies as a teaching aid designed to facilitate the learning of project management in a dynamic, stochastic environment. The research focused on the effect of the history recording mechanism on the learning process. Two types of history mechanisms were tested: the automatic history mechanism, in which predefined scenario’s states are always saved, and the manual history mechanism, in which the trainee had to show an active involvement and to save selected states manually. In Davidovitch, Parush, & Shtub (2006) , the study focused on how project managers’ decisions to record the history affected the learning process and on the effects of history inquiry when the ability to restart the simulation from a past state is not enabled. In Davidovitch, Parush, & Shtub (2008) , the study focused on the forgetting phenomenon and on how the length of a break period and history mode affected the learning, forgetting, and relearning (LFR) process. Both studies revealed that history recording improved learning; furthermore, with the manual history mechanism, learners achieved the best results.

The issue of a simulator’s functional fidelity is also of great interest. The fidelity of a simulator is a measure of its deviation from the real situation; it has three dimensions: perceptual, functional and model fidelity. Perceptual fidelity refers to the level of realism it evokes in terms of its look and feel relative to the real system. Functional fidelity refers to the way users or trainees use and control the simulation, its behavior and responses to user actions. Finally, model fidelity refers to the extent to which the mathematical or logical model underlying the simulation is close to the real processes and phenomena.

The fidelity of the simulator has been recognized as a critical factor influencing the transfer of learning (Alessi, 1998) . In order to provide a higher level of functional fidelity, the PTB simulator includes two functionalities: the ability to control the level of human resources and the ability to control the execution of the tasks. These functionalities are made available to trainees as part of the scenario development. The ability to control the level of human resources refers to the decision to hire or fire employees in accordance with the changing demand for resources during the project execution; the project manager can control the number of employees in the project in order to match availability to needs. The ability to control the execution of the tasks refers to the decision to split tasks during execution―a task can begin, stop for a while and continue later.

Davidovitch, Parish, & Shtub (2009) found that higher fidelity improved performances in the learning phase and in the transfer phase to a different scenario.

4. Specific Example―The Project Team Builder (PTB)

The Project Team Builder (PTB) is a training aid designed to facilitate the training of project management in a dynamic, stochastic environment. The design of PTB is based on the research findings described in the previous sections. Thus, for example, PTB provides high fidelity by supporting the simulation of any (real or imaginary) project. Another example is the history mechanism built into the PTB that allows the user to go back in simulation time to review past decisions and to restart the simulation from any past simulation time.

The PTB is based on the following principles:

・ A simulation approach―the PTB simulates one or more projects or several work packages of the same project. The simulation is controlled by a simple user interface and no knowledge of simulation or simulation languages is required.

・ A case study approach―the PTB is based on a simulation of case studies called scenarios. Each case study is a project or a collection of projects performed in a dynamic stochastic environment. In some scenarios the projects are performed under schedule, budget and resource constraints. The details of these case studies are built into the simulation while all the data required for analysis and decision-making is easily accessed by the user interface.

・ A dynamic approach―the case studies built into the PTB are dynamic in the sense that the situation changes over time. A random effect is introduced to simulate the uncertainty in the environment, and decisions made by the user cause changes in the state of the system simulated.

・ A model-based approach―a decision support system is built into the PTB. This system is based on project management concepts. The model base contains well-known models for scheduling, budgeting, resource management, as well as monitoring and control. These models can be consulted at any time.

・ To support decision-making further, a database is built into the PTB. Data on the current state of the simulated system is readily available to the users; it is possible to use the data as input to the models in the model base to support decision-making. Furthermore, by using special history mechanisms the user can access data on his past decisions and their consequences.

・ User friendliness and GUI―the PTB is designed as a teaching and training tool. As such, its Graphic User Interface (GUI) is friendly and easy to learn. Although quite complicated scenarios can be simulated, and the decision support tools are sophisticated, a typical user can learn how to use the PTB within an hour.

・ An integrated approach―several projects can be managed simultaneously on the PTB. These projects can share the same resources and a common cash flow.

・ Integration of processes: planning processes, executing processes and monitoring and controlling processes. All these processes are performed simultaneously in a dynamic stochastic environment.

・ Integration with commercial project management tools―the PTB is integrated with Microsoft Project so that the users can export the data to Microsoft Project in order to analyze the scenario and to support its decisions with tools that are commercially available.

5. Specific Example―The GNAM Course

The percentage of sales of successful business organizations tied to the successful Introduction of new products and services is high. Given the fact that the failure rate of these introductions is also high there is a need for tools and techniques to manage the NPD projects. The course focuses on the tools, techniques and best practices developed to support projects aimed at development and marketing of new products and systems. The course is aimed at teaching the tools and techniques developed to support the NPD process, to gain insight from real NPD success and failure case studies and to implement the tools, techniques and insights in a simulated environment.

Each student is assigned to a team. Each team selects two real NPD cases, presents it in class and leads a discussion on the lessons learned:

1) A product or a service that was thought to be the “next big innovation”, but failed.

2) A product or a service that was a “big innovation success”.

In addition each team “develops” a new product using the Project Team Builder simulator. The team prepares an NPD plan and executes it on the simulator. A final report is submitted along with the information on the NPD project plan and the results of executing the plan on the PTB.

6. Experiments and Results

Iluz and Shtub (2013) conducted controlled experiments to test the Project Team Builder as a teaching tool:

6.1. Experiment #1: individual participants

Three groups participated in this experiment:

・ A group of 16 very experienced project managers with experience of over 5 years.

・ A group of 17 experienced project managers with experience under 5 years.

・ 18 graduate students.

The essence of the experiment is to let the trainees “manage” a project of new product development themselves. Their goal is to optimize the ratio between system performance and costs (cost benefit analysis). Upon completion of the simulation, each participant was handed a questionnaire focused on tradeoff analysis and decision making.

6.2. Experiment #2: project teams

Nineteen project teams participated in this experiment with a sample size (i.e., the number of participants) of N = 57. Both PTB and Microsoft Project (MSP) were used as teaching tools and a crossover (PTB/MSP) experiment was designed to test whether SBT improves tradeoff analysis and decision making.

Participants were randomly divided into teams and roles, each including a Project Manager, a Systems Engineer, and a Quality Assurance Engineer. The teams’ target was to optimize the ratio between system performance and costs.

Upon completion of the PTB/MSP project plans and runs, participants were requested to record the plan results: duration, cost, performance, as well as to fill out a questionnaire focused on tradeoff analysis and decision making.

6.3. Data Analysis

The data were analyzed using two statistical procedures: the Chi-square test and the Analysis of Variance (ANOVA). ANOVA is aimed at testing the differences between the means of more than two samples, and is based on the partitioning of the variance in the data into different sources.

The results of the analysis are reported below in the following format: Chi-square = XX; df = XX, P < XX. The value of the statistic in the test was performed, Chi-square is presented first. This is followed by the number of Degrees of Freedom (df) that were used in the test. Finally, the significance is indicated by P, which is the probability of making an error in claiming that the difference is significant. Any probability less than 5% is interpreted in Behavioral Science as a significant difference.

6.4. Results

There are three clusters of compliance with performance: low (benefit under 20,000), moderate (benefit between 20,000 and 80,000) and high (benefit over 80,000).

1) The effect on tradeoff analysis is shown by a significant correlation between performance and cost as illustrated in Figure 1. (Chi Square = 5.99, df = 2, P < 0.05)―the better the performance, the higher the cost.

2) The relationship between the notion that the tool supports decision making and the will to integrate it before or during project life: There is a significant correlation (F = 3.5, df = 4, P < 0.05) between perceiving the simulator as a supporting tool for making decisions and the will to integrate it as a tool for making decisions before or during project performance.

3) The influence of the tool used on the tradeoff analysis was determined by statistically analyzing the questionnaire answers. A Signed-Rank test was performed on two independent samples. This test resembles a single t-test. The differences (PTB-MSP) between the answer given following use of the PTB and the answer given following use of MSP were analyzed. In case the mean value is positive and the P value < 0.05 (indicating significant statistic), the result is in favor of PTB. In other cases, the average result was negative, but the P value was not significant (>0.05). In these cases no conclusion could be drawn in favor of the MSP. An example of the analysis results is depicted in Table 1.

Ten out of 16 questions (over 60%) yielded statistical significance with regard to using SBT. However, even when there is no significance, no advantage is seen in favor of the MSP. Even when the observed difference is negative, no significance is detected.

The conclusion is, therefore, that SBT improves tradeoff analysis and decision making.

Figure 1. One way analysis of cash by benefit group.

Table 1. Difference analysis results summary on the question level. Significant results are indicated by yellow highlight.

7. Summary

New Product Development is a combination of art and science. It is the art of dealing with people in a dynamic, frequently uncertain environment and the art of riding the learning curve in a non-repetitive environment. It is the science of solving hard combinatorial, stochastic problems of NPD project planning, monitoring and control under resource and budget constraints. SBT supports training in both aspects of project management. By using the PTB in team settings the art of NPD project management can be practiced; by using SBT to plan monitor and control NPD projects, the science of NPD project management is mastered.

Cite this paper

AvrahamShtub, (2016) New Product Development—Experience from Distance Learning and Simulation-Based Training. Creative Education,07,105-113. doi: 10.4236/ce.2016.71011

References

  1. 1. Alessi, S. M. (1998). Fidelity in the Design of Instructional Simulations. Journal of Computer-Based Instruction, 15, 40-47.

  2. 2. Archer, J. E., Conway, R., & Schneider, F. B. (1984). User Recovery and Reversal in Interactive Systems. ACM Transactions on Programming Languages and Systems, 6, 1-19.
    http://dx.doi.org/10.1145/357233.357234

  3. 3. Beyerlein, S., Ford. M., & Apple, D. (1996). The Learning Assessment Journal as a Tool for Structured Reflection in Process Education. IEEE Proceedings of Frontiers in Education’96 (pp. 310-313).

  4. 4. Cuevas, H. M., Fiore, S. M., Bowers, C. A., & Salas, E. (2004). Fostering Constructive Cognitive and Metacognitive Task in Computer-Based Complex Task Training Environments. Computers in Human Behavior, 20, 225-241.
    http://dx.doi.org/10.1016/j.chb.2003.10.016

  5. 5. Davidovitch, L., Parush, A., & Shtub, A. (2006). Simulation-Based Learning in Engineering Education: Performance and Transfer in Learning Project Management. Journal of Engineering Education, 95, 289-299.
    http://dx.doi.org/10.1002/j.2168-9830.2006.tb00904.x

  6. 6. Davidovitch, L., Parush, A., & Shtub, A. (2008). Simulation-Based Learning: The Learning-Forgetting-Relearning Process and Impact of Learning History. Computers and Education, 50, 866-880.
    http://dx.doi.org/10.1016/j.compedu.2006.09.003

  7. 7. Davidovitch, L., Parush, A., & Shtub, A. (2009). The Impact of Functional Fidelity in Simulator-Based Learning of Project Management. International Journal of Engineering Education, 25, 333-340.

  8. 8. Grieshop, J. I. (1987). Games: Powerful Tools for Learning. Journal of Extension, 25.

  9. 9. Guzdial, M., Kolodner, J., Hmelo, C., Narayanan, H., Carlso, D., Rappin, N., Hubscher, R., Turns, J., & Newstetter, W. (1996). The Collaboratory Notebook. Communications of the ACM, 39, 32-33.
    http://dx.doi.org/10.1145/227210.227218

  10. 10. Iluz, M., & Shtub, A. (2013). Simulator Based Training to Improve Tradeoffs Analysis and Decision Making in Lean Development Environment in Advances in Production Management Systems. Sustainable Production and Service Supply Chains, IFIP WG 5.7 International Conference, APMS 2013, Vol. 415 (pp. 108-117). State College.
    http://dx.doi.org/10.1007/978-3-642-41263-9_14

  11. 11. International Journal of Engineering in Education, Special Issue on Simulators for Engineering Education and for Professional Development, 25, February 2009.

  12. 12. Keys, B. (1976). A Review of Learning Research in Business Gaming. In B. H. Sord (Ed.), Proceedings of the Third Annual Conference of the Association for Business Simulation and Experimental Learning. Knoxville, TN: ABSEL.

  13. 13. Kirby, A. (1992). Games for Trainers (Vol. 1). Cambridge: Gower.

  14. 14. Knoppen, D., & Sáenz, M. J. (2007). Supply Chain Collaboration Games: A Conceptual Model of the Gaming Process. In M. Taisch, & J. Cassina (Eds.), Learning with Games. Italy: Mar. Co.

  15. 15. Kolb, D. A. (1984). Experiential Learning. England: Prentice Hall.

  16. 16. Meijer, S., Hofstede, G. J., Beers, G., & Omta, S. W. (2006). Trust and Tracing Game: Learning about Transactions and Embeddedness in a Trade Network. Production Planning and Control, 17, 569-583.
    http://dx.doi.org/10.1080/09537280600866629

  17. 17. Millians, D. (1999). Thirty Years and More of Simulations and Games. Simulation & Gaming, 30, 352-355.
    http://dx.doi.org/10.1177/104687819903000311

  18. 18. Parush, A., Hamm, H., & Shtub, A. (2002). Learning Histories in Simulation-Based Teaching: The Effects on Self-Learning and Transfer. Computers and Education, 39, 319-332.
    http://dx.doi.org/10.1016/S0360-1315(02)00043-X

  19. 19. Prechelt, L. (2001). Accelerating Learning from Experience: Avoiding Defects Faster. IEEE Software, 18, 56-61.
    http://dx.doi.org/10.1109/52.965803

  20. 20. Randel, J., Morris, B. A., Wetzel, C. D., & Whitehill, B. V. (1992). The Effectiveness of Games for Educational Purposes: A Review of Recent Research. Simulation & Gaming, 23, 261-276.
    http://dx.doi.org/10.1177/1046878192233001

  21. 21. Ruben, B. D. (1999). Simulations, Games and Experience-Based Learning: The Quest for a New Paradigm for Teaching and Learning. Simulation & Gaming, 30, 498-505.
    http://dx.doi.org/10.1177/104687819903000409

  22. 22. Shtub, A. (1999). Enterprise Resource Planning: The Dynamics of Operations Management. Norell, MA: Kluwer.

  23. 23. Shtub, A. (2001). Teaching Operations in the Enterprise Resource Planning (ERP) Era. International Journal of Production Research, 39, 567-576.
    http://dx.doi.org/10.1080/00207540010009714

  24. 24. Smeds, R., & Riis, J. O. (Eds.) (1998). Experimental Learning in Production Management. London: Chapman and Hall.
    http://dx.doi.org/10.1007/978-0-387-35354-8

  25. 25. Thoben, K. D., Hauge, J. B., Smeds, R., & Riis, J. O. (Eds.) (2007). Multidisciplinary Research on New Methods for Learning and Innovation in Enterprise Networks. Aachen: Verlaag Mainz.

  26. 26. Thompson, T. H., Purdy, J. M., & Fandt, P. M. (1997). Building a Strong Foundation Using a Computer Simulation in an Introductory Management Course. Journal of Management Education, 21, 418-434.
    http://dx.doi.org/10.1177/105256299702100312

  27. 27. Vargo, C. G., Brown, C. E., & Swierenga, S. J. (1992). An Evaluation of Computer-Supported Backtracking in a Hierarchical Database. In Proceedings of the Human Factors Society’s 36th Annual Meeting (pp. 356-360). Santa Monica, CA: Human Factors Society.

  28. 28. Wang, G. G. (2004). Bringing Games into the Classroom in Teaching Quality Control. International Journal of Engineering Education, 20, 678-689.

  29. 29. Wolfe, J. (1993). A History of Business Teaching Games in English-Speaking and Post-Socialist Countries: The Origination and Diffusion of a Management Education and Development Technology. Simulation & Gaming, 24, 446-463.
    http://dx.doi.org/10.1177/1046878193244003

  30. 30. Wolfe, J., & Keys, J. B. (Eds.) (1997). Business Simulations, Games and Experiential Learning in International Business Education. New York: International Business Press.